Category Archives: AWS

A Step by Step Guide to Deploy CloudFoundry on AWS

1. Configure Below on AWS

  • VPC ( Virtual Private Cloud )
  • IAM USER
  • Elastic IP
  • Key Pair
  • Security Group
  • Route 53

a. VPC –  Configuring VPC

Select Launch VPC Wizard

sys

Click on Select to continue

Enter IPv4 details and click on create VPC

VPC Created Successfully

b. IAM USER

Go to IAM under Services

Click on AddUser

Enter Username and Select on Next:Permissions

Select Administrator Access  under policies and click on Review

Click on Create User

User Create Successfully and download the CSV file

c. Elastic IP

Select Elastic IP’s from EC2 Dashboard

Select Allocate new address

Select Allocate Address

IP Created successfully

d.Key Pair

Select Key Pairs from EC2 Dashboard

Click on Create Key Pair

Enter the name and click on Create

Key Pair  Gets created and .pem file is downloaded automatically

e.Security Group

Select Security Groups from EC2 Dashboard

Select Create Security Group

Enter the details and click Create

Bosh Group is Created as shown below

Select bosh security group, Edit inbound rules and add as shown below

f. Route 53

Select Route53  from Services

Click on Hosted Zones

Click on Create Hosted Zone

Enter the domain details and click on create

nbos.co dns entry is created

Add *.nbos.co address record and save Record

 

2. Create an Instance to Deploy Bosh Director

a. Create  EC2 Instance

Select Launch Instance from EC2 Dashboard

b. Select Amazon Linux

c. Select t2.micro and click Next

d. Select Network and subnet as shown below and proceed to Configure security Group

e. Select bosh security group and click on Review and Launch

f. Click on Launch to Launch the instance

g. Wait till instance is launched and showed running

3. Configure bosh cli and deploy bosh director

a. Login to the EC2 instance

b. Verify the bosh version

c. Install the following packages

d. make directory bosh

e. clone bosh Director Template

Update cloud-config.yml file in bosh-deployment/aws

f. Deploy Bosh Director using create-env

Wait till it completes

g. Connect to the bosh director

4. Deploy Cloud Foundry

a. clone cf-deployment

b. upload stemcell to director

c. update  the cf-deployment.yml under cf-deployment with content below

d. Deploy the cloud foundry

Wait till the deployment completes

e. Update the  Elastic IP to the router instance created by cf-deployment which is mapped to *.nbos.co in DNS

5. Login to cf

a. Set the cf target

b. Retrieve the admin password from  cf-deployment/deployment-vars.yml generated while deploying cf-deployment

c. Verify cf API

 

Are you planning to move your existing GB’s or TB’s data to s3?
Below nodejs script might help you to get started and upload your data to S3.

PreRequsite:

Ensure that aws-sdk node module is configured. If not follow AWS documentation https://aws.amazon.com/sdk-for-node-js/

s3 Options:

Usually accessKeyId and secretAccessKey is enough to make it happening. But if you are behind proxy, use those commented s3Options fields accordingly.

Source Directory

For windows directory delimiter needs tobe delimited, eg c:\\mysite\\uploads

For Linux you can use something like ‘/var/www/site/uploads’

 

Use Case:

A PHR(Patient Health Record) application  gets health reports of patient from various hospitals and labs. Patient Health records are in XML format which needs to be processed  and stored into Database.

Lets see how we can use Amazon AWS to implement this use case.

Step1: Create a Amazon Simple Storage Service (S3) bucket where documents to be uploaded.

Step2: Create a AWS Lambda Function  using NodeJs. Configure  Lambda function to get triggered on   XML upload to S3. Check below screen shot.

Step3: 

Create a Mysql Database using AWS Relational Database‎ (RDS).

Step 4: 

Lambda NodeJs code can be dowloaded from Wavelabs Git Repo.

Package the JS file into zip file and upload the NodeJS Lambda function. Make necessary changes in index.js file as per your requirements.

Ouput:

From AWS dashboard, upload  sample xml file( use file  patient.xml from repo) to s3 bucket.

Connect to Mysql using command line or using workbench and see the results.