Resize AWS EC2 Volume

In this excerise, we will try to resize a volume for EC2 Size(any type) and IOPS(Some types) can be increased for the EBS volumes Repartition required after resizing The volume is still usable after increasing the size Can’t decrease the EBS volume size Check volume before the resize [root@ip-111-111-111-111 ec2-user]# lsblk NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINTS xvda 202:0 0 8G 0 disk ├─xvda1 202:1 0 8G 0 part / ├─xvda127 259:0 0 1M 0 part └─xvda128 259:1 0 10M 0 part [root@ip-111-111-111-111 ec2-user]# df -h Filesystem Size Used Avail Use% Mounted on devtmpfs 4.

Docker With ECR

Install docker Docker on EC2 $ sudo yum update -y $ sudo amazon-linux-extras install docker $ sudo service docker start $ sudo usermod -a -G docker ec2-user $ docker info EC2 User Data #! /bin/sh yum update -y amazon-linux-extras install docker service docker start usermod -a -G docker ec2-user chkconfig docker on Create image sample docker file FROMalpineLABEL description="Running Docker from EC2"WORKDIR/srcRUN echo "Hello world" > hello.

How to Use AWS KMS to Encrypt Data

Create KMS CMS Key # create a key and save the key id from the response aws kms create-key --description "Test CMK" # create a alias for the key aws kms create-alias --target-key-id {key_id} --alias-name "alias/testcmk" aws kms list-keys Encrypt and Decrypt a file echo "this is a test message" > test.txt aws kms encrypt --key-id "alias/testcmk" --plaintext file://test.txt --output text --query CiphertextBlob | base64 --decode > test.txt.encrypted aws kms decrypt --ciphertext-blob fileb://test.

Deployment With AWS Codedeploy

Env Setup Create a new EC2 service role, which have access to S3 bucket Launch a new EC2 instance with the role from previous step attached Install CodeDeploy agnet on the EC2 instance sudo yum update sudo yum install ruby sudo yum install wget cd /home/ec2-user wget chmod +x ./install sudo ./install auto sudo service codedeploy-agent status Create a CodeDeploy service role and attach AWSCodeDeployRole policy Code Deploy Create the application and upload code aws deploy create-application --application-name mywebapp aws deploy push --application-name mywebapp --s3-location s3://{bucket_name}/webapp.

Add Contact Form in Hugo Site

Create a lambda function to send the contact message Setup SES and verify sender email address Create an IAM role for lambda with policy details below { "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "ses:SendEmail", "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource": "*" } ] } Create the lambda function const aws = require("aws-sdk"); const ses = new aws.SES({ region: "ap-southeast-2" }); exports.handler = async function (event) { console.

Configure Encryption on a S3 Bucket

Create a S3 bucket, with name tw-testbucket-2021abc Attach bucket policy below, this can be generated with help of AWS Policy Generator { "Id": "Policy1627604722484", "Version": "2012-10-17", "Statement": [ { "Sid": "Stmt1627604720916", "Action": [ "s3:PutObject" ], "Effect": "Deny", "Resource": "arn:aws:s3:::tw-testbucket-2021abc/*", "Condition": { "StringNotEquals": { "s3:x-amz-server-side-encryption": "aws:kms" } }, "Principal": "*" } ] } Upload file without encryption enabled The upload will failed with Access Denied error Upload file with encryption enabled

AWS Cross Account Access

In this exercise, I will try few ways to access resources in Account A for a user in Account B. Setup Have two accounts ready, Account A and Account B Go to Account A console Create a Role with “Another AWS account” as the type of trusted entity, role name as crossaccountrole Attach policies, for example AmazonS3FullAccess Access Account A resouce from Account B Access from console Go to Account B console Click switch role Provide account id of Account A and the role name of crossaccountrole created earlier Now we should be able to access S3 in Account B Or use https://signin.

Use Route 53 and Cloudfront for My Website

Recently I tested with migrating wordpress site to Hugo, and host it in AWS s3 static website. In this excerise, I will use Route 53 for the dns service and Cloudfront for content delivery, before starting, a working s3 static website should be ready, please refer to How to Create Site With Hugo to see how to setup a hugo site in S3. Request a certificate in AWS Certificate Manager Add domain names: *.

Mount S3 Bucket to a Local System

In this exercise, we will try to mount s3 to local system with s3fs package. Mount the s3 bucket to local directory aws configure aws s3 mb s3://testbucket sudo cp -r ~/.aws /root # install s3fs sudo yum install s3fs-fuse -y mkdir /mnt/s3data sudo s3fs {BUCKET_name} /mnt/s3data -o allow_other -o default_acl=public-read -o use_cache=/tmp/s3fs echo "<html><h1>test</h1><html>" > index.html #check file in local folder and s3 ll /mnt/s3data aws s3 ls s3://{BUCKET_name} Access the s3 bucket data from Docker Container Once mounted we can access the data in a container as well