AWS

Add Contact Form in Hugo Site

Create a lambda function to send the contact message Setup SES and verify sender email address Create an IAM role for lambda with policy details below { "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "ses:SendEmail", "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource": "*" } ] } Create the lambda function const aws = require("aws-sdk"); const ses = new aws.SES({ region: "ap-southeast-2" }); exports.handler = async function (event) { console.

Configure Encryption on a S3 Bucket

Create a S3 bucket, with name tw-testbucket-2021abc Attach bucket policy below, this can be generated with help of AWS Policy Generator { "Id": "Policy1627604722484", "Version": "2012-10-17", "Statement": [ { "Sid": "Stmt1627604720916", "Action": [ "s3:PutObject" ], "Effect": "Deny", "Resource": "arn:aws:s3:::tw-testbucket-2021abc/*", "Condition": { "StringNotEquals": { "s3:x-amz-server-side-encryption": "aws:kms" } }, "Principal": "*" } ] } Upload file without encryption enabled The upload will failed with Access Denied error Upload file with encryption enabled

AWS Cross Account Access

In this exercise, I will try few ways to access resources in Account A for a user in Account B. Setup Have two accounts ready, Account A and Account B Go to Account A console Create a Role with “Another AWS account” as the type of trusted entity, role name as crossaccountrole Attach policies, for example AmazonS3FullAccess Access Account A resouce from Account B Access from console Go to Account B console Click switch role Provide account id of Account A and the role name of crossaccountrole created earlier Now we should be able to access S3 in Account B Or use https://signin.

Use Route 53 and Cloudfront for My Website

Recently I tested with migrating wordpress site to Hugo, and host it in AWS s3 static website. In this excerise, I will use Route 53 for the dns service and Cloudfront for content delivery, before starting, a working s3 static website should be ready, please refer to How to Create Site With Hugo to see how to setup a hugo site in S3. Request a certificate in AWS Certificate Manager Add domain names: *.

Mount S3 Bucket to a Local System

In this exercise, we will try to mount s3 to local system with s3fs package. Mount the s3 bucket to local directory aws configure aws s3 mb s3://testbucket sudo cp -r ~/.aws /root # install s3fs sudo yum install s3fs-fuse -y mkdir /mnt/s3data sudo s3fs {BUCKET_name} /mnt/s3data -o allow_other -o default_acl=public-read -o use_cache=/tmp/s3fs echo "<html><h1>test</h1><html>" > index.html #check file in local folder and s3 ll /mnt/s3data aws s3 ls s3://{BUCKET_name} Access the s3 bucket data from Docker Container Once mounted we can access the data in a container as well