Overview
I decided to take on the AWS version of this project after completing the same challenge using GCP. My hope was to improve my knowledge of other cloud platforms since my only prior experience with the cloud had been only using Google.
While working through this version of the project, I was able to complete each step in less time using many of the tools and skills I learned during my first attempt. There was still a learning curve since I had to contend with the whole new platform, but there were enough similarities to help speed up the process. Below is a brief overview of how I completed each phase of the project.
Certification
I have 2 certifications for GCP already so I decided to skip this step for now and opted to take the AWS Cloud Practitioner Essentials course instead. The course was very helpful in getting me up to speed on the AWS platform terms and resources and I plan on taking the Cloud Practitioner in the near future.
Infrastructure as Code and CI/CD
When I first attempted the challenge using GCP I initially deployed all of my resources directly from the console or command line. I then had to go back and define and deploy my GCP resources using Terraform. For my AWS project I decided to use SAM and CloudFormation as my IaC tool right from the start.
I created a Github repo and cloned it locally, pushing updates once all resources were successfully defined and deployed. One advantage that I found while using SAM was the option of using a template. I started with a "Hello-World!" SAM template and edited/added the resources according. This sped up the whole process dramatically as it defined an API and Lambda function, two necessary steps in completing the challenge.The CloudFormation documentation was very detailed and I struggled a bit with all the nested syntax when defining resources, making sense of the required vs optional lines of code was challenging.
For CI/CD I used Github actions to run tests on my Lambda function, deploy resources using SAM, and updated my website's files to my S3 bucket. I also configured my Github workflow file to deploy my infrastructure only upon a successful Pytest of my Lambda function. Testing is something I could have improved in my GCP project so I am glad that I spent a bit more time on this section.
Creating a Static Website Using S3 Bucket
For my webpage I used the same template as my GCP resume page, but I updated the project and skills sections to include my AWS project. To ensure that my website uses HTTPs I created a CloudFront distribution and attached an SSL certificate. Passing my S3 bucket name under my distribution settings provided CloudFront access to the my website files. The final step in creating my static website was to create a DNS record pointing my custom Route53 domain to my CloudFront distribution.
Website Visitor Counter
Both the GCP and AWS version of this project make use a serverless function and database to create and update a website visitor counter. After deploying an empty DynamoDB table using SAM, I read up on the Boto3 documentation for creating, retrieving, and updating DynamoDB items. I tested my Lambda function directly from the AWS console until it produced the desired results, updating a count each time the function was invoked. I also confirmed that my API was working as expected and handled CORS properly.
import json
import boto3
def lambda_handler(event, context):
#Get the service resource.
dynamodb = boto3.resource('dynamodb')
#Get existing table 'Website_Hits"
table = dynamodb.Table('Website_Hits')
#Check if there is existing record and update count
try:
views = table.get_item(Key={'hits': 'site_hits'})
curent_views = views['Item']['total_views']
updated_views = curent_views + 1
table.put_item(Item={'hits':'site_hits', 'total_views':updated_views})
#Update the count to 1 if no record exists
except:
table.put_item(Item={'hits':'site_hits', 'total_views':1})
updated_views = 1
#CORS headers
headers = {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods": "*",
"Access-Control-Allow-Headers": "*",
}
return {
"statusCode": 200,
"headers": headers,
"body": json.dumps({"total_views":str(updated_views)
})}
Next Steps
The obvious next step is to continue to learn more about AWS and consider a certification. What I appreciated most about this experience was discovering the similarities and differences between the Amazon and Google environments.
I did prefer some aspects of AWS, such as the SAM templates and Identity Access Management. I found that using IAM in AWS to be more straightforward and user friendly than with Google. I recall spending a few hours troubleshooting service account privileges in GCP and I didn't experience the same issues with AWS. The process of setting up a static website was less involved and more straightforward with GCP (the downside is that it is a lot more expensive to serve HTTPs traffic with GCP than AWS). Ultimately, I would make the case that GCP is more beginner friendly for the Cloud Resume Challenge
Lastly, my resume page itself can use some further refinements and updates. I focused primarily on the infrastructure end of the project so I think it's now time to polish up the website and work on its content.