Cloud Resume Challenge On AWS - My Journal

Cloud Resume Challenge On AWS - My Journal

Breaking into the cloud by learning.

The Challenge

After getting my AWS Certified Developer Associate certification I came across AWS Serverless Hero Forrest Brazeal's Cloud Resume Challenge and I was motivated by the stories from other participants who landed jobs coming from non-tech roles.

The challenge intrigued me and I saw the opportunity to get my hands dirty as I try to break into the cloud by creating a real-world application.

The challenge dares you to build a resume website with a visitor counter, but the catch is you have to deploy it fully through some AWS services. This has to be completed whilst learning how to correctly deploy Serverless applications, CI/CD pipelines, networking, security, and more.

The challenge's first requirement is to pass the AWS Cloud Practitioner certification and I already have completed two certifications :

  • AWS Certified Cloud Practitioner
  • AWS Certified Developer Associate

so I decided to jump into it.

I had zero experience building with AWS but I had a good theoretical understanding of the services from my certification exams and I thought this could be an opportunity to level up a bit.

This article won’t be a detailed tutorial but it will give some guidance and remarks about the path I followed to complete the challenge, but also about some experiments to test and learn with some “side quests” along the way.

If you want to have a look at the result, here are the interesting links:


cloud-resume-architecture (1).png

This challenge was completed in a series of steps, I separated the tasks into frontend and backend tasks and I started with the frontend.


The challenge starts with an HTML/CSS version of your resume. I thought about overengineering the resume with JavaScript frameworks but I decided to keep the resume simple and sweet by using the OG stack. For the UI I made the website resemble what a normal resume would look like.

Once you are done with the website, the challenge requires you to host it somewhere to make it publicly accessible using Amazon Simple Storage Service and set up its static hosting capabilities.

Amazon S3 gives you a long URL for your website that's not easy to remember, so the next step was to set up my subdomain record in Route 53 and obtain an SSL certificate from Amazon Certificate Manager. All of the content is served by CloudFront, AWS’s CDN and is secured using Origin Access Identity (OAI).

Setting up the S3 bucket as origin was easy, but correctly arranging the DNS routing was more thorny.

I used GitHub as my version control hosting and created a CI workflow with GitHub Actions to upload it to the correct S3 bucket on push. Everything was going great! I was now on the second part of my challenge.


The second part of the challenge took me some time to get around, I had to create a simple visitor counter using REST API calls made to an API created in API Gateway, which should trigger a Lambda function that updates and returns a counter stored in a DynamoDB table. All this had to be done using SAM templates.

Since this was my first time using AWS Serverless services I decided to set up AWS SAM locally for development and testing. I used docker to create a network where my DynamoDB table could interact with my Lambda Function and API Gateway REST API.

After my local development was set up I created my SAM template and my knowledge from the AWS-CDA was rather helpful as this was heavily tested. I developed code for the Lambda function using python 3.8 and boto3 the AWS SDK. The function makes use of the DynamoDB atomic counter feature to update the DynamoDB table and return a JSON response with the visit count. The Lambda function was tested with pytest and moto.

Before I packaged and deployed my SAM application I tested my Lambda function, API Gateway, and DynamoDB table locally.

Lastly, I created my CI/CD pipeline that automatically performs tests and deploys the SAM app using Github Actions whenever I push my code to the main branch. I learned a lot about GitHub Actions and using GitHub secrets to keep my resources secure.

What I learned and where to next

This challenge was very rewarding and I have learned so much from the AWS documentation, StackOverflow posts, and other participants' blog posts.

Shortly, I will be sitting for the AWS Solutions Architect Associate exam and I hope this project helps me grow in the Cloud Engineering realm. I would like to lastly thank Forrest for creating this challenge.