Breaking into Cloud Security: My Cloud Resume Challenge
As a fresh graduate who recently started my career as a cybersecurity engineer, I was given the opportunity to explore cloud infrastructure in my workplace, and I quickly became interested in cloud security. As I previously had little experience with cloud technologies, I began searching for learning resources online.
During this search, I came across the Cloud Resume Challenge, a project designed to teach cloud fundamentals by building a real-world application. It seemed like the perfect opportunity for me not only to learn about cloud technologies, but also to gain exposure to related concepts such as CI/CD pipelines. Without much hesitation, I decided to take on the challenge, focusing on the goal of learning cloud and CI/CD security.
You can find the completed project at my website: resume.wongyx.com
Architecture Overview
For the challenge, I chose AWS as the cloud provider as I have some prior experience working with it in my job. I decided to use Cloudflare as my DNS since it is also my domain registrar.
CI/CD Flow
As for CI/CD, I mainly use GitHub Actions to configure my pipeline, integrating tests along the way to secure my project (this will be discussed in greater details later in the blog).
Step 1: Frontend
Since the main point of my project is to learn about securing the cloud, I decided to cheat a little bit for the HTML and CSS of my website by relying on Claude to generate them for me. These files are stored in an AWS S3 bucket, which serves them as a static website through CloudFront acting as the CDN. An SSL/TLS certificate is provisioned in AWS Certificate Manager to enable HTTPS traffic for the website.
For the security of this part, I configured my S3 bucket to only allow traffic from CloudFront by making use of AWS Origin Access Control. I also enabled DNSSEC on Cloudflare to enhance the security of my DNS.
Step 2: Backend
The backend for my website is hosted on serverless infrastructures, making use of DynamoDB as the database of my visitor counter and AWS Lambda for executing my Python code. The AWS API Gateway is used to receive API calls coming from the JavaScript code embedded in my website whenever someone visits it.
As code testing is not one of my main focus for this project, I decided to keep it simple by only writing a smoke test using Playwright. The test will check that my visitor counter successfully loads on the website and the counter is updated on refresh.
To secure the backend of my project, I added a throttling policy to my API gateway to as a protection against DDOS attacks. I also configured CORS policy on my Lambda code to only allow my domain as the allowed origin. I wanted to configure AWS WAF to further secure my website, but it cost $5 per month to use AWS WAF and I want to keep this project as low cost as possible, so I decided to not implement the WAF in the end.
Step 3: Infrastructure as Code
This is the part where things started to get more complicated for me as I have no prior experience with IaC. I wanted to learn an IaC tool that will be useful for me in my future career, and I ended up choosing Terraform as it is cloud agnostic and widely used in the industry.
As I wanted my project to mimic real world deployment practices, I created two separate AWS accounts for test and production, and provision to the two environments using the same Terraform code. Since I had already deployed the infrastructure manually in the earlier steps of the challenge, I had a clear idea of the resources required. To speed up the process, I used Claude to generate an initial Terraform template for the infrastructure. I then reviewed and modified the generated configuration to ensure it accurately reflected my deployed setup. By studying and refining the generated template, I gradually built an understanding of how Terraform defines infrastructure declaratively and manages relationships between resources.
Initially, Terraform state for the project was stored locally. However, I realized that this approach would not be suitable once I integrated Terraform with a CI/CD pipeline. To address this, I configured the state to be stored remotely in an S3 bucket. In addition, DynamoDB is used for state locking to prevent multiple deployments from modifying the infrastructure simultaneously.
From a security perspective, I ensured that the IAM roles created in AWS follow the principle of least privilege. Each service is granted only the minimum permissions required to interact with other services, reducing the potential attack surface of the infrastructure.
Step 4: CI/CD Pipeline
CI/CD pipeline is relatively new to me as well, so I was quite excited when I finally reached this part. I first set up the pipeline to automate the deployment of my AWS Lambda function along with the infrastructure provisioned using Terraform. When pull requests are merged into the main branch of my GitHub repository, the changes will first be deployed to the test environment, and a smoke test will run. Failure in the smoke test will trigger an automatic rollback to the previous working version. If the smoke test passed, I can then manually trigger the deployment into production, which similarly has the smoke test and rollback implemented.
To allow my CI/CD pipeline to interact with AWS securely, I configured OpenID Connect (OIDC) authentication between GitHub Actions and AWS instead of using long-lived access keys. This eliminates the need to store permanent access keys in the repository or GitHub secrets, reducing the risk of credential leakage while aligning with modern cloud security best practices. I also ensured that the IAM role follows the principle of least privilege by defining fine-grained permissions in the IAM policy, granting access only to the specific AWS resources and actions required for deployment. In particular, I restricted IAM-related permissions to read-only access, preventing the pipeline from creating or modifying IAM roles or policies. This approach limits the potential impact of a compromised workflow while still allowing the CI/CD pipeline to function as intended.
Securing the pipeline
My next step was to implement security into the CI/CD pipeline. I configured a pre-commit hook that will run gitleaks whenever I do a git commit to ensure that I will not accidentally push any secrets to my public repository. I also implemented a GPG key with RSA3072 as the encryption algorithm to sign my commits, while setting my GitHub repository to only accept signed commits.
I decided to configure the SAST tools to run in the pipeline whenever a pull request is created, preventing vulnerable code from being merged into the main branch. I used GitHub CodeQL to analyze my Python code for security vulnerabilities and tfsec to scan my Terraform configuration, failing if any high or critical issues are detected. I also added a rule into GitHub to block the merge if either of the scans failed.
After learning about the Shai-Hulud supply chain attack in the npm ecosystem in September 2025, I also decided to add Software Composition Analysis (SCA) into my CI/CD pipeline. I used Syft to generate an SBOM in the CycloneDX format, which is then analyzed using Grype. Similar to the SAST checks, any findings with high or critical severity will also block the merge into the main branch.
Conclusion
Overall, I found this project to be both challenging and rewarding to work through. It allowed me to explore several areas that I had previously only read about, including cloud infrastructure, CI/CD automation, and security practices in modern development workflows. Building the system end-to-end gave me a deeper appreciation of how security can be integrated throughout the development lifecycle rather than treated as a separate step. In particular, working with cloud infrastructure and implementing security checks in the pipeline gave me a glimpse into the world of cloud security and DevSecOps. It is an area that I find particularly interesting and hope to continue exploring as I grow in my career.