Architecture to Overcome API Gateway Pay Load Limit

Today’s Agenda :
  • Architecture to Overcome API Gateway Pay Load Limit

  • Deploying Apps to EKS with GitHub Actions and ArgoCD

  • Reimagining cloud strategy for AI-first enterprises

  • Docker Desktop 4.33 Introduces Build Checks for Optimized Dockerfiles

  • Google Adds Gemini AI to BigQuery and Looker to Simplify Data Tasks

  • Microsoft to Explore Kernel Access Alternatives Post-CrowdStrike Outage

Read Time: 4 minutes

Architecture to Overcome API Gateway Pay Load Limit

Namaste πŸ™ TechOps Soldiers!

Good to see you back on a fresh week; hope you had a fantabulous weekend.

Today's use case is about "Overcoming AWS API Gateway Payload Limits" with an ideal architecture example.

Imagine you're developing a document management mobile app, where users can upload large files such as PDFs, presentations, and spreadsheets. The app needs to store these documents securely and efficiently.

For this, utilizing AWS services like API Gateway and S3 is an ideal choice due to their scalability, security, and ease of integration.

A mobile client can directly upload files to an S3 bucket via API Gateway, as depicted below:

While this might seem straightforward, it has a critical limitation: API Gateway imposes a payload size limit of 10MB.

In this setup:

  1. Mobile Client sends a request to API Gateway.

  2. API Gateway attempts to upload the file to S3 Bucket.

  3. Files under 10MB succeed, while those over 10MB fail.

This architecture is not only inefficient but also unsuitable for applications dealing with larger files.

Key considerations for API Gateway:

Feature

Value/Range

Maximum routes per API

300

Maximum integrations per API

300

Maximum stages per API

10

Maximum request payload size

10 MB

Maximum response payload size

10 MB

Request rate limit

10,000 requests per second

Burst limit

5,000 requests per second

Maximum Integration timeout

30 seconds

API key limit per account

10,000 API keys

Deployment limit per stage

10 deployments

Don't worry, we've got this covered πŸ˜ƒ 

Have you heard about Pre-Signed URLs?

A pre-signed URL is a URL that you can provide to your users to grant temporary access to a specific S3 bucket. With this URL, users can upload files directly to S3 without requiring AWS credentials or going through an intermediary service like API Gateway.

Pre-signed URLs are time-limited and can be configured to allow only specific actions (like uploads), ensuring secure and controlled access to your S3 buckets.

Let's transition to a better architecture as below:

In this setup:

  1. Mobile Client sends a request to API Gateway.

  2. API Gateway invokes a Lambda Function.

  3. Lambda Function generates a pre-signed URL for the S3 bucket.

  4. Lambda Function returns the pre-signed URL to API Gateway.

  5. API Gateway forwards the pre-signed URL to the Mobile Client.

  6. Mobile Client uses the pre-signed URL to upload the file directly to S3 Bucket.

Why is this the Better Choice?

This improved architecture offers several benefits:

  • No File Size Limitations: By allowing direct uploads to S3, the file size constraints of API Gateway are eliminated.

  • Enhanced Performance: Direct uploads reduce latency and improve the overall performance of the file upload process.

  • Scalability: This architecture can handle a larger number of uploads without additional complexity.

  • Security: Pre-signed URLs ensure that only authorized users can upload files, maintaining security without sacrificing convenience.

This way, you ensure a seamless, scalable, and efficient file upload process for your mobile app, enhancing user experience and operational efficiency.

In my experience, what I understand is 'architecture' design is a subjective choice and differs drastically with individuals and systems.

This solution is one of the choices - That's it!

πŸ“– Resources & Tutorials

This blog explains how to use GitHub Actions and ArgoCD to deploy applications on Amazon EKS, focusing on key techniques to simplify CI/CD workflows. It includes steps for setting up EKS clusters, configuring ArgoCD, and connecting GitHub repositories for smooth deployments.

MIT Technology Review Insights explores how AI advancements are transforming businesses. While AI shows significant benefits, achieving its full potential requires substantial cloud infrastructure investment.

πŸ“ˆ Trends & Updates

Docker Desktop 4.33 now features Docker Build checks, helping developers follow best practices for building container images. These checks provide real-time feedback, identify issues early, and improve build performance. Developers can access these checks through both the CLI and Docker Desktop Builds view.

Google Cloud introduces its AI chatbot Gemini into BigQuery and Looker, helping data professionals with code generation, data analysis, and performance optimization. This enhancement aims to make data engineering tasks easier and more efficient. Additionally, BigQuery now supports Apache Spark and Kafka for better real-time data analysis.

Following a major outage caused by a CrowdStrike update, Microsoft plans to seek alternatives to direct kernel access for partners. The incident, which affected 8.5 million Windows systems, has sparked debate among SecOps professionals about the necessity and risks of kernel access. Potential alternatives, such as VBS enclaves and Azure Attestation, aim to enhance system security and resilience.

Want To Advertise in TechOps Examples ?

Our newsletter puts your products and services in front of the right people - engineering leaders and senior engineers - who make important tech decisions and big purchases.

Did someone forward this email to you? Sign up here