Click any tag below to further narrow down your results
Links
AWS Lambda now offers improved observability for Kafka event source mappings, allowing users to monitor event polling, scaling, and processing with Amazon CloudWatch Logs and metrics. This enhancement helps troubleshoot issues quickly, reducing operational overhead and mean time to resolution. It's available for both Amazon Managed Streaming for Apache Kafka and self-managed Kafka setups.
AWS now supports response streaming in API Gateway, allowing REST APIs to send responses progressively. This reduces wait times, improves user experience in applications like AI chatbots, and handles larger payloads more efficiently.
AWS Lambda Managed Instances lets you run Lambda functions on EC2 instances while keeping the serverless experience. This feature provides access to specialized compute options and cost savings for steady workloads without the hassle of managing infrastructure. You can configure capacity providers to optimize for your specific needs.
AWS Lambda now officially supports Rust for building serverless applications. The article explains how to set up and deploy Rust-based Lambda functions using Cargo Lambda and the AWS Cloud Development Kit (CDK). It covers prerequisites, function creation, testing, and deployment steps.
AWS Lambda now allows asynchronous invocations with a maximum payload size of 1 MB, up from 256 KB. This change enables developers to send more complex data in a single event, simplifying data handling for event-driven applications. Customers can use the Lambda API or receive events from various AWS services.
New Relic migrated its Lambda Extension from Go to Rust, resulting in a 40% reduction in billed duration and improved memory efficiency. The rewrite also enhanced reliability and introduced a more robust telemetry pipeline.
This article explains how to use AWS Lambda durable functions for building multi-step applications and AI workflows. It describes features like automatic retries, state management, and execution suspension, allowing developers to handle complex scenarios efficiently. It also provides a sample order processing workflow demonstrating these capabilities.
This article analyzes the security features of AWS Lambda Managed Instances, focusing on their Bottlerocket-based architecture and access restrictions. It highlights the limitations on IAM role modifications and instance access, while exploring the underlying components and network configurations that enhance security.
AWS introduced Durable Functions for Lambda, allowing developers to create multi-step applications directly in their code. These functions can track progress, handle retries, and pause execution for up to a year without incurring costs. This feature streamlines state management and simplifies complex workflows.
This article outlines the public roadmap for AWS Lambda, detailing current and upcoming features. It encourages customer feedback and clarifies how to request new features or report issues.
AWS Lambda response streaming has increased its maximum response payload size to 200 MB, ten times the previous limit, enhancing performance for latency-sensitive applications. This improvement allows for direct processing of large datasets and media files without the need for intermediate services, thereby reducing time to first byte (TTFB) for end-users. The new limit is applicable across all supported AWS Regions and compatible with both Node.js managed and custom runtimes.
Amazon Redshift will discontinue support for Python user-defined functions (UDFs) after June 30, 2026, encouraging users to migrate to Lambda UDFs which offer better integration, scalability, and security. The article provides a detailed guide on how to transition existing Python UDFs to Lambda UDFs, including setup, testing, and monitoring through AWS tools.
AWS Lambda now integrates with GitHub Actions, allowing automatic deployment of Lambda functions whenever code changes are pushed to GitHub repositories. This new feature simplifies the CI/CD process by eliminating the need for custom scripts and manual configurations, supporting both .zip file and container image deployments while streamlining permissions and error handling.
AWS Lambda requires careful consideration for observability due to its serverless nature, which complicates monitoring and debugging. This guide explores the challenges of implementing OpenTelemetry with AWS Lambda, offers insights into instrumentation methods like AWS Distro for OpenTelemetry (ADOT) and custom SDKs, and discusses deployment options for telemetry data collection, all while emphasizing the importance of understanding the Lambda execution lifecycle.
A critical vulnerability in AWS Lambda functions allows attackers to exploit OS command injection through S3 file uploads, potentially compromising AWS credentials and enabling further malicious actions such as phishing via AWS SES. The article highlights the importance of proper configuration and vulnerability scanning to prevent such attacks in event-driven architectures.
AWS ECS Fargate can struggle with sudden traffic spikes due to slow autoscaling, leading to potential 503 errors. To mitigate this, the article suggests offloading traffic to a Lambda function during high-traffic periods by creating an additional target group and using CloudWatch metrics to trigger scaling actions. This setup allows existing Fargate tasks to handle the load more effectively while new tasks are being provisioned.
A scalable mass email service was built using AWS services including SES, SQS, Lambda, S3, and CloudWatch to efficiently handle high volumes of emails while ensuring reliability and deliverability. The article provides an overview of the architecture, real-world use cases, pricing predictions, and step-by-step implementation details, along with challenges faced and solutions implemented during the project. Future improvements are suggested, such as adding a user-friendly interface and analytics functionality.
A startup experienced a silent crash in AWS Lambda, where Node.js functions failed mid-execution without any logs or errors. Despite extensive evidence and escalation through AWS support channels, the company received no constructive engagement and was ultimately blamed for the issue, leading them to migrate their entire infrastructure to Azure.
An OpenAI-compatible API can be effectively deployed using AWS Lambda and an Application Load Balancer (ALB) to bypass the limitations of API Gateway's authentication requirements. By setting up the ALB to route traffic directly to the Lambda function, developers can maintain a seamless integration with the OpenAI Python client, ensuring a consistent API experience. This approach offers flexibility and security when exposing custom AI services.
AWS has introduced a new feature that allows for the deployment of AWS Lambda functions directly through GitHub Actions, simplifying the CI/CD process with a declarative YAML configuration. This improvement eliminates the need for manual packaging and configuration steps, enhancing developer experience and security through seamless IAM integration. Users can easily set up a workflow to automatically deploy their functions with minimal effort.
AWS Lambda offers best practices for handling billions of asynchronous invocations, emphasizing the importance of scalability and reliability in serverless applications. The article outlines techniques such as simple queueing, consistent hashing, and shuffle-sharding to mitigate issues like noisy neighbors and traffic spikes, ensuring efficient load distribution and fault tolerance. Additionally, it highlights proactive monitoring and resilience strategies to maintain service quality during high-demand periods.