6 links
tagged with all of: performance + serverless
Click any tag below to further narrow down your results
Links
After facing challenges with traditional Redis-based rate limiting during a traffic spike, the author transitioned to using Cloudflare Durable Objects for a more efficient solution. This new approach significantly reduced latency, improved reliability, and lowered costs while effectively managing thousands of concurrent requests in a serverless environment.
Cloudflare has introduced an open beta for Workers Tracing, a new feature that allows developers to trace requests through their serverless applications. This tool aims to enhance visibility into application performance and assist in debugging by providing detailed insights about request paths and execution times. With Workers Tracing, users can better understand their application's behavior and optimize performance accordingly.
After two years of using serverless technology on Cloudflare Workers, the Unkey team transitioned to stateful Go servers to improve API performance and reduce latency by six times. This shift simplified their architecture, enabled self-hosting, and removed the complexities associated with serverless limitations, ultimately enhancing developer experience and operational efficiency.
Lambdaliths, or monolithic applications deployed using AWS Lambda, create a debate within the serverless community regarding their advantages and disadvantages. While they can simplify development and improve portability, they may lead to higher cold start times, reduced scalability, and a loss of fine-grained telemetry data compared to the function-per-endpoint approach. Ultimately, the choice between Lambdaliths and single-route functions depends on specific application needs and traffic patterns.
The article discusses the innovative approach taken by Vercel in building serverless servers, emphasizing the fluid architecture that allows for scalability and efficiency. It explores the technical challenges faced during development and how they were overcome to enhance performance and user experience.
The article discusses strategies for eliminating cold starts in serverless computing by implementing a "shard and conquer" approach. By breaking down workloads into smaller, manageable pieces, the technique aims to enhance performance and reduce latency during function execution. This method is particularly beneficial for optimizing resource utilization in cloud environments.