29 links
tagged with serverless
Click any tag below to further narrow down your results
Links
After facing challenges with traditional Redis-based rate limiting during a traffic spike, the author transitioned to using Cloudflare Durable Objects for a more efficient solution. This new approach significantly reduced latency, improved reliability, and lowered costs while effectively managing thousands of concurrent requests in a serverless environment.
AutoKitteh is a developer platform designed for workflow automation and orchestration using vanilla Python, offering a flexible alternative to no/low-code solutions. It supports self-hosting and a cloud option, providing a scalable serverless environment for various operational needs, along with built-in API integrations and advanced engineering features. The platform is open-source and focuses on durability and reliability for long-running workflows.
Backendless is a platform that enables developers to build applications without the hassle of managing server infrastructure. It offers a range of features including real-time data, user management, and push notifications, allowing for rapid application development. By leveraging Backendless, developers can focus on creating unique user experiences rather than worrying about backend complexities.
Elastic's transformation to a serverless architecture for Elastic Cloud Serverless involved shifting from a stateful system to a stateless design, leveraging cloud-native object storage and Kubernetes for orchestration. The changes aimed to meet evolving customer needs for simplified infrastructure management and scalability while optimizing performance and reducing operational complexity. Key strategies included using a push model for control and data communication, automated upgrades, and flexible usage-based pricing.
The article showcases Vercel, a platform designed for frontend developers to build, deploy, and optimize websites and applications effortlessly. It highlights Vercel's features, including serverless functions, automatic scaling, and support for modern frameworks, emphasizing its role in enhancing developer productivity and user experience. Additionally, it discusses integration with popular tools and the importance of performance in web development.
Cloudflare has introduced an open beta for Workers Tracing, a new feature that allows developers to trace requests through their serverless applications. This tool aims to enhance visibility into application performance and assist in debugging by providing detailed insights about request paths and execution times. With Workers Tracing, users can better understand their application's behavior and optimize performance accordingly.
AWS has introduced specialized Model Context Protocol (MCP) servers for Amazon ECS, EKS, and AWS Serverless, enhancing AI-assisted development by providing real-time contextual responses and service-specific guidance. These open-source solutions streamline application development, enabling faster deployments and more accurate interactions with AWS services through natural language commands. The MCP servers aid in managing deployments, troubleshooting, and leveraging the latest AWS features effectively.
Alex Seaton from Man Group presented at QCon London 2025 on transitioning from a high-maintenance MongoDB server farm to a serverless database solution using object storage for hedge fund trading applications. He emphasized the advantages of serverless architecture, including improved storage management and concurrency models, while also addressing challenges like clock drift and the complexities of Conflict-Free Replicated Data Types (CRDTs). Key takeaways highlighted the need for careful management of global state and the subtleties involved in using CRDTs and distributed locking mechanisms.
The article discusses the announcement of Databricks Neon, a serverless SQL warehouse designed to enhance data analytics capabilities. It highlights features like automatic scaling, easy integration with existing tools, and improved performance for data professionals. The launch aims to simplify data management and accelerate analytics workflows for organizations.
Featherless AI is now an Inference Provider on the Hugging Face Hub, enhancing serverless AI inference capabilities with a wide range of supported models. Users can easily integrate Featherless AI into their projects using client SDKs for both Python and JavaScript, with flexible billing options depending on their API key usage. PRO users receive monthly inference credits and access to additional features.
Cloud Run now offers general availability of NVIDIA GPU support, enabling developers to run AI workloads with enhanced flexibility, cost-efficiency, and rapid scaling capabilities. Users can take advantage of pay-per-second billing, automatic scaling to zero, and seamless deployment across multiple regions, making GPU acceleration more accessible than ever. Additionally, the service supports a range of use cases, including model fine-tuning and batch processing, without the need for quota requests.
Firecracker, an open-source software developed by AWS, enables the creation and management of lightweight virtual machines that enhance the performance and security of serverless applications like AWS Lambda. The article discusses its applications in Amazon Bedrock AgentCore for AI agents and the Aurora DSQL serverless relational database, highlighting the benefits of session isolation, fast VM cloning, and efficient memory management.
Amazon DocumentDB Serverless is now generally available, providing a configuration that automatically scales compute and memory based on application demand, leading to significant cost savings. It supports existing MongoDB-compatible APIs and allows for easy transitions from provisioned instances without data migration, making it ideal for variable, multi-tenant, and mixed-use workloads. Users can manage capacity effectively and only pay for what they use in terms of DocumentDB Capacity Units (DCUs).
AWS Step Functions now support JSONata, a powerful query and transformation language that simplifies data manipulation within state machines. The article demonstrates various use cases for filtering, sorting, and transforming JSON data, emphasizing the ease and maintainability JSONata brings compared to the previous JSONPath syntax. It also highlights the importance of adhering to software engineering best practices when integrating orchestration and business logic.
The article reflects on a decade of Netlify, highlighting its evolution and impact on modern web development. It discusses key milestones, innovations, and the influence of Netlify on the serverless architecture and JAMstack ecosystem. The author shares insights into the future direction of the platform and its community.
Lamatic offers a serverless platform for building and deploying generative AI applications quickly and efficiently, featuring a collaborative builder, pre-built templates, and seamless integration of third-party data sources. With capabilities like automated workflows, real-time tracing, and a managed GenAI tech stack, users can develop high-performance AI solutions without the complexities of infrastructure management. The platform ensures data security and provides extensive support for users to achieve their AI goals.
The article discusses the innovative approach taken by Vercel in building serverless servers, emphasizing the fluid architecture that allows for scalability and efficiency. It explores the technical challenges faced during development and how they were overcome to enhance performance and user experience.
Google Cloud has launched a serverless version of Apache Spark integrated within BigQuery, aimed at simplifying data processing and analytics. This new offering eliminates the need for cluster management, reduces costs, and enhances performance while providing a unified development experience in BigQuery Studio, allowing users to seamlessly work with both Spark and BigQuery.
Lambdaliths, or monolithic applications deployed using AWS Lambda, create a debate within the serverless community regarding their advantages and disadvantages. While they can simplify development and improve portability, they may lead to higher cold start times, reduced scalability, and a loss of fine-grained telemetry data compared to the function-per-endpoint approach. Ultimately, the choice between Lambdaliths and single-route functions depends on specific application needs and traffic patterns.
After two years of using serverless technology on Cloudflare Workers, the Unkey team transitioned to stateful Go servers to improve API performance and reduce latency by six times. This shift simplified their architecture, enabled self-hosting, and removed the complexities associated with serverless limitations, ultimately enhancing developer experience and operational efficiency.
Migrating from the Serverless Framework to the Cloud Development Kit (CDK) involves a manual process of preserving resources, recreating them in the CDK, and importing them to maintain functionality. The author shares insights from their own migration experience, including steps to prevent resource deletion, compare CloudFormation outputs, and deploy a sample application that demonstrates the migration techniques.
Ownstats allows users to host their own privacy-focused website analytics on AWS using a serverless architecture. It includes backend infrastructure for data processing, a frontend React application, and a JavaScript client library for sending analytics data, all while requiring specific local installations and AWS permissions. Documentation for the project is available online.
Scaleway has been added as a new Inference Provider on the Hugging Face Hub, allowing users to easily access various AI models through a serverless API. The service features competitive pricing, low latency, and supports advanced functionalities like structured outputs and multimodal processing, making it suitable for production use. Users can manage their API keys and preferences directly within their accounts for seamless integration.
The article discusses strategies for eliminating cold starts in serverless computing by implementing a "shard and conquer" approach. By breaking down workloads into smaller, manageable pieces, the technique aims to enhance performance and reduce latency during function execution. This method is particularly beneficial for optimizing resource utilization in cloud environments.
AWS Lambda offers best practices for handling billions of asynchronous invocations, emphasizing the importance of scalability and reliability in serverless applications. The article outlines techniques such as simple queueing, consistent hashing, and shuffle-sharding to mitigate issues like noisy neighbors and traffic spikes, ensuring efficient load distribution and fault tolerance. Additionally, it highlights proactive monitoring and resilience strategies to maintain service quality during high-demand periods.
Dacadoo successfully transformed its API service from a virtual machine to a Kubernetes-based architecture and finally to a fully serverless solution on AWS, achieving a remarkable 78% reduction in cloud costs and significantly lowering operational maintenance efforts. The transition enhanced scalability, availability, and automation, while complying with regulatory requirements for sensitive health data. This journey highlights the benefits of adopting managed services and modern cloud technologies.
A new PDF chatbot called Vectorless eliminates the need for vector embeddings by utilizing Large Language Models to intelligently select documents and detect page relevance in real-time. This stateless, privacy-first solution processes files entirely within the user's browser, maintaining full document context and enabling accurate, contextual answers without server storage or pre-indexing.
AWS has introduced a new feature that allows for the deployment of AWS Lambda functions directly through GitHub Actions, simplifying the CI/CD process with a declarative YAML configuration. This improvement eliminates the need for manual packaging and configuration steps, enhancing developer experience and security through seamless IAM integration. Users can easily set up a workflow to automatically deploy their functions with minimal effort.
Cloudflare has launched Containers in public beta, allowing developers to deploy Docker container images on its global edge network, which enhances performance by reducing latency. This new feature integrates with Cloudflare Workers, enabling the execution of complex Linux-based applications while offering benefits like global deployment, scale-to-zero pricing, and programmability.