Click any tag below to further narrow down your results
Links
This article details how to build a Docker-based machine learning inference service that includes automated security scanning, testing, and deployment. It walks through the architecture, CI/CD pipeline, and real-world usage of a Flask API serving a Hugging Face model locally.
OpenTinker is a framework for agentic reinforcement learning, offering a range of training scenarios and environments. It features both data-dependent and data-free paradigms, with single-turn and multi-turn interaction modes for various use cases. The setup involves cloning the repository, installing dependencies, and configuring an authentication system for API access.
LiteLLM is a lightweight proxy server designed to facilitate calls to various LLM APIs using a consistent OpenAI-like format, managing input translation and providing robust features like retry logic, budget management, and logging capabilities. It supports multiple providers, including OpenAI, Azure, and Huggingface, and offers both synchronous and asynchronous interaction models. Users can easily set up and configure the service through Docker and environment variables for secure API key management.
WAHA is a self-hosted WhatsApp HTTP API that can be set up on your server in under five minutes, provided you have Docker installed. The guide details steps for sending your first text message via the API, including session management, QR code scanning, and example payloads for message sending.