4 links
tagged with all of: llm + privacy
Click any tag below to further narrow down your results
Links
A React Native plugin enables access to Apple's on-device Intelligence Foundation Model framework, allowing developers to utilize local LLM APIs for generating structured outputs and managing sessions. It focuses on privacy by processing data on-device and supports various features like TypeScript, custom tools, and session management, making it suitable for AI-powered mobile applications.
Octo is a zero-telemetry coding assistant that supports various OpenAI-compatible and Anthropic-compatible LLM APIs, allowing users to switch models mid-conversation. It features built-in Docker support, customizable configuration, and can work seamlessly with local LLMs. Octo prioritizes user privacy and provides functionalities to manage coding tasks effectively while maintaining a user-friendly interface.
React Native RAG is a new local library that enhances large language models (LLMs) with Retrieval-Augmented Generation (RAG) capabilities, allowing for improved, context-rich responses by retrieving relevant information from a local knowledge base. It offers benefits such as privacy, offline functionality, and scalability, while providing a modular toolkit for developers to customize their implementations. The library integrates seamlessly with React Native ExecuTorch for efficient on-device processing.
The article presents OpenSkills, a tool that allows users to run Claude Skills locally on their Mac using any LLM, ensuring privacy and full control over data processing. It provides a detailed guide on installation and configuration, highlighting its compatibility with various AI tools and the ability to process sensitive documents without uploading them. Users can also create custom skills or utilize Anthropic's official skills in a sandboxed environment.