Abbey is a self-hosted configurable AI interface with workspaces, document chats, YouTube chats, and more. Find our hosted version at https://abbey.us.ai.
-
Updated
Apr 16, 2025 - JavaScript
Abbey is a self-hosted configurable AI interface with workspaces, document chats, YouTube chats, and more. Find our hosted version at https://abbey.us.ai.
Scala client for OpenAI API and other major LLM providers
Googles NotebookLM but local
The .NET library to consume 100+ APIs: OpenAI, Anthropic, Google, DeepSeek, Cohere, Mistral, Azure, xAI, Perplexity, Groq, Ollama, LocalAi, and many more!
BabyCommandAGI is designed to test what happens when you combine CLI and LLM, which are older computer interfaces than GUI. Based on BabyAGI, and using Latest LLM API. Imagine LLM and CLI having a conversation. It's exciting to think about what could happen. I hope you will all try it out.
A GUI-based AI development tool with integrated Metaphor support
The Anthropic API wrapper for Delphi leverages cutting-edge models, including Anthropic's advanced Claude series, to deliver powerful features for chat interactions, vision processing, caching, and efficient batch processing.
A standalone agent runner that executes tasks using MCP (Model Context Protocol) tools via Anthropic Claude, AWS BedRock and OpenAI APIs. It enables AI agents to run autonomously in cloud environments and interact with various systems securely.
A native Android app created with React Native on the Expo framework. Chat with assistants from OpenAI, Anthropic and Mistral.
Use AI to do anything with the selected text!
Client for the Claude AI models via the Anthropic API
A self-contained Google Colab Notebook that implements RAG using Postgres pgvector, pgai, and LLMs. Supports local-Ollama/OpenAI/Anthropic
Token counter for Anthropic
Materials/code for a short presentation on using AI with Rocket League
Neovim Plugin to Use LLMs to Code
Chat With Documents is a Streamlit application designed to facilitate interactive, context-aware conversations with large language models (LLMs) by leveraging Retrieval-Augmented Generation (RAG). Users can upload documents or provide URLs, and the app indexes the content using a vector store called Chroma to supply relevant context during chats.
Rocket League Replay Analysis with multiple AI agents (OpenAI, Anthropic, Gemini)
Large Language Model API interface
Interact with Local and Comercial LLMs via Streamlit UI
A simple web app that allows you to run multiple LLMs in parallel.
Add a description, image, and links to the anthropic-api topic page so that developers can more easily learn about it.
To associate your repository with the anthropic-api topic, visit your repo's landing page and select "manage topics."