I've crafted Baa an llm client for the web and desktop. Useful features for anyone working with llm's! Chat, handling prompts, construct functions, and run experiments with multiple models from OpenAI and Anthropic. Here's some early details on what Baa is, what Baa can do.
I created Baa because I was struggling to find an existing llm client that was spot on for my needs. Ai and LLM's have fundamentally changed how i work, more and more frequently what i do. But I struggled to find a way to use them that suited me.
Pretty, common, modern UI.
Access to data, local and cloud.
No vendor lock-in.
Self host.
Desktop Apps.
Sharing features.
Developer features.
Multiple agents (OpenAI and Anthropic): Switch between different AI agents, such as OpenAI and Anthropic, to benefit from their respective capabilities and expertise.
Desktop Web App: Use Baa as an App on your desktop, Baa will continue to use Online services for data and auth.
Chat: The usual Chat with GPT, but Baa has some useful additions such as system prompts, forking chats and multiple models.
When working with code or queries, there's some useful utilities like, forking a chat, downloading file's, starting experiments, and copy of course.
Handle prompts: Organise, create, edit and share your prompts.
Construct functions: Create edit organise and share functions.
Experiment: Experiment with different approaches or ideas, save and iterate ai functions, completions, chains and agents.
Desktop App everything local auth, storage and llm's. Remove any cloud dependencies.
Optional online community for sharing chats, prompts, and functions.
Sharing, chats, prompts, functions and experiments.
Local models, support local llm's.
Local storage, store chats, prompts, functions and experiments locally.
Local auth, authenticate using local authentication.
Open interpretor, integrate https://openinterpreter.com
VS Code integration, open file, open experiments in VS Code.
Two way audio, home assistant and agents.
Multimodal and vision
OpenAi Plugins *complete
Images
Model view
Anthropic integrated in 30minutes
Recently Anthropic widened access to Claude2 it took 30 minutes to integrate there api with Baa! Think what we can do to improve the Ui of local llm's.
Closed & open source
I'm not sure please, truly please, let me know your thoughts. Email Me
How Baa is developed?
Node.js 18: an open-source, cross-platform JavaScript runtime environment.
Next.js 13: enables you to create full-stack web applications by extending the latest React features
NextAuth.js: configurable authentication framework for Next.js 13
LangChain JS: AI orchestration layer to build intelligent apps
Tailwind CSS: is a utility-first CSS framework that provides a series of predefined classes that can be used to style each element by mixing and matching
shadcn/ui: re-usable components built using Radix UI and Tailwind CSS.
Azure Cosmos DB: fully managed platform-as-a-service (PaaS) NoSQL database to store chat history
https://www.helicone.ai: Experienced first-hand the pain of managing internal tools and monitoring for LLM's at scale ? Use Helicone to solve these problems for you.
Whats stopping Baa from release?
For Cloud - Actually one! or maybe two things. How to securely (like really) manage API Keys provided by you, or how to bill you for API Usage.
For Local - our Electron App Exists, so does Auth. Currently light weight document storage is the blocker, I'll likely write a JSON file api for langchain same as Azure Cosmos.
Why a Sheep why Baa?
-soon- but i'm Welsh 🏴.
Why build this?
-soon-
Build this for mobile?
-soon- but no.
Working on this now, as of early October 2023. Lets go.