Minimalist web-searching app with an AI assistant that runs directly from your browser. Uses Web-LLM, Ratchet-ML, Wllama and SearXNG. Demo: https://felladrin-minisearch.hf.space
-
Updated
Jul 26, 2024 - TypeScript
Minimalist web-searching app with an AI assistant that runs directly from your browser. Uses Web-LLM, Ratchet-ML, Wllama and SearXNG. Demo: https://felladrin-minisearch.hf.space
Library to supercharge your use of large language models
Empower Your Productivity with Local AI Assistants
Test and evaluate LLMs and model configurations, across all the scenarios that matter for your application
Pickup Line Generator is a fun and creative web application that helps you craft the perfect pickup line for your crush. Simply input a description of your crush and choose a style, and our AI-powered generator will create unique and creative pickup lines tailored to your preferences.
A minimalistic LLM chat UI
Generate any Tattoo using AI
LinguFlow, a low-code tool designed for LLM application development, simplifies the building, debugging, and deployment process for developers.
Official TypeScript wrapper for DeepInfra Inference API
Forked version of https://github.com/alfazh123/ParaFaze with a State-of-the-Art of an over engineering :)
Repository containing code for setting up RAG on your machine. Implemented OpenAI as well as HuggingFace llms and embedding models
🧠 Dump all your files and chat with it using your Generative AI Second Brain using LLMs ( GPT 3.5/4, Private, Anthropic, VertexAI ) & Embeddings 🧠
LLM Warehouse is a platform for showcasing LLMs, allowing users to explore each model and try them out using the web-app interface.
Multi-Agent Conversation Framework in TypeScript
Your personal code reviewer powered by LLMs (OpenAI GPT-3.5/4, Llama, Falcon, Azure AI) & Embeddings ⚡️ Improve code quality and catch bugs before you break production 🚀
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."