Generate any Tattoo using AI
-
Updated
Jun 15, 2024 - TypeScript
Generate any Tattoo using AI
Pickup Line Generator is a fun and creative web application that helps you craft the perfect pickup line for your crush. Simply input a description of your crush and choose a style, and our AI-powered generator will create unique and creative pickup lines tailored to your preferences.
A minimalistic LLM chat UI
🧠 Dump all your files and chat with it using your Generative AI Second Brain using LLMs ( GPT 3.5/4, Private, Anthropic, VertexAI ) & Embeddings 🧠
Forked version of https://github.com/alfazh123/ParaFaze with a State-of-the-Art of an over engineering :)
LLM Warehouse is a platform for showcasing LLMs, allowing users to explore each model and try them out using the web-app interface.
Official TypeScript wrapper for DeepInfra Inference API
Your personal code reviewer powered by LLMs (OpenAI GPT-3.5/4, Llama, Falcon, Azure AI) & Embeddings ⚡️ Improve code quality and catch bugs before you break production 🚀
A Framework for Narrative Agents
Repository containing code for setting up RAG on your machine. Implemented OpenAI as well as HuggingFace llms and embedding models
LinguFlow, a low-code tool designed for LLM application development, simplifies the building, debugging, and deployment process for developers.
Empower Your Productivity with Local AI Assistants
This project collects GPU benchmarks from various cloud providers and compares them to fixed per token costs. Use our tool for efficient LLM GPU selections and cost-effective AI models. LLM provider price comparison, gpu benchmarks to price per token calculation, gpu benchmark table
Library to supercharge your use of large language models
Test and evaluate LLMs and model configurations, across all the scenarios that matter for your application
Multi-Agent Conversation Framework in TypeScript
Minimalist web-searching app with an AI assistant that runs directly from your browser. Uses Web-LLM, Wllama and SearXNG. Demo: https://felladrin-minisearch.hf.space
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."