AICI: Prompts as (Wasm) Programs
-
Updated
Jul 25, 2024 - Rust
AICI: Prompts as (Wasm) Programs
LLM inference engine written in pure rust and Cuda (still under development)
A minimalistic LLM-powered Telegram assistant written in Rust that uses a self-contained Sqlite database and is very easy to install.
Lightweight and extensible LLM Inference serving benchmark tool written in Rust.
A terminal style user interface to chat with AI characters using llama LLMs for locally processed AI.
ChatFlameBackend is an innovative backend solution for chat applications, leveraging the power of the Candle AI framework with a focus on the Mistral model
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."