This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
-
Updated
Jan 12, 2021 - Python
This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
AccIo - Enterprise LLM : Unifying intelligence at your command!
Python-based WebSocket for CLI LLaVA inference.
HugNLP is a unified and comprehensive NLP library based on HuggingFace Transformer. Please hugging for NLP now!😊 HugNLP will released to @HugAILab
Logical verification of probabilistic/language model 'intuitions'.
Inference Llama 2 in one file of pure C
Specify what you want it to build, the AI asks for clarification, and then builds it.
Automating the deployment of the Takeoff Server on AWS for LLMs
creating a workflow to train t5 language models
Experimental autonomous AI LLM & RAG IETF reviewer
Code and analysis for optimizing dynamic neural networks. This project investigates and implements various optimization techniques to enhance dynamic neural networks.
Plug in and Play Implementation of Tree of Thoughts: Deliberate Problem Solving with Large Language Models that Elevates Model Reasoning by atleast 70%
A framework for multiple LLM models to operate in a non-adversarial fashion based on the structure of a bee colony working together to maintain a hive.
Trying out Google's new Gemma Large Language model
OpenAI's Code Interpreter in your terminal, running locally
Deploy your favourite LLM model onto kubernetes.
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."