Skip to content

matthew-kersting/clip-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

clip-demo

Quick demo of the CLIP model

The CLIP model was proposed in Learning Transferable Visual Models From Natural Language Supervision This project demonstrates CLIP as it is applied to a zero-shot image classification task

The only dependency you should need is docker

You can start the demo by running docker-compose up in the root of the project and navigating to localhost:8501

About

Quick demo of the CLIP model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published