Skip to content

Next step of the original idea from the skills repository 💪

License

Notifications You must be signed in to change notification settings

ranade1/skillhunter

Repository files navigation

Build Status codecov Codacy Badge Code style: black

SkillHunter

SkillHunter is all about helping you and your mates identify the most in-demand skills in the job market in order to find a new position as soon as possible. Cut the bullshit and prepare yourself in the most efficient way, no need to learn everything!

What if you asked yourself the wrong question: "What do I need to learn?" Whereas a better one could be: "What I don't need to learn?" The latter helps you save a ton of time by avoiding unnecessary work when time is the most valuable – at the very beginning of unknown – that's where the SkillHunter really shines.

Installation

Download this project and run Docker Compose on your machine:

git clone https://github.com/spyker77/skillhunter.git

Usage

Update environment variables inside docker-compose.yml and run the following bash command inside downloaded project's folder – that will launch the process of building the image (if it doesn't exist), create and start containers in detached mode.

Note ⚠️

Due to forced HTTPS in production, it might be a good idea to use "ENVIRONMENT=development" first – this will allow you to avoid SSL related errors in the local browser.

docker-compose up -d

On the first run you may also need to apply migrations to the fresh database:

docker-compose exec web python manage.py migrate

In order to run tests, try this:

docker-compose exec web pytest --cov --cov-report=term-missing

Tada 🎉

By now you should be up and running. Try to reach the http://localhost:8000 in your browser. Note that in order to see the work in full color, you need to fill the database once by loading the list of job titles to parse and skills to identify...

docker-compose exec web python loaddata jobs.json
docker-compose exec web python loaddata skills.json

...and run scrapers to collect initial data on available vacancies...

docker-compose exec web python manage.py scrape_hh
docker-compose exec web python manage.py scrape_indeed
docker-compose exec web python manage.py scrape_sh

...or run scrapers periodically using cron and additionally cleaning the database from outdated records:

docker-compose exec web python manage.py purge_db

Tech Stack

  • Docker
  • Python
  • Django
  • Django REST framework
  • Swagger UI
  • PostgreSQL
  • Redis
  • Tailwind CSS
  • Jinja
  • Beautiful Soup
  • aiohttp
  • Pytest
  • Travis CI

Contributing

Pull requests are really welcome. For major changes, please open an issue first to discuss what you would like to change.

Also, make sure to update tests as appropriate 🙏

License

This project is licensed under the terms of the MIT license.

About

Next step of the original idea from the skills repository 💪

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published