Bookworm is a FastAPI based book management api with AI recommendation system (postgres as database). It's easy to deploy, manage and scale. As the project is solo maintained github copilot is used for reviewer for basic code reviews.
Before you begin, you'll need to install one important software:
Docker is required to run the application, visit docker for desktop and follow the steps.
Important
For the application to deploy .env.dev file should be configured properly. Check env/README.md before proceeding.
The following commands will help developers to spin-up the bookworm app local dev server. You have two command options for running bookworm, using Docker or Make.
docker compose down -v- Build and setup application
docker compose up -d --buildThis command will do the following.
- Build docker image and deploy
api,db,aiandredisservices. - Start up the database.
- Run api using uvicorn in port
8000.
docker exec -it bookworm-ai ollama pull llama3.2Note
change the llm model name according to value provided in .env.* file default is set to llama3.2
docker exec -i pg-db psql -U postgres books < utils/data/db_loader.sqlmake destroymake setupNote
make setup is a collection of 3 targets. It builds the application first, the pull-model downloads the llm from the ollama hub and finally load_data load the db with some books data. Additionally a books_admin user also will be added out of the box.
- Use
uvorpoetrypackage for dependency management. - Implement
pgaifor book recommendation.