Bibliotype is a web application that generates a personalised “Reading DNA” dashboard from a user's Goodreads or StoryGraph export file and provides visual insights into reading habits and preferences
The app uses a Python backend with Pandas for data analysis and calls the Gemini API to generate a creative, AI-powered vibe for each user's unique reading taste.
Screen.Recording.2025-08-30.at.12.11.09.am.mov
Prioritised
ongoing
- settings modal
- lock down canonical genres and improve mapping (improve fiction/non fiction split)
- storygraph support ✅ follow ups:
- concurrent uploads stuck at 50%, revoke previous upload
- Reader type doesn't update live during enrichment
- Polling backoff after 90% complete: Currently fixed 5s. Stretch to 10–15s once percent > 90% to reduce DB churn during the long tail.
- enrichmentPoller lifecycle (x-show → x-if): Banner DOM elements stay rendered when complete; switching to x-if would unmount cleanly and remove the dead $store.enrichment references.
- Existing-book title-update policy" update_or_create ISBN match keeps the oldest row's title. Decide: update titles on match, or document keep-oldest as the invariant. Likely intentional — investigate before changing.
- Probe
absolute positioning
- Cover-art probe element from earlier UI review needs designer pass to coordinate with the comparative-analytics tile.
- Skeleton/placeholder cohesion pass (handoff §3)
- The three skeleton templates work but have inconsistent copy ("Still figuring out…" / "Still discovering…" / "Still fetching…") and incoherent design
todo
-
add see password button, forgot password link too close to input
-
fix button hover animation
-
adjust reader type calculations, different colour for diff types? pixel square background for banner? animated?
-
(double check but probably fine) check that when a user generates dna, the book covers get set and load. right now they only appear after refreshing.
-
(need to link methodology pages or something) update the sources for comparative analytics
-
improve ai generated vibe. add llm metrics with posthog
-
save llm vibe against the dna dictionary, cache this for a month and only refetch if dictionary has changed
- user a uploads a csv, vibe generated
- user b uploads the same csv, we want to reuse this same vibe instead of hitting gemini again
- so need to keep a mapping of dictionary hashes to llm vibes and reuse llm vibe when same dictionary is presented again
- this would prevent several llm vibes being generated during testing
-
allow user to delete profile - settings panel maybe? put make private there too, and opt out of recs, display name can also be there, email update as well? and change password functionality might as well
- make sure public by default
- allow users to opt out of recommendations
-
how similar are you/similarity percentage for 2 or more people
- add page for this
- allow similarity comparison with multiple users (only public)
-
sign up form validate password and all on blur (done) -
long author names and genre names cutting of count when hovering on chart
This is the recommended method for local development. It creates a consistent, isolated environment with a dedicated PostgreSQL database, mirroring a production setup.
- Docker and Docker Compose
- Poetry
-
Clone the repository:
git clone https://github.com/your-username/bibliotype.git cd bibliotype -
Create your environment file: Create a file named
.envin the project root. This file is ignored by Git and will hold your secret keys.# .env SECRET_KEY="generate-a-new-secret-key" GEMINI_API_KEY="your-real-gemini-api-key" # Credentials for the local PostgreSQL container POSTGRES_DB=bibliotype_db POSTGRES_USER=bibliotype_user POSTGRES_PASSWORD=yoursecurepassword123
-
Build and Run the Containers: From the project root, run the following command. The
-dflag runs the services in the background.docker-compose -f docker-compose.local.yml up --build -d
The first time you start the Docker environment, you need to set up the database. Open a new terminal window and run these commands:
-
Apply Database Migrations: This command creates all the necessary tables in the new PostgreSQL database.
docker-compose -f docker-compose.local.yml exec web poetry run python manage.py migrate -
Load Initial Data: This command populates the database with a large catalog of books and pre-calculated community analytics from a local fixture file. This is the fastest way to get started.
docker-compose -f docker-compose.local.yml exec web poetry run python manage.py loaddata core/fixtures/initial_data.json -
Create a Superuser: This allows you to access the Django admin panel at
/admin/.docker-compose -f docker-compose.local.yml exec web poetry run python manage.py createsuperuser
You can now access the application at http://127.0.0.1:8000.
If you update the book list in seed_books.py and want to regenerate the initial_data.json fixture, follow these steps:
docker-compose -f docker-compose.local.yml down -vdocker-compose -f docker-compose.local.yml up --build -ddocker-compose -f docker-compose.local.yml exec web poetry run python manage.py migratedocker-compose -f docker-compose.local.yml exec web poetry run python manage.py seed_booksdocker-compose -f docker-compose.local.yml exec web poetry run python manage.py seed_analyticsdocker-compose -f docker-compose.local.yml exec web poetry run python manage.py dumpdata core.Book core.Author core.Genre core.AggregateAnalytics --indent 2 > core/fixtures/initial_data.json- Commit the updated
initial_data.jsonfile to Git.
This guide outlines the steps to deploy the application to a production environment on a fresh Ubuntu 22.04 server (e.g., a DigitalOcean VPS). The stack uses Docker Compose, Nginx as a reverse proxy, and GitHub Actions for fully automated CI/CD.
-
Create an Ubuntu 22.04 Server:
- Provision a new VPS and ensure you can connect via SSH using your public key.
-
Create a Deployment User:
- Log in as
rootand create a dedicated non-root user for the deployment.# Replace 'deploy' with your preferred username adduser deploy - Grant this user
sudoprivileges and add them to thedockergroup (this requires installing Docker first, see next step).usermod -aG sudo deploy
- Copy your SSH key to the new user so you can log in directly:
# This command copies the keys from the root user rsync --archive --chown=deploy:deploy ~/.ssh /home/deploy/
- Log out and log back in as your new
deployuser.
- Log in as
-
Install Software & Configure Firewall:
- Install Docker, Nginx, and Certbot.
# Install Docker curl -fsSL https://get.docker.com -o get-docker.sh sudo sh get-docker.sh sudo usermod -aG docker $USER # Add current user to docker group # Install Nginx and Certbot sudo apt update sudo apt install nginx python3-certbot-nginx -y
- Configure the firewall to allow web and SSH traffic.
sudo ufw allow OpenSSH sudo ufw allow 'Nginx Full' sudo ufw enable
- Important: Log out and log back in to apply the Docker group permissions. Verify with
docker ps.
- Install Docker, Nginx, and Certbot.
-
Clone the Repository:
git clone https://github.com/your-username/bibliotype.git app cd app -
Create the Production
.envFile: Create a new.envfile for production secrets. Use strong, unique credentials.nano .env
Paste and edit the following content:
# .env (Production) SECRET_KEY="..." GEMINI_API_KEY="..." POSTGRES_DB=bibliotype_prod_db POSTGRES_USER=bibliotype_prod_user POSTGRES_PASSWORD="..." DEBUG=False ALLOWED_HOSTS="your_domain.com,www.your_domain.com"
-
Create the Static Files Directory: This empty folder on the host will be mapped into the container so Nginx can access the collected static files.
mkdir staticfiles
-
Create Nginx Config: Create a new configuration file for your site.
sudo nano /etc/nginx/sites-available/bibliotype
Paste the following configuration, replacing
your_domain.comand/home/deploy/app/with your actual values.server { listen 80; server_name your_domain.com www.your_domain.com; return 301 https://$host$request_uri; } server { listen 443 ssl http2; server_name your_domain.com www.your_domain.com; # Path for static files location /static/ { alias /home/deploy/app/staticfiles/; } # Proxy requests to the Django app location / { proxy_pass http://127.0.0.1:8000; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; } # SSL settings will be added by Certbot below }
-
Enable the Site & Get SSL Certificate:
- Activate the configuration by creating a symlink.
sudo ln -s /etc/nginx/sites-available/bibliotype /etc/nginx/sites-enabled/
- Point your domain's DNS A records to your server's IP address.
- Run Certbot to obtain an SSL certificate and automatically update the Nginx config.
sudo certbot --nginx -d your_domain.com -d www.your_domain.com
- Activate the configuration by creating a symlink.
-
Create a Deploy-Specific SSH Key: On your local machine, create a new SSH key pair dedicated to this deployment. Do not use your personal key.
ssh-keygen -t ed25519 -C "github-deploy-bibliotype" -f ~/.ssh/bibliotype_deploy_key
-
Add Public Key to Server: Copy the content of the public key (
cat ~/.ssh/bibliotype_deploy_key.pub) and paste it as a new line in your server's/home/deploy/.ssh/authorized_keysfile. -
Add Secrets to GitHub Repository: Go to
Your Repo > Settings > Secrets and variables > Actionsand add the following repository secrets:DO_SSH_HOST: Your server's IP address.DO_SSH_USERNAME: Your deployment username (e.g.,deploy).DO_SSH_KEY: The content of the private key (cat ~/.ssh/bibliotype_deploy_key).DOCKERHUB_USERNAME: Your Docker Hub username.DOCKERHUB_TOKEN: A Docker Hub access token.
-
Configure Passwordless
sudo: The deployment script needs to fix file permissions. Allow your deploy user to runsudowithout a password.- Run
sudo visudoon your server. - Add this line at the very bottom of the file (replace
deployif you used a different username):deploy ALL=(ALL) NOPASSWD: ALL
- Run
Commit your final docker-compose.prod.yml, .github/workflows/deploy.yml, and settings.py files to your repository and push to the main branch.
git push origin mainThe GitHub Action will now run and automatically build, test, and deploy your application. Subsequent pushes to main will automatically update the live site.