Skip to content

lostspace003/prompt-engineering-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Architecture Diagram

πŸ¦™ Prompt Engineering with Ollama & Qwen3

A hands-on prompt engineering course powered entirely by local open-source models.
No API keys. No cloud costs. Just your machine and a lightweight LLM.

Ollama Qwen3 Python Jupyter License


πŸ“– Overview

This repository contains 6 structured Jupyter notebooks that teach core prompt engineering techniques using Ollama and the Qwen3 0.6B model. Everything runs locally on your machine β€” no OpenAI keys, no cloud dependencies.

Each notebook is self-contained, progressively building your skills from basic prompting principles to advanced text transformation.


πŸ—‚οΈ Repository Structure

prompt-engineering-ollama/
β”‚
β”œβ”€β”€ πŸ“‚ notebooks/
β”‚   β”œβ”€β”€ 01-guidelines-for-prompting.ipynb      # Principles & tactics for effective prompts
β”‚   β”œβ”€β”€ 02-iterative-prompt-development.ipynb   # Refine prompts through iteration
β”‚   β”œβ”€β”€ 03-summarizing.ipynb                    # Summarize text with topic focus
β”‚   β”œβ”€β”€ 04-inferring.ipynb                      # Sentiment analysis & topic extraction
β”‚   β”œβ”€β”€ 05-expanding.ipynb                      # Generate tailored long-form content
β”‚   └── 06-transforming.ipynb                   # Translation, grammar, tone & format
β”‚
β”œβ”€β”€ πŸ“‚ assets/
β”‚   └── architecture.svg                        # Architecture diagram
β”‚
β”œβ”€β”€ πŸ“‚ docs/
β”‚   └── SETUP.md                                # Detailed setup instructions
β”‚
β”œβ”€β”€ .gitignore
β”œβ”€β”€ CONTRIBUTING.md
β”œβ”€β”€ LICENSE
β”œβ”€β”€ README.md
└── requirements.txt

πŸš€ Quick Start

Step 1 β€” Install Ollama

Download and install from ollama.com/download (Windows, macOS, or Linux).

ollama --version

Step 2 β€” Pull the Model & Start the Server

ollama pull qwen3:0.6b
ollama serve

Note

Leave the server running in a separate terminal. It listens on http://localhost:11434.

Step 3 β€” Clone & Set Up the Project

git clone https://github.com/<your-username>/prompt-engineering-ollama.git
cd prompt-engineering-ollama

pip install -r requirements.txt
jupyter notebook

Step 4 β€” Open Any Notebook & Run

Navigate to the notebooks/ folder in Jupyter and start with 01-guidelines-for-prompting.ipynb.


πŸ““ Lab Notebooks

# Notebook What You'll Learn
01 Guidelines for Prompting Two core principles β€” write clear instructions & give the model time to think
02 Iterative Prompt Development Systematically refine prompts using a product fact sheet as a case study
03 Summarizing Condense text with word limits, topic focus, and extract-vs-summarize strategies
04 Inferring Detect sentiment, extract emotions, identify topics from reviews and articles
05 Expanding Generate personalized customer service emails from review context
06 Transforming Translate languages, fix grammar, adjust tone, and convert formats

βš™οΈ How It Works

Every notebook uses a shared helper function that sends prompts to Ollama's local REST API:

import requests

OLLAMA_URL = "http://localhost:11434/api/generate"
MODEL = "qwen3:0.6b"

def get_completion(prompt, temperature=0.2):
    payload = {
        "model": MODEL,
        "prompt": prompt,
        "stream": False,
        "options": {"temperature": temperature}
    }
    r = requests.post(OLLAMA_URL, json=payload)
    r.raise_for_status()
    return r.json()["response"]

Data flow: Notebook β†’ Python requests β†’ Ollama REST API β†’ Qwen3 inference β†’ JSON response β†’ Notebook output


πŸ› οΈ System Requirements

Component Minimum Recommended
OS Windows 10 / macOS 12 / Ubuntu 20.04 Latest stable
RAM 4 GB 8 GB+
Python 3.9 3.10 – 3.12
Disk ~1 GB (model + deps) 2 GB+

πŸ› Troubleshooting

Problem Solution
Connection refused Make sure ollama serve is running in a separate terminal
Model not found Run ollama pull qwen3:0.6b
Port conflict on 11434 Stop other Ollama instances or reboot
Slow responses Close heavy applications to free RAM

🀝 Contributing

Contributions are welcome! Please read CONTRIBUTING.md for guidelines.


πŸ“„ License

This project is licensed under the MIT License β€” see the LICENSE file for details.


Built with ❀️ for open-source AI education

About

πŸ¦™ Learn prompt engineering for free β€” 6 hands-on Jupyter notebooks powered by Ollama & Qwen3. No API keys. 100% local.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors