This document contains a structured set of homework assignments of varying difficulty levels for students learning DevOps practices using the DevOps Demo project as an example. The assignments cover backend development, frontend development, infrastructure configuration, and observability.
-
Level 1: Beginner
- Adding simple fields (description, created_at, is_active)
- Basic CRUD operations and validation
-
Level 2: Intermediate
- Updating dependencies and Python version
- Implementing pagination and filtering
- UI/UX improvements
-
Level 3: Advanced
- Complex fields (tags, priority, due_date, attachments)
- Extended API functionality
- Additional metrics and dashboards
-
Level 4: Expert
- Workflow and item states
- User system and authorization
- Comments and JSON metadata
- Extended observability
- Level 1: Beginner
- Level 2: Intermediate
- Level 3: Advanced
- Level 4: Expert
- Additional Tasks (Optional)
- Execution Instructions
- Evaluation Criteria
- Tips
Tasks at this level are designed for beginners and cover basic operations for adding simple fields to the Item model, creating migrations, and basic integration with frontend and infrastructure components.
Objective: Learn to add new fields to an existing model, create migrations, and integrate changes between backend, frontend, and infrastructure components.
Backend Tasks:
-
Update Model:
- Open
backend/app/models.py - Add field
description: Mapped[str | None] = mapped_column(String(500), nullable=True)to theItemmodel - Ensure the field has the correct type and length constraints
- Open
-
Create Migration:
- Execute command to generate migration:
alembic revision --autogenerate -m "add_description_to_items" - Check the generated migration file in
backend/alembic/versions/ - Ensure the migration adds column
description VARCHAR(500) NULLto theitemstable
- Execute command to generate migration:
-
Update Pydantic Schemas:
- Open
backend/app/schemas.py - Add field
description: str | None = NonetoItemCreateschema - Add field
description: str | NonetoItemOutschema - Add validation:
Field(None, max_length=500)to limit length
- Open
-
Update CRUD Operations:
- Verify that CRUD operations in
backend/app/crud.pyautomatically work with the new field - If needed, add explicit handling for the new field
- Verify that CRUD operations in
-
API-Level Validation:
- Ensure Pydantic automatically validates maximum length
- Add tests for validation checking (optional)
Frontend Tasks:
-
Update Creation Form:
- Open
frontend/src/App.jsx - Add
<textarea>element for entering description - Add state for storing description value:
const [description, setDescription] = useState('') - Add change handler:
onChange={(e) => setDescription(e.target.value)}
- Open
-
Display Description:
- Update item list display to show description
- Add display length limit (e.g., first 100 characters)
- Add "Read more" button for full display of long descriptions
-
Frontend Validation:
- Add maximum length check (500 characters)
- Show character counter:
{description.length}/500 - Block form submission if description exceeds limit
-
Update API Requests:
- Ensure
descriptionis included in POST request when creating item - Verify description is displayed after fetching items from API
- Ensure
Infrastructure Tasks:
-
Automatic Migration Application:
- Verify that
docker-compose.ymlcontains command for applying migrations on startup - Command should be:
python -m alembic -c /app/alembic.ini upgrade head
- Verify that
-
Prometheus Metrics:
- Open
backend/app/metrics.py - Add metric for counting items with description:
items_with_description = Counter('items_with_description_total', 'Items with description') - Add metric for items without description:
items_without_description = Counter('items_without_description_total', 'Items without description') - Update CRUD operations to increment corresponding metrics
- Open
-
Grafana Dashboard:
- Open Grafana (http://localhost:3000)
- Find dashboard "DevOps Demo Dashboard"
- Add new panel "Items with/without Description"
- Use queries:
items_with_description_totalanditems_without_description_total - Create pie chart or bar chart for visualization
Expected Result:
- User can add items with description through frontend form
- Description is displayed in item list with ability to expand full text
- Migration is automatically applied on
make up - Metrics are displayed in Grafana dashboard
- Validation works on both backend and frontend
Verification:
# Apply seed data
make seed
# Start services
make up
# Check in browser (http://localhost:8080)
# - Description field is displayed in form
# - Description is displayed in item list
# Check in Grafana (http://localhost:3000)
# - New metric is displayed on dashboardObjective: Learn to work with dates and times, add automatic value setting on record creation, and implement date sorting.
Backend Tasks:
-
Update Model:
- Add field
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now())toItemmodel - Use
func.now()for automatic current time setting on creation - Use
timezone=Truefor timezone-aware storage
- Add field
-
Create Migration:
- Create migration:
alembic revision --autogenerate -m "add_created_at_to_items" - For existing records, set default value:
op.execute("UPDATE items SET created_at = NOW() WHERE created_at IS NULL")
- Create migration:
-
Update Schemas:
- Add
created_at: datetime | NonetoItemOutschema - Do not add to
ItemCreate(automatically set on backend)
- Add
-
Sorting Endpoint:
- Update
GET /itemsendpoint to supportsortparameter - Add validation:
sort: str | None = Query(None, regex="^(created_at|name):(asc|desc)$") - Implement sorting via SQLAlchemy:
order_by(Item.created_at.desc() if sort == "created_at:desc" else Item.created_at.asc())
- Update
Frontend Tasks:
-
Display Date:
- Install date library:
npm install date-fns - Import functions:
import { formatDistanceToNow, format } from 'date-fns' - Display date in convenient format:
formatDistanceToNow(new Date(item.created_at), { addSuffix: true }) - Alternatively:
format(new Date(item.created_at), 'dd.MM.yyyy HH:mm')
- Install date library:
-
Date Sorting:
- Add "Sort by date" button in UI
- Add state for storing sort direction:
const [sortBy, setSortBy] = useState('created_at:desc') - Update API request to include parameter:
?sort=${sortBy} - Toggle sort direction on click:
setSortBy(sortBy === 'created_at:desc' ? 'created_at:asc' : 'created_at:desc')
-
Visual Indication:
- Show arrow icon to indicate sort direction
- Highlight active sort button
Infrastructure Tasks:
-
Performance Index:
- Add index on
created_atcolumn in migration - Use:
op.create_index('ix_items_created_at', 'items', ['created_at']) - This will improve query performance with sorting
- Add index on
-
Prometheus Metrics:
- Add metric for average item age:
items_age_days = Histogram('items_age_days', 'Age of items in days', buckets=[1, 7, 30, 90, 365]) - Calculate age:
(datetime.now(timezone.utc) - item.created_at).days - Update endpoint to calculate and record metric
- Add metric for average item age:
-
Grafana Dashboard:
- Add "Items Age Distribution" panel with histogram
- Add "Items Created Over Time" panel with time series graph
- Use query:
rate(items_created_total[5m])for item creation graph
Tip: Use datetime.utcnow() or func.now() in SQLAlchemy for automatic creation time setting.
Objective: Learn to implement soft delete through archiving, add filtering, and manage record state.
Backend Tasks:
-
Update Model:
- Add field
is_active: Mapped[bool] = mapped_column(Boolean, default=True, nullable=False)toItemmodel - Default value:
True(active)
- Add field
-
Create Migration:
- Create migration:
alembic revision --autogenerate -m "add_is_active_to_items" - For existing records, set
is_active = True:op.execute("UPDATE items SET is_active = TRUE WHERE is_active IS NULL")
- Create migration:
-
Archive Endpoint:
- Add
PATCH /items/{item_id}/archiveendpoint - Set
is_active = Falsefor archived item - Return 404 if item not found
- Return 400 if item already archived
- Add
-
Unarchive Endpoint:
- Add
PATCH /items/{item_id}/unarchiveendpoint - Set
is_active = Truefor unarchived item
- Add
-
Filtering in GET /items:
- Add parameter
include_archived: bool = Query(False) - By default, show only active items:
filter(Item.is_active == True) - If
include_archived=True, show all items
- Add parameter
Frontend Tasks:
-
Archive Button:
- Add "Archive" button for each item
- On click, call
PATCH /items/{item_id}/archive - After successful archiving, update item list
-
Archived Toggle:
- Add checkbox or toggle "Show archived"
- When enabled, add parameter
include_archived=trueto API request - Update item list when toggle changes
-
Visual Display:
- Display archived items in gray
- Add strikethrough text for archived items
- Show "Archived" badge next to archived items
-
Unarchiving:
- Add "Unarchive" button for archived items
- On click, call
PATCH /items/{item_id}/unarchive
Infrastructure Tasks:
-
Prometheus Metrics:
- Add metrics:
items_total = Counter('items_total', 'Total items', ['status']) - Increment
items_total.labels(status='active')anditems_total.labels(status='archived') - Update CRUD operations to record metrics
- Add metrics:
-
Grafana Dashboard:
- Add "Active vs Archived Items" panel with pie chart
- Use queries:
items_total{status="active"}anditems_total{status="archived"} - Add stat panel with current count of active and archived items
-
Grafana Alert:
- Create alert rule:
items_total{status="archived"} > 1000 - Configure notification channel (email, Slack, etc.)
- Alert should trigger when archived items count exceeds 1000
- Create alert rule:
Tasks at this level require deeper understanding of the system, including dependency updates, implementation of more complex logic, and performance improvements.
Objective: Learn to update project dependencies, check version compatibility, and verify functionality after updates.
Backend Tasks:
-
Check Current Versions:
- Open
backend/pyproject.toml - Check current versions of all dependencies
- Use
pip list --outdatedto check for outdated packages
- Open
-
Update Dependencies:
- Check latest stable versions on PyPI
- Update versions in
pyproject.tomlorrequirements.txt - For production, use specific versions (not
latest) - Example:
fastapi = "^0.115.0"→fastapi = "^0.120.0"
-
Compatibility Check:
- Check changelog of updated packages for breaking changes
- Pay special attention to major versions (1.0 → 2.0)
-
Testing After Update:
- Install updated dependencies:
pip install -e ".[dev]" - Run tests:
make test-backend - Check linting:
make lint-backend - Check type checking:
make type-check
- Install updated dependencies:
Infrastructure Tasks:
-
Update Docker Images:
- Check current versions of base images on Docker Hub
- Update
FROM python:3.12-slimif newer versions available - Check image sizes before and after update
-
Rebuild Images:
- Execute:
make build-image - Check size:
make image-size - Compare sizes before and after update
- Execute:
-
Update docker-compose:
- Check if changes needed in
docker-compose.yml - Update service versions if needed (Prometheus, Grafana, etc.)
- Check if changes needed in
-
Functionality Check:
- Start services:
make up - Check health checks of all services
- Verify all endpoints work correctly
- Start services:
Tip: Use pip list --outdated or check PyPI to find new versions.
Expected Result:
- All dependencies updated to latest stable versions
- Docker images rebuilt and working correctly
- Tests pass:
make test-docker - Image sizes haven't increased significantly (or even decreased)
Objective: Learn to update Python version, check code and dependency compatibility with new version.
Backend Tasks:
-
Update Configuration:
- Open
backend/pyproject.toml - Update
requires-python = ">=3.13" - Update
target-version = ["py313"]in.ruff.toml - Update
python_version = "3.13"in[tool.mypy]section inpyproject.toml
- Open
-
Code Check:
- Verify code works on Python 3.13
- Fix all deprecation warnings
- Check new Python 3.13 features (if needed)
-
Testing:
- Install Python 3.13 locally
- Create new virtualenv:
python3.12 -m venv .venv - Install dependencies:
pip install -e ".[dev]" - Run tests:
make test-backend
Infrastructure Tasks:
-
Update Dockerfile:
- Open
backend/Dockerfile - Update base image:
FROM python:3.13-slim - Verify all commands work with Python 3.13
- Open
-
Rebuild Images:
- Rebuild Docker images:
make build-image - Check image sizes
- Start services:
make up
- Rebuild Docker images:
-
Update Documentation:
- Update
README.mdwith new Python version - Update
docs/prerequisites.mdwith new version - Update
docs/local-setup.mdwith instructions for Python 3.13
- Update
-
Functionality Check:
- Run tests in Docker:
make test-docker - Check Python version in container:
docker compose exec api python --version - Should show:
Python 3.13.x
- Run tests in Docker:
Verification:
make build-image
make up
make test-docker
docker compose exec api python --version # Should show 3.13.xObjective: Learn to use enums for field value constraints, implement filtering on backend and frontend, and add metrics for categories.
Backend Tasks:
-
Create Enum:
- Create enum for categories:
class ItemCategory(str, Enum): WORK = "work"; PERSONAL = "personal"; SHOPPING = "shopping"; OTHER = "other" - Use
str, Enumfor compatibility with Pydantic and SQLAlchemy
- Create enum for categories:
-
Update Model:
- Add field
category: Mapped[ItemCategory] = mapped_column(Enum(ItemCategory), nullable=False)toItemmodel - Set default value:
default=ItemCategory.OTHER
- Add field
-
Create Migration:
- Create migration:
alembic revision --autogenerate -m "add_category_to_items" - For existing records, set default value
- Create migration:
-
Filtering Endpoint:
- Update
GET /itemsto support parametercategory: ItemCategory | None = Query(None) - Add filtering:
filter(Item.category == category)if category specified
- Update
-
Categories List Endpoint:
- Add
GET /categoriesendpoint - Return list of all available categories:
[{"value": "work", "label": "Work"}, ...]
- Add
Frontend Tasks:
-
Category Selection:
- Add
<select>element for category selection when creating item - Options: Work, Personal, Shopping, Other
- Add state:
const [category, setCategory] = useState('other')
- Add
-
Category Filtering:
- Add buttons/checkboxes for filtering by categories
- When category selected, add parameter
?category=${category}to API request - Highlight active category
-
Badge Display:
- Display badge with category name next to each item
- Use different colors for different categories
- Style via CSS classes
-
Multiple Filtering:
- Add ability to select multiple categories simultaneously
- Use array:
const [selectedCategories, setSelectedCategories] = useState([]) - Send requests for each category or implement backend support for multiple filtering
Infrastructure Tasks:
-
Performance Index:
- Add index on
categorycolumn in migration - Use:
op.create_index('ix_items_category', 'items', ['category'])
- Add index on
-
Prometheus Metrics:
- Add metric:
items_total = Counter('items_total', 'Total items', ['category']) - Increment for each category:
items_total.labels(category=item.category.value).inc()
- Add metric:
-
Grafana Dashboard:
- Add "Items by Category" panel with pie chart or bar chart
- Use query:
items_totalgrouped bycategory - Add stat panels for each category
-
Loki Logging:
- Add logging when item category changes
- Use structured logging:
logger.info("Category changed", extra={"item_id": item_id, "old_category": old, "new_category": new})
Tip: Use Enum in Python and <select> in React for category selection.
Objective: Learn to implement pagination for large datasets, improve performance and user experience.
Backend Tasks:
-
Pagination Parameters:
- Add parameters to
GET /items:skip: int = Query(0, ge=0),limit: int = Query(10, ge=1, le=100) - Validation:
skip >= 0,limitfrom 1 to 100
- Add parameters to
-
Pagination Implementation:
- Use SQLAlchemy
offset()andlimit():query.offset(skip).limit(limit) - Calculate total:
total = await db.scalar(select(func.count()).select_from(Item))
- Use SQLAlchemy
-
Response with Metadata:
-
Change response format:
{ "items": [...], "total": 100, "skip": 0, "limit": 10, "has_next": True, "has_prev": False }
-
-
Parameter Validation:
- Check that
skipdoesn't exceedtotal - Return 400 error if parameters invalid
- Check that
Frontend Tasks:
-
Pagination UI:
- Add "Previous" and "Next" page buttons
- Show current page and total number of pages
- Calculate:
currentPage = Math.floor(skip / limit) + 1,totalPages = Math.ceil(total / limit)
-
Items Per Page Selection:
- Add dropdown for selection: 10, 20, 50 items per page
- On change, reset
skipto 0 and update request
-
Loading State:
- Show loading indicator while fetching new page
- Use state:
const [loading, setLoading] = useState(false)
-
Navigation:
- Add buttons to go to first/last page
- Show page numbers for direct navigation (if not too many pages)
Infrastructure Tasks:
-
Prometheus Metrics:
- Add metric:
api_requests_total = Counter('api_requests_total', 'API requests', ['endpoint', 'method', 'has_pagination']) - Increment with label
has_pagination=trueif pagination used
- Add metric:
-
Grafana Dashboard:
- Add pagination usage graph:
rate(api_requests_total{has_pagination="true"}[5m]) - Add stat panel with average items per page
- Add pagination usage graph:
-
Alert:
- Create alert if average items per page exceeds 50
- This may indicate performance issues
Expected Result:
- Endpoint
GET /items?skip=0&limit=10returns pagination metadata - Frontend displays pagination with navigation buttons
- Metrics track pagination usage
- Performance improved for large datasets
Tasks at this level require deep understanding of the system, including work with complex relationships between models, files, and implementation of advanced functionality.
Objective: Learn to work with many-to-many relationships in SQLAlchemy, implement complex queries, and manage related data.
Backend Tasks:
-
Create Tag Model:
- Create
Tagmodel with fieldsid,name(unique) - Add
__tablename__ = "tags"andname: Mapped[str] = mapped_column(String(50), unique=True, nullable=False)
- Create
-
Association Table:
- Create association table:
item_tags = Table('item_tags', Base.metadata, Column('item_id', Integer, ForeignKey('items.id')), Column('tag_id', Integer, ForeignKey('tags.id')))
- Create association table:
-
Many-to-Many Relationship:
- Add to
Itemmodel:tags: Mapped[list[Tag]] = relationship("Tag", secondary=item_tags, back_populates="items") - Add to
Tagmodel:items: Mapped[list[Item]] = relationship("Item", secondary=item_tags, back_populates="tags")
- Add to
-
Create Migration:
- Create migration for
tagsanditem_tagstables - Add unique index on
(item_id, tag_id)to avoid duplicates
- Create migration for
-
Endpoints:
POST /items/{item_id}/tags- add tag to item (create tag if doesn't exist)DELETE /items/{item_id}/tags/{tag_id}- remove tag from itemGET /items?tags=tag1,tag2- filter by tags (items that have all specified tags)
Frontend Tasks:
-
Tag Input:
- Add field for tag input (comma-separated or separate inputs)
- Use input with ability to add multiple tags
- Alternatively: use library like
react-tag-input
-
Tag Display:
- Display tags as badges/chips around each item
- Style badges with different colors
- Add ability to remove tags (if user has permissions)
-
Tag Filtering:
- Add ability to click tag to filter items
- Show only items containing selected tag
- Highlight active tag
-
Autocomplete:
- Add autocomplete when entering tags
- Request list of existing tags from API
- Suggest existing tags when typing
Infrastructure Tasks:
-
Indexes:
- Add indexes on
item_tagstable for performance improvement - Indexes on
item_idandtag_idseparately - Composite index on
(item_id, tag_id)for fast search
- Add indexes on
-
Prometheus Metrics:
- Add metrics:
tags_total = Gauge('tags_total', 'Total number of tags') - Add:
items_per_tag_avg = Histogram('items_per_tag_avg', 'Average items per tag')
- Add metrics:
-
Grafana Dashboard:
- Add "Most Popular Tags" panel with top 10 tags
- Use query to count items per tag
- Create bar chart or word cloud
-
Loki Logging:
- Add logging when new tag created
- Log tag changes for items
Tip: Use Table() for association table in SQLAlchemy and relationship() with secondary parameter.
Objective: Learn to use numeric fields with constraints, implement validation at different levels, and visualize numeric values.
Backend Tasks:
-
Update Model:
- Add field
priority: Mapped[int] = mapped_column(Integer, nullable=False, default=3) - Default value: 3 (medium priority)
- Add field
-
Pydantic Validation:
- Add validation in schema:
priority: int = Field(ge=1, le=5, description="Priority from 1 (lowest) to 5 (highest)") - Pydantic will automatically check value is in range 1-5
- Add validation in schema:
-
Create Migration:
- Create migration:
alembic revision --autogenerate -m "add_priority_to_items" - Add check constraint:
op.create_check_constraint('ck_items_priority_range', 'items', 'priority >= 1 AND priority <= 5')
- Create migration:
-
Priority Sorting:
- Add sorting by priority (highest first):
order_by(Item.priority.desc()) - Combine with other sorting (e.g., by date)
- Add sorting by priority (highest first):
-
Filtering Endpoint:
- Add parameter
priority_min: int | None = Query(None, ge=1, le=5) - Filter:
filter(Item.priority >= priority_min)if specified
- Add parameter
Frontend Tasks:
-
Visual Display:
- Display priority as stars (1-5 stars)
- Alternatively: numeric indicator with color coding
- Colors: red (5), orange (4), yellow (3), green (2), gray (1)
-
Change Priority:
- Add dropdown or slider for changing priority
- Dropdown:
<select>with options 1-5 - Slider: use
<input type="range" min="1" max="5">
-
High Priority Display:
- Display items with priority 4-5 at top of list
- Highlight high priority with brighter color
- Add "!" icon for highest priority
-
Filtering:
- Add filter "Show only high priority"
- Use parameter
priority_min=4in API request
Infrastructure Tasks:
-
Prometheus Metrics:
- Add metric:
items_total = Counter('items_total', 'Total items', ['priority']) - Increment for each priority level:
items_total.labels(priority=str(item.priority)).inc()
- Add metric:
-
Grafana Dashboard:
- Add "Items by Priority" panel with bar chart
- Display distribution of items by priority
- Add stat panels for each priority level
-
Alert:
- Create alert if count of items with priority 5 exceeds 20
- This may indicate system overload with high-priority tasks
Objective: Learn to work with dates and times for deadlines, implement date validation, and track overdue tasks.
Backend Tasks:
-
Update Model:
- Add field
due_date: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), nullable=True)toItemmodel - Use
timezone=Truefor correct timezone handling
- Add field
-
Validation:
- Add validation in Pydantic:
due_date: datetime | None = Field(None, description="Due date cannot be in the past") - Create custom validator:
@field_validator('due_date')to check date is not in the past
- Add validation in Pydantic:
-
Create Migration:
- Create migration:
alembic revision --autogenerate -m "add_due_date_to_items" - Add index on
due_datefor fast search of overdue items
- Create migration:
-
Overdue Endpoint:
- Add
GET /items/overdueendpoint - Filter:
filter(Item.due_date < datetime.now(timezone.utc)) - Return only items with
due_dateandis_active=True
- Add
-
Filtering Endpoint:
- Add parameter
due_before: datetime | None = Query(None) - Filter:
filter(Item.due_date < due_before)if specified
- Add parameter
Frontend Tasks:
-
Date Picker:
- Install library:
npm install react-datepickeror use native<input type="date"> - Add date picker for selecting deadline
- Validate date is not in the past
- Install library:
-
Display Deadline:
- Display deadline in convenient format: "In 3 days" or "Overdue by 2 days"
- Use
date-fns:formatDistanceToNow(new Date(item.due_date), { addSuffix: true })
-
Visual Indication:
- Highlight overdue items in red
- Add "!" icon for overdue items
- Show progress bar with proximity to deadline
-
Overdue Filter:
- Add checkbox "Show only overdue"
- When enabled, call
GET /items/overdue - Update list when filter changes
Infrastructure Tasks:
-
Index:
- Add index on
due_datecolumn for fast search of overdue items - Use:
op.create_index('ix_items_due_date', 'items', ['due_date'])
- Add index on
-
Prometheus Metrics:
- Add metrics:
items_overdue_total = Gauge('items_overdue_total', 'Number of overdue items') - Add:
items_due_soon_total = Gauge('items_due_soon_total', 'Number of items due within 7 days') - Update metrics on each request or via scheduled task
- Add metrics:
-
Grafana Dashboard:
- Add "Overdue Items" panel with stat panel
- Add "Items Due Soon" panel (deadline within < 7 days)
- Create time series graph with trend of overdue items
-
Alert:
- Create alert if count of overdue items exceeds 10
- Configure notification for critical situation alerts
Objective: Learn to implement file upload and storage, handle multipart/form-data requests, and manage file storage.
Backend Tasks:
-
Create Attachment Model:
- Create
Attachmentmodel with fields:id,item_id(foreign key),filename,file_path,file_size,uploaded_at - Add relationship to
Item:attachments: Mapped[list[Attachment]] = relationship("Attachment", back_populates="item")
- Create
-
Create Migration:
- Create migration for
attachmentstable - Add foreign key constraint on
item_id
- Create migration for
-
Endpoints:
POST /items/{item_id}/attachments- upload file (multipart/form-data)GET /items/{item_id}/attachments- list files for itemGET /items/{item_id}/attachments/{attachment_id}- download fileDELETE /items/{item_id}/attachments/{attachment_id}- delete file
-
File Validation:
- Check file size (maximum 10MB)
- Check file type (allow only certain types, e.g., images, PDF)
- Generate unique filename to avoid conflicts
-
File Storage:
- Store files in
/app/data/uploads/directory - Create subdirectories by item_id for organization
- Store metadata in database
- Store files in
Frontend Tasks:
-
File Upload:
- Add
<input type="file">for file selection - Support drag-and-drop via
onDrophandler - Show preview for images before upload
- Add
-
File Display:
- Show file list under each item
- Display file type icon, name, size, and upload date
- Add button to download file
-
Progress Bar:
- Show progress bar during upload
- Use
XMLHttpRequestorfetchwithonUploadProgress - Update progress bar during upload
-
File Deletion:
- Add delete button for each file
- Confirm deletion via confirm dialog
- Update list after deletion
Infrastructure Tasks:
-
Volume for Files:
- Configure volume in
docker-compose.yml:volumes: - ./data/uploads:/app/data/uploads - Ensure directory is created automatically
- Configure access permissions
- Configure volume in
-
Prometheus Metrics:
- Add metrics:
attachments_total = Counter('attachments_total', 'Total number of attachments') - Add:
attachments_size_bytes_total = Counter('attachments_size_bytes_total', 'Total size of attachments in bytes')
- Add metrics:
-
Grafana Dashboard:
- Add "Storage Usage" panel with total file size
- Display graph of storage usage growth
- Add stat panel with file count
-
Alert:
- Create alert if total file size exceeds 1GB
- Configure automatic cleanup of old files (optional)
-
Backup:
- Configure backup for file directory
- Use cron job or scheduled task for regular backup
Tip: Use UploadFile from FastAPI and store files in /app/data/uploads/ directory.
Objective: Learn to update major dependency versions, fix breaking changes, and keep project up to date.
Backend Tasks:
-
Update Dependencies:
- Update all dependencies to latest major versions
- Check changelog of each library for breaking changes
- Example updates:
- FastAPI 0.115 → 0.120+
- SQLAlchemy 2.0.36 → 2.0.40+
- Pydantic 2.9.2 → 2.10+
- Uvicorn 0.30.0 → 0.32+
-
Fix Breaking Changes:
- Fix API changes (deprecated functions, parameter changes)
- Update code according to new best practices
- Fix type hint changes
-
Testing:
- Run all tests:
make test-backend - Fix tests that fail due to breaking changes
- Check linting:
make lint-backend - Check type checking:
make type-check
- Run all tests:
-
Update Documentation:
- Update code examples in documentation
- Update versions in README.md
Infrastructure Tasks:
-
Update Docker Images:
- Update base images if needed
- Rebuild images:
make build-image - Check image sizes
-
Compatibility Check:
- Check compatibility with other services (Prometheus, Grafana, Loki)
- Update service versions in docker-compose.yml if needed
-
CI/CD Pipeline:
- Update CI/CD pipeline for new versions
- Verify all tests pass in CI
Tasks at this level require expert-level knowledge and include complex architecture, real-time functionality, and migrations to new programming language versions.
Objective: Learn to implement state machine for state management, validate state transitions, and visualize workflow.
Backend Tasks:
-
Create Status Enum:
- Create enum:
class ItemStatus(str, Enum): TODO = "todo"; IN_PROGRESS = "in_progress"; REVIEW = "review"; DONE = "done"; CANCELLED = "cancelled"
- Create enum:
-
Update Model:
- Add field
status: Mapped[ItemStatus] = mapped_column(Enum(ItemStatus), default=ItemStatus.TODO, nullable=False)
- Add field
-
Transition Validation:
-
Create mapping of allowed transitions:
ALLOWED_TRANSITIONS = { ItemStatus.TODO: [ItemStatus.IN_PROGRESS, ItemStatus.CANCELLED], ItemStatus.IN_PROGRESS: [ItemStatus.REVIEW, ItemStatus.CANCELLED], ItemStatus.REVIEW: [ItemStatus.DONE, ItemStatus.IN_PROGRESS], ItemStatus.DONE: [], ItemStatus.CANCELLED: [] }
-
Validate transition before changing status
-
-
Endpoints:
PATCH /items/{item_id}/status- change status with transition validationGET /items?status=todo- filter by statusGET /items/stats- statistics by status
Frontend Tasks:
-
Kanban Board:
- Display items grouped by status in columns
- Use drag-and-drop for status change
- Library:
react-beautiful-dndor@dnd-kit/core
-
Color Coding:
- Different colors for different statuses
- Visual indication of current status
-
Statistics:
- Show count of items in each status
- Display progress bar with completion progress
Infrastructure Tasks:
-
Metrics:
items_total{status="todo"}for each statusstatus_transitions_total{from="todo",to="in_progress"}for workflow tracking
-
Grafana Dashboard:
- Kanban-like panel "Items by Status"
- Graph of transitions between statuses
-
Alert:
- Alert if items stuck in "in_progress" status for more than 7 days
Tip: Use pydantic for transition validation or transitions library.
Objective: Learn to implement user system, foreign keys between models, and assignment management.
Backend Tasks:
-
User Model:
- Create
Usermodel with fields:id,username,email,created_at - Add uniqueness on
usernameandemail
- Create
-
Update Item:
- Add
assignee_id: Mapped[int | None] = mapped_column(ForeignKey("users.id"), nullable=True) - Add relationship:
assignee: Mapped[User | None] = relationship("User")
- Add
-
Endpoints:
GET /users- list usersPOST /users- create userPATCH /items/{item_id}/assign- assign item to userGET /items?assignee_id=1- filter by assigned user
Frontend Tasks:
-
User Selection:
- Dropdown for user selection when creating/editing item
- Display avatar/initials of assigned user
-
Filtering:
- Filter by assigned user
- "My Profile" page with list of assigned items
Infrastructure Tasks:
- Metrics:
items_total{assignee="user1"}for each user- Alert if one user has more than 50 assigned items
Objective: Learn to implement real-time functionality via WebSockets, manage comments, and synchronize state between clients.
Backend Tasks:
-
Comment Model:
- Create
Commentmodel with fields:id,item_id,author_id,content,created_at,updated_at
- Create
-
Endpoints:
GET /items/{item_id}/comments- list commentsPOST /items/{item_id}/comments- create commentPATCH /items/{item_id}/comments/{comment_id}- editDELETE /items/{item_id}/comments/{comment_id}- delete
-
WebSocket:
- Add WebSocket endpoint for real-time comment updates
- Use FastAPI WebSocket support
- Broadcast new comments to all connected clients
Frontend Tasks:
-
Comments Section:
- Form for adding comments
- Display author and creation time
- Ability to edit and delete own comments
-
Real-Time Updates:
- Connect to WebSocket
- Automatic updates on new comments
Infrastructure Tasks:
- Metrics:
comments_total,comments_per_item_avg- WebSocket connection monitoring
Objective: Learn to use JSONB types in PostgreSQL, implement search on JSON fields, and manage structured metadata.
Backend Tasks:
-
Update Model:
- Add field
metadata: Mapped[dict | None] = mapped_column(JSON, nullable=True)(PostgreSQL JSONB)
- Add field
-
Search:
- Endpoint for search:
GET /items?metadata.color=blue - Full-text search:
GET /items/search?q=metadata.location:office
- Endpoint for search:
-
Validation:
- Pydantic schema for metadata
- Structure checking
Frontend Tasks:
- Metadata UI:
- Key-value pairs for adding metadata
- Search by metadata
- Display metadata
Infrastructure Tasks:
- Indexes:
- GIN index on JSONB column
- Metrics for metadata usage
Tip: Use JSON type in SQLAlchemy and dict in Pydantic.
- Write unit tests for new fields and validation
- Write integration tests for new endpoints
- Add frontend tests (React Testing Library)
- Configure code coverage reports
- Update API documentation (Swagger/OpenAPI)
- Add usage examples for new fields
- Update README.md with description of new features
- Add JSDoc comments for React components
- Add migrations to CI/CD pipeline
- Configure automatic testing after migrations
- Add health check for new endpoints
- Configure automatic database backup
- Add monitoring for new metrics
- Improve form design for new fields
- Add frontend validation (React Hook Form)
- Add loading states and error handling
- Add animations and transitions
- Make application responsive for mobile devices
1. Backend:
# 1. Update model (models.py)
# 2. Update schemas (schemas.py)
# 3. Update CRUD (crud.py)
# 4. Update endpoints (main.py)
# 5. Create migration
cd backend
alembic revision --autogenerate -m "add_field_name"
# 6. Verify migration
alembic upgrade head2. Frontend:
# 1. Update components (App.jsx)
# 2. Add fields to forms
# 3. Update data display
# 4. Verify in browser
cd frontend
npm run dev3. Infrastructure:
# 1. Check docker-compose.yaml
# 2. Add Prometheus metrics (metrics.py)
# 3. Update Grafana dashboard
# 4. Restart services
make down
make up4. Testing:
make test-docker # Backend tests
make lint # Code check
make type-check # Type checking- Functionality: Does the new field/endpoint work correctly?
- Code: Are best practices and code style followed?
- Tests: Are tests added for new functionality?
- Migrations: Are migrations created and applied correctly?
- Validation: Is validation added at API level?
- UI/UX: Is the interface user-friendly?
- Validation: Is frontend validation added?
- Error Handling: Are errors handled correctly?
- Responsive: Does it work on mobile devices?
- Docker: Are Docker images and volumes configured correctly?
- Metrics: Are Prometheus metrics added?
- Grafana: Are dashboards updated?
- Logging: Is logging added to Loki?
- Alerts: Are alerts configured (if needed)?
- Always verify migrations on test database before production
- Use type hints for better IDE support
- Follow DRY principle (Don't Repeat Yourself)
- Write comments for complex code parts
- Test not only "happy path", but also edge cases
- Check Docker image sizes after changes
- Monitor metrics after adding new functionality
- Document changes in Infrastructure (docker-compose, volumes, etc.)
- Use Git for versioning changes
- Create separate branches for each task
- Write descriptive commit messages
- Always run tests before commit
Good luck with completing the assignments!