Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
55 changes: 55 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,54 @@ jobs:
echo "Testing LB handler in Docker environment..."
docker run --rm flash-lb-cpu:test ./test-lb-handler.sh

tarball:
runs-on: ubuntu-latest
if: github.event_name == 'pull_request'
steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: Set up uv
uses: astral-sh/setup-uv@v4
with:
enable-cache: true

- name: Build tarball
env:
PYTHON_VERSION: "3.11"
run: bash scripts/build-tarball.sh

- name: Test tarball in bare ubuntu container
run: |
TARBALL=$(ls dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz)
docker run --rm -v "$(pwd)/dist:/dist" ubuntu:22.04 \
bash -c "tar xzf /dist/$(basename $TARBALL) -C /opt && /opt/flash-worker/bootstrap.sh --test"

Comment on lines +148 to +153
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the tarball self-test step, TARBALL=$(ls dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz) and then $(basename $TARBALL) can behave unexpectedly if the glob matches 0 or >1 files (or if filenames contain spaces). Consider using a safer selection pattern (e.g., ensure exactly one match) and quoting variables when passing them into basename/the docker command.

Copilot uses AI. Check for mistakes.
- name: Upload tarball artifact
uses: actions/upload-artifact@v4
with:
name: flash-worker-tarball
path: dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz
retention-days: 30
overwrite: true

- name: Post artifact link on PR
env:
GH_TOKEN: ${{ github.token }}
run: |
TARBALL=$(basename dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz)
RUN_URL="${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}"
BODY="**Tarball artifact:** [\`${TARBALL}\`](${RUN_URL}#artifacts)

To test: \`FLASH_WORKER_TARBALL_URL=<download-url> flash deploy\`"
Comment on lines +166 to +170
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TARBALL=$(basename dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz) can expand to multiple paths; basename will then output multiple lines and the PR comment link text becomes ambiguous. Consider selecting a single artifact deterministically (or failing if more than one match) before composing the comment body.

Copilot uses AI. Check for mistakes.

# Delete previous tarball comments to keep PR clean
gh api "repos/${{ github.repository }}/issues/${{ github.event.pull_request.number }}/comments" \
--jq '.[] | select(.body | contains("Tarball artifact:")) | .id' | \
xargs -I{} gh api -X DELETE "repos/${{ github.repository }}/issues/comments/{}" 2>/dev/null || true

gh pr comment "${{ github.event.pull_request.number }}" --body "$BODY"

docker-validation:
runs-on: ubuntu-latest
needs: [test, lint, docker-test, docker-test-lb-cpu]
Expand Down Expand Up @@ -229,6 +277,13 @@ jobs:
cache-from: type=gha,scope=gpu
cache-to: type=gha,mode=max,scope=gpu

tarball-release:
needs: [release]
if: needs.release.outputs.release_created
uses: ./.github/workflows/release-tarball.yml
with:
tag_name: ${{ needs.release.outputs.tag_name }}

docker-prod-cpu:
runs-on: ubuntu-latest
needs: [release]
Expand Down
60 changes: 60 additions & 0 deletions .github/workflows/release-tarball.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
name: Release Tarball

on:
workflow_dispatch:
inputs:
dry_run:
description: "Dry run (build but don't upload)"
required: false
default: "false"
type: boolean

# Triggered by release job in ci.yml via workflow_call
# or manually via workflow_dispatch
workflow_call:
inputs:
tag_name:
required: true
type: string

permissions:
contents: write

jobs:
build-tarball:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: Set up uv
uses: astral-sh/setup-uv@v4
with:
enable-cache: true

- name: Build tarball
env:
PYTHON_VERSION: "3.11"
run: bash scripts/build-tarball.sh

- name: Test tarball in bare ubuntu container
run: |
TARBALL=$(ls dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz)
docker run --rm -v "$(pwd)/dist:/dist" ubuntu:22.04 \
bash -c "tar xzf /dist/$(basename $TARBALL) -C /opt && /opt/flash-worker/bootstrap.sh --test"

- name: Upload tarball to GitHub Release
if: inputs.dry_run != 'true' && inputs.tag_name != ''
env:
GH_TOKEN: ${{ github.token }}
run: |
TARBALL=$(ls dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz)
Comment on lines +42 to +51
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar to CI, TARBALL=$(ls dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz) + $(basename $TARBALL) assumes exactly one match. If multiple artifacts exist (or no match), this step can behave unpredictably; consider enforcing a single match and quoting variables in the docker command.

Suggested change
TARBALL=$(ls dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz)
docker run --rm -v "$(pwd)/dist:/dist" ubuntu:22.04 \
bash -c "tar xzf /dist/$(basename $TARBALL) -C /opt && /opt/flash-worker/bootstrap.sh --test"
- name: Upload tarball to GitHub Release
if: inputs.dry_run != 'true' && inputs.tag_name != ''
env:
GH_TOKEN: ${{ github.token }}
run: |
TARBALL=$(ls dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz)
shopt -s nullglob
tarballs=(dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz)
if [ "${#tarballs[@]}" -ne 1 ]; then
echo "Expected exactly one tarball in dist/, found ${#tarballs[@]} matches."
exit 1
fi
TARBALL="${tarballs[0]}"
docker run --rm -v "$(pwd)/dist:/dist" ubuntu:22.04 \
bash -c "tar xzf /dist/$(basename \"$TARBALL\") -C /opt && /opt/flash-worker/bootstrap.sh --test"
- name: Upload tarball to GitHub Release
if: inputs.dry_run != 'true' && inputs.tag_name != ''
env:
GH_TOKEN: ${{ github.token }}
run: |
shopt -s nullglob
tarballs=(dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz)
if [ "${#tarballs[@]}" -ne 1 ]; then
echo "Expected exactly one tarball in dist/, found ${#tarballs[@]} matches."
exit 1
fi
TARBALL="${tarballs[0]}"

Copilot uses AI. Check for mistakes.
gh release upload "${{ inputs.tag_name }}" "$TARBALL" --clobber

- name: Upload tarball as artifact (for dry runs)
if: inputs.dry_run == 'true' || inputs.tag_name == ''
uses: actions/upload-artifact@v4
with:
name: flash-worker-tarball
path: dist/flash-worker-v*-py3.11-linux-x86_64.tar.gz
retention-days: 7
24 changes: 24 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@ IMAGE = runpod/flash
TAG = $(or $(FLASH_IMAGE_TAG),local)
FULL_IMAGE = $(IMAGE):$(TAG)
FULL_IMAGE_CPU = $(IMAGE)-cpu:$(TAG)
VERSION = $(shell python3 -c "import re; print(re.search(r'__version__\s*=\s*\"([^\"]+)\"', open('src/version.py').read()).group(1))")
# Must match base image Python: pytorch:2.9.1-cuda12.8-cudnn9-runtime and python:3.11-slim
TARBALL_PYTHON_VERSION ?= 3.11

# Detect host platform for local builds
ARCH := $(shell uname -m)
Expand Down Expand Up @@ -56,6 +59,27 @@ clean: # Remove build artifacts and cache files
find . -type f -name "*.pyc" -delete
find . -type f -name "*.pkl" -delete

# Tarball targets (process-injectable runtime)
tarball: # Build self-contained runtime tarball (runs in Docker, linux/amd64)
docker run --rm --platform linux/amd64 \
-e PYTHON_VERSION=$(TARBALL_PYTHON_VERSION) \
-e UV_CACHE_DIR=/workspace/dist/.uv-cache \
-v $(PWD):/workspace -w /workspace \
python:3.11-slim \
bash -c 'apt-get update -qq && apt-get install -y -qq curl > /dev/null 2>&1 && pip install uv -q && bash scripts/build-tarball.sh'

Comment on lines +62 to +70
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make tarball installs uv inside the Docker container, but this Makefile also has a global parse-time check that errors if uv isn’t installed on the host. That makes the tarball targets unusable on machines without host uv; consider moving the uv presence check into only the targets that require host uv, or gating it behind a variable for Docker-only targets.

Copilot uses AI. Check for mistakes.
tarball-test: tarball # Test tarball in bare ubuntu container
docker run --rm --platform linux/amd64 -v $(PWD)/dist:/dist ubuntu:22.04 \
bash -c 'tar xzf /dist/flash-worker-v$(VERSION)-py$(TARBALL_PYTHON_VERSION)-linux-x86_64.tar.gz -C /opt && /opt/flash-worker/bootstrap.sh --test'

tarball-test-local: # Test tarball injection with mounted file (no rebuild)
docker run --rm --platform linux/amd64 \
-v $(PWD)/dist/flash-worker-v$(VERSION)-py$(TARBALL_PYTHON_VERSION)-linux-x86_64.tar.gz:/tmp/flash-worker.tar.gz \
ubuntu:22.04 \
bash -c 'set -e; FW_DIR=/opt/flash-worker; mkdir -p $$FW_DIR; \
tar xzf /tmp/flash-worker.tar.gz -C $$FW_DIR --strip-components=1; \
$$FW_DIR/bootstrap.sh --test'

setup: dev # Initialize project and sync dependencies
@echo "Setup complete. Development environment ready."

Expand Down
41 changes: 41 additions & 0 deletions scripts/bootstrap.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
#!/bin/sh
# Flash Worker bootstrap -- entry point for process-injected runtime.
# Launched by dockerArgs after tarball extraction.
set -e

FW_DIR="$(cd "$(dirname "$0")" && pwd)"

# Self-test mode (used by tarball-test targets)
if [ "$1" = "--test" ]; then
echo "Flash Worker bootstrap self-test"
echo "FW_DIR: $FW_DIR"
echo "Python: $("$FW_DIR/python/bin/python3" --version)"
echo "uv: $("$FW_DIR/uv" --version)"
"$FW_DIR/venv/bin/python" -c "import pydantic; print(f'pydantic {pydantic.__version__}')"
"$FW_DIR/venv/bin/python" -c "import fastapi; print(f'fastapi {fastapi.__version__}')"
echo "Version: $(cat "$FW_DIR/.version")"
echo "Self-test passed"
exit 0
fi

# Isolated flash-worker environment
export PATH="$FW_DIR/venv/bin:$FW_DIR/python/bin:$FW_DIR:$PATH"
export VIRTUAL_ENV="$FW_DIR/venv"
PYTHON_MINOR=$("$FW_DIR/python/bin/python3" -c "import sys; print(f'{sys.version_info.major}.{sys.version_info.minor}')")
export PYTHONPATH="$FW_DIR/src:$VIRTUAL_ENV/lib/python${PYTHON_MINOR}/site-packages${PYTHONPATH:+:$PYTHONPATH}"

# Signal tarball mode for dependency installer
export FLASH_WORKER_INSTALL_DIR="$FW_DIR"

# Mode detection (same contract as Docker images)
ENDPOINT_TYPE="${FLASH_ENDPOINT_TYPE:-qb}"

if [ "$ENDPOINT_TYPE" = "lb" ]; then
exec uvicorn lb_handler:app \
--host 0.0.0.0 \
--port 80 \
--timeout-keep-alive 600 \
--app-dir "$FW_DIR/src"
else
exec python3 "$FW_DIR/src/handler.py"
fi
158 changes: 158 additions & 0 deletions scripts/build-tarball.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,158 @@
#!/usr/bin/env bash
# Build a self-contained flash-worker tarball for process injection.
# Output: dist/flash-worker-v{VERSION}-py{PYTHON_VERSION}-linux-x86_64.tar.gz
set -euo pipefail

SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
REPO_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"

# Read version from source
VERSION=$(python3 -c "
import re
text = open('$REPO_ROOT/src/version.py').read()
print(re.search(r'__version__\\s*=\\s*\"([^\"]+)\"', text).group(1))
")
echo "Building flash-worker tarball v${VERSION}"

# Configuration
PYTHON_VERSION="${PYTHON_VERSION:-3.11}"

# Validate Python version against project requirement (>=3.10, <3.15)
PY_MINOR=$(echo "$PYTHON_VERSION" | cut -d. -f2)
if [ "$PY_MINOR" -lt 10 ] || [ "$PY_MINOR" -ge 15 ]; then
echo "ERROR: Python ${PYTHON_VERSION} is outside project requirement (>=3.10, <3.15)"
exit 1
fi

UV_VERSION="0.7.19"
UV_URL="https://github.com/astral-sh/uv/releases/download/${UV_VERSION}/uv-x86_64-unknown-linux-gnu.tar.gz"

# Build in container-local tmpdir to avoid macOS case-insensitive filesystem issues
BUILD_DIR="/tmp/flash-worker-build"
TARBALL_ROOT="$BUILD_DIR/flash-worker"
OUTPUT_DIR="$REPO_ROOT/dist"
TARBALL_NAME="flash-worker-v${VERSION}-py${PYTHON_VERSION}-linux-x86_64.tar.gz"

# Clean previous build
rm -rf "$BUILD_DIR"
Comment on lines +31 to +37
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The build uses a fixed BUILD_DIR=/tmp/flash-worker-build and unconditionally rm -rf’s it. This can break concurrent builds on the same machine and can delete unrelated data if the variable is ever changed unexpectedly; consider using mktemp -d (and a trap cleanup) to make the build directory unique and safer.

Suggested change
BUILD_DIR="/tmp/flash-worker-build"
TARBALL_ROOT="$BUILD_DIR/flash-worker"
OUTPUT_DIR="$REPO_ROOT/dist"
TARBALL_NAME="flash-worker-v${VERSION}-py${PYTHON_VERSION}-linux-x86_64.tar.gz"
# Clean previous build
rm -rf "$BUILD_DIR"
BUILD_DIR="$(mktemp -d -t flash-worker-build.XXXXXX)"
TARBALL_ROOT="$BUILD_DIR/flash-worker"
OUTPUT_DIR="$REPO_ROOT/dist"
TARBALL_NAME="flash-worker-v${VERSION}-py${PYTHON_VERSION}-linux-x86_64.tar.gz"
cleanup() {
if [ -n "${BUILD_DIR:-}" ] && [ -d "$BUILD_DIR" ]; then
rm -rf "$BUILD_DIR"
fi
}
trap cleanup EXIT

Copilot uses AI. Check for mistakes.
mkdir -p "$TARBALL_ROOT" "$OUTPUT_DIR" "$OUTPUT_DIR/.cache"

# 1. Install Python via uv (handles version resolution and caching)
echo "Installing Python ${PYTHON_VERSION} via uv..."
uv python install "$PYTHON_VERSION"
PYTHON_BIN=$(uv python find --python-preference only-managed "$PYTHON_VERSION")
PYTHON_INSTALL_DIR=$(cd "$(dirname "$PYTHON_BIN")/.." && pwd -P)
cp -r "$PYTHON_INSTALL_DIR" "$TARBALL_ROOT/python"
Comment on lines +40 to +45
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The script pins UV_VERSION for the tarball, but uses whatever uv is on PATH for uv python install and uv export. This can make builds non-reproducible across environments (host uv version != tarball uv). Consider extracting/using the pinned uv binary for all uv operations (or asserting the host uv --version matches UV_VERSION).

Copilot uses AI. Check for mistakes.

# Verify installation
if [ ! -f "$TARBALL_ROOT/python/bin/python3" ]; then
echo "ERROR: Python installation failed - python3 binary not found"
exit 1
fi
PYTHON_FULL_VERSION=$("$TARBALL_ROOT/python/bin/python3" -c "import sys; v=sys.version_info; print(f'{v.major}.{v.minor}.{v.micro}')")
echo " Python ${PYTHON_FULL_VERSION} installed"

# 2. Download uv static binary
echo "Downloading uv ${UV_VERSION}..."
if [ -f "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz" ]; then
echo " Using cached uv download"
tar xzf "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz" -C "$TARBALL_ROOT" --no-same-owner --strip-components=1 "uv-x86_64-unknown-linux-gnu/uv" 2>/dev/null || true
else
curl -fsSL "$UV_URL" -o "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz"
tar xzf "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz" -C "$TARBALL_ROOT" --no-same-owner --strip-components=1 "uv-x86_64-unknown-linux-gnu/uv" 2>/dev/null || true
fi
Comment on lines +59 to +63
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tar ... 2>/dev/null || true suppresses extraction failures for the bundled uv, which can make subsequent failures (e.g., at chmod +x) harder to diagnose. Consider removing || true/stderr suppression and explicitly erroring if the expected uv binary isn’t extracted.

Suggested change
tar xzf "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz" -C "$TARBALL_ROOT" --no-same-owner --strip-components=1 "uv-x86_64-unknown-linux-gnu/uv" 2>/dev/null || true
else
curl -fsSL "$UV_URL" -o "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz"
tar xzf "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz" -C "$TARBALL_ROOT" --no-same-owner --strip-components=1 "uv-x86_64-unknown-linux-gnu/uv" 2>/dev/null || true
fi
tar xzf "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz" -C "$TARBALL_ROOT" --no-same-owner --strip-components=1 "uv-x86_64-unknown-linux-gnu/uv"
else
curl -fsSL "$UV_URL" -o "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz"
tar xzf "$OUTPUT_DIR/.cache/uv-${UV_VERSION}.tar.gz" -C "$TARBALL_ROOT" --no-same-owner --strip-components=1 "uv-x86_64-unknown-linux-gnu/uv"
fi
if [ ! -f "$TARBALL_ROOT/uv" ]; then
echo "ERROR: Failed to extract uv binary to $TARBALL_ROOT/uv"
exit 1
fi

Copilot uses AI. Check for mistakes.
chmod +x "$TARBALL_ROOT/uv"

# 3. Create venv using portable Python
echo "Creating virtual environment..."
"$TARBALL_ROOT/python/bin/python3" -m venv "$TARBALL_ROOT/venv"

# Fix venv symlinks to be relative (python -m venv creates absolute symlinks)
cd "$TARBALL_ROOT/venv/bin"
for link in python python3 python3.*; do
[ -L "$link" ] || continue
target=$(readlink "$link")
case "$target" in
/*) # Absolute path — make relative to ../../python/bin/
basename=$(basename "$target")
ln -sf "../../python/bin/$basename" "$link"
;;
esac
done
cd "$REPO_ROOT"

# 4. Export and install production dependencies
echo "Installing production dependencies..."
cd "$REPO_ROOT"
# Use the host uv to export requirements (it reads pyproject.toml/uv.lock)
uv export --format requirements-txt --no-dev --no-hashes > "$BUILD_DIR/requirements.txt"

# Install into the tarball's venv using the tarball's uv
"$TARBALL_ROOT/uv" pip install \
--python "$TARBALL_ROOT/venv/bin/python" \
-r "$BUILD_DIR/requirements.txt"

# 4b. Fix shebangs in venv/bin scripts to use portable /usr/bin/env path
# (uv pip install stamps absolute build-time paths in console_scripts)
for script in "$TARBALL_ROOT/venv/bin/"*; do
[ -f "$script" ] || continue
head -1 "$script" | grep -q "^#!.*$BUILD_DIR" || continue
sed -i "1s|^#!.*|#!/usr/bin/env python3|" "$script"
done

# 5. Copy source files
echo "Copying source files..."
cp -r "$REPO_ROOT/src/"*.py "$TARBALL_ROOT/src/" 2>/dev/null || true
mkdir -p "$TARBALL_ROOT/src"
for f in "$REPO_ROOT/src/"*.py; do
[ -f "$f" ] && cp "$f" "$TARBALL_ROOT/src/"
done
Comment on lines +104 to +109
Copy link

Copilot AI Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cp -r "$REPO_ROOT/src/"*.py "$TARBALL_ROOT/src/" runs before $TARBALL_ROOT/src is created and its failure is ignored, and then the script copies the same files again in a loop. Creating the destination directory first and using a single copy mechanism would avoid silent failures and reduce duplication.

Copilot uses AI. Check for mistakes.
# Copy test scripts (used by --test flag)
for f in "$REPO_ROOT/src/"*.sh; do
[ -f "$f" ] && cp "$f" "$TARBALL_ROOT/src/" && chmod +x "$TARBALL_ROOT/src/$(basename "$f")"
done
# Copy test JSON files
if [ -d "$REPO_ROOT/src/tests" ]; then
cp -r "$REPO_ROOT/src/tests" "$TARBALL_ROOT/src/tests"
fi

# 6. Copy bootstrap script
cp "$REPO_ROOT/scripts/bootstrap.sh" "$TARBALL_ROOT/bootstrap.sh"
chmod +x "$TARBALL_ROOT/bootstrap.sh"

# 7. Write version file for cache invalidation
echo "$VERSION" > "$TARBALL_ROOT/.version"

# 8. Write MANIFEST.json
# Use sha256sum on Linux, shasum on macOS
if command -v sha256sum >/dev/null 2>&1; then
SHA_CMD="sha256sum"
else
SHA_CMD="shasum -a 256"
fi
CONTENTS_SHA=$(find "$TARBALL_ROOT" -type f -exec $SHA_CMD {} \; | sort | $SHA_CMD | cut -d' ' -f1)
cat > "$TARBALL_ROOT/MANIFEST.json" <<MANIFEST
{
"version": "${VERSION}",
"python_version": "${PYTHON_FULL_VERSION}",
"uv_version": "${UV_VERSION}",
"platform": "x86_64-unknown-linux-gnu",
"sha256": "${CONTENTS_SHA}"
}
MANIFEST

# 9. Package tarball
echo "Packaging tarball..."
cd "$BUILD_DIR"
tar czf "$OUTPUT_DIR/$TARBALL_NAME" flash-worker/

# 10. Report
TARBALL_SIZE=$(du -h "$OUTPUT_DIR/$TARBALL_NAME" | cut -f1)
echo ""
echo "Tarball built: $OUTPUT_DIR/$TARBALL_NAME"
echo "Size: $TARBALL_SIZE"
echo "Version: $VERSION"
echo "SHA256: $($SHA_CMD "$OUTPUT_DIR/$TARBALL_NAME" | cut -d' ' -f1)"

# Cleanup build dir (keep cache)
rm -rf "$BUILD_DIR"
4 changes: 4 additions & 0 deletions src/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,3 +50,7 @@
"""Number of times the Flash-deployed endpoint will attempt to unpack the worker-flash tarball from mounted volume."""
DEFAULT_TARBALL_UNPACK_INTERVAL = 30
"""Time in seconds the Flash-deployed endpoint will wait between tarball unpack attempts."""

# Process Injection (Tarball Mode)
FLASH_WORKER_INSTALL_DIR_ENV = "FLASH_WORKER_INSTALL_DIR"
"""Env var set by bootstrap.sh to signal tarball mode. Value is the extraction directory."""
Loading
Loading