Skip to content

Add problems 245, 252, 253: interactive problems with 3 test cases ea… #34

Add problems 245, 252, 253: interactive problems with 3 test cases ea…

Add problems 245, 252, 253: interactive problems with 3 test cases ea… #34

Workflow file for this run

name: Sync to HuggingFace
on:
push:
branches: [main]
paths:
- 'algorithmic/problems/**'
- 'research/problems/**'
release:
types: [published]
workflow_dispatch: # Allow manual trigger
jobs:
sync:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.12'
- name: Install huggingface_hub
run: pip install huggingface_hub
- name: Export dataset
run: python scripts/export_hf_dataset.py
- name: Upload to HuggingFace
run: |
python -c "
from huggingface_hub import HfApi, CommitOperationAdd
import os
from pathlib import Path
api = HfApi(token=os.environ['HF_TOKEN'])
# Collect all files for single commit
operations = []
# Add data files
data_dir = Path('hf_export/data')
for f in data_dir.iterdir():
operations.append(CommitOperationAdd(
path_in_repo=f'data/{f.name}',
path_or_fileobj=str(f)
))
# Add README
operations.append(CommitOperationAdd(
path_in_repo='README.md',
path_or_fileobj='hf_export/README.md'
))
# Single commit with all files
api.create_commit(
repo_id='FrontierCS/Frontier-CS',
repo_type='dataset',
operations=operations,
commit_message='Update dataset'
)
print(f'Uploaded {len(operations)} files')
"
env:
HF_TOKEN: ${{ secrets.HF_TOKEN }}