Skip to content

Commit 6d6f83b

Browse files
committed
Fix dapi docs: remove duplication from jobs.md, add generate() params, fix broken links, fix index nav and code example
1 parent 9fc6adc commit 6d6f83b

File tree

5 files changed

+47
-97
lines changed

5 files changed

+47
-97
lines changed

docs/examples/apps.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -209,8 +209,8 @@ After discovering applications, you can:
209209

210210
1. **[Submit jobs](../jobs.md)** using the discovered applications
211211
2. **[Explore job examples](mpm.md)** for specific workflows
212-
3. **[Check system resources](../api/systems.md)** for execution requirements
213-
4. **[Manage files](../api/files.md)** for job inputs and outputs
212+
3. **[Check system resources](../systems.md)** for execution requirements
213+
4. **[Manage files](../files.md)** for job inputs and outputs
214214

215215
## Troubleshooting
216216

docs/examples/tms_credentials.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,4 +41,4 @@ for system_id in systems:
4141

4242
All methods auto-detect your username. Pass `username="other_user"` to override.
4343

44-
See the [Systems API Reference](../api/systems.md) for full details.
44+
See [Systems](../systems.md) for full details.

docs/index.md

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -11,24 +11,22 @@
1111
```python
1212
from dapi import DSClient
1313

14-
# Initialize client (handles authentication automatically)
1514
ds = DSClient()
1615

17-
# Submit a job
16+
input_uri = ds.files.to_uri("/MyData/analysis/input/")
17+
1818
job_request = ds.jobs.generate(
1919
app_id="matlab-r2023a",
20-
input_dir_uri="/MyData/analysis/input/",
21-
script_filename="run_analysis.m"
20+
input_dir_uri=input_uri,
21+
script_filename="run_analysis.m",
22+
allocation="your_allocation",
2223
)
2324
job = ds.jobs.submit(job_request)
24-
25-
# Monitor progress
26-
final_status = job.monitor()
27-
28-
# Query research databases
29-
df = ds.db.ngl.read_sql("SELECT * FROM SITE LIMIT 10")
25+
job.monitor()
3026
```
3127

28+
For background on DesignSafe compute environments, storage, and workflow design, see the [DesignSafe Workflows guide](https://kks32.github.io/ds-workflows/).
29+
3230
## Getting Started
3331

3432
- [Installation](installation.md)
@@ -38,11 +36,16 @@ df = ds.db.ngl.read_sql("SELECT * FROM SITE LIMIT 10")
3836
## User Guide
3937

4038
- [Jobs](jobs.md) -- submit and monitor computational jobs
39+
- [Apps](apps.md) -- find applications and their IDs
40+
- [Files](files.md) -- path translation, upload, download
41+
- [Systems](systems.md) -- queues and TMS credentials
4142
- [Database Access](database.md) -- query DesignSafe research databases
4243

4344
## Examples
4445

4546
- [MPM Job Submission](examples/mpm.md)
47+
- [OpenSees MP](examples/opensees.md)
48+
- [OpenFOAM](examples/openfoam.md)
4649
- [PyLauncher Parameter Sweeps](examples/pylauncher.md)
4750
- [Database Queries](examples/database.md)
4851

docs/jobs.md

Lines changed: 30 additions & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -45,44 +45,7 @@ for job in jobs:
4545
raw = ds.jobs.list(output="raw")
4646
```
4747

48-
## Application Discovery
49-
50-
### Finding Applications
51-
52-
```python
53-
all_apps = ds.apps.find("", verbose=False)
54-
print(f"Found {len(all_apps)} applications")
55-
56-
matlab_apps = ds.apps.find("matlab", verbose=True)
57-
opensees_apps = ds.apps.find("opensees", verbose=True)
58-
mpm_apps = ds.apps.find("mpm", verbose=True)
59-
```
60-
61-
### Getting Application Details
62-
63-
```python
64-
app_details = ds.apps.get_details("mpm-s3", verbose=True)
65-
66-
print(f"App: {app_details.id}")
67-
print(f"Version: {app_details.version}")
68-
print(f"Execution System: {app_details.jobAttributes.execSystemId}")
69-
print(f"Max Runtime: {app_details.jobAttributes.maxMinutes} minutes")
70-
print(f"Default Cores: {app_details.jobAttributes.coresPerNode}")
71-
```
72-
73-
### Available Applications
74-
75-
| Application | App ID | Description |
76-
|-------------|--------|-------------|
77-
| Agnostic | `designsafe-agnostic-app` | General-purpose Python/OpenSees/PyLauncher execution |
78-
| MATLAB | `matlab-r2023a` | MATLAB computational environment |
79-
| OpenSees | `opensees-express` | Structural analysis framework |
80-
| OpenSees MP | `opensees-mp-s3` | OpenSees parallel (MPI) analysis |
81-
| MPM | `mpm-s3` | Material Point Method simulations |
82-
| ADCIRC | `adcirc-v55` | Coastal circulation modeling |
83-
| LS-DYNA | `ls-dyna` | Explicit finite element analysis |
84-
85-
The Agnostic App (`designsafe-agnostic-app`) runs Python scripts, OpenSeesPy, and PyLauncher parameter sweeps on TACC systems. It includes Python 3.12 with OpenSeesPy pre-installed and supports configurable TACC module loading. It runs in serial mode (`isMpi: false`), which is what PyLauncher workflows need.
48+
For finding applications and their IDs, see [Apps](apps.md).
8649

8750
## Job Submission
8851

@@ -107,6 +70,29 @@ job = ds.jobs.submit(job_request)
10770
print(f"Job submitted: {job.uuid}")
10871
```
10972

73+
### `generate()` Parameters
74+
75+
| Parameter | Type | Description |
76+
|---|---|---|
77+
| `app_id` | str | Tapis application ID (see [Apps](apps.md)) |
78+
| `input_dir_uri` | str | Tapis URI of the input directory (use `ds.files.to_uri()`) |
79+
| `script_filename` | str | Main script file in the input directory |
80+
| `max_minutes` | int | Wall-clock time limit |
81+
| `allocation` | str | TACC project allocation code |
82+
| `node_count` | int | Number of compute nodes |
83+
| `cores_per_node` | int | CPU cores per node |
84+
| `memory_mb` | int | Memory per node in MB |
85+
| `queue` | str | SLURM queue/partition name |
86+
| `job_name` | str | Human-readable job name |
87+
| `description` | str | Job description |
88+
| `tags` | list | List of string tags for filtering |
89+
| `archive_system` | str | Tapis system for output archiving (default: DesignSafe storage) |
90+
| `archive_path` | str | Path on archive system for outputs |
91+
| `input_dir_param_name` | str | Name of the file input parameter (default: `"Input Directory"`, some apps use different names like `"Case Directory"` for OpenFOAM) |
92+
| `extra_file_inputs` | list | Additional file inputs beyond the main input directory |
93+
| `extra_env_vars` | list | Environment variables as `[{"key": "...", "value": "..."}]` |
94+
| `extra_scheduler_options` | list | SLURM scheduler options as `[{"name": "...", "arg": "..."}]` |
95+
11096
### Advanced Configuration
11197

11298
```python
@@ -306,62 +292,23 @@ Cancellation may not be immediate. Jobs in terminal states (FINISHED, FAILED, et
306292

307293
## Resuming Monitoring
308294

309-
```python
310-
from dapi import SubmittedJob
295+
Reconnect to a previously submitted job using its UUID.
311296

312-
job_uuid = "12345678-1234-1234-1234-123456789abc"
313-
resumed_job = SubmittedJob(ds._tapis, job_uuid)
314-
final_status = resumed_job.monitor()
297+
```python
298+
job = ds.jobs.get("12345678-1234-1234-1234-123456789abc")
299+
final_status = job.monitor()
315300
```
316301

317302
(pylauncher)=
318303
## Parameter Sweeps with PyLauncher
319304

320-
[PyLauncher](https://github.com/TACC/pylauncher) runs many independent tasks within a single SLURM allocation -- ideal for parameter studies. dapi generates sweep commands, task lists, and launcher scripts.
321-
322-
```python
323-
ds = DSClient()
324-
325-
sweep = {
326-
"ALPHA": [0.3, 0.5, 3.7],
327-
"BETA": [1.1, 2.0, 3.0],
328-
}
329-
330-
# Preview (dry run)
331-
ds.jobs.parametric_sweep.generate(
332-
'python3 simulate.py --alpha ALPHA --beta BETA',
333-
sweep,
334-
preview=True,
335-
)
336-
337-
# Generate sweep files
338-
ds.jobs.parametric_sweep.generate(
339-
'python3 simulate.py --alpha ALPHA --beta BETA '
340-
'--output "$WORK/sweep_$SLURM_JOB_ID/run_ALPHA_BETA"',
341-
sweep,
342-
"/home/jupyter/MyData/sweep_demo/",
343-
debug="host+job",
344-
)
345-
346-
# Submit
347-
job = ds.jobs.parametric_sweep.submit(
348-
"/MyData/sweep_demo/",
349-
app_id="designsafe-agnostic-app",
350-
allocation="your_allocation",
351-
node_count=1,
352-
cores_per_node=48,
353-
max_minutes=30,
354-
)
355-
job.monitor()
356-
```
357-
358-
For a full walkthrough with OpenSees, see the [PyLauncher example](examples/pylauncher.md).
305+
See the [PyLauncher example](examples/pylauncher.md) for a full walkthrough, or the [PyLauncher OpenSees example](examples/pylauncher_opensees.md) for a structural engineering use case.
359306

360307
## Bulk Operations
361308

362309
```python
363310
job_uuids = ["uuid1", "uuid2", "uuid3"]
364-
jobs = [SubmittedJob(ds._tapis, uuid) for uuid in job_uuids]
311+
jobs = [ds.jobs.get(uuid) for uuid in job_uuids]
365312

366313
for job in jobs:
367314
status = job.get_status()

docs/quickstart.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Quick Start
22

3-
```python
3+
```bash
44
pip install dapi
55
```
66

0 commit comments

Comments
 (0)