Skip to content

docs: recommend Jobs API instead of history/queue endpoints#742

Open
christian-byrne wants to merge 4 commits intomainfrom
docs/recommend-jobs-api
Open

docs: recommend Jobs API instead of history/queue endpoints#742
christian-byrne wants to merge 4 commits intomainfrom
docs/recommend-jobs-api

Conversation

@christian-byrne
Copy link
Copy Markdown
Contributor

Summary

Updates cloud API documentation to recommend the Jobs API (/api/jobs) instead of legacy /history, /history_v2, and /queue endpoints.

Changes

  • Add new "Jobs API" section documenting:
    • GET /api/jobs - List jobs with filtering (status, output_type, workflow_id, sorting, pagination)
    • GET /api/jobs/{job_id} - Get full job details including workflow and outputs
  • Update complete example snippets to use /api/jobs/{id} instead of /api/history_v2/{id}
  • Update Chinese translations
  • Add note about legacy endpoints being maintained for compatibility

Related

Notion ticket: COM-13971

@comfyui-wiki
Copy link
Copy Markdown
Member

Is this PR good to merge now?

christian-byrne and others added 4 commits May 4, 2026 10:23
- Add Jobs API section with list/get endpoints documentation
- Update complete example to use /api/jobs/{id} instead of /api/history_v2
- Add note about legacy endpoints being maintained for compatibility
- Update both English and Chinese examples

Amp-Thread-ID: https://ampcode.com/threads/T-019c0c6f-94c9-70a9-90c7-dee4048a242d
Co-authored-by: Amp <amp@ampcode.com>
- Add /api/jobs and /api/jobs/{job_id} to routes table with (recommended) label
- Mark /history and /queue GET endpoints as (legacy)
- Add detailed Jobs API section with parameters, response examples
- Update both English and Chinese translations

Amp-Thread-ID: https://ampcode.com/threads/T-019c0c6f-94c9-70a9-90c7-dee4048a242d
Co-authored-by: Amp <amp@ampcode.com>
@christian-byrne christian-byrne force-pushed the docs/recommend-jobs-api branch from 1ae3a27 to 6960c75 Compare May 4, 2026 17:24
@christian-byrne
Copy link
Copy Markdown
Contributor Author

Just rebased onto latest main and resolved the merge conflicts in snippets/cloud/complete-example.mdx and snippets/zh/cloud/complete-example.mdx (kept the /api/jobs/{id} recommendation, which is the whole point of this PR). PR is now MERGEABLE — should be good to merge once CI passes.

Copy link
Copy Markdown
Contributor

@MillerMedia MillerMedia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Inline anchors below. Highest-signal:

  • comms_routes.mdx and api-reference.mdx disagree on sort_by field names. The cloud API accepts [create_time, execution_time]; comms_routes.mdx invented created_at and execution_duration. One side returns 400 / silent ignore.
  • output_type TypeScript signature drops a fourth value (3d) that the cloud API actually accepts.
  • create_time example is shown as 10-digit seconds while execution_*_time in the same payload is 13-digit ms — internally inconsistent regardless of which is right.
  • The workflow field has a [REDACTED] round-trip footgun that's undocumented here — a real surprise for anyone reusing returned workflows in /api/prompt.

Broader gaps not anchored to a single line:

  • The comms_routes.mdx Jobs API subsection mixes cloud-only fields/filters (e.g. workflow_id response field, the cloud status enum) with OSS-shared behavior, without distinguishing them. A self-hoster following the example response will see fields their server doesn't return.
  • The cloud Jobs API also supports cancellation (POST /api/jobs/{job_id}/cancel), not documented here. The legacy Note in api-reference.mdx claims the Jobs API replaces /api/queue (the cloud cancel surface), so cancellation belongs in the cloud reference.
  • The cloud list/detail responses include a structured error object for failed jobs that's not in the TS interfaces or examples.
  • No 401/403/404 documentation. The cloud API distinguishes "wrong id" from "not yours" via different status codes — readers can't tell which is which without the docs covering it.
  • Chinese mirrors (zh/development/cloud/api-reference.mdx, zh/development/comfyui-server/comms_routes.mdx, snippets/zh/cloud/complete-example.mdx) replicate every issue above. Localization is also half-done: console.log / print string literals stayed English.
  • The PR body's Related: COM-13971 leaks an internal-only Notion ticket on a public docs PR.

|-----------|------|-------------|
| `status` | string | Filter by status (comma-separated): `pending`, `in_progress`, `completed`, `failed`, `cancelled` |
| `workflow_id` | string | Filter by workflow ID |
| `sort_by` | string | Sort field: `created_at` (default), `execution_duration` |
Copy link
Copy Markdown
Contributor

@MillerMedia MillerMedia May 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sort_by values contradict api-reference.mdx in this same PR (and the actual API).

The live cloud API accepts sort_by values create_time (default) and execution_timeapi-reference.mdx (lines 472, 501, 540) uses those values. This row says created_at and execution_duration, neither of which is accepted server-side.

A reader following this table will send sort_by=created_at and get either a 400 or a silently-ignored sort. Should be:

| `sort_by` | string | Sort field: `create_time` (default), `execution_time` |

Comment on lines +80 to +84
"create_time": 1706540000,
"workflow_id": "workflow-uuid",
"outputs_count": 2,
"preview_output": {"filename": "output.png", "type": "output"},
"execution_start_time": 1706540001000,
Copy link
Copy Markdown
Contributor

@MillerMedia MillerMedia May 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Timestamp units mismatched in the same payload, and create_time is the wrong unit.

The API actually returns create_time in milliseconds (consistent with execution_start_time and execution_end_time). The example here shows 1706540000 (10 digits = seconds), while execution_start_time in the same object shows 1706540001000 (13 digits = milliseconds).

Clients parsing this as an example will either:

  • Treat both fields the same and be off by 1000×, or
  • Treat create_time as ms (matching the actual response) and display 1970 dates.

Fix the example to use 13-digit ms throughout: "create_time": 1706540000000.


async function listJobs(options: {
status?: string;
output_type?: "image" | "video" | "audio";
Copy link
Copy Markdown
Contributor

@MillerMedia MillerMedia May 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

output_type enum is missing "3d".

The live API accepts a fourth value, "3d", in addition to image | video | audio. The TypeScript signature here drops it, so anyone copy-pasting this type into their own SDK will get a compile error the moment they try to filter for 3D outputs.

-  output_type?: "image" | "video" | "audio";
+  output_type?: "image" | "video" | "audio" | "3d";

The Python docstring on line 539 has the same issue.

if (options.limit !== undefined) params.set("limit", String(options.limit));

const response = await fetch(`${BASE_URL}/api/jobs?${params}`, {
headers: getHeaders(),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

getHeaders() / get_headers() are referenced but never defined.

The TypeScript example on this line calls getHeaders(); the Python equivalent on line 561 calls get_headers(); the curl block above (line 465) uses -H "X-API-Key: $COMFY_CLOUD_API_KEY". Three different auth conventions in one section, and the two helper-function ones are phantom — nothing in the diff (or earlier on the page) introduces them.

A developer copy-pasting these snippets gets a ReferenceError/NameError. Either define the helpers up front in the page (a one-liner that returns { "X-API-Key": ... }) or inline the header inside each snippet so it matches the curl block.

Same issue at line 600 (getJobDetails) and line 625 (get_job_details).

interface JobDetailResponse {
id: string;
status: "pending" | "in_progress" | "completed" | "failed" | "cancelled";
workflow?: Record<string, any>;
Copy link
Copy Markdown
Contributor

@MillerMedia MillerMedia May 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

workflow is typed as Record<string, any> and hides a real footgun.

When the API returns a workflow, sensitive credentials inside extra_data are redacted before the response is sent. Specifically: if the original submission contained extra_data.api_key_comfy_org, the returned value of that field is replaced with the literal string "[REDACTED]". The field is preserved (not removed), so existence checks still pass, but the value is not usable.

A developer reading these docs and round-tripping a returned workflow back into /api/prompt will silently submit "[REDACTED]" as their API key and then debug a confusing auth failure. This is exactly the case where the type comment matters more than the type itself.

At minimum, add a short callout above this interface: "Returned workflows have extra_data.api_key_comfy_org replaced with "[REDACTED]" — strip or replace this field before resubmitting."

}

// Get full job details including outputs
const job = await getJobDetails(promptId);
Copy link
Copy Markdown
Contributor

@MillerMedia MillerMedia May 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

promptId is undefined, and the example silently equates prompt-id with job-id.

promptId (here) and prompt_id (Python, line 631) aren't introduced anywhere on the page. The endpoint is /api/jobs/{job_id} and the path parameter is job_id, but the example passes a variable called promptId — the only reason this works is that the cloud platform happens to use the same UUID for both.

This page never says they're the same value. A developer who got their prompt id from /api/prompt will reasonably assume they need to do a separate lookup to get a job id. Either:

  • Rename the variable to jobId/job_id and show how to obtain it (e.g. const jobId = (await fetch('/api/prompt'…)).prompt_id), or
  • Add a one-line callout: "The prompt_id returned by /api/prompt is the same as job_id."

(The JSON example response above also uses "id": "prompt-uuid" rather than a real UUID, which compounds the confusion — the id field is a UUID in the actual response.)

</CodeGroup>

<Note>
**Legacy Endpoints:** The `/api/history`, `/api/history_v2`, and `/api/queue` endpoints are maintained for compatibility with local ComfyUI but the Jobs API is recommended for new integrations.
Copy link
Copy Markdown
Contributor

@MillerMedia MillerMedia May 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The compatibility justification is factually wrong, and undersells the actual deprecation status.

Two issues:

  1. "maintained for compatibility with local ComfyUI" — local ComfyUI's API surface is /history, /history/{prompt_id}, /queue. There is no /api/ prefix and no _v2 variant. /api/history_v2 is a cloud-only endpoint that has no analog in local ComfyUI. So the stated reason for keeping the legacy surface doesn't hold.

  2. "maintained" is softer than reality. These endpoints are deprecated, and some of them already return 404 redirecting callers to /api/jobs/{prompt_id} rather than serving the legacy response. A reader of this Note may build new integrations on endpoints that already 404.

Suggested rewrite:

Legacy Endpoints: /api/history, /api/history_v2, and /api/queue are deprecated and may already return 404 — new integrations must use the Jobs API. Existing integrations should migrate; see migration mapping for field equivalences.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants