Skip to content

Crawler support to automatically create/update package yaml files #156

@kmturley

Description

@kmturley

Currently yaml files are manually added/updated via PR request. This requires substantial time to ensure the files are formatted correctly and the links are correct.

Crawlers such as https://crawlee.dev allow scripts to read an API or webpage, and extract data needed.

Running a crawler could be done in a pipeline, or as a developer step manually. However Crawlers can check many sites/pages and automate the work required to create data files.

Example workflow:
Image

This would allow Crawlers to be used to generate yaml files, which could then be submitted via PRs and go through the validation/virus scanning checks.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

Status

In progress

Status

Todo

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions