feat: add @deck.gl-community/visgl-skills — AI agent skills for deck.gl#534
feat: add @deck.gl-community/visgl-skills — AI agent skills for deck.gl#534
Conversation
…oodles, and DeckBuilder Co-authored-by: charlieforward9 <62311337+charlieforward9@users.noreply.github.com>
|
Happy to engage on this |
|
Ive been getting very deep into the noodles.gl AI assistant - which also uses a JSON format for linking config to layer schemas. Also been inspired by @ryanbaumann 2025 summit session. It feels like there should be a discussion between what @chrisgervang and @akre54 are doing with the node operators in noodles and the idea you have to add zod schemas directly to deck. I got started with googlemaps/platform-ai#63 to explore the AI map language |
|
Thanks for the ping @charlieforward9! To give a bit of context, one of the original goals of Noodles.gl's json format was to encode program structure and type information to drive UI (like generating a color picker for color types) and to enable complex data pipelines (filtering, aggregating, joining) that the standard Deck json schema doesn't natively support. Those goals are still interesting, but they're somewhat orthogonal to what this PR is trying to do. On the broader question of LLM ergonomics: I'm not sure the skills/intermediate-representation approach is the right direction these days. The consensus seems to be that agentic coding tools like Claude Code work best when they can read clean documentation and write native code directly. A json layer in-between tends to add complexity that the model then has to reason around. What actually moves the needle for LLMs using a library is closer to Building an intermediate json spec makes a lot of sense for UI-driven low-code tools. But for LLM code generation, letting agents write pure ts backed by good type definitions and agent-facing docs is probably the better path forward. Ib can you share more about the Zod direction? That seems compelling |
Adds a new module that handles both approaches discussed in #534: - Pattern A (native TypeScript): typed factory functions for all common layer types with sensible defaults, backed by llms.txt reference docs for LLM code generation (addresses akre54's docs-first feedback) - Pattern B (JSON descriptors): fully serializable layer configs with dot-path accessors, validateDescriptor, and hydrateDescriptor for low-code UIs and server-side LLM output (the noodle approach) Also includes DeckBuilder fluent compositor, viewport helpers (fitViewport, getBoundingBox), and a comprehensive llms.txt that serves as the single agent-facing reference for both patterns. 14/14 tests passing. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Adds a new module that handles both approaches discussed in #534: - Pattern A (native TypeScript): typed factory functions for all common layer types with sensible defaults, backed by llms.txt reference docs for LLM code generation (addresses akre54's docs-first feedback) - Pattern B (JSON descriptors): fully serializable layer configs with dot-path accessors, validateDescriptor, and hydrateDescriptor for low-code UIs and server-side LLM output (the noodle approach) Also includes DeckBuilder fluent compositor, viewport helpers (fitViewport, getBoundingBox), and a comprehensive llms.txt that serves as the single agent-facing reference for both patterns. 14/14 tests passing. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
New module providing typed helpers that let AI coding agents (Claude Code, Openclaw, Copilot, etc.) construct deck.gl visualisations with minimal boilerplate — either via programmatic factory functions or fully JSON-serializable noodle descriptors.
Module structure
src/layer-skills.ts— Factory functions returning plainLayerDescriptorobjects for the six most common layer types (ScatterplotLayer,PathLayer,PolygonLayer,TextLayer,IconLayer,HeatmapLayer), all with typed options and sensible defaultssrc/noodles.ts— JSON-serializable layer recipe system; field accessors encoded as dot-path strings ("meta.size") thathydrateNoodleresolves to runtime functions;validateNoodleprovides pre-flight error reportingsrc/deck-builder.ts— FluentDeckBuilderthat composes layers + view state into a singleDeckConfigsrc/viewport-skills.ts—createViewState,getBoundingBox,fitViewport(Web Mercator zoom fit)src/data-skills.ts—createColorAccessor(linear interpolation),createRadiusAccessor,flattenGeoJSON,extractPositionsKey design: noodles
The noodle system is the core AI-agent primitive. An agent emits pure data; the runtime hydrates it:
No new runtime dependencies — peers only on
@deck.gl/coreand@deck.gl/layers.Original prompt
💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.