diff --git a/CHANGELOG.md b/CHANGELOG.md
index 8b08dda44..2a197ed15 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -4,6 +4,7 @@
### Added
+- `slides create-from-markdown`: import slidey-flavored decks with per-slide YAML frontmatter (`layout:`, `content:`), `## Notes` speaker notes, Font Awesome icon shortcodes, mermaid diagrams, `::cols::`/`::col2::`/`::col3::`/`::right::` columns, and `::boxes::`/`::arrows::` icon-row blocks. New flags: `--fa-style`, `--mmdc`, `--strict`, `--keep-temp-images`, `--no-notes` — thanks @njreid.
- Calendar: add `calendar events --sort=start|end|summary|calendar` and `--order=asc|desc` so `--all` output can be returned chronologically across calendars instead of per-calendar API iteration order. Also documents `now` in the `--from`/`--to` help strings (already accepted by `timeparse`) — the relative form agents need when planning "from now on" — thanks @gado-ships-it.
- Calendar: add `calendar events --location` to include event locations in table output. Embedded newlines in the location string are collapsed so multi-line addresses still render on one row — thanks @gado-ships-it.
- Auth: add `gog auth import --client --email` with `--refresh-token-stdin`, `--refresh-token-file`, or `--refresh-token-env` for non-interactive token import without exposing secrets in argv — thanks @jcarnegie.
diff --git a/docs/commands/gog-slides-create-from-markdown.md b/docs/commands/gog-slides-create-from-markdown.md
index 69dc6aae0..538b2e814 100644
--- a/docs/commands/gog-slides-create-from-markdown.md
+++ b/docs/commands/gog-slides-create-from-markdown.md
@@ -28,15 +28,20 @@ gog slides (slide) create-from-markdown
[flags]
| `--disable-commands` | `string` | | Comma-separated list of disabled commands; dot paths allowed |
| `-n` `--dry-run` `--dryrun` `--noop` `--preview` | `bool` | | Do not make changes; print intended actions and exit successfully |
| `--enable-commands` | `string` | | Comma-separated list of enabled commands; dot paths allowed (restricts CLI) |
+| `--fa-style` | `string` | solid | Default Font Awesome style when shortcode has no prefix |
| `-y` `--force` `--assume-yes` `--yes` | `bool` | | Skip confirmations for destructive commands |
| `--gmail-no-send` | `bool` | false | Block Gmail send operations (agent safety) |
| `-h` `--help` | `kong.helpFlag` | | Show context-sensitive help. |
| `-j` `--json` `--machine` | `bool` | false | Output JSON to stdout (best for scripting) |
+| `--keep-temp-images` | `bool` | | Don't delete temporary Drive uploads after import |
+| `--mmdc` | `string` | mmdc | Path to mermaid CLI (mmdc); empty disables diagram rendering |
| `--no-input` `--non-interactive` `--noninteractive` | `bool` | | Never prompt; fail instead (useful for CI) |
+| `--no-notes` | `bool` | | Discard ## Notes sections instead of inserting as speaker notes |
| `--parent` | `string` | | Destination folder ID |
| `-p` `--plain` `--tsv` | `bool` | false | Output stable, parseable text to stdout (TSV; no colors) |
| `--results-only` | `bool` | | In JSON mode, emit only the primary result (drops envelope fields like nextPageToken) |
| `--select` `--pick` `--project` | `string` | | In JSON mode, select comma-separated fields (best-effort; supports dot paths). Desire path: use --fields for most commands. |
+| `--strict` | `bool` | | Treat skipped FA/diagram assets as fatal |
| `-v` `--verbose` | `bool` | | Enable verbose logging |
| `--version` | `kong.VersionFlag` | | Print version and exit |
| `--wrap-untrusted` | `bool` | false | In JSON/raw output, wrap fetched text fields in external untrusted-content markers |
diff --git a/docs/slides-markdown.md b/docs/slides-markdown.md
index c04879e7e..fc64e25fd 100644
--- a/docs/slides-markdown.md
+++ b/docs/slides-markdown.md
@@ -1,43 +1,112 @@
# Google Slides from Markdown
-`gog slides create-from-markdown` creates a new Google Slides deck from a small Markdown subset.
+`gog slides create-from-markdown` accepts both vanilla and slidey-flavored
+markdown. Slidey conventions are documented here.
+
+## Per-slide frontmatter
+
+Each slide may begin with a YAML frontmatter block. Recognized keys:
+
+| Key | Values | Behavior |
+|-----------|-------------------------------------------------------|----------|
+| `layout` | `title`, `hero`, `statement`, `center`, `default`, `two-cols`, `three-cols` | Picks the slide's visual treatment. Unknown values fall back to `default`. |
+| `content` | `wide`, `narrow` | Parsed but not yet applied (Slides has fixed text-box widths). |
-```bash
-gog slides create-from-markdown "Roadmap" --content-file ./slides.md
```
+---
+layout: hero
+---
+
+# univrs
-## File Structure
+Unfolding Nested Intent · Valid · Reliable · Safe
+```
-Separate slides with a line containing only `---`. Each slide needs a `##` heading; slides without a heading are ignored.
+A bare `---` line is a slide separator unless it opens a contiguous YAML
+header: key-value lines followed by a closing `---` before any blank/body line.
-````markdown
-## Roadmap
+## Speaker notes
-- Ship auth migration
-- Polish backup restore
-- Review raw API PRs
+A trailing `## Notes` (or `### Notes`) section becomes the slide's speaker
+notes. The heading and everything after it are removed from the body. FA
+icon shortcodes inside notes are stripped to plain text.
----
+```
+## Topic
-## Launch Notes
+body
-Short paragraphs become body text.
+## Notes
----
+- speaker hint one
+- speaker hint two
+```
+
+## Font Awesome icons
+
+Inline shortcodes `:fa-name:`, `:fas-name:`, `:far-name:`, `:fab-name:`,
+`:fal-name:`, `:fad-name:` resolve to FA Free SVGs fetched from
+`cdn.jsdelivr.net`, rasterized locally with `rsvg-convert` or ImageMagick, and
+inserted as PNG images. If no SVG rasterizer is available, icons are skipped
+with a warning; `--strict` makes that fatal. Style derivation:
+
+| Prefix | Resolved style |
+|---------|----------------|
+| `fa-` | `--fa-style` (default `solid`) |
+| `fas-` | `solid` |
+| `far-` | `regular` |
+| `fab-` | `brands` |
+| `fal-`, `fad-` | `solid` (FA Free has no light/duotone) |
+
+Icons placed at the start of a bullet item render as a small inline image
+to the left of the bullet text. Mid-paragraph icons are dropped.
+
+## Mermaid diagrams
-## CLI Example
+Fenced code blocks tagged `mermaid` are rendered to PNG via the local
+`mmdc` binary (configurable with `--mmdc`) and inserted as a full-width
+image. If `mmdc` is missing, the diagram is skipped with a warning;
+`--strict` makes it fatal.
-```text
-gog auth doctor --check
+## Multi-column layouts
+
+```
+::cols::
+
+left column markdown
+
+::col2::
+
+middle / right column markdown
+
+::col3::
+
+third column markdown
+
+::/cols::
```
-````
-## Supported Markdown
+`::right::` is accepted as a synonym for `::col2::` (slidey-style).
+In `layout: two-cols` and `layout: three-cols` slides, `::col2::`,
+`::col3::`, and `::right::` may also be used without an opening `::cols::`;
+the content after the slide title becomes the first column.
-- `## Heading` becomes the slide title.
-- `- item` and `* item` become bullet lists.
-- Plain lines become body text.
-- Fenced code blocks become code text.
-- Inline emphasis markers such as `**bold**`, `_italic_`, and backticks are stripped to plain text.
+## ::boxes:: and ::arrows::
+
+```
+::boxes::
+:fa-rectangle-ad: Campaigns
+:fa-headset: Support Tickets
+::/boxes::
+
+::arrows::
+
+### Step One
+
+### Step Two
+
+::/arrows::
+```
-The command is intentionally layout-light: it creates title/body slides from text content. Use `slides create-from-template` when you need exact branding, placeholder replacement, or predesigned layouts.
+Both render as bulleted lists in the body. Boxes use bullet glyphs;
+arrows use `→`.
diff --git a/go.mod b/go.mod
index b55de350e..915651714 100644
--- a/go.mod
+++ b/go.mod
@@ -9,6 +9,7 @@ require (
github.com/muesli/termenv v0.16.0
github.com/stretchr/testify v1.11.1
github.com/yosuke-furukawa/json5 v0.1.1
+ github.com/yuin/goldmark v1.8.2
golang.org/x/net v0.53.0
golang.org/x/oauth2 v0.36.0
golang.org/x/term v0.42.0
diff --git a/go.sum b/go.sum
index 6669bbc12..623e64f06 100644
--- a/go.sum
+++ b/go.sum
@@ -82,6 +82,8 @@ github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
github.com/yosuke-furukawa/json5 v0.1.1 h1:0F9mNwTvOuDNH243hoPqvf+dxa5QsKnZzU20uNsh3ZI=
github.com/yosuke-furukawa/json5 v0.1.1/go.mod h1:sw49aWDqNdRJ6DYUtIQiaA3xyj2IL9tjeNYmX2ixwcU=
+github.com/yuin/goldmark v1.8.2 h1:kEGpgqJXdgbkhcOgBxkC0X0PmoPG1ZyoZ117rDVp4zE=
+github.com/yuin/goldmark v1.8.2/go.mod h1:ip/1k0VRfGynBgxOz0yCqHrbZXhcjxyuS66Brc7iBKg=
go.opentelemetry.io/auto/sdk v1.2.1 h1:jXsnJ4Lmnqd11kwkBV2LgLoFMZKizbCi5fNZ/ipaZ64=
go.opentelemetry.io/auto/sdk v1.2.1/go.mod h1:KRTj+aOaElaLi+wW1kO/DZRXwkF4C5xPbEe3ZiIhN7Y=
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.68.0 h1:0Qx7VGBacMm9ZENQ7TnNObTYI4ShC+lHI16seduaxZo=
diff --git a/internal/cmd/slides.go b/internal/cmd/slides.go
index fb1a2bcc1..aa6e98d7e 100644
--- a/internal/cmd/slides.go
+++ b/internal/cmd/slides.go
@@ -14,9 +14,6 @@ import (
"github.com/steipete/gogcli/internal/ui"
)
-// Debug flag for slides creation
-var debugSlides = false
-
var newSlidesService = googleapi.NewSlides
type SlidesCmd struct {
@@ -192,11 +189,16 @@ func (c *SlidesCreateCmd) Run(ctx context.Context, flags *RootFlags) error {
}
type SlidesCreateFromMarkdownCmd struct {
- Title string `arg:"" name:"title" help:"Presentation title"`
- Content string `name:"content" help:"Markdown content (inline)"`
- ContentFile string `name:"content-file" help:"Read markdown content from file"`
- Parent string `name:"parent" help:"Destination folder ID"`
- Debug bool `name:"debug" help:"Show debug output"`
+ Title string `arg:"" name:"title" help:"Presentation title"`
+ Content string `name:"content" help:"Markdown content (inline)"`
+ ContentFile string `name:"content-file" help:"Read markdown content from file"`
+ Parent string `name:"parent" help:"Destination folder ID"`
+ Debug bool `name:"debug" help:"Show debug output"`
+ FAStyle string `name:"fa-style" help:"Default Font Awesome style when shortcode has no prefix" default:"solid"`
+ MMDC string `name:"mmdc" help:"Path to mermaid CLI (mmdc); empty disables diagram rendering" default:"mmdc"`
+ Strict bool `name:"strict" help:"Treat skipped FA/diagram assets as fatal"`
+ KeepTempImages bool `name:"keep-temp-images" help:"Don't delete temporary Drive uploads after import"`
+ NoNotes bool `name:"no-notes" help:"Discard ## Notes sections instead of inserting as speaker notes"`
}
func (c *SlidesCreateFromMarkdownCmd) Run(ctx context.Context, flags *RootFlags) error {
@@ -206,7 +208,6 @@ func (c *SlidesCreateFromMarkdownCmd) Run(ctx context.Context, flags *RootFlags)
return usage("empty title")
}
- // Get markdown content
var markdown string
var err error
switch {
@@ -214,7 +215,7 @@ func (c *SlidesCreateFromMarkdownCmd) Run(ctx context.Context, flags *RootFlags)
var data []byte
data, err = os.ReadFile(c.ContentFile)
if err != nil {
- return fmt.Errorf("failed to read content file: %w", err)
+ return fmt.Errorf("read content file: %w", err)
}
markdown = string(data)
case c.Content != "":
@@ -223,21 +224,39 @@ func (c *SlidesCreateFromMarkdownCmd) Run(ctx context.Context, flags *RootFlags)
return usage("either --content or --content-file is required")
}
+ parsed, err := ParseMarkdownToSlides(markdown, ParseOptions{DefaultFAStyle: c.FAStyle})
+ if err != nil {
+ return fmt.Errorf("parse markdown: %w", err)
+ }
+ if len(parsed) == 0 {
+ return fmt.Errorf("no slides found in markdown")
+ }
if c.Debug {
- debugSlides = true
+ fmt.Fprintf(os.Stderr, "parsed %d slides\n", len(parsed))
}
- parsedSlides := ParseMarkdownToSlides(markdown)
- if len(parsedSlides) == 0 {
- return fmt.Errorf("no slides found in markdown")
+ pipelineCfg := DefaultAssetPipelineConfig()
+ pipelineCfg.MMDCPath = c.MMDC
+ pipelineCfg.Strict = c.Strict
+ pipelineCfg.KeepTempImages = c.KeepTempImages
+ pipelineCfg.DefaultFAStyle = c.FAStyle
+
+ opts := CreatePresentationFromMarkdownOptions{
+ Title: title,
+ Parent: c.Parent,
+ Slides: parsed,
+ Pipeline: pipelineCfg,
+ NoNotes: c.NoNotes,
}
- if dryRunErr := dryRunExit(ctx, flags, "slides.create-from-markdown", map[string]any{
- "title": title,
- "slides": len(parsedSlides),
- "parent": strings.TrimSpace(c.Parent),
- "content_file": strings.TrimSpace(c.ContentFile),
- }); dryRunErr != nil {
- return dryRunErr
+ if flags.DryRun {
+ return dryRunExit(ctx, flags, "slides.create-from-markdown", map[string]any{
+ "title": title,
+ "slides": len(parsed),
+ "parent": strings.TrimSpace(c.Parent),
+ "content_file": strings.TrimSpace(c.ContentFile),
+ "no_notes": c.NoNotes,
+ "batch_update": buildSlideyDryRunBatchUpdate(parsed),
+ })
}
account, err := requireAccount(flags)
@@ -245,44 +264,24 @@ func (c *SlidesCreateFromMarkdownCmd) Run(ctx context.Context, flags *RootFlags)
return err
}
- // Create Slides service
slidesSvc, err := newSlidesService(ctx, account)
if err != nil {
return err
}
-
- // Create presentation from markdown
- presentation, err := CreatePresentationFromMarkdown(title, markdown, slidesSvc)
+ driveSvc, err := newDriveService(ctx, account)
if err != nil {
return err
}
- // Move to parent folder if specified
- if c.Parent != "" {
- var parentDriveSvc *drive.Service
- parentDriveSvc, err = newDriveService(ctx, account)
- if err != nil {
- return err
- }
+ opts.SlidesService = slidesSvc
+ opts.DriveService = driveSvc
- _, err = parentDriveSvc.Files.Update(presentation.PresentationId, &drive.File{}).
- AddParents(c.Parent).
- SupportsAllDrives(true).
- Context(ctx).
- Do()
- if err != nil {
- return fmt.Errorf("failed to move presentation to folder: %w", err)
- }
- }
-
- // Get presentation link
- var driveSvc *drive.Service
- driveSvc, err = newDriveService(ctx, account)
+ created, err := CreatePresentationFromMarkdownV2(ctx, opts)
if err != nil {
return err
}
- file, err := driveSvc.Files.Get(presentation.PresentationId).
+ file, err := driveSvc.Files.Get(created.PresentationId).
Fields("id, name, webViewLink").
SupportsAllDrives(true).
Context(ctx).
@@ -292,17 +291,23 @@ func (c *SlidesCreateFromMarkdownCmd) Run(ctx context.Context, flags *RootFlags)
}
if outfmt.IsJSON(ctx) {
+ presentation, err := slidesSvc.Presentations.Get(created.PresentationId).Context(ctx).Do()
+ if err != nil {
+ return err
+ }
return outfmt.WriteJSON(ctx, os.Stdout, map[string]any{
"presentation": presentation,
"file": file,
})
}
- u.Out().Printf("Created presentation with %d slides", len(ParseMarkdownToSlides(markdown)))
- u.Out().Printf("id\t%s", presentation.PresentationId)
- u.Out().Printf("name\t%s", file.Name)
- if file.WebViewLink != "" {
- u.Out().Printf("link\t%s", file.WebViewLink)
+ if created != nil {
+ u.Out().Printf("Created presentation with %d slides", len(parsed))
+ u.Out().Printf("id\t%s", created.PresentationId)
+ u.Out().Printf("name\t%s", file.Name)
+ if file.WebViewLink != "" {
+ u.Out().Printf("link\t%s", file.WebViewLink)
+ }
}
return nil
}
diff --git a/internal/cmd/slides_assets.go b/internal/cmd/slides_assets.go
new file mode 100644
index 000000000..358f9feef
--- /dev/null
+++ b/internal/cmd/slides_assets.go
@@ -0,0 +1,411 @@
+package cmd
+
+import (
+ "bytes"
+ "context"
+ "fmt"
+ "io"
+ "net/http"
+ "os"
+ "os/exec"
+ "path/filepath"
+ "strings"
+ "time"
+
+ "google.golang.org/api/drive/v3"
+)
+
+// AssetMap pairs parsed AST references with uploaded Drive ImageRefs.
+// Icons is keyed by IconRef value (Style+Name); Diagrams is keyed by
+// DiagramBlock.ID.
+type AssetMap struct {
+ Icons map[IconRef]ImageRef
+ Diagrams map[string]ImageRef
+}
+
+// NewAssetMap returns an empty initialized AssetMap.
+func NewAssetMap() AssetMap {
+ return AssetMap{
+ Icons: map[IconRef]ImageRef{},
+ Diagrams: map[string]ImageRef{},
+ }
+}
+
+// AssetPipelineConfig holds the runtime knobs for the pipeline.
+type AssetPipelineConfig struct {
+ HTTPClient *http.Client
+ MMDCPath string
+ SVGRasterizerPath string
+ Strict bool
+ KeepTempImages bool
+ DefaultFAStyle string
+}
+
+// DefaultAssetPipelineConfig returns a config with sane defaults: 30s
+// HTTP timeout, mmdc on PATH, non-strict, no image retention.
+func DefaultAssetPipelineConfig() AssetPipelineConfig {
+ return AssetPipelineConfig{
+ HTTPClient: &http.Client{Timeout: 30 * time.Second},
+ MMDCPath: "mmdc",
+ SVGRasterizerPath: "",
+ Strict: false,
+ KeepTempImages: false,
+ DefaultFAStyle: "solid",
+ }
+}
+
+func faSVGURL(style, name string) string {
+ return fmt.Sprintf(
+ "https://cdn.jsdelivr.net/npm/@fortawesome/fontawesome-free@6/svgs/%s/%s.svg",
+ style, name,
+ )
+}
+
+func mmdcCommandArgs(mmdcPath, in, out string) []string {
+ return []string{mmdcPath, "-i", in, "-o", out, "-b", "transparent", "--scale", "2"}
+}
+
+func svgRasterizerArgs(binary, in, out string) []string {
+ switch filepath.Base(binary) {
+ case "magick":
+ return []string{binary, in, "-background", "none", "-resize", "128x128", out}
+ case "convert":
+ return []string{binary, in, "-background", "none", "-resize", "128x128", out}
+ default:
+ return []string{binary, "-w", "128", "-h", "128", "-f", "png", "-o", out, in}
+ }
+}
+
+func findSVGRasterizer() (string, error) {
+ for _, candidate := range []string{"rsvg-convert", "magick", "convert"} {
+ if path, err := exec.LookPath(candidate); err == nil {
+ return path, nil
+ }
+ }
+ return "", fmt.Errorf("SVG rasterizer not found (install librsvg rsvg-convert or ImageMagick)")
+}
+
+// renderMermaidWithBinary writes source to a temp .mmd, runs mmdc, and
+// returns the rendered PNG bytes. The temp files are cleaned up.
+func renderMermaidWithBinary(ctx context.Context, mmdcPath, source string) ([]byte, error) {
+ dir, err := os.MkdirTemp("", "gogcli-mermaid-*")
+ if err != nil {
+ return nil, err
+ }
+ defer os.RemoveAll(dir)
+ in := filepath.Join(dir, "in.mmd")
+ out := filepath.Join(dir, "out.png")
+ if writeErr := os.WriteFile(in, []byte(source), 0o600); writeErr != nil {
+ return nil, writeErr
+ }
+ args := mmdcCommandArgs(mmdcPath, in, out)
+ cmd := exec.CommandContext(ctx, args[0], args[1:]...) // #nosec G204 — args constructed from validated config
+ output, err := cmd.CombinedOutput()
+ if err != nil {
+ // Surface stderr so the user can see WHY mmdc failed (puppeteer
+ // chromium download, mermaid syntax error, etc.) — bare exit codes
+ // are useless on their own.
+ trimmed := strings.TrimSpace(string(output))
+ if trimmed != "" {
+ return nil, fmt.Errorf("mmdc failed: %w: %s", err, trimmed)
+ }
+ return nil, fmt.Errorf("mmdc failed: %w", err)
+ }
+ return os.ReadFile(out) // #nosec G304 -- output path is inside a freshly-created temp directory.
+}
+
+func rasterizeSVGToPNG(ctx context.Context, svg []byte) ([]byte, error) {
+ rasterizer, err := findSVGRasterizer()
+ if err != nil {
+ return nil, err
+ }
+ return rasterizeSVGToPNGWith(ctx, svg, rasterizer)
+}
+
+func rasterizeSVGToPNGWithOptional(ctx context.Context, svg []byte, rasterizer string) ([]byte, error) {
+ if rasterizer != "" {
+ return rasterizeSVGToPNGWith(ctx, svg, rasterizer)
+ }
+ return rasterizeSVGToPNG(ctx, svg)
+}
+
+func rasterizeSVGToPNGWith(ctx context.Context, svg []byte, rasterizer string) ([]byte, error) {
+ dir, err := os.MkdirTemp("", "gogcli-svg-*")
+ if err != nil {
+ return nil, err
+ }
+ defer os.RemoveAll(dir)
+ in := filepath.Join(dir, "in.svg")
+ out := filepath.Join(dir, "out.png")
+ if writeErr := os.WriteFile(in, svg, 0o600); writeErr != nil {
+ return nil, writeErr
+ }
+ args := svgRasterizerArgs(rasterizer, in, out)
+ cmd := exec.CommandContext(ctx, args[0], args[1:]...) // #nosec G204 -- tool path and args are controlled by local config.
+ output, err := cmd.CombinedOutput()
+ if err != nil {
+ trimmed := strings.TrimSpace(string(output))
+ if trimmed != "" {
+ return nil, fmt.Errorf("rasterize SVG: %w: %s", err, trimmed)
+ }
+ return nil, fmt.Errorf("rasterize SVG: %w", err)
+ }
+ return os.ReadFile(out) // #nosec G304 -- output path is inside a freshly-created temp directory.
+}
+
+func fetchFAIconFromURL(ctx context.Context, client *http.Client, url string) ([]byte, error) {
+ req, err := http.NewRequestWithContext(ctx, http.MethodGet, url, nil)
+ if err != nil {
+ return nil, err
+ }
+ resp, err := client.Do(req)
+ if err != nil {
+ return nil, fmt.Errorf("fetch %s: %w", url, err)
+ }
+ defer resp.Body.Close()
+ if resp.StatusCode != http.StatusOK {
+ return nil, fmt.Errorf("fetch %s: HTTP %d", url, resp.StatusCode)
+ }
+ return io.ReadAll(resp.Body)
+}
+
+// Uploader abstracts the Drive operations the pipeline needs. Real impl
+// (Task 14) wraps drive.Service; tests use fakeDriveUploader.
+type Uploader interface {
+ UploadAsset(ctx context.Context, name, mime string, body []byte) (ImageRef, error)
+ DeleteAsset(ctx context.Context, fileID string) error
+}
+
+// AssetPipeline resolves all FA icon and mermaid diagram references in a
+// slice of Slides into ImageRefs by fetching/rendering them and uploading
+// to Drive via the Uploader.
+type AssetPipeline struct {
+ Config AssetPipelineConfig
+ Uploader Uploader
+
+ // uploaded tracks Drive file IDs created by this pipeline so Cleanup
+ // can delete them when --keep-temp-images is false.
+ uploaded []string
+}
+
+// Resolve walks all slides, collects unique IconRefs and DiagramBlocks,
+// fetches/renders/uploads each, and returns the resulting AssetMap.
+//
+// Per-asset failures are logged (warn-and-skip) unless Config.Strict.
+func (p *AssetPipeline) Resolve(ctx context.Context, slides []Slide) (AssetMap, error) {
+ am := NewAssetMap()
+
+ icons := collectIconRefs(slides)
+ diagrams := collectDiagrams(slides)
+
+ for ref := range icons {
+ body, resolvedStyle, err := fetchFAIconWithStyleFallback(ctx, p.Config.HTTPClient, ref)
+ if err != nil {
+ if p.Config.Strict {
+ return am, err
+ }
+ fmt.Fprintf(os.Stderr, "warning: skipping FA icon :%s-%s: %v\n", ref.Style, ref.Name, err)
+ continue
+ }
+ png, err := rasterizeSVGToPNGWithOptional(ctx, body, p.Config.SVGRasterizerPath)
+ if err != nil {
+ if p.Config.Strict {
+ return am, err
+ }
+ fmt.Fprintf(os.Stderr, "warning: skipping FA icon :%s-%s: %v\n", ref.Style, ref.Name, err)
+ continue
+ }
+ ir, err := p.Uploader.UploadAsset(ctx, fmt.Sprintf("fa-%s-%s.png", resolvedStyle, ref.Name), "image/png", png)
+ if err != nil {
+ if p.Config.Strict {
+ return am, err
+ }
+ fmt.Fprintf(os.Stderr, "warning: skipping FA icon :%s-%s: upload: %v\n", ref.Style, ref.Name, err)
+ continue
+ }
+ am.Icons[ref] = ir
+ p.uploaded = append(p.uploaded, ir.DriveFileID)
+ }
+
+ for blockID, source := range diagrams {
+ if p.Config.MMDCPath == "" {
+ if p.Config.Strict {
+ return am, fmt.Errorf("mmdc not configured; cannot render mermaid diagram %s", blockID)
+ }
+ fmt.Fprintf(os.Stderr, "warning: mmdc not configured; skipping mermaid diagram %s\n", blockID)
+ continue
+ }
+ png, err := renderMermaidWithBinary(ctx, p.Config.MMDCPath, source)
+ if err != nil {
+ if p.Config.Strict {
+ return am, err
+ }
+ fmt.Fprintf(os.Stderr, "warning: skipping mermaid diagram %s: %v\n", blockID, err)
+ continue
+ }
+ ir, err := p.Uploader.UploadAsset(ctx, blockID+".png", "image/png", png)
+ if err != nil {
+ if p.Config.Strict {
+ return am, err
+ }
+ fmt.Fprintf(os.Stderr, "warning: skipping mermaid diagram %s: upload: %v\n", blockID, err)
+ continue
+ }
+ am.Diagrams[blockID] = ir
+ p.uploaded = append(p.uploaded, ir.DriveFileID)
+ }
+
+ return am, nil
+}
+
+// Cleanup deletes every Drive file the pipeline uploaded, unless
+// Config.KeepTempImages is true.
+func (p *AssetPipeline) Cleanup(ctx context.Context) error {
+ if p.Config.KeepTempImages {
+ return nil
+ }
+ var firstErr error
+ for _, id := range p.uploaded {
+ if err := p.Uploader.DeleteAsset(ctx, id); err != nil && firstErr == nil {
+ firstErr = err
+ }
+ }
+ return firstErr
+}
+
+// collectIconRefs walks all slides, deduping IconRef values.
+func collectIconRefs(slides []Slide) map[IconRef]struct{} {
+ out := map[IconRef]struct{}{}
+ var walkBlocks func([]Block)
+ walkBlocks = func(blocks []Block) {
+ for _, b := range blocks {
+ switch v := b.(type) {
+ case ParagraphBlock:
+ if r, ok := leadingIcon(v.Inlines); ok {
+ out[r] = struct{}{}
+ }
+ case BulletsBlock:
+ for _, item := range v.Items {
+ if r, ok := leadingIcon(item.Inlines); ok {
+ out[r] = struct{}{}
+ }
+ }
+ case HeadingBlock:
+ if r, ok := leadingIcon(v.Inlines); ok {
+ out[r] = struct{}{}
+ }
+ case ColumnsBlock:
+ for _, col := range v.Columns {
+ walkBlocks(col)
+ }
+ case IconRowsBlock:
+ for _, row := range v.Rows {
+ if row.Icon != nil {
+ out[*row.Icon] = struct{}{}
+ }
+ }
+ }
+ }
+ }
+ for _, s := range slides {
+ walkBlocks(s.Body)
+ }
+ return out
+}
+
+func leadingIcon(inlines []Inline) (IconRef, bool) {
+ if len(inlines) == 0 {
+ return IconRef{}, false
+ }
+ ref, ok := inlines[0].(IconRef)
+ return ref, ok
+}
+
+// collectDiagrams walks all slides for DiagramBlocks, returning {ID: source}.
+func collectDiagrams(slides []Slide) map[string]string {
+ out := map[string]string{}
+ var walkBlocks func([]Block)
+ walkBlocks = func(blocks []Block) {
+ for _, b := range blocks {
+ switch v := b.(type) {
+ case DiagramBlock:
+ out[v.ID] = v.Source
+ case ColumnsBlock:
+ for _, col := range v.Columns {
+ walkBlocks(col)
+ }
+ }
+ }
+ }
+ for _, s := range slides {
+ walkBlocks(s.Body)
+ }
+ return out
+}
+
+// fetchFAIconWithStyleFallback fetches the SVG for ref. If the requested
+// style returns 404 (common for users who write `:fa-dev:` when "dev" is
+// only published under brands/), it tries the other free-tier styles in a
+// fixed order: brands, regular, solid. Returns the body, the style that
+// actually served, and the final error.
+func fetchFAIconWithStyleFallback(ctx context.Context, client *http.Client, ref IconRef) ([]byte, string, error) {
+ tried := map[string]bool{}
+ order := []string{ref.Style, "brands", "regular", "solid"}
+ var lastErr error
+ for _, style := range order {
+ if style == "" || tried[style] {
+ continue
+ }
+ tried[style] = true
+ body, err := fetchFAIconFromURL(ctx, client, faSVGURL(style, ref.Name))
+ if err == nil {
+ return body, style, nil
+ }
+ lastErr = err
+ // Only fall through on 404; other errors (network, 5xx) shouldn't
+ // trigger style guessing.
+ if !strings.Contains(err.Error(), "HTTP 404") {
+ return nil, ref.Style, err
+ }
+ }
+ return nil, ref.Style, lastErr
+}
+
+// DriveUploader implements Uploader by writing temporary files to Drive,
+// granting public read access, and reading the WebContentLink. Mirrors
+// the pattern in slides_add_slide.go.
+type DriveUploader struct {
+ Svc *drive.Service
+}
+
+func (d *DriveUploader) UploadAsset(ctx context.Context, name, mime string, body []byte) (ImageRef, error) {
+ created, err := d.Svc.Files.Create(&drive.File{
+ Name: name,
+ MimeType: mime,
+ }).Media(bytes.NewReader(body)).Fields("id, webContentLink").Context(ctx).Do()
+ if err != nil {
+ return ImageRef{}, fmt.Errorf("upload %s: %w", name, err)
+ }
+ if _, err := d.Svc.Permissions.Create(created.Id, &drive.Permission{
+ Type: "anyone",
+ Role: "reader",
+ }).Context(ctx).Do(); err != nil {
+ // Best-effort cleanup so a permission failure doesn't orphan the upload.
+ _ = d.Svc.Files.Delete(created.Id).Context(ctx).Do()
+ return ImageRef{}, fmt.Errorf("permission %s: %w", created.Id, err)
+ }
+ url := created.WebContentLink
+ if url == "" {
+ got, err := d.Svc.Files.Get(created.Id).Fields("webContentLink").Context(ctx).Do()
+ if err != nil {
+ _ = d.Svc.Files.Delete(created.Id).Context(ctx).Do()
+ return ImageRef{}, fmt.Errorf("get url for %s: %w", created.Id, err)
+ }
+ url = got.WebContentLink
+ }
+ return ImageRef{DriveFileID: created.Id, PublicURL: url}, nil
+}
+
+func (d *DriveUploader) DeleteAsset(ctx context.Context, fileID string) error {
+ return d.Svc.Files.Delete(fileID).Context(ctx).Do()
+}
diff --git a/internal/cmd/slides_assets_test.go b/internal/cmd/slides_assets_test.go
new file mode 100644
index 000000000..c392b3c6e
--- /dev/null
+++ b/internal/cmd/slides_assets_test.go
@@ -0,0 +1,212 @@
+package cmd
+
+import (
+ "context"
+ "fmt"
+ "io"
+ "net/http"
+ "net/http/httptest"
+ "os"
+ "os/exec"
+ "path/filepath"
+ "runtime"
+ "strings"
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+ "github.com/stretchr/testify/require"
+)
+
+func TestFetchFAIcon_OK(t *testing.T) {
+ srv := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
+ _, _ = io.WriteString(w, "")
+ }))
+ t.Cleanup(srv.Close)
+
+ body, err := fetchFAIconFromURL(context.Background(), srv.Client(), srv.URL+"/x.svg")
+ require.NoError(t, err)
+ assert.Equal(t, "", string(body))
+}
+
+func TestFetchFAIcon_404(t *testing.T) {
+ srv := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
+ http.NotFound(w, r)
+ }))
+ t.Cleanup(srv.Close)
+
+ _, err := fetchFAIconFromURL(context.Background(), srv.Client(), srv.URL+"/x.svg")
+ require.Error(t, err)
+ assert.True(t, strings.Contains(err.Error(), "404"))
+}
+
+func TestFASVGURL(t *testing.T) {
+ cases := []struct {
+ style, name, expected string
+ }{
+ {"solid", "truck-fast", "https://cdn.jsdelivr.net/npm/@fortawesome/fontawesome-free@6/svgs/solid/truck-fast.svg"},
+ {"brands", "github", "https://cdn.jsdelivr.net/npm/@fortawesome/fontawesome-free@6/svgs/brands/github.svg"},
+ {"regular", "clock", "https://cdn.jsdelivr.net/npm/@fortawesome/fontawesome-free@6/svgs/regular/clock.svg"},
+ }
+ for _, tc := range cases {
+ assert.Equal(t, tc.expected, faSVGURL(tc.style, tc.name))
+ }
+}
+
+func TestMMDCCommandArgs(t *testing.T) {
+ args := mmdcCommandArgs("/usr/bin/mmdc", "/tmp/in.mmd", "/tmp/out.png")
+ assert.Equal(t, []string{"/usr/bin/mmdc", "-i", "/tmp/in.mmd", "-o", "/tmp/out.png", "-b", "transparent", "--scale", "2"}, args)
+}
+
+func TestSVGRasterizerArgs(t *testing.T) {
+ assert.Equal(t,
+ []string{"/usr/bin/rsvg-convert", "-w", "128", "-h", "128", "-f", "png", "-o", "/tmp/out.png", "/tmp/in.svg"},
+ svgRasterizerArgs("/usr/bin/rsvg-convert", "/tmp/in.svg", "/tmp/out.png"),
+ )
+ assert.Equal(t,
+ []string{"/opt/homebrew/bin/magick", "/tmp/in.svg", "-background", "none", "-resize", "128x128", "/tmp/out.png"},
+ svgRasterizerArgs("/opt/homebrew/bin/magick", "/tmp/in.svg", "/tmp/out.png"),
+ )
+}
+
+func TestRenderMermaid_BinaryMissing(t *testing.T) {
+ _, err := renderMermaidWithBinary(context.Background(), "/nonexistent/mmdc-binary", "graph TD\nA-->B")
+ require.Error(t, err)
+}
+
+func TestRasterizeSVGToPNGWithFakeRasterizer(t *testing.T) {
+ rasterizer := fakeSVGRasterizer(t)
+ png, err := rasterizeSVGToPNGWith(context.Background(), []byte(""), rasterizer)
+ require.NoError(t, err)
+ assert.Equal(t, []byte("PNG"), png)
+}
+
+type fakeDriveUploader struct {
+ uploaded []string // file IDs in upload order
+ deleted []string
+}
+
+func (f *fakeDriveUploader) UploadAsset(ctx context.Context, name, mime string, body []byte) (ImageRef, error) {
+ id := fmt.Sprintf("file-%d", len(f.uploaded)+1)
+ f.uploaded = append(f.uploaded, id)
+ return ImageRef{DriveFileID: id, PublicURL: "https://drive.example/" + id}, nil
+}
+
+func (f *fakeDriveUploader) DeleteAsset(ctx context.Context, id string) error {
+ f.deleted = append(f.deleted, id)
+ return nil
+}
+
+func fakeSVGRasterizer(t *testing.T) string {
+ t.Helper()
+ dir := t.TempDir()
+ name := "rsvg-convert"
+ if runtime.GOOS == "windows" {
+ name += ".exe"
+ }
+
+ src := filepath.Join(dir, "main.go")
+ code := `package main
+
+import (
+ "fmt"
+ "os"
+)
+
+func main() {
+ out := ""
+ for i := 1; i < len(os.Args); i++ {
+ if os.Args[i] == "-o" && i+1 < len(os.Args) {
+ out = os.Args[i+1]
+ break
+ }
+ }
+ if out == "" {
+ _, _ = fmt.Fprintln(os.Stderr, "missing -o")
+ os.Exit(2)
+ }
+ if err := os.WriteFile(out, []byte("PNG"), 0o600); err != nil {
+ _, _ = fmt.Fprintln(os.Stderr, err)
+ os.Exit(1)
+ }
+}
+`
+ require.NoError(t, os.WriteFile(src, []byte(code), 0o600))
+
+ path := filepath.Join(dir, name)
+ cmd := exec.CommandContext(t.Context(), "go", "build", "-o", path, src)
+ out, err := cmd.CombinedOutput()
+ require.NoError(t, err, string(out))
+ return path
+}
+
+func TestAssetPipeline_CollectsUniqueIcons(t *testing.T) {
+ cfg := DefaultAssetPipelineConfig()
+ cfg.SVGRasterizerPath = fakeSVGRasterizer(t)
+ cfg.HTTPClient = &http.Client{Transport: roundTripFunc(func(r *http.Request) (*http.Response, error) {
+ return &http.Response{StatusCode: 200, Body: io.NopCloser(strings.NewReader("")), Header: http.Header{}}, nil
+ })}
+ cfg.MMDCPath = "" // disable mmdc; no diagrams in test
+
+ uploader := &fakeDriveUploader{}
+ p := &AssetPipeline{Config: cfg, Uploader: uploader}
+
+ slides := []Slide{
+ {Body: []Block{ParagraphBlock{Inlines: []Inline{
+ IconRef{Style: "solid", Name: "truck-fast"},
+ TextRun{Text: " hello "},
+ IconRef{Style: "solid", Name: "truck-fast"}, // duplicate, should not re-upload
+ }}}},
+ {Body: []Block{IconRowsBlock{Kind: "boxes", Rows: []IconRow{
+ {Icon: &IconRef{Style: "brands", Name: "github"}, Text: "GitHub"},
+ }}}},
+ }
+
+ am, err := p.Resolve(context.Background(), slides)
+ require.NoError(t, err)
+ assert.Equal(t, 2, len(am.Icons), "two unique icons, no duplicates")
+ assert.Equal(t, 2, len(uploader.uploaded), "exactly two Drive uploads")
+}
+
+func TestAssetPipeline_StrictFailsWhenMMDCDisabled(t *testing.T) {
+ cfg := DefaultAssetPipelineConfig()
+ cfg.MMDCPath = ""
+ cfg.Strict = true
+
+ p := &AssetPipeline{Config: cfg, Uploader: &fakeDriveUploader{}}
+ slides := []Slide{{Body: []Block{
+ DiagramBlock{Kind: "mermaid", Source: "graph TD\nA-->B", ID: "block-1"},
+ }}}
+
+ _, err := p.Resolve(context.Background(), slides)
+ require.Error(t, err)
+ assert.Contains(t, err.Error(), "mmdc not configured")
+}
+
+func TestCollectIconRefs_OnlyLeadingParagraphAndHeadingIcons(t *testing.T) {
+ leading := IconRef{Style: "solid", Name: "file"}
+ mid := IconRef{Style: "solid", Name: "truck-fast"}
+
+ got := collectIconRefs([]Slide{{
+ Body: []Block{
+ HeadingBlock{Inlines: []Inline{leading, TextRun{Text: " Rethink"}}},
+ ParagraphBlock{Inlines: []Inline{TextRun{Text: "middle "}, mid}},
+ },
+ }})
+
+ assert.Contains(t, got, leading)
+ assert.NotContains(t, got, mid)
+}
+
+func TestAssetPipeline_Cleanup(t *testing.T) {
+ uploader := &fakeDriveUploader{}
+ p := &AssetPipeline{Config: DefaultAssetPipelineConfig(), Uploader: uploader}
+ uploader.uploaded = []string{"file-1", "file-2"}
+ p.uploaded = []string{"file-1", "file-2"}
+
+ require.NoError(t, p.Cleanup(context.Background()))
+ assert.Equal(t, []string{"file-1", "file-2"}, uploader.deleted)
+}
+
+func TestDriveUploaderSatisfiesUploader(t *testing.T) {
+ var _ Uploader = (*DriveUploader)(nil)
+}
diff --git a/internal/cmd/slides_e2e_test.go b/internal/cmd/slides_e2e_test.go
new file mode 100644
index 000000000..7ec5d9b86
--- /dev/null
+++ b/internal/cmd/slides_e2e_test.go
@@ -0,0 +1,90 @@
+package cmd
+
+import (
+ "context"
+ "os"
+ "path/filepath"
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+ "github.com/stretchr/testify/require"
+)
+
+func TestSlideyFixture_ParsesAndRenders(t *testing.T) {
+ path := filepath.Join("..", "..", "testdata", "slidey", "index.md")
+ data, err := os.ReadFile(path)
+ require.NoError(t, err)
+
+ parsed, err := ParseMarkdownToSlides(string(data), ParseOptions{})
+ require.NoError(t, err)
+ assert.GreaterOrEqual(t, len(parsed), 30, "fixture should produce ~30+ slides")
+
+ // At least one hero/title/statement, one two-cols, one three-cols.
+ var sawHero, sawTwoCols, sawThreeCols, sawNotes, sawIcon, sawDiagram bool
+ for _, s := range parsed {
+ switch s.Frontmatter.Layout {
+ case "hero", "title", "statement":
+ sawHero = true
+ case "two-cols":
+ sawTwoCols = true
+ case "three-cols":
+ sawThreeCols = true
+ }
+ if s.Notes != "" {
+ sawNotes = true
+ }
+ var walk func([]Block)
+ walk = func(blocks []Block) {
+ for _, b := range blocks {
+ switch v := b.(type) {
+ case ParagraphBlock:
+ for _, in := range v.Inlines {
+ if _, ok := in.(IconRef); ok {
+ sawIcon = true
+ }
+ }
+ case BulletsBlock:
+ for _, item := range v.Items {
+ for _, in := range item.Inlines {
+ if _, ok := in.(IconRef); ok {
+ sawIcon = true
+ }
+ }
+ }
+ case IconRowsBlock:
+ for _, row := range v.Rows {
+ if row.Icon != nil {
+ sawIcon = true
+ }
+ }
+ case ColumnsBlock:
+ for _, col := range v.Columns {
+ walk(col)
+ }
+ case DiagramBlock:
+ sawDiagram = true
+ }
+ }
+ }
+ walk(s.Body)
+ }
+ assert.True(t, sawHero, "fixture should contain a hero/title/statement slide")
+ assert.True(t, sawTwoCols, "fixture should contain a two-cols slide")
+ assert.True(t, sawThreeCols, "fixture should contain a three-cols slide")
+ assert.True(t, sawNotes, "fixture should contain ## Notes sections")
+ assert.True(t, sawIcon, "fixture should contain FA shortcodes")
+ assert.True(t, sawDiagram, "fixture should contain mermaid blocks")
+
+ // Renderer should produce a non-empty BatchUpdate plan with a fake asset map.
+ am := NewAssetMap()
+ for ref := range collectIconRefs(parsed) {
+ am.Icons[ref] = ImageRef{DriveFileID: "x", PublicURL: "https://example/x"}
+ }
+ for id := range collectDiagrams(parsed) {
+ am.Diagrams[id] = ImageRef{DriveFileID: "y", PublicURL: "https://example/y"}
+ }
+ reqs, notes := RenderSlides(parsed, am, defaultPageGeometry())
+ assert.NotEmpty(t, reqs)
+ assert.NotEmpty(t, notes)
+ _ = context.Background() // reserved for future
+}
diff --git a/internal/cmd/slides_formatter.go b/internal/cmd/slides_formatter.go
index d2468fef1..081d31be8 100644
--- a/internal/cmd/slides_formatter.go
+++ b/internal/cmd/slides_formatter.go
@@ -1,184 +1,613 @@
package cmd
import (
+ "context"
"fmt"
+ "os"
"strings"
+ "unicode/utf16"
+ "google.golang.org/api/drive/v3"
"google.golang.org/api/slides/v1"
)
-const slideElementTitle = "title"
+const (
+ slideLineHeightPT = 22
+ iconImageSizePT = 18
+ iconImageGutterPT = 24
+ diagramVisualLines = 6
+ maxDiagramHeightPT = 160
+)
-// SlidesToAPIRequests converts slide structures to Google Slides API batch update requests
-func SlidesToAPIRequests(slideData []Slide) ([]*slides.Request, map[int]string) {
- var requests []*slides.Request
- slideIDs := make(map[int]string)
+// SlideNotesPlan tells the second BatchUpdate which slide gets which
+// speaker-notes text. SlideIndex maps to the i-th slide created.
+type SlideNotesPlan struct {
+ SlideIndex int
+ SlideID string
+ Text string
+}
- for i, slide := range slideData {
- slideID := fmt.Sprintf("slide_%d", i+1)
- slideIDs[i] = slideID
+// RenderSlides converts a parsed Slide AST plus an AssetMap into the
+// initial BatchUpdate requests AND a notes plan to apply after the
+// presentation is created.
+func RenderSlides(in []Slide, assets AssetMap, g LayoutGeometry) ([]*slides.Request, []SlideNotesPlan) {
+ var reqs []*slides.Request
+ var notes []SlideNotesPlan
- // Create blank slide
- requests = append(requests, &slides.Request{
+ for i, slide := range in {
+ slideID := fmt.Sprintf("slide_%d", i+1)
+ reqs = append(reqs, &slides.Request{
CreateSlide: &slides.CreateSlideRequest{
- ObjectId: slideID,
- SlideLayoutReference: &slides.LayoutReference{
- PredefinedLayout: "BLANK",
- },
+ ObjectId: slideID,
+ SlideLayoutReference: &slides.LayoutReference{PredefinedLayout: "BLANK"},
},
})
- // Add title box
- titleID := fmt.Sprintf("title_%d", i+1)
- requests = append(requests, &slides.Request{
- CreateShape: &slides.CreateShapeRequest{
- ObjectId: titleID,
- ShapeType: "TEXT_BOX",
- ElementProperties: &slides.PageElementProperties{
- PageObjectId: slideID,
- Transform: &slides.AffineTransform{
- ScaleX: 1,
- ScaleY: 1,
- TranslateX: 72 * 0.5, // 0.5 inches from left
- TranslateY: 72 * 0.5, // 0.5 inches from top
- Unit: "PT",
- },
- Size: &slides.Size{
- Width: &slides.Dimension{Magnitude: 612 - 72, Unit: "PT"},
- Height: &slides.Dimension{Magnitude: 100, Unit: "PT"},
- },
- },
- },
- })
+ layout := MapSlideyLayout(slide.Frontmatter.Layout)
- // Add title text
- for _, elem := range slide.Elements {
- if elem.Type == slideElementTitle {
- requests = append(requests, &slides.Request{
- InsertText: &slides.InsertTextRequest{
- ObjectId: titleID,
- Text: elem.Content,
- InsertionIndex: 0,
- },
- })
+ // Title box (skipped for SectionHeader layouts — those put the
+ // title in the body box at large size; see Task 16).
+ if layout != LayoutKindSectionHeader && slide.Title != "" {
+ reqs = append(reqs, renderTitleBox(slideID, i+1, slide.Title, g)...)
+ }
- // Make title bold
- requests = append(requests, &slides.Request{
+ explicitColumnCount := explicitColumnsCount(slide.Body)
+ if (layout == LayoutKindDefault || layout == LayoutKindCenter) && explicitColumnCount > 0 {
+ if explicitColumnCount == 2 {
+ layout = LayoutKindTwoCols
+ } else {
+ layout = LayoutKindThreeCols
+ }
+ }
+
+ switch layout {
+ case LayoutKindSectionHeader:
+ // Body box is one large centered text box. Title is rendered
+ // inline at 44pt; everything else at the standard size.
+ bodyID := fmt.Sprintf("body_%d", i+1)
+ reqs = append(reqs, createTextBox(bodyID, slideID, SingleBodyBox(g)))
+ text := blocksToPlainText(slide.Body)
+ if text != "" {
+ reqs = append(reqs, &slides.Request{
+ InsertText: &slides.InsertTextRequest{ObjectId: bodyID, Text: text},
+ })
+ }
+ // Style first paragraph (the h1 line) at 44pt bold.
+ if firstLineLen := utf16CodeUnits(strings.SplitN(text, "\n", 2)[0]); firstLineLen > 0 {
+ reqs = append(reqs, &slides.Request{
UpdateTextStyle: &slides.UpdateTextStyleRequest{
- ObjectId: titleID,
+ ObjectId: bodyID,
TextRange: &slides.Range{
- Type: "ALL",
+ Type: "FIXED_RANGE",
+ StartIndex: int64Ptr(0),
+ EndIndex: int64Ptr(firstLineLen),
},
Style: &slides.TextStyle{
- Bold: true,
- FontSize: &slides.Dimension{
- Magnitude: 36,
- Unit: "PT",
- },
+ Bold: true,
+ FontSize: &slides.Dimension{Magnitude: 44, Unit: "PT"},
},
Fields: "bold,fontSize",
},
})
}
- }
-
- // Add body box
- bodyID := fmt.Sprintf("body_%d", i+1)
- requests = append(requests, &slides.Request{
- CreateShape: &slides.CreateShapeRequest{
- ObjectId: bodyID,
- ShapeType: "TEXT_BOX",
- ElementProperties: &slides.PageElementProperties{
- PageObjectId: slideID,
- Transform: &slides.AffineTransform{
- ScaleX: 1,
- ScaleY: 1,
- TranslateX: 72 * 0.5,
- TranslateY: 72 * 1.5, // Below title
- Unit: "PT",
+ if text != "" {
+ reqs = append(reqs, &slides.Request{
+ UpdateParagraphStyle: &slides.UpdateParagraphStyleRequest{
+ ObjectId: bodyID,
+ TextRange: &slides.Range{Type: "ALL"},
+ Style: &slides.ParagraphStyle{Alignment: "CENTER"},
+ Fields: "alignment",
},
- Size: &slides.Size{
- Width: &slides.Dimension{Magnitude: 612 - 72, Unit: "PT"},
- Height: &slides.Dimension{Magnitude: 300, Unit: "PT"},
+ })
+ }
+ reqs = appendBlockImages(reqs, slide.Body, slideID, assets, SingleBodyBox(g))
+ case LayoutKindTwoCols, LayoutKindThreeCols:
+ n := 2
+ if layout == LayoutKindThreeCols {
+ n = 3
+ }
+ boxes := ColumnBoxes(g, n)
+ // Find the first ColumnsBlock; if absent, fall back to splitting body evenly.
+ cols := findColumnsBlock(slide.Body, n)
+ for ci := 0; ci < n; ci++ {
+ colID := fmt.Sprintf("body_%d_col%d", i+1, ci+1)
+ reqs = append(reqs, createTextBox(colID, slideID, boxes[ci]))
+ text := blocksToPlainText(cols[ci])
+ if text != "" {
+ reqs = append(reqs, &slides.Request{
+ InsertText: &slides.InsertTextRequest{ObjectId: colID, Text: text},
+ })
+ }
+ reqs = appendBlockImages(reqs, cols[ci], slideID, assets, boxes[ci])
+ }
+ default:
+ // LayoutKindDefault, LayoutKindCenter — single body box.
+ bodyText := blocksToPlainText(slide.Body)
+ bodyID := fmt.Sprintf("body_%d", i+1)
+ reqs = append(reqs, createTextBox(bodyID, slideID, SingleBodyBox(g)))
+ if bodyText != "" {
+ reqs = append(reqs, &slides.Request{
+ InsertText: &slides.InsertTextRequest{ObjectId: bodyID, Text: bodyText},
+ })
+ }
+ if layout == LayoutKindCenter && bodyText != "" {
+ reqs = append(reqs, &slides.Request{
+ UpdateParagraphStyle: &slides.UpdateParagraphStyleRequest{
+ ObjectId: bodyID,
+ TextRange: &slides.Range{Type: "ALL"},
+ Style: &slides.ParagraphStyle{Alignment: "CENTER"},
+ Fields: "alignment",
},
- },
- },
- })
+ })
+ }
+ reqs = appendBlockImages(reqs, slide.Body, slideID, assets, SingleBodyBox(g))
+ }
- // Build body content
- var bodyContent strings.Builder
- for _, elem := range slide.Elements {
- if elem.Type != slideElementTitle {
- switch elem.Type {
- case "body":
- bodyContent.WriteString(elem.Content)
- bodyContent.WriteString("\n")
- case "bullets":
- for _, item := range elem.Items {
- bodyContent.WriteString("• ")
- bodyContent.WriteString(item)
- bodyContent.WriteString("\n")
- }
- case inlineTypeCode:
- bodyContent.WriteString("```\n")
- bodyContent.WriteString(elem.Content)
- bodyContent.WriteString("\n```\n")
+ if slide.Notes != "" {
+ notes = append(notes, SlideNotesPlan{SlideIndex: i, SlideID: slideID, Text: slide.Notes})
+ }
+ }
+ return reqs, notes
+}
+
+func appendBlockImages(reqs []*slides.Request, blocks []Block, slideID string, assets AssetMap, box BoxRect) []*slides.Request {
+ line := 0
+ for i, b := range blocks {
+ switch v := b.(type) {
+ case ParagraphBlock:
+ if icon, ok := leadingIcon(v.Inlines); ok {
+ if img, ok := assets.Icons[icon]; ok {
+ reqs = append(reqs, createIconImageRequest(slideID, img.PublicURL, box, line))
+ }
+ }
+ case HeadingBlock:
+ if icon, ok := leadingIcon(v.Inlines); ok {
+ if img, ok := assets.Icons[icon]; ok {
+ reqs = append(reqs, createIconImageRequest(slideID, img.PublicURL, box, line))
+ }
+ }
+ case DiagramBlock:
+ if ir, ok := assets.Diagrams[v.ID]; ok {
+ reqs = append(reqs, createImageRequest(slideID, ir.PublicURL, box.LeftPT, box.TopPT+float64(line)*slideLineHeightPT, box.WidthPT, minFloat(box.HeightPT, maxDiagramHeightPT)))
+ }
+ case BulletsBlock:
+ for j, item := range v.Items {
+ if len(item.Inlines) == 0 {
+ continue
+ }
+ icon, ok := item.Inlines[0].(IconRef)
+ if !ok {
+ continue
+ }
+ if img, ok := assets.Icons[icon]; ok {
+ reqs = append(reqs, createIconImageRequest(slideID, img.PublicURL, box, line+j))
}
}
+ case IconRowsBlock:
+ for j, row := range v.Rows {
+ if row.Icon == nil {
+ continue
+ }
+ if img, ok := assets.Icons[*row.Icon]; ok {
+ reqs = append(reqs, createIconImageRequest(slideID, img.PublicURL, box, line+j))
+ }
+ }
+ case ColumnsBlock:
+ // Columns are rendered by the column-layout branch with real box positions.
+ }
+ line += blockVisualLines(b)
+ if i < len(blocks)-1 {
+ line++
}
+ }
+ return reqs
+}
- // Add body text if there's content
- if bodyContent.Len() > 0 {
- requests = append(requests, &slides.Request{
- InsertText: &slides.InsertTextRequest{
- ObjectId: bodyID,
- Text: bodyContent.String(),
- InsertionIndex: 0,
+func blockVisualLines(block Block) int {
+ switch v := block.(type) {
+ case ParagraphBlock:
+ return textVisualLines(inlinesToText(v.Inlines))
+ case HeadingBlock:
+ return textVisualLines(inlinesToText(v.Inlines))
+ case BulletsBlock:
+ return maxInt(1, len(v.Items))
+ case CodeBlock:
+ return textVisualLines(v.Source)
+ case IconRowsBlock:
+ return maxInt(1, len(v.Rows))
+ case DiagramBlock:
+ return diagramVisualLines
+ case ColumnsBlock:
+ return textVisualLines(blocksToPlainText([]Block{v}))
+ default:
+ return 1
+ }
+}
+
+func textVisualLines(s string) int {
+ if s == "" {
+ return 1
+ }
+ return strings.Count(s, "\n") + 1
+}
+
+func maxInt(a, b int) int {
+ if a > b {
+ return a
+ }
+ return b
+}
+
+func createImageRequest(slideID, url string, left, top, width, height float64) *slides.Request {
+ return &slides.Request{
+ CreateImage: &slides.CreateImageRequest{
+ Url: url,
+ ElementProperties: &slides.PageElementProperties{
+ PageObjectId: slideID,
+ Transform: &slides.AffineTransform{
+ ScaleX: 1, ScaleY: 1,
+ TranslateX: left, TranslateY: top,
+ Unit: "PT",
+ },
+ Size: &slides.Size{
+ Width: &slides.Dimension{Magnitude: width, Unit: "PT"},
+ Height: &slides.Dimension{Magnitude: height, Unit: "PT"},
},
- })
+ },
+ },
+ }
+}
+
+func createIconImageRequest(slideID, url string, box BoxRect, line int) *slides.Request {
+ return createImageRequest(
+ slideID,
+ url,
+ box.LeftPT-iconImageGutterPT,
+ box.TopPT+float64(line)*slideLineHeightPT,
+ iconImageSizePT,
+ iconImageSizePT,
+ )
+}
+
+func minFloat(a, b float64) float64 {
+ if a < b {
+ return a
+ }
+ return b
+}
+
+func renderTitleBox(slideID string, oneBased int, title string, g LayoutGeometry) []*slides.Request {
+ titleID := fmt.Sprintf("title_%d", oneBased)
+ box := TitleBox(g)
+ return []*slides.Request{
+ createTextBox(titleID, slideID, box),
+ {InsertText: &slides.InsertTextRequest{ObjectId: titleID, Text: title}},
+ {UpdateTextStyle: &slides.UpdateTextStyleRequest{
+ ObjectId: titleID,
+ TextRange: &slides.Range{Type: "ALL"},
+ Style: &slides.TextStyle{
+ Bold: true,
+ FontSize: &slides.Dimension{Magnitude: 28, Unit: "PT"},
+ },
+ Fields: "bold,fontSize",
+ }},
+ }
+}
+
+func createTextBox(objectID, slideID string, box BoxRect) *slides.Request {
+ return &slides.Request{
+ CreateShape: &slides.CreateShapeRequest{
+ ObjectId: objectID,
+ ShapeType: "TEXT_BOX",
+ ElementProperties: &slides.PageElementProperties{
+ PageObjectId: slideID,
+ Transform: &slides.AffineTransform{
+ ScaleX: 1, ScaleY: 1,
+ TranslateX: box.LeftPT, TranslateY: box.TopPT,
+ Unit: "PT",
+ },
+ Size: &slides.Size{
+ Width: &slides.Dimension{Magnitude: box.WidthPT, Unit: "PT"},
+ Height: &slides.Dimension{Magnitude: box.HeightPT, Unit: "PT"},
+ },
+ },
+ },
+ }
+}
+
+// blocksToPlainText is the simplest body-text extraction: paragraphs
+// joined by blank lines, bullets prefixed with "• ", code blocks shown
+// verbatim. Inline icons are skipped; diagrams reserve blank text lines
+// so later text does not overlap the CreateImage request.
+func blocksToPlainText(blocks []Block) string {
+ var b strings.Builder
+ for i, blk := range blocks {
+ if i > 0 {
+ b.WriteString("\n\n")
+ }
+ switch v := blk.(type) {
+ case ParagraphBlock:
+ b.WriteString(inlinesToText(v.Inlines))
+ case HeadingBlock:
+ b.WriteString(inlinesToText(v.Inlines))
+ case BulletsBlock:
+ for j, item := range v.Items {
+ if j > 0 {
+ b.WriteString("\n")
+ }
+ b.WriteString(strings.Repeat(" ", item.Indent))
+ if v.Ordered {
+ fmt.Fprintf(&b, "%d. ", j+1)
+ } else {
+ b.WriteString("• ")
+ }
+ b.WriteString(inlinesToText(item.Inlines))
+ }
+ case CodeBlock:
+ b.WriteString(v.Source)
+ case ColumnsBlock:
+ // Tasks 16/17 render columns as separate boxes; here we
+ // flatten so the renderer still produces output.
+ for ci, col := range v.Columns {
+ if ci > 0 {
+ b.WriteString("\n\n")
+ }
+ b.WriteString(blocksToPlainText(col))
+ }
+ case IconRowsBlock:
+ for j, row := range v.Rows {
+ if j > 0 {
+ b.WriteString("\n")
+ }
+ if v.Kind == "arrows" {
+ b.WriteString("→ ")
+ } else {
+ b.WriteString("• ")
+ }
+ b.WriteString(row.Text)
+ }
+ case DiagramBlock:
+ b.WriteString(strings.Repeat("\n", diagramVisualLines-1))
}
}
+ return b.String()
+}
- return requests, slideIDs
+// CreatePresentationFromMarkdownOptions controls the slidey-aware
+// orchestrator. Wired from SlidesCreateFromMarkdownCmd in slides.go.
+type CreatePresentationFromMarkdownOptions struct {
+ Title string
+ Parent string
+ Slides []Slide
+ SlidesService *slides.Service
+ DriveService *drive.Service
+ Pipeline AssetPipelineConfig
+ NoNotes bool
}
-// CreatePresentationFromMarkdown creates a Google Slides presentation from markdown
-func CreatePresentationFromMarkdown(title string, markdown string, service *slides.Service) (*slides.Presentation, error) {
- // Parse markdown to slides
- slidesData := ParseMarkdownToSlides(markdown)
+// CreatePresentationFromMarkdownV2 is the slidey orchestrator. It:
+//
+// 1. Runs the asset pipeline (uploads icons + diagrams to Drive),
+// 2. Creates the presentation,
+// 3. Reads its page size to derive LayoutGeometry,
+// 4. Renders the first BatchUpdate (slides + content + image refs),
+// 5. Re-fetches the presentation, finds notes object IDs,
+// 6. Renders the second BatchUpdate (speaker notes),
+// 7. Cleans up the temp Drive files.
+func CreatePresentationFromMarkdownV2(ctx context.Context, opts CreatePresentationFromMarkdownOptions) (*slides.Presentation, error) {
+ pipeline := &AssetPipeline{
+ Config: opts.Pipeline,
+ Uploader: &DriveUploader{Svc: opts.DriveService},
+ }
+ defer func() {
+ if cleanupErr := pipeline.Cleanup(ctx); cleanupErr != nil {
+ fmt.Fprintf(os.Stderr, "warning: asset cleanup: %v\n", cleanupErr)
+ }
+ }()
- if len(slidesData) == 0 {
- return nil, fmt.Errorf("no slides found in markdown")
+ assets, err := pipeline.Resolve(ctx, opts.Slides)
+ if err != nil {
+ return nil, fmt.Errorf("resolve assets: %w", err)
}
- // Create presentation
- presentation, err := service.Presentations.Create(&slides.Presentation{
- Title: title,
- }).Do()
+ created, err := opts.SlidesService.Presentations.Create(&slides.Presentation{Title: opts.Title}).Context(ctx).Do()
if err != nil {
- return nil, fmt.Errorf("failed to create presentation: %w", err)
+ return nil, fmt.Errorf("create presentation: %w", err)
}
- // Convert to API requests
- requests, slideIDs := SlidesToAPIRequests(slidesData)
+ if opts.Parent != "" && opts.DriveService != nil {
+ _, moveErr := opts.DriveService.Files.Update(created.PresentationId, &drive.File{}).
+ AddParents(opts.Parent).
+ SupportsAllDrives(true).
+ Context(ctx).
+ Do()
+ if moveErr != nil {
+ return nil, fmt.Errorf("move to parent: %w", moveErr)
+ }
+ }
+
+ g := geometryFromPresentation(created)
- // Execute batch update
- if len(requests) > 0 {
- _, err = service.Presentations.BatchUpdate(presentation.PresentationId, &slides.BatchUpdatePresentationRequest{
- Requests: requests,
- }).Do()
+ mainReqs, notesPlan := buildPopulateRequests(created, opts.Slides, assets, g)
+ if len(mainReqs) > 0 {
+ if _, err := opts.SlidesService.Presentations.BatchUpdate(
+ created.PresentationId,
+ &slides.BatchUpdatePresentationRequest{Requests: mainReqs},
+ ).Context(ctx).Do(); err != nil {
+ return nil, fmt.Errorf("populate slides: %w", err)
+ }
+ }
+
+ if !opts.NoNotes && len(notesPlan) > 0 {
+ populated, err := opts.SlidesService.Presentations.Get(created.PresentationId).Context(ctx).Do()
if err != nil {
- return nil, fmt.Errorf("failed to populate slides: %w", err)
+ return nil, fmt.Errorf("re-fetch presentation: %w", err)
+ }
+ notesReqs := buildNotesRequests(populated, notesPlan)
+ if len(notesReqs) > 0 {
+ if _, err := opts.SlidesService.Presentations.BatchUpdate(
+ created.PresentationId,
+ &slides.BatchUpdatePresentationRequest{Requests: notesReqs},
+ ).Context(ctx).Do(); err != nil {
+ return nil, fmt.Errorf("apply notes: %w", err)
+ }
}
}
- // Debug output
- if debugSlides {
- fmt.Printf("[DEBUG] Created presentation with %d slides\n", len(slidesData))
- for i, slideID := range slideIDs {
- fmt.Printf(" Slide %d: %s - %s\n", i+1, slideID, slidesData[i].Title)
+ return created, nil
+}
+
+func buildPopulateRequests(created *slides.Presentation, in []Slide, assets AssetMap, g LayoutGeometry) ([]*slides.Request, []SlideNotesPlan) {
+ mainReqs, notesPlan := RenderSlides(in, assets, g)
+ mainReqs = append(mainReqs, deleteExistingSlideRequests(created)...)
+ return mainReqs, notesPlan
+}
+
+func deleteExistingSlideRequests(p *slides.Presentation) []*slides.Request {
+ if p == nil {
+ return nil
+ }
+ reqs := make([]*slides.Request, 0, len(p.Slides))
+ for _, s := range p.Slides {
+ if s == nil || s.ObjectId == "" {
+ continue
}
+ reqs = append(reqs, &slides.Request{
+ DeleteObject: &slides.DeleteObjectRequest{ObjectId: s.ObjectId},
+ })
}
+ return reqs
+}
- return presentation, nil
+func geometryFromPresentation(p *slides.Presentation) LayoutGeometry {
+ if p == nil || p.PageSize == nil {
+ return defaultPageGeometry()
+ }
+ // Slides PageSize is in EMU; 1pt = 12700 EMU.
+ w := float64(p.PageSize.Width.Magnitude) / 12700.0
+ h := float64(p.PageSize.Height.Magnitude) / 12700.0
+ if p.PageSize.Width.Unit == "PT" {
+ w = float64(p.PageSize.Width.Magnitude)
+ h = float64(p.PageSize.Height.Magnitude)
+ }
+ return LayoutGeometry{PageWidthPT: w, PageHeightPT: h, MarginPT: 36, GutterPT: 24, BodyTopPT: 108}
+}
+
+func buildNotesRequests(p *slides.Presentation, plan []SlideNotesPlan) []*slides.Request {
+ var reqs []*slides.Request
+ for _, np := range plan {
+ page, _ := findSlidesPageByID(p, np.SlideID)
+ if page == nil {
+ continue
+ }
+ notesID := findSpeakerNotesObjectID(page)
+ if notesID == "" {
+ continue
+ }
+ // Freshly-created slides have empty notes boxes; a DeleteText{ALL}
+ // against an empty box errors out with "startIndex 0 must be less
+ // than endIndex 0", so just InsertText.
+ if np.Text == "" {
+ continue
+ }
+ reqs = append(reqs, &slides.Request{
+ InsertText: &slides.InsertTextRequest{ObjectId: notesID, Text: np.Text},
+ })
+ }
+ return reqs
+}
+
+func buildSlideyDryRunBatchUpdate(slideData []Slide) *slides.BatchUpdatePresentationRequest {
+ g := defaultPageGeometry()
+ assets := NewAssetMap()
+ // Stub asset map: every IconRef gets a placeholder URL; same for diagrams.
+ for ref := range collectIconRefs(slideData) {
+ assets.Icons[ref] = ImageRef{
+ DriveFileID: "dryrun",
+ PublicURL: fmt.Sprintf("gogcli://pending/fa-%s-%s", ref.Style, ref.Name),
+ }
+ }
+ for id := range collectDiagrams(slideData) {
+ assets.Diagrams[id] = ImageRef{
+ DriveFileID: "dryrun",
+ PublicURL: fmt.Sprintf("gogcli://pending/diagram-%s", id),
+ }
+ }
+ mainReqs, _ := RenderSlides(slideData, assets, g)
+ return &slides.BatchUpdatePresentationRequest{Requests: mainReqs}
+}
+
+// SlidesToAPIRequests is retained as a thin wrapper for any legacy caller.
+func SlidesToAPIRequests(in []Slide) ([]*slides.Request, map[int]string) {
+ reqs, _ := RenderSlides(in, NewAssetMap(), defaultPageGeometry())
+ ids := map[int]string{}
+ for i := range in {
+ ids[i] = fmt.Sprintf("slide_%d", i+1)
+ }
+ return reqs, ids
+}
+
+func defaultPageGeometry() LayoutGeometry {
+ // Standard 16:9 Slides page = 10in x 5.625in = 720pt x 405pt.
+ return LayoutGeometry{
+ PageWidthPT: 720, PageHeightPT: 405,
+ MarginPT: 36, GutterPT: 24, BodyTopPT: 108,
+ }
+}
+
+func int64Ptr(v int64) *int64 { return &v }
+
+func utf16CodeUnits(s string) int64 {
+ return int64(len(utf16.Encode([]rune(s))))
+}
+
+// findColumnsBlock returns the column contents from the first ColumnsBlock,
+// padded/truncated to exactly n columns. Surrounding blocks are preserved:
+// prefix content is prepended to the first column, suffix content appended to
+// the last column.
+func findColumnsBlock(blocks []Block, n int) [][]Block {
+ out := make([][]Block, n)
+ found := false
+ for _, b := range blocks {
+ if c, ok := b.(ColumnsBlock); ok {
+ if !found {
+ for i := 0; i < n; i++ {
+ if i < len(c.Columns) {
+ out[i] = append(out[i], c.Columns[i]...)
+ }
+ }
+ found = true
+ continue
+ }
+ }
+ if found {
+ out[n-1] = append(out[n-1], b)
+ } else {
+ out[0] = append(out[0], b)
+ }
+ }
+ if found {
+ return out
+ }
+ // No explicit ColumnsBlock — split top-level body roughly evenly.
+ for i, b := range blocks {
+ out[i%n] = append(out[i%n], b)
+ }
+ return out
+}
+
+func explicitColumnsCount(blocks []Block) int {
+ for _, b := range blocks {
+ c, ok := b.(ColumnsBlock)
+ if !ok {
+ continue
+ }
+ switch len(c.Columns) {
+ case 2:
+ return 2
+ case 3:
+ return 3
+ }
+ }
+ return 0
}
diff --git a/internal/cmd/slides_formatter_test.go b/internal/cmd/slides_formatter_test.go
new file mode 100644
index 000000000..7ee8059d3
--- /dev/null
+++ b/internal/cmd/slides_formatter_test.go
@@ -0,0 +1,371 @@
+package cmd
+
+import (
+ "strings"
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+ "github.com/stretchr/testify/require"
+ "google.golang.org/api/slides/v1"
+)
+
+func defaultGeometry() LayoutGeometry {
+ return LayoutGeometry{PageWidthPT: 720, PageHeightPT: 405, MarginPT: 36, GutterPT: 24, BodyTopPT: 108}
+}
+
+func TestRenderSlide_DefaultLayout_TitlePlusBody(t *testing.T) {
+ s := Slide{
+ Title: "Hello",
+ Body: []Block{
+ ParagraphBlock{Inlines: []Inline{TextRun{Text: "World"}}},
+ },
+ }
+ reqs, _ := RenderSlides([]Slide{s}, NewAssetMap(), defaultGeometry())
+
+ // Expect: CreateSlide, CreateShape (title), InsertText (title),
+ // UpdateTextStyle (title bold), CreateShape (body), InsertText (body).
+ require.GreaterOrEqual(t, len(reqs), 6)
+ assert.NotNil(t, reqs[0].CreateSlide)
+ // Find at least one InsertText with "Hello" and one with "World".
+ var sawHello, sawWorld bool
+ for _, r := range reqs {
+ if r.InsertText != nil {
+ if r.InsertText.Text == "Hello" {
+ sawHello = true
+ }
+ if r.InsertText.Text == "World" {
+ sawWorld = true
+ }
+ }
+ }
+ assert.True(t, sawHello)
+ assert.True(t, sawWorld)
+}
+
+func TestRenderSlide_NotesRequestsReturned(t *testing.T) {
+ s := Slide{Title: "T", Notes: "speaker hint"}
+ _, notesPlan := RenderSlides([]Slide{s}, NewAssetMap(), defaultGeometry())
+
+ // notesPlan is a slice of {SlideIndex int, Text string} we feed into
+ // the second BatchUpdate after discovering notes object IDs.
+ require.Equal(t, 1, len(notesPlan))
+ assert.Equal(t, 0, notesPlan[0].SlideIndex)
+ assert.Equal(t, "speaker hint", notesPlan[0].Text)
+}
+
+func TestRenderSlide_HeroLayoutLargeTitleNoTitleBox(t *testing.T) {
+ s := Slide{
+ Frontmatter: SlideFrontmatter{Layout: "hero"},
+ Body: []Block{
+ HeadingBlock{Level: 1, Inlines: []Inline{TextRun{Text: "Big Wordmark"}}},
+ },
+ }
+ reqs, _ := RenderSlides([]Slide{s}, NewAssetMap(), defaultGeometry())
+
+ // No separate title text box — find the body insert and the 44pt style.
+ var sawLargeStyle bool
+ for _, r := range reqs {
+ if r.UpdateTextStyle != nil && r.UpdateTextStyle.Style != nil &&
+ r.UpdateTextStyle.Style.FontSize != nil &&
+ r.UpdateTextStyle.Style.FontSize.Magnitude == 44 {
+ sawLargeStyle = true
+ }
+ }
+ assert.True(t, sawLargeStyle, "hero h1 should be styled at 44pt")
+}
+
+func TestRenderSlide_CenterLayoutWithOnlyTitleDoesNotStyleEmptyBody(t *testing.T) {
+ s := Slide{
+ Frontmatter: SlideFrontmatter{Layout: "center"},
+ Title: "Only title",
+ }
+ reqs, _ := RenderSlides([]Slide{s}, NewAssetMap(), defaultGeometry())
+
+ for _, r := range reqs {
+ if r.UpdateParagraphStyle != nil && r.UpdateParagraphStyle.ObjectId == "body_1" {
+ t.Fatal("must not style an empty body text box")
+ }
+ }
+}
+
+func TestRenderSlide_HeroStyleRangeUsesUTF16(t *testing.T) {
+ s := Slide{
+ Frontmatter: SlideFrontmatter{Layout: "hero"},
+ Body: []Block{
+ HeadingBlock{Level: 1, Inlines: []Inline{TextRun{Text: "A 🐢"}}},
+ },
+ }
+ reqs, _ := RenderSlides([]Slide{s}, NewAssetMap(), defaultGeometry())
+
+ for _, r := range reqs {
+ if r.UpdateTextStyle != nil && r.UpdateTextStyle.TextRange != nil &&
+ r.UpdateTextStyle.TextRange.Type == "FIXED_RANGE" {
+ require.NotNil(t, r.UpdateTextStyle.TextRange.EndIndex)
+ assert.Equal(t, int64(4), *r.UpdateTextStyle.TextRange.EndIndex)
+ return
+ }
+ }
+ t.Fatal("expected fixed-range hero text style")
+}
+
+func TestRenderSlide_TwoColumnsCreateTwoBodyBoxes(t *testing.T) {
+ s := Slide{
+ Frontmatter: SlideFrontmatter{Layout: "two-cols"},
+ Title: "T",
+ Body: []Block{
+ ColumnsBlock{Columns: [][]Block{
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "left"}}}},
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "right"}}}},
+ }},
+ },
+ }
+ reqs, _ := RenderSlides([]Slide{s}, NewAssetMap(), defaultGeometry())
+ // Expect a CreateShape per column (in addition to title shape).
+ shapeCount := 0
+ for _, r := range reqs {
+ if r.CreateShape != nil {
+ shapeCount++
+ }
+ }
+ assert.GreaterOrEqual(t, shapeCount, 3, "title + 2 column body boxes")
+}
+
+func TestRenderSlide_ExplicitColumnsWithoutLayoutCreateColumnBoxes(t *testing.T) {
+ s := Slide{
+ Title: "T",
+ Body: []Block{
+ ColumnsBlock{Columns: [][]Block{
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "left"}}}},
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "right"}}}},
+ }},
+ },
+ }
+ reqs, _ := RenderSlides([]Slide{s}, NewAssetMap(), defaultGeometry())
+
+ var columnShapes []string
+ for _, r := range reqs {
+ if r.CreateShape != nil && strings.Contains(r.CreateShape.ObjectId, "_col") {
+ columnShapes = append(columnShapes, r.CreateShape.ObjectId)
+ }
+ }
+ assert.ElementsMatch(t, []string{"body_1_col1", "body_1_col2"}, columnShapes)
+}
+
+func TestRenderSlide_ThreeColumnsCreateThreeBodyBoxes(t *testing.T) {
+ s := Slide{
+ Frontmatter: SlideFrontmatter{Layout: "three-cols"},
+ Title: "T",
+ Body: []Block{
+ ColumnsBlock{Columns: [][]Block{
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "A"}}}},
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "B"}}}},
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "C"}}}},
+ }},
+ },
+ }
+ reqs, _ := RenderSlides([]Slide{s}, NewAssetMap(), defaultGeometry())
+ shapeCount := 0
+ for _, r := range reqs {
+ if r.CreateShape != nil {
+ shapeCount++
+ }
+ }
+ assert.GreaterOrEqual(t, shapeCount, 4, "title + 3 column body boxes")
+}
+
+func TestFindColumnsBlock_PreservesSurroundingContent(t *testing.T) {
+ got := findColumnsBlock([]Block{
+ ParagraphBlock{Inlines: []Inline{TextRun{Text: "Intro"}}},
+ ColumnsBlock{Columns: [][]Block{
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "Left"}}}},
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "Right"}}}},
+ }},
+ ParagraphBlock{Inlines: []Inline{TextRun{Text: "After"}}},
+ }, 2)
+
+ require.Equal(t, 2, len(got))
+ assert.Equal(t, "Intro\n\nLeft", blocksToPlainText(got[0]))
+ assert.Equal(t, "Right\n\nAfter", blocksToPlainText(got[1]))
+}
+
+func TestBuildPopulateRequests_DeleteDefaultSlideAfterCreatedSlides(t *testing.T) {
+ reqs, _ := buildPopulateRequests(
+ &slides.Presentation{Slides: []*slides.Page{{ObjectId: "default-slide"}}},
+ []Slide{{Title: "Imported"}},
+ NewAssetMap(),
+ defaultGeometry(),
+ )
+
+ require.NotEmpty(t, reqs)
+ assert.NotNil(t, reqs[0].CreateSlide)
+ require.NotNil(t, reqs[len(reqs)-1].DeleteObject)
+ assert.Equal(t, "default-slide", reqs[len(reqs)-1].DeleteObject.ObjectId)
+}
+
+func TestRenderSlide_DiagramEmitsCreateImage(t *testing.T) {
+ bid := "block-test-1"
+ s := Slide{
+ Title: "T",
+ Body: []Block{DiagramBlock{Kind: "mermaid", Source: "graph TD\nA-->B", ID: bid}},
+ }
+ am := NewAssetMap()
+ am.Diagrams[bid] = ImageRef{DriveFileID: "f1", PublicURL: "https://drive.example/f1"}
+
+ reqs, _ := RenderSlides([]Slide{s}, am, defaultGeometry())
+ var sawImage bool
+ for _, r := range reqs {
+ if r.CreateImage != nil && r.CreateImage.Url == "https://drive.example/f1" {
+ sawImage = true
+ }
+ }
+ assert.True(t, sawImage)
+}
+
+func TestBlocksToPlainText_ReservesDiagramSpace(t *testing.T) {
+ got := blocksToPlainText([]Block{
+ DiagramBlock{Kind: "mermaid", Source: "graph TD\nA-->B", ID: "diagram-1"},
+ ParagraphBlock{Inlines: []Inline{TextRun{Text: "After"}}},
+ })
+
+ assert.Equal(t, strings.Repeat("\n", diagramVisualLines+1)+"After", got)
+}
+
+func TestRenderSlide_BulletWithLeadingIconEmitsImage(t *testing.T) {
+ icon := IconRef{Style: "solid", Name: "truck-fast"}
+ s := Slide{
+ Title: "T",
+ Body: []Block{
+ BulletsBlock{Items: []BulletItem{
+ {Inlines: []Inline{icon, TextRun{Text: " Fulfilment"}}},
+ }},
+ },
+ }
+ am := NewAssetMap()
+ am.Icons[icon] = ImageRef{DriveFileID: "f2", PublicURL: "https://drive.example/f2"}
+
+ reqs, _ := RenderSlides([]Slide{s}, am, defaultGeometry())
+ var sawIcon bool
+ for _, r := range reqs {
+ if r.CreateImage != nil && r.CreateImage.Url == "https://drive.example/f2" {
+ sawIcon = true
+ assert.Less(t, r.CreateImage.ElementProperties.Transform.TranslateX, SingleBodyBox(defaultGeometry()).LeftPT)
+ }
+ }
+ assert.True(t, sawIcon)
+}
+
+func TestBlocksToPlainText_PreservesOrderedAndNestedLists(t *testing.T) {
+ got := blocksToPlainText([]Block{
+ BulletsBlock{Ordered: true, Items: []BulletItem{
+ {Inlines: []Inline{TextRun{Text: "first"}}},
+ {Indent: 1, Inlines: []Inline{TextRun{Text: "second"}}},
+ }},
+ BulletsBlock{Items: []BulletItem{
+ {Indent: 2, Inlines: []Inline{TextRun{Text: "nested"}}},
+ }},
+ })
+
+ assert.Equal(t, "1. first\n 2. second\n\n • nested", got)
+}
+
+func TestRenderSlide_IconRowsEmitImages(t *testing.T) {
+ icon := IconRef{Style: "solid", Name: "headset"}
+ s := Slide{
+ Title: "T",
+ Body: []Block{
+ IconRowsBlock{Kind: "boxes", Rows: []IconRow{{Icon: &icon, Text: "Support"}}},
+ },
+ }
+ am := NewAssetMap()
+ am.Icons[icon] = ImageRef{DriveFileID: "f3", PublicURL: "https://drive.example/f3"}
+
+ reqs, _ := RenderSlides([]Slide{s}, am, defaultGeometry())
+ var sawIcon bool
+ for _, r := range reqs {
+ if r.CreateImage != nil && r.CreateImage.Url == "https://drive.example/f3" {
+ sawIcon = true
+ assert.Less(t, r.CreateImage.ElementProperties.Transform.TranslateX, SingleBodyBox(defaultGeometry()).LeftPT)
+ }
+ }
+ assert.True(t, sawIcon)
+}
+
+func TestRenderSlide_HeadingLeadingIconEmitsImage(t *testing.T) {
+ icon := IconRef{Style: "solid", Name: "file"}
+ s := Slide{
+ Title: "T",
+ Body: []Block{
+ HeadingBlock{Level: 2, Inlines: []Inline{icon, TextRun{Text: " Rethink"}}},
+ },
+ }
+ am := NewAssetMap()
+ am.Icons[icon] = ImageRef{DriveFileID: "f5", PublicURL: "https://drive.example/f5"}
+
+ reqs, _ := RenderSlides([]Slide{s}, am, defaultGeometry())
+ var sawIcon bool
+ for _, r := range reqs {
+ if r.CreateImage != nil && r.CreateImage.Url == "https://drive.example/f5" {
+ sawIcon = true
+ }
+ }
+ assert.True(t, sawIcon)
+}
+
+func TestRenderSlide_IconImagePositionAccountsForBlankLinesBetweenBlocks(t *testing.T) {
+ icon := IconRef{Style: "solid", Name: "file"}
+ s := Slide{
+ Title: "T",
+ Body: []Block{
+ ParagraphBlock{Inlines: []Inline{TextRun{Text: "Intro"}}},
+ HeadingBlock{Level: 2, Inlines: []Inline{icon, TextRun{Text: " Rethink"}}},
+ },
+ }
+ am := NewAssetMap()
+ am.Icons[icon] = ImageRef{DriveFileID: "f5", PublicURL: "https://drive.example/f5"}
+
+ reqs, _ := RenderSlides([]Slide{s}, am, defaultGeometry())
+ for _, r := range reqs {
+ if r.CreateImage != nil && r.CreateImage.Url == "https://drive.example/f5" {
+ assert.Equal(t, float64(152), r.CreateImage.ElementProperties.Transform.TranslateY)
+ return
+ }
+ }
+ t.Fatal("expected icon image request")
+}
+
+func TestRenderSlide_ColumnDiagramEmitsCreateImage(t *testing.T) {
+ bid := "block-column-1"
+ s := Slide{
+ Frontmatter: SlideFrontmatter{Layout: "two-cols"},
+ Title: "T",
+ Body: []Block{
+ ColumnsBlock{Columns: [][]Block{
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "left"}}}},
+ {DiagramBlock{Kind: "mermaid", Source: "graph TD\nA-->B", ID: bid}},
+ }},
+ },
+ }
+ am := NewAssetMap()
+ am.Diagrams[bid] = ImageRef{DriveFileID: "f4", PublicURL: "https://drive.example/f4"}
+
+ reqs, _ := RenderSlides([]Slide{s}, am, defaultGeometry())
+ var sawImage bool
+ for _, r := range reqs {
+ if r.CreateImage != nil && r.CreateImage.Url == "https://drive.example/f4" {
+ sawImage = true
+ }
+ }
+ assert.True(t, sawImage)
+}
+
+func TestDeleteExistingSlideRequests(t *testing.T) {
+ reqs := deleteExistingSlideRequests(&slides.Presentation{Slides: []*slides.Page{
+ {ObjectId: "p"},
+ nil,
+ {ObjectId: "slide_existing"},
+ }})
+
+ require.Equal(t, 2, len(reqs))
+ assert.Equal(t, "p", reqs[0].DeleteObject.ObjectId)
+ assert.Equal(t, "slide_existing", reqs[1].DeleteObject.ObjectId)
+}
diff --git a/internal/cmd/slides_layout.go b/internal/cmd/slides_layout.go
new file mode 100644
index 000000000..5508a6c94
--- /dev/null
+++ b/internal/cmd/slides_layout.go
@@ -0,0 +1,88 @@
+package cmd
+
+// LayoutKind enumerates the renderer's internal layout categories.
+type LayoutKind int
+
+const (
+ LayoutKindDefault LayoutKind = iota
+ LayoutKindCenter
+ LayoutKindSectionHeader // title / hero / statement
+ LayoutKindTwoCols
+ LayoutKindThreeCols
+)
+
+const slideyLayoutTitle = "title"
+
+// MapSlideyLayout maps a slidey frontmatter layout name to a LayoutKind.
+// Unknown values fall back to LayoutKindDefault.
+func MapSlideyLayout(name string) LayoutKind {
+ switch name {
+ case "center":
+ return LayoutKindCenter
+ case slideyLayoutTitle, "hero", "statement":
+ return LayoutKindSectionHeader
+ case "two-cols":
+ return LayoutKindTwoCols
+ case "three-cols":
+ return LayoutKindThreeCols
+ default:
+ return LayoutKindDefault
+ }
+}
+
+// LayoutGeometry holds the per-presentation geometry constants used to
+// position text and image boxes. Sizes are in points (PT).
+type LayoutGeometry struct {
+ PageWidthPT float64
+ PageHeightPT float64
+ MarginPT float64
+ GutterPT float64
+ BodyTopPT float64 // top edge of the body area (below the title)
+}
+
+// BoxRect is a positioned rectangle in points.
+type BoxRect struct {
+ LeftPT, TopPT, WidthPT, HeightPT float64
+}
+
+// ColumnBoxes returns N side-by-side body box rectangles using the
+// page geometry. Heights are clamped to (pageHeight - bodyTop - margin).
+func ColumnBoxes(g LayoutGeometry, n int) []BoxRect {
+ if n < 1 {
+ return nil
+ }
+ innerWidth := g.PageWidthPT - 2*g.MarginPT - float64(n-1)*g.GutterPT
+ colWidth := innerWidth / float64(n)
+ height := g.PageHeightPT - g.BodyTopPT - g.MarginPT
+
+ out := make([]BoxRect, n)
+ for i := 0; i < n; i++ {
+ out[i] = BoxRect{
+ LeftPT: g.MarginPT + float64(i)*(colWidth+g.GutterPT),
+ TopPT: g.BodyTopPT,
+ WidthPT: colWidth,
+ HeightPT: height,
+ }
+ }
+ return out
+}
+
+// SingleBodyBox returns one full-width body box at the body-top.
+func SingleBodyBox(g LayoutGeometry) BoxRect {
+ return BoxRect{
+ LeftPT: g.MarginPT,
+ TopPT: g.BodyTopPT,
+ WidthPT: g.PageWidthPT - 2*g.MarginPT,
+ HeightPT: g.PageHeightPT - g.BodyTopPT - g.MarginPT,
+ }
+}
+
+// TitleBox returns the title-bar box at the top of the slide.
+func TitleBox(g LayoutGeometry) BoxRect {
+ return BoxRect{
+ LeftPT: g.MarginPT,
+ TopPT: g.MarginPT,
+ WidthPT: g.PageWidthPT - 2*g.MarginPT,
+ HeightPT: g.BodyTopPT - g.MarginPT,
+ }
+}
diff --git a/internal/cmd/slides_layout_test.go b/internal/cmd/slides_layout_test.go
new file mode 100644
index 000000000..414fe8b93
--- /dev/null
+++ b/internal/cmd/slides_layout_test.go
@@ -0,0 +1,45 @@
+package cmd
+
+import (
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+)
+
+func TestMapSlideyLayout(t *testing.T) {
+ cases := map[string]LayoutKind{
+ "": LayoutKindDefault,
+ "default": LayoutKindDefault,
+ "center": LayoutKindCenter,
+ "title": LayoutKindSectionHeader,
+ "hero": LayoutKindSectionHeader,
+ "statement": LayoutKindSectionHeader,
+ "two-cols": LayoutKindTwoCols,
+ "three-cols": LayoutKindThreeCols,
+ "unknown-lay": LayoutKindDefault,
+ }
+ for in, want := range cases {
+ assert.Equal(t, want, MapSlideyLayout(in), "layout=%q", in)
+ }
+}
+
+func TestColumnBoxes_TwoColumns(t *testing.T) {
+ g := LayoutGeometry{PageWidthPT: 720, PageHeightPT: 405, MarginPT: 36, GutterPT: 24, BodyTopPT: 108}
+ boxes := ColumnBoxes(g, 2)
+ assert.Equal(t, 2, len(boxes))
+ // width = (720 - 2*36 - (2-1)*24) / 2 = (720 - 72 - 24)/2 = 624/2 = 312
+ assert.InDelta(t, 36, boxes[0].LeftPT, 0.001)
+ assert.InDelta(t, 312, boxes[0].WidthPT, 0.001)
+ assert.InDelta(t, 312, boxes[1].WidthPT, 0.001)
+ assert.InDelta(t, 36+312+24, boxes[1].LeftPT, 0.001)
+}
+
+func TestColumnBoxes_ThreeColumns(t *testing.T) {
+ g := LayoutGeometry{PageWidthPT: 720, PageHeightPT: 405, MarginPT: 36, GutterPT: 24, BodyTopPT: 108}
+ boxes := ColumnBoxes(g, 3)
+ assert.Equal(t, 3, len(boxes))
+ // width = (720 - 72 - 48) / 3 = 600/3 = 200
+ assert.InDelta(t, 200, boxes[0].WidthPT, 0.001)
+ assert.InDelta(t, 200, boxes[1].WidthPT, 0.001)
+ assert.InDelta(t, 200, boxes[2].WidthPT, 0.001)
+}
diff --git a/internal/cmd/slides_markdown.go b/internal/cmd/slides_markdown.go
index d9fabeef8..0f4c765ed 100644
--- a/internal/cmd/slides_markdown.go
+++ b/internal/cmd/slides_markdown.go
@@ -4,224 +4,186 @@ import (
"strings"
)
-// SlideLayout represents the layout type for a slide
-type SlideLayout string
-
-const (
- LayoutTitleOnly SlideLayout = "TITLE"
- LayoutTitleAndBody SlideLayout = "TITLE_AND_BODY"
- LayoutTitleAndTwoColumns SlideLayout = "TITLE_AND_TWO_COLUMNS"
- LayoutSectionHeader SlideLayout = "SECTION_HEADER"
- LayoutBlank SlideLayout = "BLANK"
-)
-
-// SlideElement represents an element on a slide
-type SlideElement struct {
- Type string // "title", "body", "bullets", "code"
- Content string
- Items []string // for bullet lists
- IsBold bool
- IsItalic bool
+// ParseOptions configures the markdown parser.
+type ParseOptions struct {
+ DefaultFAStyle string // "solid"|"regular"|"brands"; empty → "solid"
}
-// Slide represents a single slide
-type Slide struct {
- Title string
- Layout SlideLayout
- Elements []SlideElement
+// ParseMarkdownToSlides parses a slidey-flavored markdown deck into a
+// slice of Slide AST nodes. Returns an error if frontmatter is malformed.
+func ParseMarkdownToSlides(markdown string, opts ParseOptions) ([]Slide, error) {
+ if opts.DefaultFAStyle == "" {
+ opts.DefaultFAStyle = "solid"
+ }
+ blocks, err := splitMarkdownIntoSlideBlocks(markdown)
+ if err != nil {
+ return nil, err
+ }
+ out := make([]Slide, 0, len(blocks))
+ ids := &blockIDGenerator{}
+ for _, b := range blocks {
+ out = append(out, parseSlideFromBlock(b, opts, ids))
+ }
+ return out, nil
}
-// ParseMarkdownToSlides parses markdown into slide structures
-func ParseMarkdownToSlides(markdown string) []Slide {
- var slides []Slide
-
- // Split by slide separators (--- on its own line)
- lines := strings.Split(markdown, "\n")
- var currentSlide strings.Builder
- inSlide := false
-
- for _, line := range lines {
- if strings.TrimSpace(line) == literalMarkdownTripleDash {
- if currentSlide.Len() > 0 {
- slide := parseSlide(currentSlide.String())
- if slide.Title != "" {
- slides = append(slides, slide)
- }
- currentSlide.Reset()
- }
- inSlide = false
- } else {
- if !inSlide {
- inSlide = true
- }
- if currentSlide.Len() > 0 {
- currentSlide.WriteString("\n")
- }
- currentSlide.WriteString(line)
- }
- }
+func parseSlideFromBlock(b slideBlock, opts ParseOptions, ids *blockIDGenerator) Slide {
+ body, notesText := splitOutNotes(b.Body)
+ body = normalizeShorthandColumns(body, b.Frontmatter.Layout)
+ parsed := parseBlocksWithIDs(body, opts.DefaultFAStyle, ids)
- // Handle the last slide
- if currentSlide.Len() > 0 {
- slide := parseSlide(currentSlide.String())
- if slide.Title != "" {
- slides = append(slides, slide)
- }
+ slide := Slide{
+ Frontmatter: b.Frontmatter,
+ Body: parsed,
+ Notes: stripFAShortcodes(notesText),
}
- return slides
+ if !layoutSkipsTitleHoist(b.Frontmatter.Layout) {
+ title, remaining := hoistTitle(parsed)
+ slide.Title = title
+ slide.Body = remaining
+ }
+ return slide
}
-// parseSlide parses a single slide's markdown
-func parseSlide(text string) Slide {
- slide := Slide{
- Layout: LayoutTitleAndBody,
- }
-
- lines := strings.Split(text, "\n")
- var currentElement *SlideElement
- var inCodeBlock bool
- var codeContent strings.Builder
-
- for _, line := range lines {
- // Handle code blocks
- if strings.HasPrefix(line, "```") {
- if inCodeBlock {
- // End code block
- if currentElement != nil {
- currentElement.Content = codeContent.String()
- slide.Elements = append(slide.Elements, *currentElement)
- }
- inCodeBlock = false
- currentElement = nil
- codeContent.Reset()
- } else {
- // Start code block
- inCodeBlock = true
- currentElement = &SlideElement{
- Type: "code",
- }
- }
+// splitOutNotes scans body lines for an exact "## Notes" or "### Notes"
+// heading (case-sensitive). Everything from that heading to the end is
+// returned as raw notes text (without the heading itself); the body
+// returned is everything before.
+func splitOutNotes(body string) (newBody string, notes string) {
+ lines := strings.Split(body, "\n")
+ var fenceChar byte
+ fenceLen := 0
+ for i, line := range lines {
+ if isFenceDelimiter(line) {
+ fenceChar, fenceLen = updateMarkdownFenceState(line, fenceChar, fenceLen)
continue
}
-
- if inCodeBlock {
- if codeContent.Len() > 0 {
- codeContent.WriteString("\n")
- }
- codeContent.WriteString(line)
+ if fenceLen > 0 {
continue
}
-
- // Skip empty lines
- if strings.TrimSpace(line) == "" {
- continue
+ t := strings.TrimSpace(line)
+ if t == "## Notes" || t == "### Notes" {
+ b := strings.Join(lines[:i], "\n")
+ n := strings.TrimSpace(strings.Join(lines[i+1:], "\n"))
+ return b, n
}
+ }
+ return body, ""
+}
- // Title (## heading for slides)
- if strings.HasPrefix(line, "## ") {
- title := strings.TrimPrefix(line, "## ")
- // Remove formatting markers
- title = stripInlineFormatting(title)
- slide.Title = title
- slide.Elements = append(slide.Elements, SlideElement{
- Type: "title",
- Content: title,
- })
- continue
+// hoistTitle returns the first h1 (or h2 fallback) inline text and the
+// blocks with that heading removed.
+func hoistTitle(blocks []Block) (string, []Block) {
+ // First pass: look for h1.
+ for i, b := range blocks {
+ if h, ok := b.(HeadingBlock); ok && h.Level == 1 {
+ return inlinesToText(h.Inlines), removeIndex(blocks, i)
}
-
- // Bullet points
- if strings.HasPrefix(line, "- ") || strings.HasPrefix(line, "* ") {
- item := strings.TrimPrefix(strings.TrimPrefix(line, "- "), "* ")
- item = stripInlineFormatting(item)
-
- // Find or create bullets element
- var bulletsElement *SlideElement
- for i := range slide.Elements {
- if slide.Elements[i].Type == "bullets" {
- bulletsElement = &slide.Elements[i]
- break
- }
- }
-
- if bulletsElement == nil {
- slide.Elements = append(slide.Elements, SlideElement{
- Type: "bullets",
- Items: []string{item},
- })
- } else {
- bulletsElement.Items = append(bulletsElement.Items, item)
- }
- continue
+ }
+ // Fallback: first h2.
+ for i, b := range blocks {
+ if h, ok := b.(HeadingBlock); ok && h.Level == 2 {
+ return inlinesToText(h.Inlines), removeIndex(blocks, i)
}
-
- // Regular paragraph
- content := stripInlineFormatting(line)
- slide.Elements = append(slide.Elements, SlideElement{
- Type: "body",
- Content: content,
- })
}
-
- // Determine layout based on content
- slide.Layout = determineLayout(slide)
-
- return slide
+ return "", blocks
}
-// stripInlineFormatting removes markdown formatting from text
-func stripInlineFormatting(text string) string {
- // Remove bold/italic markers
- text = strings.ReplaceAll(text, "**", "")
- text = strings.ReplaceAll(text, "__", "")
- text = strings.ReplaceAll(text, "*", "")
- text = strings.ReplaceAll(text, "_", "")
-
- // Remove code markers
- text = strings.ReplaceAll(text, "`", "")
+func removeIndex(s []Block, i int) []Block {
+ out := make([]Block, 0, len(s)-1)
+ out = append(out, s[:i]...)
+ out = append(out, s[i+1:]...)
+ return out
+}
- // Remove links but keep text [text](url) -> text
- // Simple approach: just remove brackets and parens for now
+func inlinesToText(inlines []Inline) string {
+ var b strings.Builder
+ for _, in := range inlines {
+ if tr, ok := in.(TextRun); ok {
+ b.WriteString(tr.Text)
+ }
+ }
+ return b.String()
+}
- return text
+func layoutSkipsTitleHoist(layout string) bool {
+ switch layout {
+ case slideyLayoutTitle, "hero", "statement":
+ return true
+ }
+ return false
}
-// determineLayout chooses the best layout for a slide
-func determineLayout(slide Slide) SlideLayout {
- hasTitle := false
- hasBullets := false
- hasBody := false
- hasCode := false
-
- for _, elem := range slide.Elements {
- switch elem.Type {
- case slideElementTitle:
- hasTitle = true
- case "bullets":
- hasBullets = true
- case "body":
- hasBody = true
- case "code":
- hasCode = true
+func normalizeShorthandColumns(body, layout string) string {
+ if layout != "two-cols" && layout != "three-cols" {
+ return body
+ }
+ if hasExplicitColumnsBlock(body) || !hasShorthandColumnMarker(body) {
+ return body
+ }
+
+ lines := strings.Split(body, "\n")
+ columnStart := 0
+ for columnStart < len(lines) && strings.TrimSpace(lines[columnStart]) == "" {
+ columnStart++
+ }
+ if columnStart < len(lines) && headingRE.MatchString(lines[columnStart]) {
+ columnStart++
+ for columnStart < len(lines) && strings.TrimSpace(lines[columnStart]) == "" {
+ columnStart++
}
}
- // No title = blank layout
- if !hasTitle {
- return LayoutBlank
+ var out []string
+ out = append(out, lines[:columnStart]...)
+ out = append(out, colsOpen)
+ out = append(out, lines[columnStart:]...)
+ if len(out) == 0 || strings.TrimSpace(out[len(out)-1]) != "" {
+ out = append(out, "")
}
+ out = append(out, colsClose)
+ return strings.Join(out, "\n")
+}
- // Code slides often need more space
- if hasCode {
- return LayoutTitleAndBody
+func hasExplicitColumnsBlock(body string) bool {
+ var fenceChar byte
+ fenceLen := 0
+ for _, line := range strings.Split(body, "\n") {
+ if isFenceDelimiter(line) {
+ fenceChar, fenceLen = updateMarkdownFenceState(line, fenceChar, fenceLen)
+ continue
+ }
+ if fenceLen > 0 {
+ continue
+ }
+ if strings.TrimSpace(line) == colsOpen {
+ return true
+ }
}
+ return false
+}
- // Bullets or body = title + body
- if hasBullets || hasBody {
- return LayoutTitleAndBody
+func hasShorthandColumnMarker(body string) bool {
+ var fenceChar byte
+ fenceLen := 0
+ for _, line := range strings.Split(body, "\n") {
+ if isFenceDelimiter(line) {
+ fenceChar, fenceLen = updateMarkdownFenceState(line, fenceChar, fenceLen)
+ continue
+ }
+ if fenceLen > 0 {
+ continue
+ }
+ switch strings.TrimSpace(line) {
+ case colMarker2, colMarker3, colMarkerAlt:
+ return true
+ }
}
+ return false
+}
- // Just a title = title only
- return LayoutTitleOnly
+func isFenceDelimiter(line string) bool {
+ _, _, ok := markdownFenceMarker(line)
+ return ok
}
diff --git a/internal/cmd/slides_markdown_ast.go b/internal/cmd/slides_markdown_ast.go
new file mode 100644
index 000000000..e3c327eb7
--- /dev/null
+++ b/internal/cmd/slides_markdown_ast.go
@@ -0,0 +1,100 @@
+package cmd
+
+// SlideFrontmatter holds per-slide YAML frontmatter values.
+type SlideFrontmatter struct {
+ Layout string // "title"|"hero"|"center"|"default"|"two-cols"|"three-cols"|"statement"|""
+ Content string // "wide"|"narrow"|"" — parsed but not rendered this PR
+ Raw map[string]string // forward-compat for unknown keys
+}
+
+// Slide is the parsed form of one markdown slide. Replaces the legacy
+// flat-Element shape used by the original parser.
+type Slide struct {
+ Frontmatter SlideFrontmatter
+ Title string // hoisted h1 (or h2 fallback); empty for title/hero/statement layouts
+ Body []Block // ordered top-level blocks
+ Notes string // resolved speaker-notes text (raw, FA stripped)
+}
+
+// Block is a top-level body block.
+type Block interface{ isBlock() }
+
+type ParagraphBlock struct {
+ Inlines []Inline
+}
+
+type BulletItem struct {
+ Inlines []Inline
+ Indent int // number of leading 2-space indents (0 = top level)
+}
+
+type BulletsBlock struct {
+ Items []BulletItem
+ Ordered bool
+}
+
+type CodeBlock struct {
+ Lang string
+ Source string
+}
+
+type HeadingBlock struct {
+ Level int
+ Inlines []Inline
+}
+
+type ColumnsBlock struct {
+ Columns [][]Block // 2 or 3 element outer slice
+}
+
+type IconRow struct {
+ Icon *IconRef // nil if line had no shortcode
+ Text string
+}
+
+type IconRowsBlock struct {
+ Kind string // "boxes" | "arrows"
+ Rows []IconRow
+}
+
+type DiagramBlock struct {
+ Kind string // "mermaid" only for now
+ Source string
+ ID string // stable ID assigned by the parser; used as AssetMap.Diagrams key
+}
+
+func (ParagraphBlock) isBlock() {}
+func (BulletsBlock) isBlock() {}
+func (CodeBlock) isBlock() {}
+func (HeadingBlock) isBlock() {}
+func (ColumnsBlock) isBlock() {}
+func (IconRowsBlock) isBlock() {}
+func (DiagramBlock) isBlock() {}
+
+// Inline is an inline run inside text.
+type Inline interface{ isInline() }
+
+type TextRun struct {
+ Text string
+ Bold bool
+ Italic bool
+ Code bool
+}
+
+// IconRef is an unresolved Font Awesome shortcode (style+name).
+// After the asset pipeline runs, an ImageRef is looked up by this value
+// from AssetMap.Icons.
+type IconRef struct {
+ Style string // "solid"|"regular"|"brands"
+ Name string
+}
+
+func (TextRun) isInline() {}
+func (IconRef) isInline() {}
+
+// ImageRef is the result of uploading an asset (icon SVG or rendered
+// diagram PNG) to Drive.
+type ImageRef struct {
+ DriveFileID string
+ PublicURL string
+}
diff --git a/internal/cmd/slides_markdown_ast_test.go b/internal/cmd/slides_markdown_ast_test.go
new file mode 100644
index 000000000..17ca1bbbe
--- /dev/null
+++ b/internal/cmd/slides_markdown_ast_test.go
@@ -0,0 +1,16 @@
+package cmd
+
+import "testing"
+
+func TestBlockMarkerMethods(t *testing.T) {
+ var _ Block = ParagraphBlock{}
+ var _ Block = BulletsBlock{}
+ var _ Block = CodeBlock{}
+ var _ Block = HeadingBlock{}
+ var _ Block = ColumnsBlock{}
+ var _ Block = IconRowsBlock{}
+ var _ Block = DiagramBlock{}
+
+ var _ Inline = TextRun{}
+ var _ Inline = IconRef{}
+}
diff --git a/internal/cmd/slides_markdown_blocks.go b/internal/cmd/slides_markdown_blocks.go
new file mode 100644
index 000000000..2122dc6bf
--- /dev/null
+++ b/internal/cmd/slides_markdown_blocks.go
@@ -0,0 +1,338 @@
+package cmd
+
+import (
+ "fmt"
+ "regexp"
+ "strings"
+
+ "github.com/yuin/goldmark"
+ gast "github.com/yuin/goldmark/ast"
+ gtext "github.com/yuin/goldmark/text"
+)
+
+var headingRE = regexp.MustCompile(`^(#{1,6})(?:\s+(.*))?$`)
+
+type blockIDGenerator struct {
+ next uint64
+}
+
+func (g *blockIDGenerator) nextBlockID() string {
+ if g == nil {
+ g = &blockIDGenerator{}
+ }
+ g.next++
+ return fmt.Sprintf("block-%d", g.next)
+}
+
+const (
+ colsOpen = "::cols::"
+ colsClose = "::/cols::"
+ colMarker2 = "::col2::"
+ colMarker3 = "::col3::"
+ colMarkerAlt = "::right::" // synonym for col2
+ boxesOpen = "::boxes::"
+ boxesClose = "::/boxes::"
+ arrowsOpen = "::arrows::"
+ arrowsClose = "::/arrows::"
+)
+
+// parseBlocks turns body markdown into top-level blocks. CommonMark parsing
+// is delegated to goldmark; slidey-specific directive blocks are recognized
+// by a thin line scanner before each ordinary markdown chunk is parsed.
+func parseBlocks(body string) []Block {
+ return parseBlocksWithIDs(body, "solid", &blockIDGenerator{})
+}
+
+func parseBlocksWithIDs(body string, defaultFAStyle string, ids *blockIDGenerator) []Block {
+ lines := strings.Split(body, "\n")
+ var out []Block
+ var chunk []string
+ flushChunk := func() {
+ text := strings.Join(chunk, "\n")
+ chunk = nil
+ if strings.TrimSpace(text) == "" {
+ return
+ }
+ out = append(out, parseGoldmarkBlocks(text, defaultFAStyle, ids)...)
+ }
+
+ i := 0
+ var fenceChar byte
+ fenceLen := 0
+ for i < len(lines) {
+ line := lines[i]
+ trimmed := strings.TrimSpace(line)
+ if isFenceDelimiter(line) {
+ fenceChar, fenceLen = updateMarkdownFenceState(line, fenceChar, fenceLen)
+ chunk = append(chunk, line)
+ i++
+ continue
+ }
+
+ if fenceLen == 0 {
+ // Columns block.
+ if trimmed == colsOpen {
+ flushChunk()
+ i++
+ cols, consumed := consumeColumnsBlock(lines[i:], defaultFAStyle, ids)
+ i += consumed
+ out = append(out, cols)
+ continue
+ }
+
+ if trimmed == boxesOpen || trimmed == arrowsOpen {
+ flushChunk()
+ kind := "boxes"
+ closeMarker := boxesClose
+ if trimmed == arrowsOpen {
+ kind = "arrows"
+ closeMarker = arrowsClose
+ }
+ rows, consumed := consumeIconRowsBlock(lines[i+1:], kind, closeMarker, defaultFAStyle)
+ i += consumed + 1
+ out = append(out, rows)
+ continue
+ }
+ }
+
+ chunk = append(chunk, line)
+ i++
+ }
+ flushChunk()
+
+ return out
+}
+
+func parseGoldmarkBlocks(markdown string, defaultFAStyle string, ids *blockIDGenerator) []Block {
+ source := []byte(markdown)
+ doc := goldmark.DefaultParser().Parse(gtext.NewReader(source))
+ var out []Block
+ for n := doc.FirstChild(); n != nil; n = n.NextSibling() {
+ out = append(out, goldmarkBlockToBlocks(n, source, defaultFAStyle, ids)...)
+ }
+ return out
+}
+
+func goldmarkBlockToBlocks(n gast.Node, source []byte, defaultFAStyle string, ids *blockIDGenerator) []Block {
+ switch v := n.(type) {
+ case *gast.Heading:
+ return []Block{HeadingBlock{Level: v.Level, Inlines: goldmarkInlines(v, source, defaultFAStyle, false, false)}}
+ case *gast.Paragraph:
+ return []Block{ParagraphBlock{Inlines: goldmarkInlines(v, source, defaultFAStyle, false, false)}}
+ case *gast.TextBlock:
+ return []Block{ParagraphBlock{Inlines: goldmarkInlines(v, source, defaultFAStyle, false, false)}}
+ case *gast.FencedCodeBlock:
+ lang := string(v.Language(source))
+ code := strings.TrimSuffix(string(v.Lines().Value(source)), "\n")
+ if lang == "mermaid" {
+ return []Block{DiagramBlock{Kind: "mermaid", Source: code, ID: ids.nextBlockID()}}
+ }
+ return []Block{CodeBlock{Lang: lang, Source: code}}
+ case *gast.CodeBlock:
+ return []Block{CodeBlock{Source: strings.TrimSuffix(string(v.Lines().Value(source)), "\n")}}
+ case *gast.List:
+ return []Block{goldmarkListToBlock(v, source, defaultFAStyle)}
+ case *gast.Blockquote:
+ var out []Block
+ for c := v.FirstChild(); c != nil; c = c.NextSibling() {
+ out = append(out, goldmarkBlockToBlocks(c, source, defaultFAStyle, ids)...)
+ }
+ return out
+ case *gast.ThematicBreak:
+ return nil
+ default:
+ if n.HasChildren() {
+ var out []Block
+ for c := n.FirstChild(); c != nil; c = c.NextSibling() {
+ out = append(out, goldmarkBlockToBlocks(c, source, defaultFAStyle, ids)...)
+ }
+ return out
+ }
+ if n.Lines() != nil && n.Lines().Len() > 0 {
+ text := strings.TrimSpace(string(n.Lines().Value(source)))
+ if text != "" {
+ return []Block{ParagraphBlock{Inlines: parseInlines(text, defaultFAStyle)}}
+ }
+ }
+ return nil
+ }
+}
+
+func goldmarkListToBlock(list *gast.List, source []byte, defaultFAStyle string) BulletsBlock {
+ var items []BulletItem
+ appendGoldmarkListItems(list, source, defaultFAStyle, 0, &items)
+ return BulletsBlock{Ordered: list.IsOrdered(), Items: items}
+}
+
+func appendGoldmarkListItems(list *gast.List, source []byte, defaultFAStyle string, indent int, items *[]BulletItem) {
+ for n := list.FirstChild(); n != nil; n = n.NextSibling() {
+ item, ok := n.(*gast.ListItem)
+ if !ok {
+ continue
+ }
+ var inlines []Inline
+ var nested []*gast.List
+ for c := item.FirstChild(); c != nil; c = c.NextSibling() {
+ switch v := c.(type) {
+ case *gast.Paragraph:
+ if len(inlines) > 0 {
+ inlines = append(inlines, TextRun{Text: " "})
+ }
+ inlines = append(inlines, goldmarkInlines(v, source, defaultFAStyle, false, false)...)
+ case *gast.TextBlock:
+ if len(inlines) > 0 {
+ inlines = append(inlines, TextRun{Text: " "})
+ }
+ inlines = append(inlines, goldmarkInlines(v, source, defaultFAStyle, false, false)...)
+ case *gast.List:
+ nested = append(nested, v)
+ }
+ }
+ if len(inlines) > 0 {
+ *items = append(*items, BulletItem{Indent: indent, Inlines: inlines})
+ }
+ for _, nestedList := range nested {
+ appendGoldmarkListItems(nestedList, source, defaultFAStyle, indent+1, items)
+ }
+ }
+}
+
+func goldmarkInlines(parent gast.Node, source []byte, defaultFAStyle string, bold, italic bool) []Inline {
+ var out []Inline
+ for n := parent.FirstChild(); n != nil; n = n.NextSibling() {
+ switch v := n.(type) {
+ case *gast.Text:
+ text := string(v.Value(source))
+ if v.SoftLineBreak() {
+ text += " "
+ } else if v.HardLineBreak() {
+ text += "\n"
+ }
+ out = append(out, styleTextRuns(parseInlines(text, defaultFAStyle), bold, italic, false)...)
+ case *gast.String:
+ out = append(out, styleTextRuns(parseInlines(string(v.Value), defaultFAStyle), bold, italic, v.IsCode())...)
+ case *gast.CodeSpan:
+ out = append(out, TextRun{Text: goldmarkInlinePlainText(v, source), Code: true})
+ case *gast.Emphasis:
+ out = append(out, goldmarkInlines(v, source, defaultFAStyle, bold || v.Level >= 2, italic || v.Level == 1)...)
+ default:
+ if n.HasChildren() {
+ out = append(out, goldmarkInlines(n, source, defaultFAStyle, bold, italic)...)
+ }
+ }
+ }
+ return out
+}
+
+func goldmarkInlinePlainText(parent gast.Node, source []byte) string {
+ var b strings.Builder
+ _ = gast.Walk(parent, func(n gast.Node, entering bool) (gast.WalkStatus, error) {
+ if !entering {
+ return gast.WalkContinue, nil
+ }
+ switch v := n.(type) {
+ case *gast.Text:
+ b.Write(v.Value(source))
+ case *gast.String:
+ b.Write(v.Value)
+ }
+ return gast.WalkContinue, nil
+ })
+ return b.String()
+}
+
+func styleTextRuns(in []Inline, bold, italic, code bool) []Inline {
+ if !bold && !italic && !code {
+ return in
+ }
+ out := make([]Inline, 0, len(in))
+ for _, item := range in {
+ if tr, ok := item.(TextRun); ok {
+ tr.Bold = tr.Bold || bold
+ tr.Italic = tr.Italic || italic
+ tr.Code = tr.Code || code
+ out = append(out, tr)
+ continue
+ }
+ out = append(out, item)
+ }
+ return out
+}
+
+func consumeColumnsBlock(lines []string, defaultFAStyle string, ids *blockIDGenerator) (ColumnsBlock, int) {
+ var current []string
+ var columns [][]string
+ flush := func() {
+ columns = append(columns, append([]string(nil), current...))
+ current = nil
+ }
+
+ consumed := 0
+ var fenceChar byte
+ fenceLen := 0
+ for consumed < len(lines) {
+ line := lines[consumed]
+ trimmed := strings.TrimSpace(line)
+ if isFenceDelimiter(line) {
+ fenceChar, fenceLen = updateMarkdownFenceState(line, fenceChar, fenceLen)
+ }
+ if fenceLen == 0 {
+ switch trimmed {
+ case colsClose:
+ flush()
+ consumed++
+ return columnsBlockFromRaw(columns, defaultFAStyle, ids), consumed
+ case colMarker2, colMarker3, colMarkerAlt:
+ flush()
+ consumed++
+ continue
+ }
+ }
+ current = append(current, line)
+ consumed++
+ }
+ // EOF without close — still flush what we have.
+ flush()
+ return columnsBlockFromRaw(columns, defaultFAStyle, ids), consumed
+}
+
+func columnsBlockFromRaw(raw [][]string, defaultFAStyle string, ids *blockIDGenerator) ColumnsBlock {
+ cb := ColumnsBlock{}
+ for _, col := range raw {
+ body := strings.Join(col, "\n")
+ cb.Columns = append(cb.Columns, parseBlocksWithIDs(body, defaultFAStyle, ids))
+ }
+ return cb
+}
+
+func consumeIconRowsBlock(lines []string, kind, closeMarker, defaultFAStyle string) (IconRowsBlock, int) {
+ block := IconRowsBlock{Kind: kind}
+ consumed := 0
+ for consumed < len(lines) {
+ line := strings.TrimSpace(lines[consumed])
+ consumed++
+ if line == "" {
+ continue
+ }
+ if line == closeMarker {
+ return block, consumed
+ }
+ block.Rows = append(block.Rows, parseIconRow(line, defaultFAStyle))
+ }
+ return block, consumed
+}
+
+func parseIconRow(line, defaultFAStyle string) IconRow {
+ line = strings.TrimSpace(line)
+ if m := headingRE.FindStringSubmatch(line); m != nil {
+ line = strings.TrimSpace(m[2])
+ }
+ inlines := parseInlines(line, defaultFAStyle)
+ if len(inlines) == 0 {
+ return IconRow{Text: line}
+ }
+ if icon, ok := inlines[0].(IconRef); ok {
+ return IconRow{Icon: &icon, Text: strings.TrimSpace(inlinesToText(inlines[1:]))}
+ }
+ return IconRow{Text: strings.TrimSpace(inlinesToText(inlines))}
+}
diff --git a/internal/cmd/slides_markdown_blocks_test.go b/internal/cmd/slides_markdown_blocks_test.go
new file mode 100644
index 000000000..23a470ca0
--- /dev/null
+++ b/internal/cmd/slides_markdown_blocks_test.go
@@ -0,0 +1,179 @@
+package cmd
+
+import (
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+ "github.com/stretchr/testify/require"
+)
+
+func TestParseBlocks_Paragraph(t *testing.T) {
+ got := parseBlocks("Hello world.\n")
+ assert.Equal(t, []Block{
+ ParagraphBlock{Inlines: []Inline{TextRun{Text: "Hello world."}}},
+ }, got)
+}
+
+func TestParseBlocks_BulletList(t *testing.T) {
+ got := parseBlocks("- one\n- two **bold**\n- three\n")
+ assert.Equal(t, []Block{
+ BulletsBlock{Items: []BulletItem{
+ {Indent: 0, Inlines: []Inline{TextRun{Text: "one"}}},
+ {Indent: 0, Inlines: []Inline{TextRun{Text: "two "}, TextRun{Text: "bold", Bold: true}}},
+ {Indent: 0, Inlines: []Inline{TextRun{Text: "three"}}},
+ }},
+ }, got)
+}
+
+func TestParseBlocks_OrderedList(t *testing.T) {
+ got := parseBlocks("1. first\n2. second\n")
+ assert.Equal(t, []Block{
+ BulletsBlock{Ordered: true, Items: []BulletItem{
+ {Indent: 0, Inlines: []Inline{TextRun{Text: "first"}}},
+ {Indent: 0, Inlines: []Inline{TextRun{Text: "second"}}},
+ }},
+ }, got)
+}
+
+func TestParseBlocks_CodeBlock(t *testing.T) {
+ input := "```go\nfunc main() {}\n```\n"
+ got := parseBlocks(input)
+ assert.Equal(t, []Block{
+ CodeBlock{Lang: "go", Source: "func main() {}"},
+ }, got)
+}
+
+func TestParseBlocks_Heading(t *testing.T) {
+ got := parseBlocks("### Subsection\n")
+ assert.Equal(t, []Block{
+ HeadingBlock{Level: 3, Inlines: []Inline{TextRun{Text: "Subsection"}}},
+ }, got)
+}
+
+func TestParseBlocks_BareHeading(t *testing.T) {
+ got := parseBlocks("##\n")
+ assert.Equal(t, []Block{
+ HeadingBlock{Level: 2},
+ }, got)
+}
+
+func TestParseBlocks_Mixed(t *testing.T) {
+ input := "## Topic\n\nIntro paragraph.\n\n- bullet 1\n- bullet 2\n\nFollowup.\n"
+ got := parseBlocks(input)
+ assert.Equal(t, []Block{
+ HeadingBlock{Level: 2, Inlines: []Inline{TextRun{Text: "Topic"}}},
+ ParagraphBlock{Inlines: []Inline{TextRun{Text: "Intro paragraph."}}},
+ BulletsBlock{Items: []BulletItem{
+ {Inlines: []Inline{TextRun{Text: "bullet 1"}}},
+ {Inlines: []Inline{TextRun{Text: "bullet 2"}}},
+ }},
+ ParagraphBlock{Inlines: []Inline{TextRun{Text: "Followup."}}},
+ }, got)
+}
+
+func TestParseBlocks_TwoColumns(t *testing.T) {
+ input := "::cols::\n\nleft side text\n\n::col2::\n\nright side text\n\n::/cols::\n"
+ got := parseBlocks(input)
+ assert.Equal(t, []Block{
+ ColumnsBlock{Columns: [][]Block{
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "left side text"}}}},
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "right side text"}}}},
+ }},
+ }, got)
+}
+
+func TestParseBlocks_ThreeColumns(t *testing.T) {
+ input := "::cols::\n\nA\n\n::col2::\n\nB\n\n::col3::\n\nC\n\n::/cols::\n"
+ got := parseBlocks(input)
+ assert.Equal(t, []Block{
+ ColumnsBlock{Columns: [][]Block{
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "A"}}}},
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "B"}}}},
+ {ParagraphBlock{Inlines: []Inline{TextRun{Text: "C"}}}},
+ }},
+ }, got)
+}
+
+func TestParseBlocks_MermaidBlock(t *testing.T) {
+ input := "```mermaid\nflowchart LR\n A --> B\n```\n"
+ got := parseBlocks(input)
+ require.Equal(t, 1, len(got))
+ d, ok := got[0].(DiagramBlock)
+ require.True(t, ok)
+ assert.Equal(t, "mermaid", d.Kind)
+ assert.Equal(t, "flowchart LR\n A --> B", d.Source)
+ assert.NotEmpty(t, d.ID)
+}
+
+func TestParseBlocks_RightSynonymForCol2(t *testing.T) {
+ input := "::cols::\n\nA\n\n::right::\n\nB\n\n::/cols::\n"
+ got := parseBlocks(input)
+ require.Equal(t, 1, len(got))
+ col, ok := got[0].(ColumnsBlock)
+ assert.True(t, ok)
+ assert.Equal(t, 2, len(col.Columns))
+}
+
+func TestParseBlocks_ColumnMarkersInsideFenceStayCode(t *testing.T) {
+ input := "::cols::\n\n```md\n::right::\n```\n\n::right::\n\nright side\n\n::/cols::\n"
+ got := parseBlocks(input)
+ require.Equal(t, 1, len(got))
+ col, ok := got[0].(ColumnsBlock)
+ require.True(t, ok)
+ require.Equal(t, 2, len(col.Columns))
+ require.Equal(t, 1, len(col.Columns[0]))
+ code, ok := col.Columns[0][0].(CodeBlock)
+ require.True(t, ok)
+ assert.Equal(t, "::right::", code.Source)
+ assert.Equal(t, "right side", blocksToPlainText(col.Columns[1]))
+}
+
+func TestParseBlocks_ColumnMarkersInsideTildeFenceStayCode(t *testing.T) {
+ input := "::cols::\n\n~~~md\n::right::\n~~~\n\n::right::\n\nright side\n\n::/cols::\n"
+ got := parseBlocks(input)
+ require.Equal(t, 1, len(got))
+ col, ok := got[0].(ColumnsBlock)
+ require.True(t, ok)
+ require.Equal(t, 2, len(col.Columns))
+ require.Equal(t, 1, len(col.Columns[0]))
+ code, ok := col.Columns[0][0].(CodeBlock)
+ require.True(t, ok)
+ assert.Equal(t, "::right::", code.Source)
+ assert.Equal(t, "right side", blocksToPlainText(col.Columns[1]))
+}
+
+func TestParseBlocks_MismatchedFenceMarkersStayCode(t *testing.T) {
+ input := "::cols::\n\n````md\n~~~\n::right::\n```\n::boxes::\n````\n\n::right::\n\nright side\n\n::/cols::\n"
+ got := parseBlocks(input)
+ require.Equal(t, 1, len(got))
+ col, ok := got[0].(ColumnsBlock)
+ require.True(t, ok)
+ require.Equal(t, 2, len(col.Columns))
+ require.Equal(t, 1, len(col.Columns[0]))
+ code, ok := col.Columns[0][0].(CodeBlock)
+ require.True(t, ok)
+ assert.Equal(t, "~~~\n::right::\n```\n::boxes::", code.Source)
+ assert.Equal(t, "right side", blocksToPlainText(col.Columns[1]))
+}
+
+func TestParseBlocks_IconRows(t *testing.T) {
+ input := "::boxes::\n:fa-headset: Support Tickets\n:fab-github: GitHub\n::/boxes::\n"
+ got := parseBlocks(input)
+ assert.Equal(t, []Block{
+ IconRowsBlock{Kind: "boxes", Rows: []IconRow{
+ {Icon: &IconRef{Style: "solid", Name: "headset"}, Text: "Support Tickets"},
+ {Icon: &IconRef{Style: "brands", Name: "github"}, Text: "GitHub"},
+ }},
+ }, got)
+}
+
+func TestParseBlocks_ArrowRowsStripHeadingMarkers(t *testing.T) {
+ input := "::arrows::\n### Screen-scrape legacy systems.\n### Pray nothing leaks.\n::/arrows::\n"
+ got := parseBlocks(input)
+ assert.Equal(t, []Block{
+ IconRowsBlock{Kind: "arrows", Rows: []IconRow{
+ {Text: "Screen-scrape legacy systems."},
+ {Text: "Pray nothing leaks."},
+ }},
+ }, got)
+}
diff --git a/internal/cmd/slides_markdown_frontmatter.go b/internal/cmd/slides_markdown_frontmatter.go
new file mode 100644
index 000000000..64fd59f04
--- /dev/null
+++ b/internal/cmd/slides_markdown_frontmatter.go
@@ -0,0 +1,211 @@
+package cmd
+
+import (
+ "fmt"
+ "regexp"
+ "strings"
+
+ "gopkg.in/yaml.v3"
+)
+
+// slideBlock is the intermediate form between raw markdown and the parsed
+// Slide AST: per-slide frontmatter + the raw body markdown for that slide.
+type slideBlock struct {
+ Frontmatter SlideFrontmatter
+ Body string
+}
+
+var yamlKeyLineRE = regexp.MustCompile(`^[A-Za-z_][A-Za-z0-9_-]*:\s`)
+
+// splitMarkdownIntoSlideBlocks walks markdown line by line, splits on bare
+// "---" separators, and detects per-slide frontmatter using the rule from
+// the design spec (§4.1):
+//
+// 1. A "---" at file start, or immediately following another "---" separator
+// (only blank lines between), opens a frontmatter candidate.
+// 2. The next non-blank line must match a YAML key (^[A-Za-z_][\w-]*:\s).
+// If not, the original "---" is a separator and the candidate is abandoned.
+// 3. Scan the contiguous YAML header lines; a blank or non-key line before
+// the closing "---" abandons the candidate so key-value prose after a slide
+// separator stays body text.
+func splitMarkdownIntoSlideBlocks(markdown string) ([]slideBlock, error) {
+ // Normalize CRLF so downstream regex matches and body strings stay clean
+ // regardless of authoring platform.
+ markdown = strings.ReplaceAll(markdown, "\r\n", "\n")
+ lines := strings.Split(markdown, "\n")
+ var blocks []slideBlock
+
+ i := 0
+ for i < len(lines) {
+ // Try to consume a frontmatter block at the current position.
+ // tryConsumeFrontmatter will consume the opening "---" itself, so if
+ // the current position IS a "---" that turns out to be a frontmatter
+ // opener, it is removed from the body.
+ fm, after, ok, err := tryConsumeFrontmatter(lines, i)
+ if err != nil {
+ return nil, err
+ }
+ if ok {
+ i = after
+ // Skip the blank line(s) separating frontmatter from body.
+ for i < len(lines) && strings.TrimSpace(lines[i]) == "" {
+ i++
+ }
+ } else {
+ // Not frontmatter: if we're sitting on a "---" it was already
+ // determined to be a plain separator — skip it plus trailing blanks.
+ if i < len(lines) && isBareDelimiter(lines[i]) {
+ i++
+ for i < len(lines) && strings.TrimSpace(lines[i]) == "" {
+ i++
+ }
+ }
+ fm = SlideFrontmatter{Raw: map[string]string{}}
+ }
+
+ // Consume body lines until the next bare "---" separator or EOF.
+ // Markdown fences are handled here, before Goldmark sees the slide
+ // body, so delimiter examples inside code blocks stay intact.
+ bodyStart := i
+ var fenceChar byte
+ fenceLen := 0
+ for i < len(lines) {
+ inFence := fenceLen > 0
+ if !inFence && isBareDelimiter(lines[i]) {
+ break
+ }
+ fenceChar, fenceLen = updateMarkdownFenceState(lines[i], fenceChar, fenceLen)
+ i++
+ }
+ bodyLines := lines[bodyStart:i]
+ body := strings.Join(bodyLines, "\n")
+ if strings.TrimSpace(body) == "" {
+ continue
+ }
+ blocks = append(blocks, slideBlock{Frontmatter: fm, Body: body})
+
+ // Leave the "---" in place; the next iteration will call
+ // tryConsumeFrontmatter which will decide if it opens frontmatter or
+ // is a plain separator.
+ }
+
+ return blocks, nil
+}
+
+func tryConsumeFrontmatter(lines []string, start int) (SlideFrontmatter, int, bool, error) {
+ // Skip leading blank lines.
+ i := start
+ for i < len(lines) && strings.TrimSpace(lines[i]) == "" {
+ i++
+ }
+ if i >= len(lines) || !isBareDelimiter(lines[i]) {
+ return SlideFrontmatter{}, start, false, nil
+ }
+
+ // First non-blank line after "---" must look like a known frontmatter key.
+ j := i + 1
+ for j < len(lines) && strings.TrimSpace(lines[j]) == "" {
+ j++
+ }
+ if j >= len(lines) {
+ return SlideFrontmatter{}, start, false, nil
+ }
+ if !isFrontmatterStartKey(lines[j]) {
+ return SlideFrontmatter{}, start, false, nil
+ }
+
+ // Find closing "---" before normal body content begins. This intentionally
+ // keeps "Problem: hard\n\nbody\n---" as slide content, not frontmatter.
+ closeIdx := -1
+ for k := j; k < len(lines); k++ {
+ if isBareDelimiter(lines[k]) {
+ closeIdx = k
+ break
+ }
+ trimmed := strings.TrimSpace(lines[k])
+ if trimmed == "" {
+ return SlideFrontmatter{}, start, false, nil
+ }
+ if !isFrontmatterStartKey(lines[k]) {
+ return SlideFrontmatter{}, start, false, nil
+ }
+ }
+ if closeIdx == -1 {
+ return SlideFrontmatter{}, start, false, nil
+ }
+
+ yamlText := strings.Join(lines[i+1:closeIdx], "\n")
+ fm, err := parseSlideFrontmatter(yamlText)
+ if err != nil {
+ return SlideFrontmatter{}, start, false, fmt.Errorf("frontmatter at line %d: %w", i+1, err)
+ }
+ return fm, closeIdx + 1, true, nil
+}
+
+func parseSlideFrontmatter(yamlText string) (SlideFrontmatter, error) {
+ raw := map[string]string{}
+ if strings.TrimSpace(yamlText) != "" {
+ var m map[string]any
+ if err := yaml.Unmarshal([]byte(yamlText), &m); err != nil {
+ return SlideFrontmatter{}, err
+ }
+ for k, v := range m {
+ raw[k] = fmt.Sprintf("%v", v)
+ }
+ }
+ return SlideFrontmatter{
+ Layout: raw["layout"],
+ Content: raw["content"],
+ Raw: raw,
+ }, nil
+}
+
+func isFrontmatterStartKey(line string) bool {
+ trimmed := strings.TrimSpace(line)
+ return yamlKeyLineRE.MatchString(trimmed)
+}
+
+func isBareDelimiter(line string) bool {
+ return strings.TrimSpace(line) == literalMarkdownTripleDash
+}
+
+func updateMarkdownFenceState(line string, fenceChar byte, fenceLen int) (byte, int) {
+ char, length, ok := markdownFenceMarker(line)
+ if !ok {
+ return fenceChar, fenceLen
+ }
+ if fenceLen == 0 {
+ return char, length
+ }
+ if char == fenceChar && length >= fenceLen && markdownFenceCloser(line, length) {
+ return 0, 0
+ }
+ return fenceChar, fenceLen
+}
+
+func markdownFenceMarker(line string) (byte, int, bool) {
+ trimmed := strings.TrimSpace(line)
+ if len(trimmed) < 3 {
+ return 0, 0, false
+ }
+ char := trimmed[0]
+ if char != '`' && char != '~' {
+ return 0, 0, false
+ }
+ length := 0
+ for length < len(trimmed) && trimmed[length] == char {
+ length++
+ }
+ if length < 3 {
+ return 0, 0, false
+ }
+ return char, length, true
+}
+
+func markdownFenceCloser(line string, markerLen int) bool {
+ trimmed := strings.TrimSpace(line)
+ if markerLen > len(trimmed) {
+ return false
+ }
+ return strings.TrimSpace(trimmed[markerLen:]) == ""
+}
diff --git a/internal/cmd/slides_markdown_frontmatter_test.go b/internal/cmd/slides_markdown_frontmatter_test.go
new file mode 100644
index 000000000..648ef3670
--- /dev/null
+++ b/internal/cmd/slides_markdown_frontmatter_test.go
@@ -0,0 +1,168 @@
+package cmd
+
+import (
+ "strings"
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+ "github.com/stretchr/testify/require"
+)
+
+func TestSplitMarkdownIntoSlideBlocks(t *testing.T) {
+ cases := []struct {
+ name string
+ input string
+ expected []slideBlock
+ }{
+ {
+ name: "single slide no frontmatter",
+ input: "# Hello\n\nbody\n",
+ expected: []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# Hello\n\nbody\n"},
+ },
+ },
+ {
+ name: "two slides separated by ---",
+ input: "# A\n\n---\n\n# B\n",
+ expected: []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# A\n"},
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# B\n"},
+ },
+ },
+ {
+ name: "leading frontmatter then content",
+ input: "---\nlayout: hero\n---\n\n# Title\n",
+ expected: []slideBlock{
+ {Frontmatter: SlideFrontmatter{Layout: "hero", Raw: map[string]string{"layout": "hero"}}, Body: "# Title\n"},
+ },
+ },
+ {
+ name: "frontmatter on second slide",
+ input: "# A\n\n---\nlayout: center\n---\n\n# B\n",
+ expected: []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# A\n"},
+ {Frontmatter: SlideFrontmatter{Layout: "center", Raw: map[string]string{"layout": "center"}}, Body: "# B\n"},
+ },
+ },
+ {
+ name: "frontmatter with content key",
+ input: "---\nlayout: center\ncontent: wide\n---\n\nbody\n",
+ expected: []slideBlock{
+ {Frontmatter: SlideFrontmatter{
+ Layout: "center",
+ Content: "wide",
+ Raw: map[string]string{"layout": "center", "content": "wide"},
+ }, Body: "body\n"},
+ },
+ },
+ {
+ name: "bare --- at slide start is separator not frontmatter",
+ input: "# A\n\n---\n\nplain text body\n",
+ expected: []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# A\n"},
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "plain text body\n"},
+ },
+ },
+ }
+
+ for _, tc := range cases {
+ t.Run(tc.name, func(t *testing.T) {
+ got, err := splitMarkdownIntoSlideBlocks(tc.input)
+ require.NoError(t, err)
+ require.Equal(t, len(tc.expected), len(got))
+ for i := range tc.expected {
+ assert.Equal(t, tc.expected[i].Frontmatter, got[i].Frontmatter, "slide %d frontmatter", i)
+ assert.Equal(t, tc.expected[i].Body, got[i].Body, "slide %d body", i)
+ }
+ })
+ }
+}
+
+func TestSplitMarkdownIntoSlideBlocks_UnclosedFrontmatterStaysBody(t *testing.T) {
+ got, err := splitMarkdownIntoSlideBlocks("---\nlayout: hero\n\n# never closed\n")
+ require.NoError(t, err)
+ require.Equal(t, []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "layout: hero\n\n# never closed\n"},
+ }, got)
+}
+
+func TestSplitMarkdownIntoSlideBlocks_MalformedClosedFrontmatter(t *testing.T) {
+ _, err := splitMarkdownIntoSlideBlocks("---\nlayout: [\n---\n\n# broken\n")
+ require.Error(t, err)
+ assert.Contains(t, strings.ToLower(err.Error()), "frontmatter")
+}
+
+func TestSplitMarkdownIntoSlideBlocks_SkipsEmptySeparatorChunks(t *testing.T) {
+ got, err := splitMarkdownIntoSlideBlocks("# A\n\n---\n\n")
+ require.NoError(t, err)
+ require.Equal(t, []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# A\n"},
+ }, got)
+}
+
+func TestSplitMarkdownIntoSlideBlocks_EmptyInput(t *testing.T) {
+ got, err := splitMarkdownIntoSlideBlocks("")
+ require.NoError(t, err)
+ assert.Empty(t, got)
+}
+
+func TestSplitMarkdownIntoSlideBlocks_SkipsMetadataOnlyFrontmatter(t *testing.T) {
+ got, err := splitMarkdownIntoSlideBlocks("---\ntitle: Deck\n---\n\n---\n\n# First\n")
+ require.NoError(t, err)
+ require.Equal(t, []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# First\n"},
+ }, got)
+}
+
+func TestSplitMarkdownIntoSlideBlocks_KeyValueBodyAfterSeparator(t *testing.T) {
+ got, err := splitMarkdownIntoSlideBlocks("# A\n\n---\n\nProblem: hard\n\n## B\nbody\n")
+ require.NoError(t, err)
+ require.Equal(t, []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# A\n"},
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "Problem: hard\n\n## B\nbody\n"},
+ }, got)
+}
+
+func TestSplitMarkdownIntoSlideBlocks_KeyValueBodyBeforeNextSeparator(t *testing.T) {
+ got, err := splitMarkdownIntoSlideBlocks("# A\n\n---\n\nlayout: responsive\n\nbody\n\n---\n\n# C\n")
+ require.NoError(t, err)
+ require.Equal(t, []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# A\n"},
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "layout: responsive\n\nbody\n"},
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# C\n"},
+ }, got)
+}
+
+func TestSplitMarkdownIntoSlideBlocks_FrontmatterMayStartWithUnknownKey(t *testing.T) {
+ got, err := splitMarkdownIntoSlideBlocks("---\nbackground: dark\nlayout: hero\n---\n\n# B\n")
+ require.NoError(t, err)
+ require.Equal(t, []slideBlock{
+ {Frontmatter: SlideFrontmatter{
+ Layout: "hero",
+ Raw: map[string]string{
+ "background": "dark",
+ "layout": "hero",
+ },
+ }, Body: "# B\n"},
+ }, got)
+}
+
+func TestSplitMarkdownIntoSlideBlocks_DelimiterInsideFenceStaysBody(t *testing.T) {
+ input := "# A\n\n```markdown\n---\nlayout: hero\n---\n```\n\n---\n\n# B\n"
+ got, err := splitMarkdownIntoSlideBlocks(input)
+ require.NoError(t, err)
+ require.Equal(t, []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# A\n\n```markdown\n---\nlayout: hero\n---\n```\n"},
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# B\n"},
+ }, got)
+}
+
+func TestSplitMarkdownIntoSlideBlocks_DelimiterInsideTildeFenceStaysBody(t *testing.T) {
+ input := "# A\n\n~~~yaml\n---\nlayout: hero\n---\n~~~\n\n---\n\n# B\n"
+ got, err := splitMarkdownIntoSlideBlocks(input)
+ require.NoError(t, err)
+ require.Equal(t, []slideBlock{
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# A\n\n~~~yaml\n---\nlayout: hero\n---\n~~~\n"},
+ {Frontmatter: SlideFrontmatter{Raw: map[string]string{}}, Body: "# B\n"},
+ }, got)
+}
diff --git a/internal/cmd/slides_markdown_inlines.go b/internal/cmd/slides_markdown_inlines.go
new file mode 100644
index 000000000..b6b6cc50c
--- /dev/null
+++ b/internal/cmd/slides_markdown_inlines.go
@@ -0,0 +1,95 @@
+package cmd
+
+import (
+ "regexp"
+ "strings"
+)
+
+// faShortcodeRE matches :fa-name:, :fas-name:, :far-name:, :fab-name:,
+// :fal-name:, :fad-name:.
+var faShortcodeRE = regexp.MustCompile(`:fa([srlbd])?-([a-z0-9][a-z0-9-]*):`)
+
+// emphasisRE matches **bold**, __bold__, _italic_, *italic*, `code`.
+// Greedy, non-nested. We process emphasis on text spans between FA shortcodes.
+var emphasisRE = regexp.MustCompile(
+ "(\\*\\*[^*\\n]+\\*\\*)|(__[^_\\n]+__)|(\\*[^*\\n]+\\*)|(_[^_\\n]+_)|(`[^`\\n]+`)",
+)
+
+// parseInlines tokenizes a single line of markdown text into Inline runs.
+// FA shortcodes are extracted first (so emphasis processing doesn't see
+// the colons inside them), then emphasis is applied to the remaining text.
+func parseInlines(text string, defaultFAStyle string) []Inline {
+ var out []Inline
+
+ idxs := faShortcodeRE.FindAllStringSubmatchIndex(text, -1)
+ cursor := 0
+ for _, m := range idxs {
+ // Append text before the icon.
+ if m[0] > cursor {
+ out = append(out, parseEmphasis(text[cursor:m[0]])...)
+ }
+ stylePrefix := ""
+ if m[2] != -1 {
+ stylePrefix = text[m[2]:m[3]]
+ }
+ name := text[m[4]:m[5]]
+ out = append(out, IconRef{Style: faStyleFromPrefix(stylePrefix, defaultFAStyle), Name: name})
+ cursor = m[1]
+ }
+ if cursor < len(text) {
+ out = append(out, parseEmphasis(text[cursor:])...)
+ }
+ return out
+}
+
+func faStyleFromPrefix(prefix, defaultStyle string) string {
+ switch prefix {
+ case "":
+ return defaultStyle
+ case "s":
+ return "solid"
+ case "r":
+ return "regular"
+ case "b":
+ return "brands"
+ case "l", "d":
+ // FA Free has no light or duotone; substitute with solid.
+ return "solid"
+ default:
+ return defaultStyle
+ }
+}
+
+func parseEmphasis(s string) []Inline {
+ var out []Inline
+ cursor := 0
+ for _, m := range emphasisRE.FindAllStringIndex(s, -1) {
+ if m[0] > cursor {
+ out = append(out, TextRun{Text: s[cursor:m[0]]})
+ }
+ token := s[m[0]:m[1]]
+ switch {
+ case strings.HasPrefix(token, "**") && strings.HasSuffix(token, "**"):
+ out = append(out, TextRun{Text: token[2 : len(token)-2], Bold: true})
+ case strings.HasPrefix(token, "__") && strings.HasSuffix(token, "__"):
+ out = append(out, TextRun{Text: token[2 : len(token)-2], Bold: true})
+ case strings.HasPrefix(token, "`") && strings.HasSuffix(token, "`"):
+ out = append(out, TextRun{Text: token[1 : len(token)-1], Code: true})
+ case strings.HasPrefix(token, "*") && strings.HasSuffix(token, "*"):
+ out = append(out, TextRun{Text: token[1 : len(token)-1], Italic: true})
+ case strings.HasPrefix(token, "_") && strings.HasSuffix(token, "_"):
+ out = append(out, TextRun{Text: token[1 : len(token)-1], Italic: true})
+ }
+ cursor = m[1]
+ }
+ if cursor < len(s) {
+ out = append(out, TextRun{Text: s[cursor:]})
+ }
+ return out
+}
+
+// stripFAShortcodes removes :fa*-name: tokens from text (used for speaker
+// notes which can't render images).
+func stripFAShortcodes(text string) string {
+ return faShortcodeRE.ReplaceAllString(text, "")
+}
diff --git a/internal/cmd/slides_markdown_inlines_test.go b/internal/cmd/slides_markdown_inlines_test.go
new file mode 100644
index 000000000..dba9241f0
--- /dev/null
+++ b/internal/cmd/slides_markdown_inlines_test.go
@@ -0,0 +1,64 @@
+package cmd
+
+import (
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+)
+
+func TestParseInlines_PlainText(t *testing.T) {
+ got := parseInlines("hello world", "solid")
+ assert.Equal(t, []Inline{TextRun{Text: "hello world"}}, got)
+}
+
+func TestParseInlines_Emphasis(t *testing.T) {
+ got := parseInlines("plain **bold** _ital_ `code` end", "solid")
+ assert.Equal(t, []Inline{
+ TextRun{Text: "plain "},
+ TextRun{Text: "bold", Bold: true},
+ TextRun{Text: " "},
+ TextRun{Text: "ital", Italic: true},
+ TextRun{Text: " "},
+ TextRun{Text: "code", Code: true},
+ TextRun{Text: " end"},
+ }, got)
+}
+
+func TestParseInlines_FAShortcodes(t *testing.T) {
+ got := parseInlines("Welcome :fa-truck-fast: to :fab-github: here", "solid")
+ assert.Equal(t, []Inline{
+ TextRun{Text: "Welcome "},
+ IconRef{Style: "solid", Name: "truck-fast"},
+ TextRun{Text: " to "},
+ IconRef{Style: "brands", Name: "github"},
+ TextRun{Text: " here"},
+ }, got)
+}
+
+func TestParseInlines_FAStyleDerivation(t *testing.T) {
+ cases := []struct {
+ shortcode string
+ defaultStyle string
+ expectedStyle string
+ expectedName string
+ }{
+ {":fa-database:", "solid", "solid", "database"},
+ {":fas-headset:", "solid", "solid", "headset"},
+ {":far-clock:", "solid", "regular", "clock"},
+ {":fab-github:", "solid", "brands", "github"},
+ {":fal-flask:", "solid", "solid", "flask"}, // free-tier substitution
+ {":fad-bug:", "solid", "solid", "bug"}, // free-tier substitution
+ {":fa-database:", "regular", "regular", "database"}, // default override
+ }
+ for _, tc := range cases {
+ t.Run(tc.shortcode, func(t *testing.T) {
+ got := parseInlines(tc.shortcode, tc.defaultStyle)
+ assert.Equal(t, []Inline{IconRef{Style: tc.expectedStyle, Name: tc.expectedName}}, got)
+ })
+ }
+}
+
+func TestStripFAShortcodes(t *testing.T) {
+ got := stripFAShortcodes(":fa-truck-fast: Orders and :fab-github: GitHub")
+ assert.Equal(t, " Orders and GitHub", got)
+}
diff --git a/internal/cmd/slides_markdown_test.go b/internal/cmd/slides_markdown_test.go
new file mode 100644
index 000000000..91ecc17b9
--- /dev/null
+++ b/internal/cmd/slides_markdown_test.go
@@ -0,0 +1,164 @@
+package cmd
+
+import (
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+ "github.com/stretchr/testify/require"
+)
+
+func TestParseMarkdownToSlides_TitleHoistFromH1(t *testing.T) {
+ input := "# Hello\n\nbody text\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ assert.Equal(t, "Hello", got[0].Title)
+ require.Equal(t, 1, len(got[0].Body))
+ assert.IsType(t, ParagraphBlock{}, got[0].Body[0])
+}
+
+func TestParseMarkdownToSlides_TitleFallbackToH2(t *testing.T) {
+ input := "## Topic Heading\n\n- a\n- b\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ assert.Equal(t, "Topic Heading", got[0].Title)
+}
+
+func TestParseMarkdownToSlides_HeroLayoutKeepsH1InBody(t *testing.T) {
+ input := "---\nlayout: hero\n---\n\n# Big Wordmark\n\nsubline\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ assert.Equal(t, "", got[0].Title, "title should not be hoisted on hero")
+ require.GreaterOrEqual(t, len(got[0].Body), 1)
+ first, ok := got[0].Body[0].(HeadingBlock)
+ require.True(t, ok)
+ assert.Equal(t, 1, first.Level)
+}
+
+func TestParseMarkdownToSlides_NotesExtraction(t *testing.T) {
+ input := "## Topic\n\nbody\n\n## Notes\n\n- speaker note one\n- speaker note two\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ assert.Contains(t, got[0].Notes, "speaker note one")
+ assert.Contains(t, got[0].Notes, "speaker note two")
+ for _, b := range got[0].Body {
+ if h, ok := b.(HeadingBlock); ok && len(h.Inlines) > 0 {
+ if tr, ok := h.Inlines[0].(TextRun); ok {
+ assert.NotEqual(t, "Notes", tr.Text, "Notes heading should be removed from body")
+ }
+ }
+ }
+}
+
+func TestParseMarkdownToSlides_NotesStripsFAShortcodes(t *testing.T) {
+ input := "## Topic\n\nbody\n\n## Notes\n\n:fa-truck-fast: Orders matter\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ assert.NotContains(t, got[0].Notes, ":fa-truck-fast:")
+ assert.Contains(t, got[0].Notes, "Orders matter")
+}
+
+func TestParseMarkdownToSlides_NotesHeadingInsideFenceStaysBody(t *testing.T) {
+ input := "## Topic\n\n```md\n## Notes\nkeep as code\n```\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ assert.Empty(t, got[0].Notes)
+ require.Equal(t, 1, len(got[0].Body))
+ code, ok := got[0].Body[0].(CodeBlock)
+ require.True(t, ok)
+ assert.Equal(t, "## Notes\nkeep as code", code.Source)
+}
+
+func TestParseMarkdownToSlides_NotesHeadingInsideTildeFenceStaysBody(t *testing.T) {
+ input := "## Topic\n\n~~~md\n## Notes\nkeep as code\n~~~\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ assert.Empty(t, got[0].Notes)
+ require.Equal(t, 1, len(got[0].Body))
+ code, ok := got[0].Body[0].(CodeBlock)
+ require.True(t, ok)
+ assert.Equal(t, "## Notes\nkeep as code", code.Source)
+}
+
+func TestParseMarkdownToSlides_NotesHeadingInsideLongFenceStaysBody(t *testing.T) {
+ input := "## Topic\n\n````md\n```\n## Notes\nkeep as code\n````\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ assert.Empty(t, got[0].Notes)
+ require.Equal(t, 1, len(got[0].Body))
+ code, ok := got[0].Body[0].(CodeBlock)
+ require.True(t, ok)
+ assert.Equal(t, "```\n## Notes\nkeep as code", code.Source)
+}
+
+func TestParseMarkdownToSlides_DiagramIDsAreUniqueAndDeterministic(t *testing.T) {
+ input := "## One\n\n```mermaid\ngraph TD\nA-->B\n```\n\n---\n\n## Two\n\n```mermaid\ngraph TD\nC-->D\n```\n"
+ first, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ second, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+
+ firstDiagrams := collectDiagrams(first)
+ secondDiagrams := collectDiagrams(second)
+ assert.Equal(t, map[string]string{
+ "block-1": "graph TD\nA-->B",
+ "block-2": "graph TD\nC-->D",
+ }, firstDiagrams)
+ assert.Equal(t, firstDiagrams, secondDiagrams)
+}
+
+func TestParseMarkdownToSlides_ShorthandColumns(t *testing.T) {
+ input := "---\nlayout: two-cols\n---\n\n## Topic\n\nleft\n\n::right::\n\nright\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ assert.Equal(t, "Topic", got[0].Title)
+ require.Equal(t, 1, len(got[0].Body))
+ cols, ok := got[0].Body[0].(ColumnsBlock)
+ require.True(t, ok)
+ require.Equal(t, 2, len(cols.Columns))
+ assert.Equal(t, "left", blocksToPlainText(cols.Columns[0]))
+ assert.Equal(t, "right", blocksToPlainText(cols.Columns[1]))
+}
+
+func TestParseMarkdownToSlides_ShorthandMarkerInsideFenceStaysCode(t *testing.T) {
+ input := "---\nlayout: two-cols\n---\n\n## Syntax\n\n```md\n::right::\n```\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ require.Equal(t, 1, len(got[0].Body))
+ code, ok := got[0].Body[0].(CodeBlock)
+ require.True(t, ok)
+ assert.Equal(t, "::right::", code.Source)
+}
+
+func TestParseMarkdownToSlides_ShorthandMarkerInsideTildeFenceStaysCode(t *testing.T) {
+ input := "---\nlayout: two-cols\n---\n\n## Syntax\n\n~~~md\n::right::\n~~~\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ require.Equal(t, 1, len(got[0].Body))
+ code, ok := got[0].Body[0].(CodeBlock)
+ require.True(t, ok)
+ assert.Equal(t, "::right::", code.Source)
+}
+
+func TestParseMarkdownToSlides_ShorthandColumnsWithColsMentionInsideFence(t *testing.T) {
+ input := "---\nlayout: two-cols\n---\n\n## Syntax\n\n```md\n::cols::\n```\n\nleft\n\n::right::\n\nright\n"
+ got, err := ParseMarkdownToSlides(input, ParseOptions{})
+ require.NoError(t, err)
+ require.Equal(t, 1, len(got))
+ require.Equal(t, 1, len(got[0].Body))
+ cols, ok := got[0].Body[0].(ColumnsBlock)
+ require.True(t, ok)
+ require.Equal(t, 2, len(cols.Columns))
+ assert.Equal(t, "::cols::\n\nleft", blocksToPlainText(cols.Columns[0]))
+ assert.Equal(t, "right", blocksToPlainText(cols.Columns[1]))
+}
diff --git a/testdata/slidey/index.md b/testdata/slidey/index.md
new file mode 100644
index 000000000..2c4f90d1b
--- /dev/null
+++ b/testdata/slidey/index.md
@@ -0,0 +1,969 @@
+---
+title: univrs — Executive Pitch
+---
+
+---
+layout: hero
+---
+
+# univrs
+
+Unfolding Nested Intent · Valid · Reliable · Safe
+
+## Run the whole company for **less** than the cost of **one** Salesforce seat.
+
+## Notes
+
+- Open with the economic and organizational claim together.
+- Frame this as company infrastructure, not an app pitch.
+- The rest of the deck explains why this becomes necessary, not merely attractive.
+- CEO: this is operating leverage.
+- CFO: this is a cost-curve change, not a tooling swap.
+- CIO and architecture: this is simplification of core primitives.
+- Say it with ambition, not just thrift: we can unify how the company operates.
+
+---
+
+## Why do teams buy & use SaaS?
+
+In short: Plan, agree, track, record & report activities from the viewpoint of their role.
+
+### Why specific products?
+
+- Familiarity, "Industry Standard"
+
+---
+
+---
+layout: statement
+---
+
+## Not just CRM.
+
+::boxes::
+:fa-rectangle-ad: Campaigns
+:fa-headset: Support Tickets
+:fa-dev: Product & Eng Tracking
+:fa-people-group: HR & Recruiting
+:fa-truck-fast: Orders & Fulfilment
+:fa-building: Facilities
+::/boxes::
+
+## Notes
+
+- The problem is fragmented operations across the whole business.
+- The ambition is one operating core for the company.
+- CPO should hear that product and internal operations can share one semantic spine.
+- CEO should hear that this is how execution scales coherently across functions.
+
+---
+layout: default
+---
+
+## An accumulating failure mode
+
+- Every team buys their own tools. We have >1000.
+- Every tool interprets identity & permissions differently.
+- Every new workflow adds another translation layer.
+
+## Notes
+
+- This is the default fate of a growing company.
+- Tool sprawl looks adaptive until it becomes structural drag.
+- Every workflow adds translation, identity, and coordination cost.
+
+---
+layout: center
+---
+
+## This is not abundance
+
+## It is **operational fragmentation**.
+
+We are spending too much energy stitching together a company that should already know how it works.
+
+## Notes
+
+- Name the core disease: fragmentation, not lack of features.
+- The company is paying to reconcile systems instead of run operations.
+- This is a structural problem, not a vendor-specific one.
+- Product framing: the company experience is fragmented because the underlying model is fragmented.
+
+---
+layout: center
+content: wide
+---
+
+## Growth should make us sharper...
+
+- Every new hire → bigger bill.
+- Every new customer → bigger bill.
+- Every new domain boundary → more license cost, more glue, more maintenance.
+
+## Instead, growth makes the stack **noisier**, **slower**, more **error prone** _and_ more **expensive**.
+
+## Notes
+
+- Growth should improve leverage, but the SaaS stack makes growth more expensive.
+- The cost model is hostile to scale because it rises with every operational dimension.
+- This is why the problem is economic, not just technical.
+- CFO: opex rises with headcount, customers, and domain count.
+- CEO: the company becomes slower exactly when it should be compounding.
+- Product framing: every new workflow should make the system smarter, not more fragmented.
+
+---
+layout: three-cols
+---
+
+## We keep paying to connect what should already be connected
+
+::cols::
+
+## **60–70%**
+
+of IT budget goes to running and maintaining existing systems.
+
+[Gartner / Deloitte ↗](https://www.solix.com/blog/how-legacy-systems-are-draining-your-it-budget-and-what-to-do-about-it/)
+
+::col2::
+
+## **1,000+**
+
+SaaS apps in our stack today. Industry average is 80–400.
+
+[Zylo, 2026 ↗](https://zylo.com/blog/saas-statistics)
+
+::col3::
+
+## **+15–20%**
+
+integration spend growth, year over year — faster than overall IT.
+
+[Integrate.io ↗](https://www.integrate.io/blog/data-integration-adoption-rates-enterprises/)
+
+## Notes
+
+- The exact figures are less important than the shape of the spend.
+- Running old systems and integrating them is eating the budget.
+- Glue is the hidden operating tax we are trying to remove.
+- CIO: this is why the roadmap never gets cleaner on its own.
+- Head of architecture: integration complexity is becoming the architecture.
+- Product framing: too much effort goes into translation, not into better operating experiences.
+
+---
+
+## Audit today: a fire drill
+
+### Exports
+
+### Spreadsheets
+
+### Archaeology
+
+## Notes
+
+- Keep this short and memorable.
+- Audit work today is reconstructive because the systems were not built to preserve truth cleanly.
+- That same weakness appears in compliance, reporting, and operations.
+
+---
+layout: three-cols
+---
+
+## Why this moment is different
+
+##
+
+## :fas-file:
+
+### Rethink
+
+The agentic era demands a complete overhaul of how records are kept.
+
+::col2::
+
+##
+
+## :fa-database:
+
+### Reshape
+
+Enterprises need to not just **own** their data, but _control_ and _evolve_ the **shape** of it.
+
+::col3::
+
+##
+
+## :fa-house-circle-check:
+
+### Safeguard
+
+Fine-grained permissions empower **humans** and **AI agents** to act, but do so safely and in the best interests of the company.
+
+## Notes
+
+- AI changes the requirement, not just the interface.
+- Owning data is insufficient if the verbs and permissions are not owned too.
+- This is why the argument is timely rather than academic.
+- CPO: agents need product-safe verbs, not screen-level improvisation.
+- CIO: AI without policy-native execution increases risk faster than it increases output.
+- The next generation of software is not better dashboards; it is systems that can act safely.
+
+---
+
+## AI today is pretending the UI is the product
+
+::arrows::
+
+### Screen-scrape legacy systems.
+
+### Hope the query selectors still match the HTML.
+
+### Pray nothing leaks.
+
+::/arrows::
+
+## Notes
+
+- Current AI automation is brittle and unsafe because it is UI-driven.
+- Screen scraping is the wrong abstraction for enterprise action.
+- This sets up the need for a typed execution surface.
+- Product framing: the interface is not the system. The operating model is.
+
+---
+layout: default
+---
+
+## Oh, but we did API integration!
+
+- Even if systems exchange _syntax_, we get "semantic drift"
+
+- Does a Salesforce "OrderItem" == a JIRA "Component" == BuildKite "System"?
+
+---
+
+## univrs defines fields and rules for agents to act within.
+
+```mermaid
+flowchart LR
+ classDef danger fill:#fff5f6,stroke:#ef223a,color:#0d1730,stroke-width:2px;
+ classDef core fill:#1866ee,stroke:#0d1730,color:#ffffff,stroke-width:2px;
+ classDef surface fill:#ffffff,stroke:#d9dce5,color:#0d1730,stroke-width:1.5px;
+ classDef accent fill:#f1f5fe,stroke:#b9cdfb,color:#1866ee,stroke-width:1.5px;
+ classDef muted fill:#f1f3f8,stroke:#d9dce5,color:#5f6880,stroke-dasharray: 4 4;
+
+ subgraph Pixels[AI today · pixels]
+ direction TB
+ URL[crm-legacy.example.com/case/482?v=2]:::surface
+ S1[div.x-1f9 → ???]:::muted
+ S2[button.act-prim moved]:::muted
+ S3[[role=textbox] stale ref]:::muted
+ S4[modal confirm timeout]:::muted
+ S5[table cell selector drift]:::muted
+ FAIL[selectors break after release audit trail = screen recording]:::danger
+ URL --> S1 --> S2 --> S3 --> S4 --> S5 --> FAIL
+ end
+
+ subgraph Verbs[univrs · verbs]
+ direction TB
+ V1[verb Resolve]:::core
+ V2[actor = agent:triage-bot target = engineering.Ticket:7f3a transition = Resolve policy = AgentMayResolve]:::accent
+ V3[Cedar checks event recorded replayable outcome]:::core
+ V1 --> V2 --> V3
+ end
+```
+
+## Notes
+
+- Here is the needed interface for AI: verbs, policy, and recorded outcomes.
+- This is the bridge from the problem to the architecture.
+- Once you believe this, an owned operational core becomes necessary.
+- CPO: this is how automation becomes product behavior rather than brittle scripting.
+- Head of architecture: this is typed execution with replayable outcomes.
+- CIO and CFO: safer automation reduces both incident risk and integration spend.
+- This is the shift from automating clicks to expressing intent.
+
+---
+layout: center
+---
+
+## The credibility question:
+
+> "Isn't this just a brittle custom build?"
+
+## No
+
+It is one schema, one event log, one permission model, and one execution surface.
+
+## Notes
+
+- Answer the obvious objection directly once the need is established.
+- The point is simplification of primitives, not bespoke complexity.
+- That is what makes the architecture durable.
+- Head of architecture: the durability comes from fewer core abstractions.
+- CIO: this is less stack sprawl, not more.
+
+---
+layout: center
+---
+
+## The company should remember what it does.
+
+Everything is _first_ recorded as an event.
+
+No lock-step sync jobs. No "which copy is right?"
+
+Current state is a _projection_ in memory, with regular snapshots.
+
+Even **schema evolution** _itself_ is events.
+
+## Notes
+
+- Start showing why this core works.
+- Facts are recorded first; views are derived afterward.
+- That removes the synchronization problem at the root.
+- Product framing: memory is a feature. The system should learn, explain, and replay.
+
+---
+
+## Every team sees the same company through a different lens
+
+```mermaid
+flowchart TB
+ classDef core fill:#1866ee,stroke:#0d1730,color:#ffffff,stroke-width:2px;
+ classDef surface fill:#ffffff,stroke:#d9dce5,color:#0d1730,stroke-width:1.5px;
+ classDef accent fill:#f1f5fe,stroke:#b9cdfb,color:#1866ee,stroke-width:1.5px;
+
+ subgraph EventLog
+ direction TB
+ e1[Ticket.Created]
+ e2[Order.Placed]
+ e3[Ticket.Resolved]
+ e4[Invoice.Issued]
+ e5[Case.Escalated]
+ e6[Payment.Received]
+ end
+ SalesLens[Sales lens pipeline · forecast · ARR]:::accent
+ SupportLens[Support lens queues · SLA · CSAT]:::accent
+ FinanceLens[Finance lens A/R · revenue · audit]:::accent
+
+ EventLog --> SalesLens
+ EventLog --> SupportLens
+ EventLog --> FinanceLens
+```
+
+Same data, different lenses. No sync jobs. No translation layers.
+
+## Notes
+
+- Integration becomes projection over one log rather than synchronization between systems.
+- This is how the company gets many views without many copies.
+- It is a simpler and cheaper integration model.
+- Product framing: one operating core, many purpose-built experiences.
+
+---
+layout: center
+---
+
+## Trust is a built-in behavior.
+
+Every action gates through a [**Cedar policy**](https://cedar.dev) _before_ it runs.
+
+## Notes
+
+- Compliance should be runtime behavior, not cleanup work.
+- Policy enforcement before action is the key move.
+- This lowers executive risk while lowering operational effort.
+- CFO and CEO: lower risk is a financial outcome, not just a control outcome.
+- CIO: auditability is built in instead of bolted on.
+- Product framing: safety should be native to the product, not external to it.
+
+---
+layout: two-cols
+---
+
+## Every event answers four questions
+
+```kdl
+event "Executed" {
+ at "2026-05-05T14:23:01.847Z"
+ actor "user:njr"
+ target "engineering.Ticket:7f3a-critbug"
+ transition Transition::Resolve
+ from_state Status::Pending_Resolution
+ to_state Status::Resolved
+ policy "ManagerSignoff"
+ correlation_id "saga:9c1e-2f04"
+}
+```
+
+::right::
+
+- What changed, and when?
+- Who changed it?
+- **Which policy allowed it?**
+- What was the workflow state at that microsecond?
+
+Lower IT cost. Lower executive risk.
+
+## Notes
+
+- This is the executive-grade audit atom.
+- Every meaningful action carries identity, policy, and state context with it.
+- The payoff is both stronger accountability and lower cost.
+
+---
+
+```mermaid
+flowchart TB
+ classDef core fill:#1866ee,stroke:#0d1730,color:#ffffff,stroke-width:2px;
+ classDef domain fill:#ffffff,stroke:#0d1730,color:#0d1730,stroke-width:1.5px;
+ classDef surface fill:#ffffff,stroke:#d9dce5,color:#0d1730,stroke-width:1.5px;
+ classDef persona fill:#f1f3f8,stroke:#d9dce5,color:#0d1730,stroke-width:1.5px;
+
+ subgraph Domains[Schema engines]
+ direction LR
+ Sales[Sales]:::domain
+ Support[Support]:::domain
+ Engineering[Engineering]:::domain
+ HR[HR]:::domain
+ Ops[Ops]:::domain
+ end
+
+ Receptionist[Receptionist routing · auth · sessions · cedar policies · event log]:::core
+
+ subgraph Access[Access surfaces]
+ direction LR
+ Desktop[Desktop browser tab]:::surface
+ Mobile[Mobile PWA]:::surface
+ API[CLI · API scripts & AI agents]:::surface
+ Notify[Email · SMS notifications]:::surface
+ end
+
+ subgraph People[People and teams]
+ direction LR
+ SalesRep[Sales rep field]:::persona
+ SupportAgent[Support agent desk]:::persona
+ Engineer[Engineer on-call]:::persona
+ HRPartner[HR partner people]:::persona
+ OpsLead[Ops lead facilities]:::persona
+ end
+
+ Domains --> Receptionist --> Access --> People
+```
+
+## Notes
+
+- This is the owned operational core the company actually needs.
+- Domain engines sit on one receptionist, one policy surface, and one event log.
+- Once this exists, every interface becomes a projection or a channel around the same truth.
+
+---
+layout: three-cols
+---
+
+## univrs runs a schema. **KDL** is the schema
+
+::cols::
+
+### domain
+
+```kdl
+domain engineering {
+ prefix ENG
+
+ component Ticket {
+ field title string required
+ field owner @Email
+ field site @Address
+ field priority string {
+ enum "Low" "Medium" "High"
+ }
+ field estimate int
+
+ check estimate {
+ case "value > 40" warn
+ }
+ }
+}
+```
+
+::col2::
+
+### workflow
+
+```kdl
+workflow Resolve
+ component=engineering.Ticket {
+
+ state Open
+ state InProgress
+ state Resolved
+
+ transition Resolve
+ from=InProgress
+ to=Resolved
+ verb=Execute
+}
+```
+
+::col3::
+
+### screen
+
+```kdl
+screen "Ticket Detail"
+ component=engineering.Ticket
+ kind=form {
+
+ section "Overview" {
+ field title
+ field owner
+ }
+ section "Details" {
+ field priority
+ field estimate
+ field site
+ }
+}
+```
+
+## Notes
+
+- KDL is the control surface for the core.
+- Domain, workflow, and screen stay in one legible system.
+- The purpose is coherence and auditability, not novelty of syntax.
+
+---
+layout: three-cols
+---
+
+## How records compose. **ECS**, not class trees
+
+**E**ntity **C**omponent **S**ystem: data structure used in massive online games with hard real-time constraints.
+
+::cols::
+
+### Class hierarchy
+
+```ts
+class Asset {
+ id: string
+}
+class Room
+ extends Asset {
+ building: string
+ capacity: number
+}
+class MeetingRoom
+ extends Room {
+ av_kit: string[]
+}
+```
+
+Behavior is _locked into the type_. Marking a room **OutOfService** means a subclass, a flag column, or a code change everywhere `Room` is used.
+
+::col2::
+
+### Tables + joins
+
+```sql
+SELECT r.id, b.name,
+ o.headcount,
+ m.next_due
+FROM rooms r
+JOIN buildings b
+ ON b.id = r.building_id
+JOIN occupancy o
+ ON o.room = r.id
+JOIN maintenance m
+ ON m.room = r.id;
+```
+
+Every query _reassembles_ the room from foreign keys. Adding **OutOfService** means a new table _and_ a new join everywhere it appears.
+
+::col3::
+
+### ECS components
+
+```kdl
+entity Room#HQ-A14 {
+ Location { bldg HQ; floor 3 }
+ Occupancy { cap 12 }
+ OutOfService { until "2026-05-12" }
+}
+
+entity Printer#3F-04 {
+ OutOfService { until "2026-05-10" }
+}
+
+entity Factory#KCMO {
+ OutOfService { until "2026-06-01" }
+}
+```
+
+Entity is just an _id_. **OutOfService** is one row of data — and the _same_ component attaches to a Room, a Printer, or an entire Factory. No subclass, no extra table, no per-type code path.
+
+## Notes
+
+- This shows how the model stays extensible without becoming tangled.
+- Operational concepts compose across domains instead of being rewritten.
+- That matters if the system is meant to run the whole company.
+
+---
+layout: center
+---
+
+## We should own the logic
+
+## Not the box.
+
+SaaS rents us somebody else's workflow. univrs turns our operating model into a **capital asset**.
+
+## Notes
+
+- Now make the strategic jump explicit.
+- We do not just need cheaper software; we need owned operating logic.
+- That is why the answer is to build the core.
+- CEO: this is the strategic asset.
+- CPO: product and operations logic stop drifting apart.
+- CFO: money spent here compounds instead of renewing forever.
+- This is the heart of the pitch: our operating model should become ours.
+
+---
+layout: statement
+---
+
+## _Modest_ hardware.
+
+## _Extreme_ scale.
+
+- Memory-protected
+- Event-sourced
+- Shared-nothing
+- Sharded
+
+The whole company runs on infrastructure cheaper than **one SFDC seat** (~$5K/year).
+
+## Notes
+
+- Reinforce that this is not a giant infrastructure gamble.
+- The economics improve because the architecture is compact and scalable.
+- This makes building the core financially plausible.
+
+---
+
+## univrs changes what growth feels like
+
+```mermaid
+xychart-beta
+ title "Growth cost curve"
+ x-axis [first hire, team, multi-domain, enterprise]
+ y-axis "cost" 0 --> 100
+ line [10, 24, 62, 96]
+ line [10, 14, 18, 24]
+```
+
+A new domain — Logistics, Facilities, Ops — is a **schema promotion**, not a new app project.
+
+## Notes
+
+- This is the economic payoff of the architecture.
+- New domains extend the model instead of spawning new systems.
+- That is why building the core changes the cost curve.
+- CIO: the roadmap becomes additive instead of multiplicative.
+- CFO: new capability stops implying a new vendor category.
+- Head of architecture: extension happens within one model, not across more seams.
+- Product framing: each new domain should make the product stronger, not messier.
+
+---
+
+## Start with one spine. Let the company unfold from it
+
+```mermaid
+flowchart LR
+ classDef core fill:#0d1730,stroke:#0d1730,color:#ffffff,stroke-width:2px;
+ classDef accent fill:#1866ee,stroke:#0d1730,color:#ffffff,stroke-width:1.5px;
+
+ Root[Jan-Dec Root schema one event model · one permission model · one source of truth]:::core
+ HR[Mar-Dec HR]:::accent
+ Sales[May-Dec Sales]:::accent
+ Support[Jul-Dec Support]:::accent
+ Engineering[Sep-Dec Engineering]:::accent
+ Finance[Dec Finance stub]:::accent
+
+ Root --> HR --> Sales --> Support --> Engineering --> Finance
+```
+
+Each new domain is a **schema** promotion, not another integration program.
+
+## Notes
+
+- This makes the rollout path concrete.
+- The company does not need to replace everything at once.
+- One root spine can expand domain by domain over time.
+- CEO and CFO: this is phased execution, not a big-bang rewrite.
+- CIO: the migration story is controlled and sequential.
+- Product framing: this is a roadmap that compounds.
+
+---
+layout: hero
+---
+
+## Once the core exists, every channel can feel like the same company
+
+```mermaid
+flowchart LR
+ classDef core fill:#1866ee,stroke:#0d1730,color:#ffffff,stroke-width:2px;
+ classDef channel fill:#ffffff,stroke:#d9dce5,color:#0d1730,stroke-width:1.5px;
+ classDef edge fill:#fff5f6,stroke:#ef223a,color:#0d1730,stroke-width:2px;
+ classDef accent fill:#f1f5fe,stroke:#b9cdfb,color:#1866ee,stroke-width:1.5px;
+
+ c1[SMS · MMS notifications · replies]:::channel
+ c2[Voice IVR · call routing]:::channel
+ c3[WhatsApp customer messaging]:::channel
+
+ Core[univrs core schema engines · receptionist · event log · cedar policies system of record · workflow engine · permissions · replay]:::core
+
+ s1[Email SendGrid delivery]:::channel
+ s2[Verify · Lookup identity · number trust]:::accent
+ s3[Conversations · Flex omnichannel service edge]:::edge
+ end
+
+ c1 --> Core
+ c2 --> Core
+ c3 --> Core
+
+ Core --> s1
+ Core --> s2
+ Core --> s3
+```
+
+Twilio does not replace the core. It gives the core reach: messaging, voice, email, identity, and contact-center surfaces that all express the same operating model.
+
+## Notes
+
+- This is the culmination, not the premise.
+- First build the owned operational core; then connect it to the world through Twilio.
+- Twilio is the communications edge, while univrs remains the system of record, workflow engine, and policy surface.
+- CEO: this is how the company reaches customers without renting the operating model.
+- CPO: channels become expressions of the core, not separate products.
+- CIO and architecture: Twilio lives at the edge, not in the center of the truth model.
+- Product framing: every touchpoint should feel like one company because it is powered by one core.
+
+---
+layout: two-cols
+---
+
+## Why build it now
+
+- Lower structural spend
+- Faster process changes
+- Company-wide semantic spine
+
+::right::
+
+## Why it can scale safely
+
+- One permission model
+- Durable audit and replay
+- Far less integration surface
+
+## Notes
+
+- This slide is the decision frame.
+- Left side: why this expands what the company can become.
+- Right side: why we can trust it as it grows.
+- This is not only an efficiency move. It is a coherence move.
+- Technical teams should hear that the leverage comes from one model with fewer seams.
+
+---
+layout: center
+---
+
+## The strategic asset is not the UI.
+
+It is the operating model underneath it: one system that can finally describe, guide, and improve how the company actually works.
+
+## Notes
+
+- Slow down here.
+- This is the emotional center of the deck.
+- We are not proposing another interface layer.
+- We are proposing that the company finally own the logic of how it works.
+- Once that is true, better products, better automation, and better execution all follow.
+
+---
+layout: statement
+---
+
+## Build the core once.
+
+## Let every team, agent, and channel run from the same truth.
+
+Why are we still funding translation layers?
+
+## Notes
+
+- End with conviction, not defensiveness.
+- The deck has shown that building the core is both necessary and affordable.
+- The ask is simple: stop funding fragmentation and start funding the core.
+- For a technical company, this is the highest-leverage product we can build.
+- End on possibility: one company, one model, many experiences.
+
+---
+layout: statement
+---
+
+## Appendix:
+
+## Why this stack exists.
+
+Each choice optimizes for durability, auditability, and cost.
+
+## Notes
+
+- Signal that the appendix is about why the stack choices are intentional.
+- None of these picks are for fashion; they are in service of operating economics.
+- If time is short, this section can be skimmed or skipped.
+
+---
+layout: two-cols
+---
+
+## Runtime + UI
+
+### Our choices
+
+- **Rust** - predictable latency, compact footprint, memory safety.
+- **Axum** - small, explicit HTTP layer; easy to keep the surface narrow.
+- **Maud** - HTML stays typed, server-owned, and reviewable.
+- **Datastar** - reactive UX without a client app framework tax.
+- **Server-exclusive state** - one source of truth; no client/server divergence.
+
+::right::
+
+### What we avoid
+
+- **No React** - less duplicated state, hydration, bundling, and app-shell complexity.
+- **No Tailwind** - fewer ad hoc design decisions embedded in markup.
+- **No REST or GraphQL** - the system exposes verbs over one domain model, not data plumbing APIs.
+- **No thick SPA** - faster cold start, easier authz, simpler debugging.
+
+## Notes
+
+- The runtime and UI choices are about keeping the surface narrow and understandable.
+- Server ownership reduces duplicated state and accidental complexity.
+- We are optimizing for long-term operability, not trend alignment.
+
+---
+layout: two-cols
+---
+
+## Schema + behavior
+
+### Our choices
+
+- **KDL** - business logic stays declarative, diffable, and authorable.
+- **CEL for validation** - rich constraints close to the schema, not buried in handlers.
+- **Cedar permissions** - policy is explicit, testable, explainable, and replayable.
+- **Declarative metrics** - reporting logic is versioned with the business model.
+
+::right::
+
+### What we avoid
+
+- We do not want forms, rules, policy, and reporting scattered across five stacks.
+- We want a new domain to be a schema promotion, not a new app project.
+- We want every decision to survive replay and audit.
+
+## Notes
+
+- Schema and behavior belong close together so the system stays legible.
+- Mainstream defaults scatter core business logic across too many layers.
+- We want domain promotion to be declarative, not a bespoke engineering project.
+
+---
+layout: two-cols
+---
+
+## Persistence + execution
+
+### Our choices
+
+- **Event sourcing** - complete history, deterministic replay, cheap audit.
+- **Crypto shredding** - delete sensitive meaning without corrupting the historical ledger.
+- **Metric backfill** - new questions can be answered from old facts.
+- **rkyv** - zero-copy hot state handover and dense in-memory snapshots.
+
+::right::
+
+### What we avoid
+
+- Compliance stops being an export exercise.
+- Analytics does not depend on whether someone modeled the report up front.
+- Schema evolution can be fast without losing operational continuity.
+
+## Notes
+
+- Persistence choices are downstream of the auditability goal.
+- Event history should remain intact even as privacy and schema requirements evolve.
+- This is what lets the system answer new questions from old facts.
+
+---
+layout: two-cols
+---
+
+## Search + read model
+
+### Our choices
+
+- **Tantivy** - local, embeddable full-text search with strong performance.
+- **Shared-nothing + sharded design** - scale-out economics without SaaS-seat pricing.
+- **Single event log** - integration becomes projection, not synchronization.
+
+::right::
+
+### What we avoid
+
+- We prefer one internal truth over many service-specific copies.
+- We prefer embedded infrastructure where it keeps cost and latency down.
+- We prefer generated read models over hand-built integration glue.
+
+## Notes
+
+- Search and read models should be embedded where that improves cost and latency.
+- The shared-nothing shape supports scale without SaaS pricing dynamics.
+- Again, the theme is one truth with many projections.
+
+---
+layout: center
+---
+
+## The pattern is deliberate.
+
+Fewer layers. Fewer copies. Fewer translations.
+
+More history. More leverage. More control.
+
+## Notes
+
+- Summarize the architecture pattern in plain business terms.
+- Fewer layers and copies reduce cost; more history increases control.
+- This is a deliberate trade toward operating leverage.
+
+---
+layout: statement
+---
+
+## The architecture is opinionated
+
+## because the economics are.
+
+The mainstream stack optimizes for shipping apps. univrs optimizes for running the whole company.
+
+## Notes
+
+- Close by restating the thesis at the architecture level.
+- App stacks optimize for local product delivery; this stack optimizes for enterprise operation.
+- The economics force the opinionated design.