Skip to content

Latest commit

 

History

History
524 lines (384 loc) · 16.9 KB

File metadata and controls

524 lines (384 loc) · 16.9 KB

Documentation Best Practices Guide

How to create world-class developer documentation that drives adoption and reduces support burden.

This guide distills best practices from top-tier documentation like Redis, Stripe, Auth0, and Vercel to help you create documentation that developers love to use.

Core Principles

1. Start with User Outcomes, Not Features

❌ Don't write: "This guide covers Handit's optimization features" ✅ Do write: "This quickstart shows you how to set up your autonomous engineer in 10 minutes"

Why: Users care about what they can accomplish, not what your tool does. Lead with outcomes.

2. Use Action-Oriented Language

❌ Don't write: "Setting Up Your Database" ✅ Do write: "Set up your database"

Why: Action-oriented headers feel more approachable and task-focused. Users are here to do something, not read about doing something.

3. Provide Clear Learning Outcomes

Always start with: "This guide shows you how to:"

  1. Specific outcome 1
  2. Specific outcome 2
  3. Specific outcome 3

Include: Time estimates and prerequisites upfront.

4. Show, Don't Just Tell

  • Screenshots/videos for every major step
  • Code examples that users can copy-paste
  • Expected outputs so users know they're on track
  • Visual confirmation at each step

Structure Templates

Quickstart Page Template

# [Action-Oriented Title]

> **[Value proposition in one sentence]** [Explain what users will accomplish and why it matters]

## What you'll learn

This quickstart shows you how to:

1. **[Specific outcome 1]** - Brief description
2. **[Specific outcome 2]** - Brief description  
3. **[Specific outcome 3]** - Brief description

**Time required:** X minutes  
**Prerequisites:** [List requirements with links]

<Callout type="info">
  **Already have [prerequisite]?** Skip to [relevant section](#anchor) or [alternative action].
</Callout>

## Set up [main thing]

### Step 1: [Clear action]

```bash
command-here

[Explanation of what this does and why]

[Video/screenshot of the process]

Step 2: [Next clear action]

[Natural explanation of the process]

What happens:

  • Specific thing 1
  • Specific thing 2
  • Specific thing 3

[Video/screenshot showing results]

Verify your setup

✅ Check [location]: You should see:

  • Specific indicator 1
  • Specific indicator 2
  • Specific indicator 3

✅ Confirm [other location]: Look for:

  • Another specific indicator
  • Expected behavior

How [thing] works

[Natural explanation without bullet lists]

[Example or analogy that makes it concrete]

What you've accomplished

✅ [Primary achievement]

You should now see:

  • Specific result 1
  • Specific result 2
  • Specific result 3

What happens next

Within [timeframe]:

  • Expected outcome 1
  • Expected outcome 2

Ongoing:

  • Long-term benefit 1
  • Long-term benefit 2

Next steps

[Immediate actions]

  • [Action 1]: [Link] - [Brief description]
  • [Action 2]: [Link] - [Brief description]

[Learning resources]

  • [Resource 1]: [Link] - [What they'll learn]
  • [Resource 2]: [Link] - [What they'll learn]

[Support]

  • [Support channel 1]: [Link] - [When to use]
  • [Support channel 2]: [Link] - [When to use]

More info

  • [Advanced topic 1] - [Link] - [Brief description]
  • [Advanced topic 2] - [Link] - [Brief description]

Troubleshooting

[Common issue 1]:

  • Solution step 1
  • Solution step 2

[Common issue 2]:

  • Solution step 1
  • Solution step 2

### Overview Page Template

```markdown
# [Clear Product/Feature Name]

> **[Value proposition]** [Explain the core benefit and why it matters]

[Natural paragraph explaining the problem this solves]

## The [problem] problem

[Relatable scenario that users face]

**This is the reality for most teams.** [Expand on the pain points]

[What if scenario showing the solution]

<Callout type="info">
  [How your solution addresses this problem]
</Callout>

## How [solution] works

[Natural explanation without bullet lists, using storytelling]

[Concrete example or analogy]

## Real-world impact

[Stories or examples of how this helps teams]

## Key capabilities

### [Capability 1]
[Natural explanation of what this does and why it matters]

### [Capability 2]  
[Natural explanation focusing on user benefits]

### [Capability 3]
[Natural explanation with concrete examples]

## Getting started

[Clear call-to-action with specific next steps]

<Cards.Card title="[Action-oriented title]" href="/link" arrow />

[Additional resources if relevant]

Writing Best Practices

1. Use Natural, Conversational Language

❌ Avoid:

  • "Handit.ai's optimization system provides comprehensive quality assessment"
  • "The system automatically improves your AI while you focus on building your product"
  • "Leverage powerful language models to assess quality"

✅ Use:

  • "Your autonomous engineer needs to know what's broken to fix it"
  • "Stop being your AI's on-call engineer"
  • "Picture this: It's 2 AM and your phone buzzes with an AI failure alert"

2. Lead with Problems, Not Solutions

Structure: Problem → Impact → Solution → Benefit

Example:

  1. Problem: "Manual AI evaluation doesn't scale"
  2. Impact: "You can only check 50 out of 5,000 interactions"
  3. Solution: "AI-powered evaluation assesses every interaction"
  4. Benefit: "Your autonomous engineer can detect issues before users complain"

3. Replace Bullet Lists with Natural Flow

❌ Instead of:

Benefits:
- Automated assessment
- Consistent standards  
- Real-time feedback
- Focused insights

✅ Write: "Automated evaluation removes the inconsistency of human review while providing real-time feedback on every interaction. Instead of generic quality scores, you get focused insights about specific quality dimensions that enable targeted improvements."

4. Use Concrete Examples and Scenarios

❌ Avoid: "The system detects quality issues and generates improvements"

✅ Use: "If your customer service AI starts giving incomplete responses at 2 AM, your autonomous engineer detects this pattern, generates a better system prompt, tests it against real conversations, and creates a PR to replace the problematic prompt—all while you sleep."

5. Provide Multiple Learning Paths

Always include:

  • Fast track: "Already have X? Skip to Y"
  • Deep dive: "Want to understand how this works? Read Z"
  • Alternative approaches: "Prefer manual setup? See advanced guide"

Visual Design Principles

1. Consistent Video/Image Styling

<video 
  width="100%" 
  autoPlay 
  loop 
  muted 
  playsInline
  style={{ borderRadius: '8px' }}
>
  <source src="/assets/path/video.mp4" type="video/mp4" />
  Your browser does not support the video tag.
</video>

2. Effective Callout Usage

  • Info callouts: For context or helpful tips
  • Success callouts: For completion confirmations
  • Warning callouts: For critical information or common mistakes
  • Tip callouts: For pro tips and optimizations

3. Code Block Best Practices

  • Always include filename: filename="example.py"
  • Use realistic examples: Actual code users would write
  • Include comments: Explain non-obvious parts
  • Show expected output: When relevant

Content Organization

1. Information Hierarchy

  1. Value proposition - Why this matters
  2. Learning outcomes - What users will accomplish
  3. Prerequisites - What they need before starting
  4. Step-by-step setup - How to do it
  5. Verification - How to confirm success
  6. Understanding - How it works conceptually
  7. Next steps - Where to go from here
  8. Troubleshooting - Common issues and solutions

2. Progressive Disclosure

  • Start simple: Basic setup first
  • Add complexity gradually: Advanced features later
  • Provide escape hatches: "Need custom setup? See advanced guide"
  • Multiple paths: Different approaches for different needs

3. Cross-Reference Strategy

  • Hub and spoke: Main quickstart → specialized guides
  • Clear progression: Setup → Understanding → Advanced features
  • Contextual links: Link to relevant sections when mentioned
  • No dead ends: Every page should have clear next steps

Common Mistakes to Avoid

1. Feature-Focused Writing

❌ Don't: List features and capabilities ✅ Do: Explain problems solved and outcomes achieved

2. Excessive Bullet Lists

❌ Don't: Break everything into bullet points ✅ Do: Use natural paragraphs with occasional lists for clarity

3. Technical Jargon Without Context

❌ Don't: "LLM-as-Judge leverages sophisticated language models" ✅ Do: "Use AI to evaluate AI—advanced models assess your AI's quality with human-level understanding"

4. Vague Success Criteria

❌ Don't: "Setup complete! Your system is now configured" ✅ Do: "Setup complete! You should see real-time data in your dashboard at [specific URL]"

5. Missing Context for Existing Users

❌ Don't: Assume everyone starts from scratch ✅ Do: Provide shortcuts for users who already have partial setup

Quality Checklist

Before Publishing Any Page:

Content Quality

  • Clear value proposition in the first sentence
  • Specific learning outcomes listed upfront
  • Action-oriented headers throughout
  • Natural, conversational language instead of technical jargon
  • Concrete examples instead of abstract descriptions
  • Problem-solution narrative that creates emotional connection

User Experience

  • Clear prerequisites with links to dependencies
  • Step-by-step instructions that are easy to follow
  • Visual confirmation for each major step
  • Verification steps so users know they succeeded
  • Multiple learning paths for different user types
  • Clear next steps that guide users forward

Technical Accuracy

  • All commands tested and work as documented
  • Code examples are copy-pasteable and functional
  • Links work and point to current content
  • Screenshots/videos are current and accurate
  • Expected outputs match actual results

Consistency

  • Messaging alignment with overall product positioning
  • Terminology consistency across all pages
  • Visual styling matches design system
  • Cross-references are accurate and helpful
  • Tone and voice consistent throughout

Measuring Documentation Success

Key Metrics to Track

User Behavior

  • Time to first success - How quickly users complete quickstart
  • Completion rates - Percentage who finish setup successfully
  • Drop-off points - Where users abandon the process
  • Return visits - How often users come back to docs

Support Impact

  • Support ticket reduction - Fewer questions about covered topics
  • Common questions - What users still struggle with
  • Feature adoption - How quickly new features get adopted
  • User feedback - Direct feedback on documentation quality

Business Impact

  • Faster onboarding - Reduced time from signup to value
  • Higher activation - More users successfully implementing features
  • Better retention - Users who understand the product stick around
  • Community growth - Good docs drive word-of-mouth adoption

Continuous Improvement Process

  1. Monitor user behavior - Use analytics to identify problem areas
  2. Collect feedback - Regular surveys and user interviews
  3. Test with real users - Watch people use your docs
  4. Iterate based on data - Update content based on evidence
  5. Measure impact - Track improvements in key metrics

Implementation Plan

Phase 1: Foundation (Week 1)

  • Audit existing content using the quality checklist
  • Identify top user journeys and ensure they're well-documented
  • Standardize templates for consistency across pages
  • Update main entry points (overview, quickstart) first

Phase 2: Content Transformation (Week 2-3)

  • Rewrite feature pages using problem-solution narratives
  • Add verification steps to all setup guides
  • Include visual confirmation for major steps
  • Create clear cross-reference strategy

Phase 3: Enhancement (Week 4)

  • Add videos/screenshots for visual learners
  • Create alternative learning paths for different user types
  • Enhance troubleshooting with specific solutions
  • Add "more info" sections for advanced users

Phase 4: Optimization (Ongoing)

  • Monitor user behavior and identify improvement opportunities
  • Collect user feedback and iterate based on insights
  • A/B test different approaches to see what works best
  • Keep content current as features evolve

Tools and Resources

Documentation Tools

  • Nextra - React-based documentation framework
  • Figma - Design mockups and user flows
  • Loom - Quick video creation for walkthroughs
  • GitHub Issues - Track documentation improvements

Analytics and Feedback

Content Creation

Examples of Excellence

Redis Documentation

What they do well:

  • Clear learning outcomes: "You'll learn how to: 1. Create account 2. Connect to database"
  • Context-aware guidance: "If you already have an account, see..."
  • Visual confirmation: Screenshots for every step
  • Multiple connection methods: Different paths for different needs

Stripe Documentation

What they do well:

  • Immediate code examples that work
  • Progressive complexity (simple → advanced)
  • Real-world scenarios and use cases
  • Excellent error handling documentation

Auth0 Documentation

What they do well:

  • Clear user journey mapping
  • Multiple framework examples
  • Security best practices integrated throughout
  • Excellent troubleshooting sections

Vercel Documentation

What they do well:

  • Deployment-focused outcomes
  • Framework-specific guidance
  • Performance optimization tips
  • Community examples and templates

Red Flags to Avoid

Content Red Flags

  • Feature laundry lists without context or benefits
  • Technical jargon without explanation
  • Vague success criteria like "setup complete"
  • Missing verification steps
  • No clear next steps after completing a guide

Structure Red Flags

  • Too many nested sections that confuse navigation
  • Inconsistent terminology across pages
  • Broken or outdated links
  • Missing prerequisites or assumptions about user knowledge
  • No alternative paths for different user scenarios

User Experience Red Flags

  • Walls of text without visual breaks
  • Code examples that don't work
  • Missing context about when to use different approaches
  • No troubleshooting for common issues
  • Outdated screenshots or videos

Maintenance Strategy

Weekly Reviews

  • Check for broken links across all pages
  • Review user feedback and support tickets
  • Update any outdated information
  • Monitor analytics for drop-off patterns

Monthly Audits

  • User journey testing - Have someone new try the docs
  • Content freshness review - Update examples and screenshots
  • Cross-reference validation - Ensure all links are accurate
  • Competitive analysis - See what others are doing well

Quarterly Overhauls

  • Major content restructuring based on user behavior data
  • Template updates to incorporate new best practices
  • Video/screenshot refresh to keep visuals current
  • User research to understand changing needs

Success Metrics

Leading Indicators

  • Page completion rates - Users finishing guides successfully
  • Time to first success - Speed of initial value achievement
  • Return usage patterns - Users coming back to reference docs
  • Community engagement - Questions, contributions, discussions

Lagging Indicators

  • Support ticket reduction - Fewer questions about documented topics
  • Feature adoption rates - How quickly new features get used
  • User satisfaction scores - Direct feedback on documentation quality
  • Business metrics - Faster onboarding, higher activation, better retention

Remember: Great documentation is never finished—it's continuously improved based on user feedback and changing needs. Start with these principles and iterate based on what you learn from your users.