Skip to content

Latest commit

 

History

History
138 lines (93 loc) · 3.5 KB

File metadata and controls

138 lines (93 loc) · 3.5 KB

Overview

Intercom developed the RICE prioritization framework to help product teams make better decisions about what to build. With this framework, they decided to consider four factors (reach, impact, confidence, and effort) and came up with a formula for calculating and combining them.

The Four Factors

Reach

How many people will this impact within a given time period?

Measurement:

  • Number of users/customers per quarter
  • Transactions per month
  • Events per time period
  • Concrete numbers, not percentages

Example: "5,000 customers per quarter" or "500 trials per month"

Impact

How much will this impact each person?

Scale:

  • 3 = Massive impact
  • 2 = High impact
  • 1 = Medium impact
  • 0.5 = Low impact
  • 0.25 = Minimal impact

Considerations:

  • How much does it move the needle?
  • Impact on conversion, engagement, or satisfaction
  • Contribution to product goals

Confidence

How confident are you in your estimates?

Scale:

  • 100% = High confidence (strong data)
  • 80% = Medium confidence (some data)
  • 50% = Low confidence (mostly assumptions)

Purpose:

  • Prevents pet projects with inflated estimates
  • Forces honesty about uncertainty
  • Highlights need for more research

Effort

How much time will this require from all team members?

Measurement:

  • Person-months (total team effort)
  • Includes all disciplines: product, design, engineering
  • Realistic estimates, not ideal scenarios

Example: "2 person-months" means two people for one month, or one person for two months

The RICE Formula

RICE Score = (Reach × Impact × Confidence) / Effort

Example Calculation

Feature A:

  • Reach: 2,000 customers per quarter
  • Impact: 2 (high)
  • Confidence: 80%
  • Effort: 4 person-months

RICE Score = (2,000 × 2 × 0.8) / 4 = 800

When to Use RICE

The RICE Prioritization framework is ideal for:

  • Planning a product roadmap
  • Comparing diverse initiatives
  • Making data-driven decisions
  • Aligning teams on priorities
  • Justifying resource allocation

Benefits

  • Reduces bias in prioritization
  • Forces quantitative thinking
  • Balances multiple factors
  • Easy to compare disparate projects
  • Encourages data-driven decisions
  • Makes assumptions explicit
  • Scalable across teams

Limitations

  • Requires effort to gather accurate data
  • Can feel overly analytical
  • Impact scoring is somewhat subjective
  • Doesn't account for strategic alignment
  • May not capture intangible benefits

Best Practices

Use Consistent Time Periods

Always measure Reach over the same time period (e.g., per quarter) for fair comparison.

Involve the Team

Get input from product, design, and engineering for more accurate estimates.

Update Regularly

Re-score items as you learn more or circumstances change.

Don't Obsess Over Precision

RICE is meant to guide decisions, not provide absolute truth. Rough estimates are often sufficient.

Consider Strategic Factors

Use RICE alongside strategic considerations, not as the sole decision-making tool.

Comparison with Other Frameworks

RICE vs. MoSCoW: RICE is better for ongoing product roadmap planning with quantitative comparison, while MoSCoW works well for fixed-deadline projects.

RICE vs. Value vs. Effort: RICE is more comprehensive, considering reach and confidence in addition to value and effort.

Best Use Cases

  • Product roadmap planning
  • Feature prioritization
  • Resource allocation decisions
  • Quarterly planning
  • Portfolio management
  • Cross-team priority alignment