-
Notifications
You must be signed in to change notification settings - Fork 0
Description
🚀 ABM Parameter Calibration Issue: Grid Search and Plausible Ranges
📝 Issue Summary
In our agent-based model (ABM) for simulating behavioral adoption in energy communities, certain parameters—especially those governing commitment evolution—need careful calibration.
The main concern is: how to select parameters like k, L, and N0 so that the model produces realistic adoption dynamics, while keeping computational effort manageable.
This document summarizes strategies and considerations for addressing this issue later.
🔑 Key Parameters
| Parameter | Meaning | Role in Model |
|---|---|---|
k |
Growth rate of commitment | Controls how fast agents change behavior in response to adoption/incentives |
L |
Maximum commitment saturation | Sets the upper limit of commitment, even if all peers adopt |
N0 |
Inflection point | Determines how many adopters are needed to trigger rapid growth (mass effect) |
w_C, w_I, w_S |
Adoption weights | Influence the contribution of commitment, incentives, and social influence |
σ² |
Noise in adoption | Adds stochasticity to adoption decisions |
⚠️ Issue: Without explicit values or ranges, reviewers may consider the model parameters unjustified, affecting credibility.
🎯 Approach for Parameter Selection
-
Define plausible ranges for each parameter based on literature and model timescale:
k:[0.01, 0.2]– slow to moderate adoption over 100 ticks (≈2 years)L:[0.7, 1.0]– maximum possible commitmentN0:[50, 400]– adoption threshold for rapid growth
-
Grid search concept:
- Test all combinations of parameters on a discrete grid
- For each combination, run the model and compute a loss function comparing simulated adoption vs. empirical targets
- Select the combination that minimizes the discrepancy
🔹 Problem: With many parameters, full grid search becomes computationally expensive (combinatorial explosion).
-
Practical alternatives:
- One-at-a-time (OAT): vary one parameter, fix others. ✅ Simple, ❌ ignores interactions
- Random search: sample random combinations in the plausible ranges. ✅ Efficient, captures interactions statistically
- Latin Hypercube Sampling (LHS): ensures uniform coverage of multi-dimensional space without testing all combinations
- Bayesian/Adaptive optimization: iteratively chooses promising combinations based on previous results. ✅ Most efficient for large parameter spaces
-
Selection criterion:
- Loss function example:
L(θ) = sqrt(1/M ∑_m (A_sim_m(θ) - A_emp_m)^2) θ= parameter combinationA_sim_m= simulated adoption indicatorA_emp_m= empirical adoption indicator- Choose parameters that minimize L(θ)
- Loss function example:
⚡ Key Insights
kdetermines speed of behavioral changeLsets maximum achievable commitmentN0defines how much adoption is needed to trigger rapid change- Intervals guide grid/random search, ensuring values remain realistic and computationally feasible
- Use sampling strategies to reduce runtime while preserving model reliability
📌 Next Steps
- Implement random/LHS sampling for main ABM parameters
- Define loss function for adoption targets
- Test calibration with ~500–1000 simulations as a starting point
- Document final calibrated values and ranges in GitHub README or parameter table
📚 References
- Rogers, E.M. (1962). Diffusion of Innovations. Free Press.
- Bass, F.M. (1969). "A New Product Growth Model for Consumer Durables." Management Science, 15(5), 215–227.
- Parvin, A.J. et al. (2021). "Macro Patterns and Trends of U.S. Consumer Technological Innovation Diffusion." Systems, 9(1), 16.