Skip to content

first hmm example#923

Merged
AlexanderFengler merged 5 commits intomainfrom
low-level-to-hmm-example
Mar 12, 2026
Merged

first hmm example#923
AlexanderFengler merged 5 commits intomainfrom
low-level-to-hmm-example

Conversation

@AlexanderFengler
Copy link
Copy Markdown
Member

hmm-ssm first notebook example.
Needs a bit of refinement but it's a poc.

@review-notebook-app
Copy link
Copy Markdown

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

digicosmos86
digicosmos86 previously approved these changes Mar 11, 2026
Copy link
Copy Markdown
Collaborator

@digicosmos86 digicosmos86 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@review-notebook-app
Copy link
Copy Markdown

review-notebook-app bot commented Mar 11, 2026

View / edit / reply to this conversation on ReviewNB

krishnbera commented on 2026-03-11T18:30:36Z
----------------------------------------------------------------

minor: consider making the title more informative for not-so-technical audience who might be interested in latent processes?

Eg. Joint Modeling of Latent Cognitive States and Decision Processes: Utilizing HMM with DDM Emissions


@review-notebook-app
Copy link
Copy Markdown

review-notebook-app bot commented Mar 11, 2026

View / edit / reply to this conversation on ReviewNB

krishnbera commented on 2026-03-11T18:30:37Z
----------------------------------------------------------------

broken github repo link here.


@review-notebook-app
Copy link
Copy Markdown

review-notebook-app bot commented Mar 11, 2026

View / edit / reply to this conversation on ReviewNB

krishnbera commented on 2026-03-11T18:30:38Z
----------------------------------------------------------------

Line #3.    

Can probably introduce a blurb about how the likelihood is computed for such models? Might help ppl who are not already familiar with it?

See eg. blurb below --

The likelihood of an HMM-DDM is calculated by integrating the continuous DDM densities into a discrete Hidden Markov Model framework using the Forward Algorithm.

The Core Process

  1. Local DDM Densities: For every trial $t$, we calculate the likelihood of the observed response time and choice under each possible hidden state $k$.
  2. State Filtering: We use a "forward pass" to track the probability of being in state $k$ at time $t$, given all previous observations.
  3. Recursive Update: This forward probability is updated trial-by-trial by multiplying the previous trial's beliefs by the transition matrix $P$ and the current trial's DDM density.
  4. Total Likelihood: By summing these probabilities at the final trial $N$, we obtain the marginal likelihood of the entire dataset across all possible state sequences.


@krishnbera
Copy link
Copy Markdown
Collaborator

@AlexanderFengler this already is in a great shape!
Made few minor comments. Ability to use HMMs with SSMs in such an easily accessible manner is def something a lot of people would find interesting.

krishnbera
krishnbera previously approved these changes Mar 11, 2026
AlexanderFengler and others added 2 commits March 11, 2026 21:22
- Update title to be more informative for broader audience
- Remove broken link to regime-switching-bayesian repo
- Add Forward Algorithm likelihood explanation blurb

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@AlexanderFengler AlexanderFengler merged commit 6f3f20a into main Mar 12, 2026
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants