feat: implement quad-flow interaction system and major architectural …#1
Merged
BenjaminIsaac0111 merged 4 commits intomainfrom Feb 25, 2026
Merged
feat: implement quad-flow interaction system and major architectural …#1BenjaminIsaac0111 merged 4 commits intomainfrom
BenjaminIsaac0111 merged 4 commits intomainfrom
Conversation
…cleanup
Implemented a flexible attention masking system (Quad-Flow) that allows granular
configuration of interaction quadrants (p2p, p2h, h2p, h2h), now defaulting to
full interaction mode. This refactor simplifies the core architecture while
introducing robust biological supervision and scalable presets.
Key Changes:
- Model Architecture:
* Replaced binary masking with a composable '--interactions' list.
* Integrated AuxiliaryPathwayLoss for direct biological supervision and
ZINB loss support for raw count modeling.
* Simplified core logic by inlining 'PathwayTokenizer' and removing legacy
5D/2D branches, dead aliases, and deprecated forward paths.
* Added 'return_attention' support for cross-layer attention map extraction.
- Diagnostics & Scalability:
* Added 'diagnose_collapse.py' to monitor pathway diversity and analyze
mean attention across the four interaction quadrants.
* Refactored 'run_preset.py' into a unified flag system with new scaled
model variants (L2, L4, L6).
* Standardized environment setup (PS1/SH) on Python 3.9 for CI consistency.
- Documentation & Project:
* Rewrote 'MODELS.md' to detail the Quad-Flow Interaction design and
auxiliary loss branch.
* Updated 'README.md' and 'TRAINING_GUIDE.md' with ZINB recipes, training
presets, and feature highlights.
* Introduced 'CONTRIBUTING.md' to define IP rules and dev standards.
- Verification:
* Expanded test suite to verify interaction logic, ZINB/Auxiliary loss
stability, and MSigDB signal detection.
- Update CONTRIBUTING.md to formally document main branch protection requirements, including mandatory PR reviews, status checks, and linear history. - Create .github/CODEOWNERS to automatically assign @BenjaminIsaac0111 as a reviewer for all project modules and documentation. - Align contribution guidelines with the new Repository Ruleset workflow.
…orrectly placing enable_nested_tensor=False in the TransformerEncoder constructor. Pytest Configuration: Configured pyproject.toml to suppress common non-critical warnings (Deprecation, Matplotlib, etc.). Demonstration Test: Added tests/test_warnings.py to illustrate how to handle warnings using pytest.warns.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
…cleanup
Implemented a flexible attention masking system (Quad-Flow) that allows granular configuration of interaction quadrants (p2p, p2h, h2p, h2h), now defaulting to full interaction mode. This refactor simplifies the core architecture while introducing robust biological supervision and scalable presets.
Key Changes:
Model Architecture:
Diagnostics & Scalability:
Documentation & Project:
Verification: