Personal portfolio website for Elliott John Mitchell — neuroinformatics researcher working at the intersection of brain-computer interfaces, neural dynamics, and perceptually-aligned AI. Built with HTML, CSS, and JavaScript, the site presents Elliott's research, publications, and technical work.
Live at: hawksp.github.io
git clone https://github.com/HawkSP/hawksp.github.io.gitgit clone https://github.com/HawkSP/hawksp.github.io.gitOpen index.html in a browser to view the portfolio locally.
Elliott John Mitchell is a neuroinformatics researcher specialising in brain-computer interfaces, neural dynamics, and perceptually-aligned AI systems. His research bridges computational neuroscience with deep learning to build AI that perceives temporal structure through oscillatory dynamics — the same mechanisms humans use to process rhythm, music, and motion.
He holds a First Class Honours degree in Music: Production, Performance and Enterprise from the University of Westminster (2024), and is applying to Queen Mary University of London's MSc Sound and Music Computing (AI and Music Data Science stream) for September 2026, with the intention of continuing into doctoral research.
Research interests across neural dynamics, brain-computer interfaces, cross-modal perception, and perceptually-aligned AI, with headline metrics from the DedAI-Neurodynamics project.
Education, research experience, professional experience, and technical skills.
- DedAI-Neurodynamics — Neural music generation framework integrating Neural Resonance Theory (ASHLE, GrFNN) with deep learning
- PyNRT — Open-source Python toolkit for simulating Hopf oscillators, gradient frequency neural networks, and adaptive oscillator models
- DMRN+18 Talk — 20-minute invited talk at Queen Mary University of London on EEG-driven music generation
- DedAI Research Paper — Advanced AI-Driven Music Composition Informed by EEG-Based Emotional Analysis
Project updates from the DedAI-Neurodynamics platform.
Direct contact details, and academic references (Hussein Boon and Dr Jasmine Taylor, both University of Westminster).
- 94.37% Phase-Locking Value retention in trained TCN surrogate of ASHLE dynamics
- <50 ms real-time latency for closed-loop BCMI
- r = 0.91 cross-modal correlation between visual motion energy and ASHLE rhythm synthesis
- p = 0.0158 statistically significant musician/non-musician entrainment difference
- 14-channel EEG acquisition and analysis (Emotiv EPOC X)
Based on a template by codewithsadee. Heavily customised in content, structure, and styling for Elliott's work.
elliott.mitchell10@gmail.com github.com/HawkSP · linkedin.com/in/dedeye
MIT — see LICENSE for details.