There could be an Instrument interface that is implemented by instrument types such as RenderedSynthInstrument, RealtimeSynthInstrument, and RealInstrument. (There's not a great built-in way to do interfaces in JS, so "interface" is purely conceptual here.)
These instruments would be managed by the NotePlayer class.
-
PrerenderedSynthInstrument would be how the code currently works - pre-render a scale of notes into an audio buffer, then play the buffer piecemeal using multiple voices as notes are requested.
-
RealtimeSynthInstrument would schedule and play synth notes in real time, the way the Tone.js library works by default.
-
RealInstrument would load in a set of audio files (or an audio sprite) from URL(s). This could allow a ToneMatrix instrument to be backed by recordings of real instruments such as piano.
There could be an
Instrumentinterface that is implemented by instrument types such asRenderedSynthInstrument,RealtimeSynthInstrument, andRealInstrument. (There's not a great built-in way to do interfaces in JS, so "interface" is purely conceptual here.)These instruments would be managed by the
NotePlayerclass.PrerenderedSynthInstrumentwould be how the code currently works - pre-render a scale of notes into an audio buffer, then play the buffer piecemeal using multiple voices as notes are requested.RealtimeSynthInstrumentwould schedule and play synth notes in real time, the way theTone.jslibrary works by default.RealInstrumentwould load in a set of audio files (or an audio sprite) from URL(s). This could allow a ToneMatrix instrument to be backed by recordings of real instruments such as piano.