|
Post by feijai on Jan 23, 2024 21:53:48 GMT
Namke and I are submitting a paper to SMC 2024 (Sound and Music Computing conference) entitled "Modular MIDI". It is a technical paper which describes advanced ideas in uses for MIDI in a modular system, with the focus on AE Modular, but with an eye towards other platforms. It details a specification of conventions for a lot of these ideas. The goal is a spec that has multiple levels: basic levels that are useful NOW for current module developers, and advanced levels that really just lay out a roadmap (which can be changed) for more sophisticated applications in the future. We would like your feedback. Our deadline is mid to late February. Please post to the forum and tell us what you think!
Attached to this message are two files: - midi.pdf. This is the paper itself. It is a highly technical academic paper.
- introduction.pdf This is a short, breezy ELI5 introduction to the spec that you should read FIRST to come up to speed with the ideas we're tossing around. I typed this up fast and now that I read it, man, it's got a lot of typos. :-)
Sound and Music Computing is a moderately-sized research computer science conference whose subject is computer music and electronic music technology research. It is rigorously peer-reviewed. Researchers submit papers to it to be officially published, and then present their papers at the conference. I'm a computer science professor, so this is what I do...
Attachments:midi.pdf (173.61 KB)
introduction.pdf (76.94 KB)
|
|
gerif
Junior Member
Posts: 53
|
Post by gerif on Jan 24, 2024 0:46:39 GMT
Hello!
A lot of ideas, currently I only took a look to the introduction for the moment! I give you some short feedback for discussion!
To patch midi information inside AE Modular still is here with the modules of Wonkystuff! With mb/1, mcc/4, mtr/8 and mdiv we got an interface from midi to the analogue world.
The mco/1 is something totally different because there is a limited interface to the analogue world (no CV output). There are no analogue input e.g. for modulation. There is a restricted analogue interface to this module.
So clearly next steps will follow!
Generally I see two different way:
a) Enhance the midi implementation to provide more CV and gate output and keep the AE Modular at the analogue world b) Create a set of modules using midi communication instead of CV and gate information
I personally would prefer a)
b) Would mean that the number of parameter will explode and not accessible haptically anymore.
I understand such a system as the next generation of modular synthesizer.
According the MIDI standard a lot of things are defined. So the first question is about the compatibility. As soon we connect controllers/sequencers/DAW we have to consider what information they are able to configure and to handle. There is a very wide range from "enable to set only a midi channel" to flexible possibilities to configure the midi functionality.
Some of the ideas seem not to be compatible with the midi standard (e.g. see list of CC). With my AKAI MINIAK I am not able to configure the CC20.
On DAW there are some more possibilities, but it means not a simple job to configure what you describe with our ideas. It would be necessary to have a software configuration tool to setup the system.
Concerning the connection between the midi modules: currently analogue connections are done point to point
With midi it could be only a bus structure, but it would be possible to have not only one "through" output and one midi input per module. With more than one midi output the patching would take the function of addressing (define the connection) instead using an additional ID.
Sometimes ago I thought about a system with digital patching instead of analogue. So we transfer 32 bit with 192 kHz with fibre optical wires between the modules (Audio, CV, Gate, Trigger, parameter, etc.), but use encoders to define the values. But this is another story!
Gerhard
|
|
gerif
Junior Member
Posts: 53
|
Post by gerif on Jan 24, 2024 7:53:20 GMT
A complete digital conception could have a rack with interface modules looking like AE modular modules but consist only of the potis, switches , sliders and patch points etc. The electronic connects this user interface to an internal bus which leads to an communication controller. One or more of such racks are connected to a personal computer.
So this is a digital modular synthesizer controller! Providing the haptik of an analogue interface.
All processing is done with software and plugins at the computer.
With a quite low number of different hardware modules a very flexible synth could be creaded!
|
|
|
Post by feijai on Jan 25, 2024 12:01:32 GMT
It looks like most of the discussion of the paper is going on on Discord, not here, but I'd like to get things going here too. To start, let me respond to a few items: Hello! Generally I see two different way:
a) Enhance the midi implementation to provide more CV and gate output and keep the AE Modular at the analogue world b) Create a set of modules using midi communication instead of CV and gate information
This is up to namke of course. But there's no reason you couldn't have some limited analog and MIDI signals mixed: though notes and gate might be hard. The goal in the paper was to cover all the bases. If a controller of DAW doesn't support it, we have to have some kind of fallback. For example, we support 14-bit CC and NRPN: but a number of lazy stupid DAWs don't support either one (this despite the fact that NRPN has been part of MIDI since the late 1980s, and 14-bit CC has ALWAYS been standard in MIDI). We can't do much about this, but at least our 14-bit CC is set up to fall back to 7-bit CC if necessary. I was not aware the MINIAK could send any CC at all: it is a rebadged Alesis Micron, and I had thought it sends but does not accept CCs. We cannot do much about controllers or keyboards that do not fully support MIDI. A good option would be a Novation SL! That wouldn't be hard to build. But it shouldn't be necessary with a good controller should it? It's not a bus structure, it's point to point, but you are correct that MIDI does not allow cycles, that is, you can't connect A to B and then B to A again. Also each node can can have at most one MIDI IN [unless the node does MIDI Merge, which is complicated]. You can have as many MIDI THRUs on a module as you like. We do allow a node X to modulate node Y via MIDI, while BOTH are being controlled by the musician. I am not truly a modular synth guy. But it seems to me that this would partially defeat the point of a modular synthesizer: if you're going to go fully that route, you might as well just use a standard synthesizer with a modulation matrix, which does the same thing and much more cheaply...
|
|
|
Post by leethargo on Jan 25, 2024 13:32:47 GMT
First, I think both the intro as well as the article are clearly written (although I only skimmed over the later sections dealing with namespace conflicts).
Second, the intro in particular was helpful anti-GAS for me, as I no longer feel like I'm missing out by not jumping on the MIDI-inside-AEM train as I lack external MIDI gear and don't want to use a DAW anyways. Though, the comments about pitch tracking and microtonal scales are intriguing. Thanks :-)
In terms of scope of the article, as well as future work, it seems that the outlook is limited to a particular direction of signal flow, where MIDI originates outside the modular system (eg DAW or controller) and then is consumed by sound generating modules inside the system. But I could equally imagine going the other direction: Maybe I want to use modules to produce melody lines generatively (eg by some combination of sequencers and turing machines, running at different clock divisions and transposing each other) and then using the resulting pitch/gate events to control some external (non-modular) MIDI synthesizer. I guess converting V/oct to MIDI involves quantizing and might still be ambiguous, but otherwise this scenario would still fit in your protocol, right?
|
|
namke
wonkystuff
electronics and sound, what's not to like?!
Posts: 654
|
Post by namke on Jan 25, 2024 16:58:51 GMT
First, I think both the intro as well as the article are clearly written (although I only skimmed over the later sections dealing with namespace conflicts). Second, the intro in particular was helpful anti-GAS for me, as I no longer feel like I'm missing out by not jumping on the MIDI-inside-AEM train as I lack external MIDI gear and don't want to use a DAW anyways. Though, the comments about pitch tracking and microtonal scales are intriguing. Thanks :-) In terms of scope of the article, as well as future work, it seems that the outlook is limited to a particular direction of signal flow, where MIDI originates outside the modular system (eg DAW or controller) and then is consumed by sound generating modules inside the system. But I could equally imagine going the other direction: Maybe I want to use modules to produce melody lines generatively (eg by some combination of sequencers and turing machines, running at different clock divisions and transposing each other) and then using the resulting pitch/gate events to control some external (non-modular) MIDI synthesizer. I guess converting V/oct to MIDI involves quantizing and might still be ambiguous, but otherwise this scenario would still fit in your protocol, right? Yes, there’s nothing preventing MIDI signals from leaving AE — with, as you mentioned, some kind of CV-MIDI bridge. Depending upon the desired signal output (notes, or CCs for example) there might be something like a ‘sample and hold’ that took a CV input and sent a MIDI message when it received a trigger pulse. @feijal’s DAVE firmware for GRAINS has already shown it can send messages to the TRS of the mb/1, so a more ‘formalised’ version of his adapter could be made…
|
|
namke
wonkystuff
electronics and sound, what's not to like?!
Posts: 654
|
Post by namke on Jan 25, 2024 17:03:34 GMT
It looks like most of the discussion of the paper is going on on Discord, not here, but I'd like to get things going here too. To start, let me respond to a few items: Hello! Generally I see two different way:
a) Enhance the midi implementation to provide more CV and gate output and keep the AE Modular at the analogue world b) Create a set of modules using midi communication instead of CV and gate information
This is up to namke of course. But there's no reason you couldn't have some limited analog and MIDI signals mixed: though notes and gate might be hard. Not just me of course — the great thing about AE is that it is DIY friendly, so these ideas might be the things which launch another 3rd party developer!
|
|
|
Post by visuellemusik on Jan 25, 2024 19:51:29 GMT
In terms of scope of the article, as well as future work, it seems that the outlook is limited to a particular direction of signal flow, where MIDI originates outside the modular system (eg DAW or controller) and then is consumed by sound generating modules inside the system. But I could equally imagine going the other direction: Maybe I want to use modules to produce melody lines generatively (eg by some combination of sequencers and turing machines, running at different clock divisions and transposing each other) and then using the resulting pitch/gate events to control some external (non-modular) MIDI synthesizer. I guess converting V/oct to MIDI involves quantizing and might still be ambiguous, but otherwise this scenario would still fit in your protocol, right? TBH, I also find use-cases (signalflows) which mainly keep things inside the AE Modular (so now MIDI from DAWs for intance) or expand it further to places "Where No Man Has Gone Before" the most intriguing, and think it's a great chance to use MIDI in lesser expected ways, still staying compatible with it's protocol. So best-case less cables, less complexity but still as fun and flexible as CV/Gate. And: of course not as an alternative to CV/Gate, but as an extension where it makes sense and you want to use it. Due to the technology involved some modules even could be smaller and/or cheaper and/or even more precise without CV, because e.g. you don't need ADCs or DACs when it's only using MIDI, like let's say a modulator, sequencer or whatever. Still regarding MIDI signals from outside the AE Modular: usage of hardware like MIDI-controllers are nice, I guess ;-)
In terms of converting V/oct to MIDI without quantizing: typically you could do this the "MPE-way" (which is one of many options with the concepts from the paper) you still can have CV with a precision of 14bit, which really should be enough and is even more than some digital modules (using 10 or 12 bits for instance) can process.
How this is done in principle is that along with the note used, the pitch given by a MIDI note you'll obviously can have Pitchbend. And because MPE features "voices" per MIDI-channel, you can have pitchbend per voice (Memberchannel as it's called with MPE sometimes as well). There is one nasty catch with MPE, which is that the standard suggests, that after noteoff (so typically during the release-phase of a voice) "a receiver" should not process pitchbend anymore, which I find really weird. (The idea behind that is that tonal glitches between several tones of one voice are meant to be prevented that way, which normally would not happen if we have one note per MPE voice, but anyways). As with many other things as with MIDI 1.0 (including MPE) I am quite sure we can find ways around that.
Another option with variable pitch and MIDI is to have pitchmaps, but those (making use of SysEx messages) are not intended for realtime use and are at some extend (at least without the use of pitchbend) a way to quantize stuff to other scales (per note if you want), still a nice feature we can have with MIDI "for free" I'd say.
|
|
|
Post by feijai on Jan 25, 2024 20:08:49 GMT
First, I think both the intro as well as the article are clearly written (although I only skimmed over the later sections dealing with namespace conflicts). Second, the intro in particular was helpful anti-GAS for me, as I no longer feel like I'm missing out by not jumping on the MIDI-inside-AEM train as I lack external MIDI gear and don't want to use a DAW anyways. Though, the comments about pitch tracking and microtonal scales are intriguing. Thanks :-) In terms of scope of the article, as well as future work, it seems that the outlook is limited to a particular direction of signal flow, where MIDI originates outside the modular system (eg DAW or controller) and then is consumed by sound generating modules inside the system. But I could equally imagine going the other direction: Maybe I want to use modules to produce melody lines generatively (eg by some combination of sequencers and turing machines, running at different clock divisions and transposing each other) and then using the resulting pitch/gate events to control some external (non-modular) MIDI synthesizer. I guess converting V/oct to MIDI involves quantizing and might still be ambiguous, but otherwise this scenario would still fit in your protocol, right? Yes, there’s nothing preventing MIDI signals from leaving AE — with, as you mentioned, some kind of CV-MIDI bridge. Depending upon the desired signal output (notes, or CCs for example) there might be something like a ‘sample and hold’ that took a CV input and sent a MIDI message when it received a trigger pulse. @feijal’s DAVE firmware for GRAINS has already shown it can send messages to the TRS of the mb/1, so a more ‘formalised’ version of his adapter could be made…
Hmmm, you got me wondering if I should set up USB MIDI Out on the grains. What would people want to send out? Notes from SEQ16? Triggers?
|
|
|
Post by admin on Jan 25, 2024 20:14:20 GMT
Hmmm, you got me wondering if I should set up USB MIDI Out on the grains. What would people want to send out? Notes from SEQ16? Triggers?
* triggers from Topograf or EuclidGrid * pitch and gate from RBSS * CV from LFOs would be cool to be able to pipe those into my iPad to one of the many synths and drum kits I have amassed in there.
|
|
|
Post by feijai on Jan 25, 2024 21:32:13 GMT
In terms of converting V/oct to MIDI without quantizing: typically you could do this the "MPE-way" (which is one of many options with the concepts from the paper) you still can have CV with a precision of 14bit, which really should be enough and is even more than some digital modules (using 10 or 12 bits for instance) can process. The MPE method (note off, then pitch bend, then note on) is probably the only reasonable approach. But you have to know what the pitch bend range is on the follow-on instrument and/or if it supports RPN pitch bend range. The default bend range in MPE is crazy: it's +/-4 octaves! So that still gives you 170 steps per semitone. I thought so too but it makes perfect sense if you want precision pitches. For your first note you set the pitch bend, then do the note on. Fine. But for the second note, you do note off, then set the pitch bend and note on. Uh oh! When you changed the pitch bend the *first note pitch changed* because its release is still going. That's why they say the pitch bend can't effect the note off, so you can be guaranteed that changing the pitch bend to prepare for the next note won't mess up the previous one. I would be surprised if microtuning maps will work. MIDI's got a reasonable microtuning standard but not enough machines support it, and I've never heard of one which could change it in real-time.
|
|
|
Post by feijai on Jan 25, 2024 21:34:38 GMT
BTW, it's worth mentioning that, though he's not listed on the draft, visuellemusic will be the third author on the paper.
|
|
gerif
Junior Member
Posts: 53
|
Post by gerif on Jan 25, 2024 23:15:12 GMT
Hello!
Now I read the paper too! I understand the idea of the paper in that way: find additional definitions at the protocol to enable communication between a lot of different modules.
At first I see the equipment around a modular synth in different types:
- A lot of users use only the Synth: here Midi is helpful to communicate internally between different digital modules only.
- Other users have much equipment around the synth (controller, DAW, sequencer etc.): here it is to less to provide only the input for one device.
We want to play the keys to control the synth and record at the same time this to the DAW including the moving of the knobs on the modular synth! And we want to send it back from the DAW to the equipment (with a modular synth as we are used with other digital sound modules - it is only like I am dreaming)
The paper describes something between this two poles. It is quite complex and I am sure there will be some surprises during implementation and integration to existing equipment.
When we look only to the synth we have mainly three types of modules: - Oscillators - Effects (filters, vcas, really effects) - Manipulators (lfo, envelops, sequencer) and a mix of all of this (sound modules)
Manipulation sent information to oscillators, effects and other manipulation mainly. All should be remote controlled.
- Some of the modules should have remote control from outside (midi, CV, gates, trigger)
- Most of the modules have inputs (CV, gate, trigger, audio) internally
- Most of the modules have a user interface to interact with the user - Each of the modules has outputs (CV, gate, trigger, audio) internally or externally
When we internally want to use Midi instead of CV, gates or trigger the modules needs related interfaces! To provide something similar what is possible with CV... we need inputs to receive information from different sources and outputs to some different modules. At the AE Modular modules up to 8 different I/O's are used and routed to a lot of other modules. Routing is possible to any other module, independent to which voice etc. it belongs!
If we want something similar it will not work with only one midi in and one midi though port on the modules! A standard bus controller with a bus inside the rack should be considered. I am sure there are cheap chips (e.g. Ethernet?), which could be used on digital modules. Each module would have its dedicated address! Midi protocol could be used anyway, but with different instances.
With only serial midi I/O ports mostly more than one input and output will be used!
Finally I would see a modular synth with digital modules more with complex modules where complete functionalities (oscillators, effects, manipulation, polyphony etc.) are integrated in one module. At midi level than it could be seen as one device at each module (similar to stand alone equipment)! At least with the current possibilities!
The paper surely is here at the right time! Finding some simplification e.g. by using other possibilities for connections and identification outside the Midi protocol!
Gerhard
|
|