What is MIDI 2.0, and what does it mean for musicians and producers?
We chat to Art+Logic’s Brett Porter to find out what makes this new interfacing spec tick
It’s been an essential building block of how music is made for about 35 years now, but would you be surprised to learn that the MIDI standard - the protocol we use every day to play and program music on computers, synths and other electronic gear - is still at version 1.0?
• Also see: MIDI 2.0 spec confirmed: “the biggest advance in music technology in decades”
The fact that things have remained unchanged for so long has been a testament to the original Yamaha, Sequential Circuits, Roland, Korg and Kawai. But whether you think that MIDI’s [all musicians need or that it’s ripe for a change, a change is on the way in the form of MIDI 2.0, an update to the specification first announced in the last couple of years, with help from some of today’s biggest software creators.
We should mention at this point that the MIDI 2.0 specification isn’t final at the time of publication. It’s subject to change, but at this late stage, it seems that the vast, vast majority of the work has been done.
(UPDATE: The MIDI Manufacturers Association voted at its annual meeting on the Sunday of NAMM to adopt MIDI 2.0 as an official spec, so there won't be any substantial changes now.)
With the completion of MIDI 2.0 looking iminent (at least when compared to the 35-year lifespan of version 1.0) - and Roland having recently announced the A-88MKII, its first MIDI 2.0-compatible controller - MusicRadar talked to Art+Logic’s Brett Porter, one of the players involved in the crafting and implementation of the standard.
Decoding the future
When we talk to Brett, he’s recently returned from London’s ADC 2019 conference, where he was presenting more about MIDI 2.0.
Get the MusicRadar Newsletter
Want all the hottest music and gear news, reviews, deals, features and more, direct to your inbox? Sign up here.
“That was really the first public-facing presentation of a lot of the material in that kind of detail,” says Brett, whose company, Art+Logic, is an audio software developer for hire. “What we brought to the table was that we didn’t have an agenda. We’re not doing this to try and sell a product - we’re involved because what makes the musical instrument industry bigger is obviously good for us.”
We should be as un-sensationally upfront as possible here because, as Brett explains, the implications of MIDI 2.0 are hard to pin down right now. The new protocol offers a whole field possibilities, but it’s up to manufacturers and developers to make creative use out of them.
“It’s kind of like we’ve been designing a box of Lego bricks, and now somebody else has got to make something cool out of it. I think so much of what we’ve seen over the last 35 years, so much of what people are doing with MIDI now, wasn’t imagined when MIDI 1.0 was created in the early 80s.”
The very first MIDI 2.0 instruments
Early 2019 saw the first MIDI 2.0 synth and controller prototypes come together to see if they could communicate effectively based on the draft specification at the time. At Winter NAMM, companies including Ableton, Google, Roland, Steinberg, Yamaha and Native Instruments came together for the MIDI 2.0 ‘Plug Fest’, echoing the event that had taken place in the creation of MIDI 1.0.
“A Plug Fest is exactly what it sounds like - everybody runs around with what they have, connecting stuff together and checking what happens. This was the first time all of us on the prototyping group were in the same room.”
You can find some footage from 2019’s Plug Fest at the MMA website.
“It was very much a historic occasion. My contribution was a debugging and development tool we call MIDI 2.0 Scope. Basically, it allows you to create any MIDI 2.0 message and send it under controlled circumstances so you can be sure that your synth responds correctly. It makes it very obvious if something is being sent in error.”
Higher resolution for velocity and control messages
One of the most obvious upgrades to a modern digital communication system would be increased resolution. What can MIDI 2.0 offer us that MIDI 1.0 couldn’t?
“Velocity is moving up from 7-bit to 16-bit - so a lot more resolution. Most of the controllers are 32-bit now, meaning vastly more resolution.”
While some would argue that higher-resolution controls are unnecessary for music-making applications, MIDI has become a lot more than just a musical standard, with fields like lighting and animatronics feeling the benefit of MIDI 1.0, and now the squeeze of its resolution, too.
“Anytime you need data that you can control over time, there are tools that can work with MIDI that can do that. If you’re controlling an animatronic character rather than a DX7, that’s OK.”
“I think there’s going to be a bit of leapfrog going on. I think right now the industry is so used to having 7-bit resolution that there’s been no reason to build higher-res controllers. There should be plenty of uses, but in many ways it’s up to the industry to catch up and make something based on that standard.”
MIDI 2.0 is fully backward-compatible with MIDI 1.0
A determining factor of MIDI 2.0’s potential success is whether a device that supports it can still communicate with older, MIDI 1.0 devices. After all, the entire music-making ecosystem, as it stands, is made up only of those older controllers and instruments.
The solution for MIDI 2.0’s backward compatibility - and for its forward trajectory, too - is known as MIDI CI: Capability Inquiry.
“The Big Change in MIDI 2.0 is that it has to be a bi-directional communication. Before, you could have a synth that just sent out notes - it had no idea if anyone was at the other end or what was going to happen to it. The first thing that will happen with MIDI 2.0 is that two devices will align and negotiate with each other and work out what they're going to do. If nothing’s sent back, it just goes back to life as a MIDI 1.0 device.”
MIDI CI’s not just a catch-all for backward compatibility, it’ll also bring progress in compatibility as manufacturers release a whole range of new instruments - whatever they may be…
MIDI 2.0 devices will talk more to each other
“I got into this biz a long time ago; when you bought a new piece of gear you’d have to use a tiny little screwdriver to set jumpers,” says Brett. “Then there was the jump to USB, where you plug your adapter in and it goes, ‘I know who you are, I know what you can do, let’s go.’ And MIDI 2.0 will give us exactly that kind of negotiation and auto configuration.”
Property Exchange
Building off MIDI CI, Property Exchange lets one device ask another device what parameters it’s got that are available to access. Parameter lists, controller mappings, synthesis parameters, and information about presets are a few of the details that can be communicated - and communicated using meaningful names rather abstract, generic numbers.
“Anyone who’s used automation on a synth plugin inside a DAW already understands the benefits of being able to see what’s on the other end of the synth. With a hardware synth over MIDI 1.0 it might be ‘Program 32’ or ‘Controller 9’, and you have no idea what that is. It’s much better as a human being to know a name for what you’re working with.”
Profile Configuration
As another result of Capability Inquiry, the world of MIDI controllers - including any non-standard devices like wind controllers or MIDI guitars - will be able to integrate with music setups much more easily.
“The example that’s frequently given here is that there are dozens of drawbar organ controllers on the market and way more synths that respond to those. No two of them use the same MIDI controller numbers for the drawbars. With Profiles, the industry can come together and say, ‘OK, these are the block of controllers that we are going to use for Hammond B3 style drawbars’, and as soon as two MIDI 2.0 devices agree that they both support that profile, they auto-configure, and there’s no more that needs to be done.”
MIDI 2.0 meets VST 3
One presentation at this year’s ADE saw Steinberg outlining how VST 3 instruments will fit with MIDI 2.0. Long story short: the VST 3 spec is more than ready to deal with the expanded resolution of MIDI 2.0, having implemented higher resolution virtually already. It’ll be up to DAW developers to properly implement the MIDI 2.0 spec, but VST 3 plugins are ready to deal with everything that’s coming.
16 channels become 256
“One of the cool new things considered for MIDI 2.0 is that 16 channels is not enough for anybody. So every message begins with a message type and a channel group - and there are 16 channel groups. So now we’ve got 16 virtual cables with 16 channels each.”
“This also lets you get a MIDI 1.0 data message onto one channel, so if you have a very simple device that doesn’t need higher resolution data, it can run on one of those 256 channels, alongside other 1.0 and 2.0 devices, with a single physical link.”
All hail the Universal MIDI Packet
Anyone who’s ever got into the weeds and learnt how MIDI messages are formed will be familiar with how simple Note On and Note Off messages have been formulated in MIDI until now - three simple bytes per event for most common functions. But as things get complicated, we’ll have to wave goodbye to our simple .mid files as MIDI 2.0 replaces these messages (although there’ll be backward compatibility, of course).
“The thing that’s in the protocol spec is this idea of what they’re calling Universal MIDI Packet, and there are different kinds of things that can be carried in a universal MIDI packet. And one of those things is that you can just stick a MIDI 1.0 message in and it will be routed under the new MIDI 2.0 rules.
“It was a byte stream, so a MIDI 1.0 message would have been two to three bytes long, depending on context, whereas the new Universal MIDI Packet is always multiples of 32-bit - the simplest message under the UMP is four bytes long.
“A timestamp message is always 32 bits; a MIDI 1.0 message inside of universal protocol is always 32 bits long; all of the new channel voice messages - what you and I would think of as MIDI data – are all 64 bits, two 32-bit words. And then there are 96- and 128-bit messages as well. For instance SysEx data is going to be sent out 128 bits at a time.
“There is going to be a new standard MIDI file format. I don’t believe work on that has actually started yet, but because the messages are so different, we’re going to need a new file format.”
Tighter timing
“One of the new MIDI 2.0 message types are what they call JR timestamps, which stands for Jitter Reduction. There’s going to be an option to prepend one of these timestamp messages to every outgoing MIDI 2.0 message that a device sends. That allows the sender’s time to be encoded in every message.
“If something gets jumbled and delayed in transmission, the receiver still knows what time it was supposed to arrive rather than just knowing the time it actually arrived.”
Beyond MPE
Recently, the MIDI Manufacturers Association officially adopted MPE (MIDI Polyphonic Expression) into the specification, meaning that controllers like the Linnstrument, the Haaken Continuum and Expressive E’s Osmose can communicate easily with instruments designed to be played with polyphonic pitchbend. MIDI 2.0 brings support for MPE, too, but it could take things even further.
“The exciting thing for me as a composer and musician is the built-in support for per-note events. MPE was awesome, but I think this goes beyond that because it gives you that same extreme polyphony and control but with higher-resolution data messages. I think there’s an opportunity for inventors to come up with new types of controllers that didn’t make sense before.”
The implications: not only can you send messages to perform polyphonic pitchbends, varying multiple individual notes in pitch separately, but you can also send polyphonic CC data per-note as well.
No more keyswitching, easier xenharmonic music
“For microtonal composers. you can carry detailed pitch information in every individual note and event. When you did microtonal work before, you’d have to set up individual scales within a synth - often using Scala tuning files. Now you can have a Note On event each time you hit middle C that actually sends out a different microtonal pitch - and that’s carried directly in the datastream, it’s not a configuration thing.
“Note On and Note Off events now have a new data field that’s called an attribute, and one of the predefined attributes is pitch. But you can use it for other things.
“A manufacturer can decide to create their own attribute language, so for instance, symphonic sample libraries use the bottom octave of your keyboard for keyswitches that control articulation - but if you use articulation as the Attribute, you could have a higher-res controller where the very front of your key is pizzicato, and as you move towards the back of the key you change the articulation… or any other attribute of the sound.
“You can do all kinds of things with it; we’re just waiting for people to come up with things to do with it.”