-
Notifications
You must be signed in to change notification settings - Fork 275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does rtmidi support midi 2.0? #307
Comments
Hello,
For some background, I am a member of the MIDI organization and am working on implementing MIDI 2.0 into products that my company (KMI) makes. We use rtmidi for some of this.The answer to your question is complicated.
Currently rtmidi acts as an OS agnostic framework to receive and send raw MIDI data to/from midi devices. This data is accessed through rtmidi as arrays of raw 8bit data, and the data is not parsed or formatted by rtmidi. It is up to the implementer to do things with the data, ie to send/receive note on/off and system exclusive messages.
MIDI 2.0 introduces a lot of functionality that works using System Exclusive data. Key to this is MIDI-CI (capability inquiry), where devices use bidirectional communication to discover each other, exchange MUIDs (like an ip address), and report functions they support. This then allows use of other MIDI 2.0 methods like property exchange and profiles. You can use rtmidi as the underlying framework for sending and receiving MIDI-CI SysEx, but it has no methods for implementing these MIDI 2.0 features. Today, you have to write those methods yourself. That may not be the case next year, depending on the progress being made by other open source projects and in MacOS and Windows support at the OS level.
MIDI 2.0 also introduces a new 32bit data format, called UMP (Universal MIDI Packet). For RtMidi to support this packet, it would need to implement a method to receive and send the data from the operating system. Currently, MacOS is the only OS that supports this data. Microsoft has said that Windows will support it next year. Android already supports it, however Android is not currently supported by RtMIDI. Linux is supposedly also going to support it soon.However, if you take a look at the MacOS CoreMIDI MIDI 2.0 API, you will notice that things are much, much more complicated than you would assume. There are methods to report MIDI-CI, profiles, groups, and more. There are also yet to be announced routing/metadata/protocol features of MIDI 2.0 that will also (eventually) be supported by all of these operating systems. All of this metadata and routing information will likely need to be handled in tandem with the operating system, and software that tries to ignore or circumvent what the OS is doing or how it does it will likely not behave correctly.
A framework like RtMIDI will need to become much more complex to properly support MIDI 2.0. It is likely that every OS will have different methods for the new MIDI 2.0 features and metadata, so having a cross platform framework with standardized methods would be very useful, but this is fundamentally different than what rtmidi does today with MIDI 1.0, where it only provides access to midi ports and raw data.
The midi org is currently developing open source tools to implement these features in embedded hardware/firmware, which are available today to members, and eventually these tools will be released to the public. Today in early 2023, this puts software developers at a crossroads. MacOS supports MIDI 2.0 features and metadata like MUIDs, MIDI-CI, and some basic property exchange features. Windows has none of this, and we have no idea how different the two APIs will be (but there is some coordination happening behind the scenes). So unless you only want to support MacOS, you might want to wait until these details get sorted out.
One project that may be of interest is Andrew Mee’s MIDI2_CPP, however take note- the UMP 32bit methods were likely only tested over the serial port, as USB MIDI 2.0 drivers are in their infancy.
https://github.com/starfishmod/MIDI2_CPP
Happy to answer more questions on this subject.
Cheers,
Eric Bateman
|
The current API transfers a packeted byte stream. This means that you can send an UMP to RtMidi. It will be handed over to Windows or MacOS unchanged. This should work with the WinMM backend as it has exclusive access to the MIDI devices. The corresponding feature (raw MIDI device access) is not used with ALSA (Linux). Instead the Linux implementation uses the ALSA sequencer, which tries to keep MIDI messages intact even when several streams are merged. If ALSA mixes UMP and MIDI1.0 messages it is very likely to confuse the receiving device. Also other backends like macOS or JACK allow to merge MIDI streams, so they have the same problem. That's why RtMidi has to wait for the backends to support MIDI 2.0. |
People interested in this subject should check out the MIDI 2.0 specification updates announced in Nov 2022. https://www.midi.org/midi-articles/details-about-midi-2-0-midi-ci-profiles-and-property-exchange There are some serious implications as to how operating systems will be managing MIDI 2.0 devices, specifically with regards to "Endpoints" and "Function Blocks". Currently with USB MIDI 1.0, a device may connect with several input and output ports, and all the operating system does is report these ports and names. With USB MIDI 2.0, there is an "endpoint" with input and output ports, and function blocks that describe the bidirectional nature and uses for these ports. |
Thanks, this is all helpful! I need to do some further learning to understand what to do with this. |
FYI - microsoft just opened up the repository for their new open source Windows MIDI Services. In addition to MIDI 2.0 support and backwards compatibility with MIDI 1.0, there will also be support for virtual midi ports and multi-client access. |
For anyone interested I've started trying to migrate step by step https://github.com/jcelerier/libremidi , my fork (pretty much a rewrite by now) of rtmidi to midi 2, first by providing the ability to midi_in and midi_out to speak UMP for when someone simply wants higher resolution CCs aha ; the next step will be to work on an interface that will model the "endpoint" concept of MIDI 2 ; any input and feedback is very welcome! |
Microsoft, Google, Apple, and the lead maintainer of ALSA have been meeting once or twice a month as midi.org members to develop a whitepaper on guidelines for developers and their midi 2.0 APIs. The operating systems are going to be managing some/most of the “endpoints” and reporting metadata within the api. Before you go too far down the road of writing an endpoint interface (or even handling UMP) you should take a look at the current implementations. Microsoft is last to have official support, but just opened up their repos for developers. Andrew Mee from Yamaha also has a library out there, but the public version is way behind the fork that the midi.org is currently using in development (the fork is not public yet because it implements spec updates that haven't been released)microsoft/MIDI: Windows MIDI Servicesgithub.comhttps://github.com/starfishmod/MIDI2_CPPOn Jul 31, 2023, at 6:59 AM, Jean-Michaël Celerier ***@***.***> wrote:
For anyone interested I've started trying to migrate step by step https://github.com/jcelerier/libremidi , my fork (pretty much a rewrite by now) of rtmidi to midi 2, first by providing the ability to midi_in and midi_out to speak UMP for when someone simply wants higher resolution CCs aha ; the next step will be to work on an interface that will model the "endpoint" concept of MIDI 2 ; any input and feedback is very welcome!
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: ***@***.***>
|
What's the reasoning for a MIDI-2.0 protocol when virtually everyone has moved or is moving to OSC as the replacement for MIDI? |
Hi Andrew,
For the last 3 years I've spent an hour a week with the MIDI organization
working to update the MPE spec and write a new MPE MIDI 2.0 Profile that
was recently released. I'm going to do my best to take your question
seriously and answer it thoroughly.
Three years ago when Ableton announced that Live 11 would support MPE, I
was working with an engineer at Sequential Circuits to test their MPE Beta
firmware with one of our MPE controllers. We were also testing with a beta
version of Live 11, and a problem arose - between our controller, Live 11,
and the Sequential synth, there was no standardized way to automatically
configure and then confirm the number of member channels that each
device/software was using. This caused many problems and unexpected
behavior. At the time there was already a registered controller (#6)
defined in the MPE specification to set the number of member channels, but
there was no defined way for two devices to poll each other to not only set
the parameter, but to also confirm that it was set.
So we (the MIDI organization) formed a committee to create a new MIDI 2.0
MPE Profile that standardized the messaging between two MPE capable
devices. The committee included members actively involved in developing MPE
products (KMI, Moog, GeoShred), and we periodically invited folks from
Ableton, Steinberg, Yamaha, and other companies to get their input and buy
in on what we were doing, with assurances that they would adopt the
standard once it was released.
You can view the specification here:
https://midi.org/midi-polyphonic-expression-mpe-specification-adopted
At no point in this process did OSC enter the discussion. Not one
stakeholder involved brought it up, and to the best of my knowledge none of
the companies I've mentioned ship OSC in the products that we were working
together to improve.
I can tell you that KMI has a product that uses OSC over SysEx. It's there
because a contractor we employed (before my time) really loved OSC, and
preferred working with it. I'm sure that OSC can be a great solution in
certain situations, but I have to tell you, OSC has provided zero benefit
to this product, but there have been a TON of headaches. The calibration
software that relies on the OSC communication is written in Max/MSP, with a
ton of node.js dependencies and o.dot objects all built around making OSC
work over sysex. Last year when our contract manufacturer in Shenzhen
pulled out the calibration laptop for this product to build a large order,
something in the OSC node.js dependencies broke, and I had to drop
everything and spend two weeks untangling things to get it working again.
Now if every product we made used OSC, we'd probably have a more reliable
stack to support it, so again this may not be the fault of the OSC
protocol. But we do have very reliable and well maintained SysEx libraries
that don't use OSC, and the follow-up to this specific product that we are
working on now will not have OSC in it. But it will have MIDI 2.0, and the
editor software will use RtMidi, same as most of our products.
This year at NAMM we saw several new products with MIDI 2.0, including from
Korg and Roland. Steinberg and Apple both support MIDI 2.0, and Microsoft
is releasing their update this year. Our first MIDI 2.0 product will ship
later this year as well.
I hope this answers your question.
Cheers,
Eric Bateman
…On Wed, Mar 6, 2024 at 3:03 PM Andrew Broughton ***@***.***> wrote:
What's the reasoning for a MIDI-2.0 protocol when virtually everyone has
moved or is moving to OSC as the replacement for MIDI?
—
Reply to this email directly, view it on GitHub
<#307 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAGQHAC4W47AANBWVBWKI23YW6ODDAVCNFSM6AAAAAAUCN43NWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBSGAYDGMBQGM>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
TY for the detailed explanation, Eric. |
Sure thing eric atsymbol keithmcmillen dot com |
regarding "Microsoft, Google, Apple, and the lead maintainer of ALSA have been meeting once or twice a month as midi.org members to develop a whitepaper on guidelines for developers and their midi 2.0 APIs. The operating systems are going to be managing some/most of the “endpoints” and reporting metadata within the api. Before you go too far down the road of writing an endpoint interface (or even handling UMP) you should take a look at the current implementations. Microsoft is last to have official support, but just opened up their repos for developers. Andrew Mee from Yamaha also has a library out there, but the public version is way behind the fork that the midi.org is currently using in development (the fork is not public yet because it implements spec updates that haven't been released)" my implementation is based on the current APIs provided by macOS and Linux for MIDI 2 (windows one still not final as we know, looked at it a few times) which provide access to UMP just like MIDI 1 APIs provide MIDI 1 bytestreams. I really wouldn't want to go through the "ump-over-midi1-sysex" route when OS APIs provide clean and beautiful APIs that return uint32's: |
UMP also provides better methods for sysex (8bit data!). A unified cross platform library that reports back if the OS supports MIDI 2.0 is the goal. Translating MIDI 1.0 to UMP is pretty much baked into UMP in any case. |
For OS support it's not too simple : from what I can see in macOS it has to be a compile-time choice:
if anyone knows how to do a run-time check instead I'll gladly implement it but I'm not sure this is possible as it stands, short of having per-backend .dylibs: dlopen will fail for the ones which are too recent for the OS version. For Linux it's already done: I load all the ALSA symbols through dlopen / dlsym and check for the availability of the UMP ones. If they are not available the library on the user's system, then the UMP backends simply will not be available: |
And over time for sure I think it'd make sense to just drop the MIDI 1 back-ends and just focus on UMP, as the OS will translate MIDI 1 -> UMP in all cases and UMP is just so much easier to work with. |
Correct, moving forward the OS will handle embedding midi 1.0 device messages into their UMP equivalents, so on our end we can stock with the new methods. Part of the midi org work has been to standardize the translation methods between the two protocols.On Mar 8, 2024, at 7:54 AM, Jean-Michaël Celerier ***@***.***> wrote:
And over time for sure I think it'd make sense to just drop the MIDI 1 back-ends and just focus on UMP, as the OS will translate MIDI 1 -> UMP in all cases and UMP is just so much easier to work with.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: ***@***.***>
|
Since I don't see anything explicit about midi 2.0 I am assuming it is unsupported, but wanted to ask.
The text was updated successfully, but these errors were encountered: