VST3 and MIDI CC pitfall

Hi,

I write this message to share my impression and understanding of how VST3 and MIDI CC has been working the last years.

Why VST3 does not provide MIDI CC input and output support?

  • VST3 wants the host to solve the conflict between a concurrent automation and a midi message affecting the value of a parameter
  • Maybe Steinberg believes that MIDI CC mapping inside the plugin is a legacy feature.
  • VST3 already has its own “MPE”.

How does it work?

  • IMidiMapping lets the plugin maps exclusively one MIDI CC to one parameter.
  • IMidiLearn lets the plugin know that one MIDI CC happened, in order to let the plugin implement MIDI learn.

What are the problems?

  • It is impossible to map one MIDI CC to multiple parameters.
  • No NRPN (both 7 and 14 bits) support
  • No 14 bits CC support
  • It is impossible to define a modulation range, curve and smoothing.
  • To work around the above issues, the plugin would have to create “proxy” parameters, which would be virtual (not automatable) parameters created specially to receive one MIDI CC, and then dispatch according to the plugin rules to other parameters. In this case the plugin is responsible to deal with conflicts between MIDI CC and automation; which defeats the goal of letting the DAW deal with the conflict.
  • We can see today many plugins creating 128 * 16 proxy parameters in order to receive MIDI CC. Which is a waste of resources, and introduces unnecessary complexity. It looks like an “Anti-Pattern”. What happens if with MIDI 3, there are up to 1024 channels and 1024 CCs, the plugin creates a million parameters? Does the host have to call IMidiMapping::getMidiControllerAssignment() a million time and cache it for each plugin instance? Because IMidiMapping belongs to the IEditController and should be called from the MainThread right?

Conclusion

To me it is unclear, if VST3 by design and specification do not want anything related to MIDI CC, because it should be entirely done by the DAW and it is pure legacy. Then I’d suggest to remove the IMidiMapping and IMidiLearn to make it clear that the plugin won’t ever get MIDI CC, and don’t give the opportunity for the plugin to cheat the design. Also that would force plugins to use VST3’s note expression and not catch the MIDI CC to perform MIDI MPE.

Or, VST3 by design wants the plugin to work with MIDI CC. The current solution of having 16 * 128 proxy parameters being exactly the same as having full and raw MIDI CC input. As consequence I would introduce a ControlEvent (it might not be bound to MIDI CC limits) and finally I would deprecate, IMidiMapping and IMidiLearn because they would be unnecessary.

Please remember, that everything I said is according to my experiences and understanding of VST3. If I missed something important please let me know.

Regards,
Alexandre

Yes, please give us back standard MIDI support instead of hacks! having more than 2000 parameters just to support MIDI CC and program change message is ridiculous. And it slows down everything for both hosts and plug-ins!

3 Likes

Agreed, same issue for us, declaring thousands of proxy parameters is not a clean solution from our point of view.

Hi,
from my personal point of view, a plug-in should not handle MIDI-CC at all. It’s the responsibility for the host to map input events to parameter changes. The host knows what external hardware is connected and which routings are active. A plug-in does not have the knowledge about this.
As a plus point, the plug-in does not need to make any changes to support the forthcoming MIDI 2.0 spec.

Cheers,
Arne

Unfortunately, it looks like many of our common customers do not share this point of view!

Also MIDI CC can be used for many things (not only parameter changes), so this is definitely an artificial limitation. VST3 is the only plug-in format that does not support MIDI natively.

3 Likes

What are these many things? Are you sure that these things are not workarounds to do something an audio plug-in is not supposed to do?

To me the main issue is not if it supports MIDI CC or not, it is the “in between” situation, where is it not good at supporting MIDI CC and not good at not supporting them. I’d be fine with or without MIDI CC.

I am not sure who can define what a plug-in is or is not supposed to do…

MIDI CC messages are just MIDI messages like others, so you may want to convert them to other types of messages (for example CC to program change for controllers that cannot do Program Change), pass them to other plug-ins hosted inside a plug-in etc.

As you may already know, we have the same problem with Program Change messages. The “program” function provided by VST3 being buggy and incompatible with what program change usually does (mainly in Steinberg hosts btw) makes it completely useless, so people expect to have simple MIDI support.

Musicians are used to dealing with MIDI in general, so there is no reason for removing it (do you plan to remove raw audio data support as well in the future?). How do you explain customers that they can create MIDI tracks to store MIDI events but cannot process / receive them properly in plug-ins?

2 Likes

Musicians struggle all the time to handle MIDI, especially young ones. Even I struggle sometimes even though I speak MIDI for 30 years.
MIDI-CC is a black box system, that no one can understand without looking in the manual of the product, or all the products you work with. That was OK for the time where you only got two or three synthesizer. Today in a world where you have hundreds of plug-ins on your machine it is not possible if all these plug-ins have different MIDI-CC handling. Even the MIDI association is trying to go away from this black box with “MIDI-CI” and “Property Exchange” as MIDI-CC in its current form is bad user experience.

Hi Arne!

What I personally think is that MIDI provided two things which are quite valuable:

  1. an ontology of semantic parameter meanings
  2. automatically-linked output parameters

We don’t inherently need MIDI for these, but we are currently falling back to MIDI because there isn’t an alternative.

Semantic ontology

Sure, it’s hacky and defined by convention, and most MIDI controllers lack consistent and widely-respected definitions.

But when supported, things like pitch-bend or sustain-pedal are universal across synths, in terms of their semantic meanings. DAWs and input devices know to make these available, sometimes linked by default to a particular interface (e.g. foot-pedal, or a wheel which snaps back to a neutral position).

VST3 taps into this by using IMidiMapping - but I think that we should acknowledge what this is: VST3 using MIDI as an ontology for parameter meanings, enabling implicit zero-configuration setup for inputs, and hints for interfaces.

Automatically linked outputs

The other thing that MIDI lets us do (which I’m currently struggling with in VST3) is it lets us semantically describe our outputs, so that they are automatically linked up to corresponding inputs on any following effects.

Yes, any decent DAW lets us link parameters between effects, but we have to do it explicitly. MIDI lets us implicitly do some of this routing, using controller numbers (or other MIDI events like pitch-bend or aftertouch).

For example, let’s say I want to write an “explicit vibrato” plugin: for a monophonic instrument, it adds a particular vibrato pattern to each note using pitch-bend. This should be able to work transparently with any synth - and it really should be possible to have that output routed by default to the synth input parameter tagged as “pitch-bend”. Without that, we’re forced to either output legacy MIDI messages, or ask the user to wire us up appropriately.

So, those are the gaps MIDI is currently filling for me.


One option: tag outputs in the same way as inputs

Point (1) above is already covered, because we use IMidiMapping to semantically tag our inputs.

I think the simplest solution to would be: an equivalent of IMidiMapping::getMidiControllerAssignment(), but for output parameters.

That is: we continue to use MIDI controllers as the semantic ontology for our inputs. DAWs would use these semantic tags to automatically link outputs to inputs (by default - the user can always un-link them), and they could do it with full floating-point resolution and sample-accurate timing.

For now, I’m looking at LegacyMIDICCOutEvent, but it does indeed feel like a legacy solution.





End user chiming in here.

I’m doing some complicated routing work with effects layers and transitions, and deep parameter programming as well as integration into video art, I’m in the midst of exploring this and what Cubase is capable of, if I need other software for certain projects, etc, etc.

So imo, Cubase does not have clear multi-parameter control and is sort of stuck in this weird middle ground of “‘to Automate’ or ‘to MIDI’ how do I control these 10 different things inserted on these seven separate tracks with this one thing on my desk / ouch my brain hurts.

It’s weird, like, I pretty much always have to use MIDI, but MIDI isn’t fully supported… or it is sometimes, but not always.

I’ve discovered a few different ways to accomplish this and overcome these limitations or say, intuitive lacking , and I’m pretty sure %80 of Cubase users don’t know about this.

I will explain.

So, this is the context of wanting to control multiple parameters across multiple tracks, either with one control, or multiple controls. And let’s say we are just talking about VST insert plugins. If they don’t support MIDI, this pretty much leaves only one option and this is what I’m pretty sure %80 of Cubase users don’t know about, maybe even more like %98…
Any Quick Control, can be routed to any track or normally controllable parameter in the program. The secrete is to Shift+Click on a Quick Control which brings up an alternate menu, that lists every controllable parameter in the entire project.

Here is an example.

Create 9 new audio tracks.

go to the 9th tracks quick controls and shift+click in parameter selector space.

go to VST Mixer

here you will find the 8 other audio tracks and all their parameters.

On Audio Track 9, assign QC1 to Audio 1 Fader. QC2 Audio 2 Fader, etc, etc.

You can now control all 8 faders from one audio tracks quick controls… which the Quick Controls via ‘Studio Setup’'Track Quick Controls’ could be receiving MIDI from say, a keyboard controller that has 8 faders.


Now, if you wanted to control all 8 Quick Controls from one knob or fader on your keyboard controller, you can go to your ‘studio setup’ Window and in the Track Quick Controls section, you can create Quick Control CC assignment templates, (I’ve created 10 or so left on my desktop in a folder) and you could create a template where all 8 Track Controls are assigned to the same MIDI CC. Now all 8 audio faders are responding to the one single fader on your controller.


There’s another way to do this pending the specifications of what you are trying to control.

This will only work for plugins that have MIDI support

If say you wanted one controller knob on your keyboard to control multiple parameters on multiple plugins on one audio track AND other plugins parameters on other audio (or group or FX, etc) tracks, you can do this:


say there are 8 parameters total you want to control. 4 parameters are on 3 different plugins one ONE track - Audio Track 1.

Audio Track 1
–Ins1 - FilterPluginA
----Frequency Cutoff
----Resonance
–Ins2 - ChorusPluginA
----Depth
–Ins3 - DelayPluginA
----Feedback

FX Track 1
–parameter
FX Track 2
–parameter
Fx Track 3
–parameter
FX Track 4
–parameter

Create 8 MIDI tracks and set all their inputs to the same controller, the controller that has the one knob or fader you will be using. Set the outputs to the specific parameters listed above so that each MIDI track is Outputting to one parameter.

Select all the MIDI tracks and arm/monitor them and voila. it’s that simple. Too bad more VST developers don’t support MIDI… there should be a marketing to developers to push them to support MIDI in all their plugins.




So that’s sort of been my workflow to accomplish. Using internal virtual MIDI software like LoopBE30 can help with routing solutions as well… Really, a tool that should be standard in all DAWs.

Either way, it’s a bit of a convoluted process in my opinion just to control multiple things. I wish VST developers were maybe encouraged more to include MIDI in their plugins (Like GRM Tools does), it seems there’s mostly only MIDI support for VSTis.

Not sure if this has to do with what you guys are talking about, hope it helps.

End User and developer here. Steinberg is EXTREMELY short sighted to remove midi CC from VST3. There are many uses for these other midi besides the limited cases that have been outlined here. They are useful messages for various reasons, such as articulation management in numerous DAW’s and instruments and many other reasons. It is fine and good that CC’s are mapped directly to parameters for those cases where people are using sliders to control parameters. that is the simple obvious case, but this message, as well as PC, PitchBend and Aftertouch are used for all kinds of purposes to trigger things to happen in both plugins and external devices. It should very well be possible to allow these other kinds of midi events to appear in the midi buffer of the plugin callback. That fact that Steinberg has removed it is absurd.

I am trying to develop a midi plugin with VST3 and its causing so many headaches because of this.

Also, the sequential ordering of midi events has been, for decades, serialized, so that a sender could send a series of midi events in a certain order and be sure the consumer at the other end would receive those events in the same order. Even with the parameter hacks that people are using to get around VST3’s severe deficiency in this regard…that hack loses that ordering of events that have the same timestamp. For example of the event list has a sequence of cc-note-cc-note-cc-note, all on the same timestamp…what will show up inside VST3 plugin will not have this ordering still intact because the CC’s will have been abstracted out of that serial stream into parameters…and that inherent serialized ordering is lost!! Many people have been working with midi for decades and relying on that.

With articulation management in numerous hosts, including Cubase, it can be beneficial to have a VST plugin to assist in ways that ExpressionMaps fall short, for example. But here again is a situation where VST3 cannot handle it because CC’s are stripped out of the stream…and even if we hack them back in, the original order is lost. VST3 has cut off many creative uses for midi messages from even being possible. This is particularly problematic for articulation management.

1 Like

Why not describing the use-cases you want to implement which you are struggling with? The one use-case I can decipher from your rant is to talk with external devices. In this case the answer is, talk with the device directly. Then you can be sure that your messages are send to the correct device, not filtered or changed in any way by the host.
For articulation: note expression and key switch handling are features natively available in VST3.

Cheers,
Arne

key switching of sample based instruments as of 2020 requires the complete midi protocol in serial order. That is just one case. Midi users have been using this protocol for decades in myriad of ways beyond simply as a parameter adjuster for effecting notes. That includes within plugin signal chains, not just externally.

steinberg’s view on the matter is limiting users from what they have been able to do previously for decades. At this time vst3 cannot be used for midi plugin development in some cases because of this limiting point of view about the role of cc messages. For the time being, those people with the old license can continue to provide vst2 solutions, but new developers will be unable to provide certain midi plugin solutions due to steinberg’s narrow point of view about what each type of midi message can and should be used for.

The main problem is that the vst3 abstraction, though elegant from a computer sci point of view, destroys the serial order that has existed for decades in the past, both externally from the daw in virtual midi device buffer, in the midi cable itself sure but also in the vst2/au midi buffer that order was preserved and all midi events were delivered in the exact same order from the host’s event list to the plugin’s callback, and from one plugin to the next plugin in the chain of routed that way, etc always with that order maintained and every single message delivered as sent.

Vst3 abstracts all but the notes themselves and obliterates that possibility for midi users to count on all messages making it to their favorite plugin in the designated order.

Keyswitching of sample instruments with cc is one place this has broken functionality entirely.

1 Like

In order to get around these vst3 limitations some hosts are hacking around it to get midi into the plugin. Not all hosts are. Even cubase is broken. Try to create a chord in cubase in the same timestamp with a different attribute expression map articulation attached to each note of the chord. Then monitor the output. You will see that only one of the expression map articulations wins and all notes of the chord will be treated as if one expression map articulation was in effect for the complete chord. Steinberg can hack around the vst3 limitation too if they want but I doubt that will be approved code change :wink:

It’s entirely possible also that if you have a sample Instrument that requires say two cc messages from the same controller # as two switches in a row ahead of the note to switch articulations, again it can be broken if the host decides to only remember the last cc as the parameter value provided to the plugin. The two cc message sequence is lost. It turns out that Cubase is making sure to pass all CC events to iMidiMapping and so the plugin can get them in order, ahead of the note. But its not guaranteed that all hosts will be sure to do that step.

I have run tests with Cubase, constructing an event list with cc-note-cc-note-cc-note on the same timestamp. Cubase sends those events in that order always to external device. However when sent to a VST3 plugin…it will scramble the order…the interleaving of cc’s and notes in that intended order is lost and impossible to reconstruct inside the plugin.

1 Like

Hi,
your frustration comes from the misunderstanding that you can build MIDI plug-ins with VST. VST describes an audio plugin API. That you could misuse version 2 for building MIDI plug-ins was not intended. You need a MIDI plug-in API which does not exist across hosts. And for key switching support in VST3 see : https://steinbergmedia.github.io/vst3_doc/vstinterfaces/keyswitch.html

Cheers,
Arne

1 Like

is there a midi related api? I wasn’t talking only about midi-only plugins. This problem will always exist as long as you have the possibility to use cc switching of instruments. Vst3 audio instruments receive midi to drive the instrument. That is how things work in the daw world in 2020. Poly articulation chords are an impossibility with vst3 plugin instruments today in the year 2020. Perhaps someday in the future there will be a fully capable api that provides these capabilities and all daws fully using it; but today in 2020 users need the midi to go through to the instrument in the proper order, which today it does not in VST3.

2 Likes

For poly articulation see https://steinbergmedia.github.io/vst3_doc/vstinterfaces/noteExpression.html.

I guess as soon as you force all daw makers to conform to that approach then maybe it would work but today in 2020 that is not the situation and musicians need solutions that work today.

3 Likes

I would like to say that it is not the first time that musicians « misunderstand » an instrument and get a sound from it that it was not intended to produce. Would you lock the lid of a grand piano so that its strings can not be plucked and therefore it stays on tune longer?

Here there are three examples of misuse of the VST2 API:

I really struggle to understand the point of « deprecating them ». The fact that Logic accommodates them as AU MIDI Effects indicates that not everybody in the industry is against a standard for MIDI effect plugins.

Cubase and Ableton Live chose not to host external AU MIDI effects but Reaper and Logic did and Protools also allowed AAX MIDI plugins.

1 Like