Page 1 of 3

VST3 and MIDI CC pitfall

Posted: Fri Apr 12, 2019 8:54 am
by abique

I write this message to share my impression and understanding of how VST3 and MIDI CC has been working the last years.

Why VST3 does not provide MIDI CC input and output support?
  • VST3 wants the host to solve the conflict between a concurrent automation and a midi message affecting the value of a parameter
  • Maybe Steinberg believes that MIDI CC mapping inside the plugin is a legacy feature.
  • VST3 already has its own "MPE".
How does it work?
  • IMidiMapping lets the plugin maps exclusively one MIDI CC to one parameter.
  • IMidiLearn lets the plugin know that one MIDI CC happened, in order to let the plugin implement MIDI learn.
What are the problems?
  • It is impossible to map one MIDI CC to multiple parameters.
  • No NRPN (both 7 and 14 bits) support
  • No 14 bits CC support
  • It is impossible to define a modulation range, curve and smoothing.
  • To work around the above issues, the plugin would have to create "proxy" parameters, which would be virtual (not automatable) parameters created specially to receive one MIDI CC, and then dispatch according to the plugin rules to other parameters. In this case the plugin is responsible to deal with conflicts between MIDI CC and automation; which defeats the goal of letting the DAW deal with the conflict.
  • We can see today many plugins creating 128 * 16 proxy parameters in order to receive MIDI CC. Which is a waste of resources, and introduces unnecessary complexity. It looks like an "Anti-Pattern". What happens if with MIDI 3, there are up to 1024 channels and 1024 CCs, the plugin creates a million parameters? Does the host have to call IMidiMapping::getMidiControllerAssignment() a million time and cache it for each plugin instance? Because IMidiMapping belongs to the IEditController and should be called from the MainThread right?

To me it is unclear, if VST3 by design and specification do not want anything related to MIDI CC, because it should be entirely done by the DAW and it is pure legacy. Then I'd suggest to remove the IMidiMapping and IMidiLearn to make it clear that the plugin won't ever get MIDI CC, and don't give the opportunity for the plugin to cheat the design. Also that would force plugins to use VST3's note expression and not catch the MIDI CC to perform MIDI MPE.

Or, VST3 by design wants the plugin to work with MIDI CC. The current solution of having 16 * 128 proxy parameters being exactly the same as having full and raw MIDI CC input. As consequence I would introduce a ControlEvent (it might not be bound to MIDI CC limits) and finally I would deprecate, IMidiMapping and IMidiLearn because they would be unnecessary.

Please remember, that everything I said is according to my experiences and understanding of VST3. If I missed something important please let me know.


Re: VST3 and MIDI CC pitfall

Posted: Wed May 15, 2019 12:52 pm
by bluecataudio
Yes, please give us back standard MIDI support instead of hacks! having more than 2000 parameters just to support MIDI CC and program change message is ridiculous. And it slows down everything for both hosts and plug-ins!

Re: VST3 and MIDI CC pitfall

Posted: Thu Jun 13, 2019 2:42 pm
by blegoff
Agreed, same issue for us, declaring thousands of proxy parameters is not a clean solution from our point of view.

Re: VST3 and MIDI CC pitfall

Posted: Tue Jun 18, 2019 9:09 am
by Arne Scheffler
from my personal point of view, a plug-in should not handle MIDI-CC at all. It's the responsibility for the host to map input events to parameter changes. The host knows what external hardware is connected and which routings are active. A plug-in does not have the knowledge about this.
As a plus point, the plug-in does not need to make any changes to support the forthcoming MIDI 2.0 spec.


Re: VST3 and MIDI CC pitfall

Posted: Mon Jun 24, 2019 8:36 am
by bluecataudio
Unfortunately, it looks like many of our common customers do not share this point of view!

Also MIDI CC can be used for many things (not only parameter changes), so this is definitely an artificial limitation. VST3 is the only plug-in format that does not support MIDI natively.

Re: VST3 and MIDI CC pitfall

Posted: Mon Jun 24, 2019 11:14 am
by Arne Scheffler
bluecataudio wrote:
Mon Jun 24, 2019 8:36 am
Also MIDI CC can be used for many things (not only parameter changes)
What are these many things? Are you sure that these things are not workarounds to do something an audio plug-in is not supposed to do?

Re: VST3 and MIDI CC pitfall

Posted: Tue Jun 25, 2019 9:04 am
by abique
To me the main issue is not if it supports MIDI CC or not, it is the "in between" situation, where is it not good at supporting MIDI CC and not good at not supporting them. I'd be fine with or without MIDI CC.

Re: VST3 and MIDI CC pitfall

Posted: Wed Jun 26, 2019 2:56 pm
by bluecataudio
Arne Scheffler wrote:
Mon Jun 24, 2019 11:14 am
What are these many things? Are you sure that these things are not workarounds to do something an audio plug-in is not supposed to do?
I am not sure who can define what a plug-in is or is not supposed to do...

MIDI CC messages are just MIDI messages like others, so you may want to convert them to other types of messages (for example CC to program change for controllers that cannot do Program Change), pass them to other plug-ins hosted inside a plug-in etc.

As you may already know, we have the same problem with Program Change messages. The "program" function provided by VST3 being buggy and incompatible with what program change usually does (mainly in Steinberg hosts btw) makes it completely useless, so people expect to have simple MIDI support.

Musicians are used to dealing with MIDI in general, so there is no reason for removing it (do you plan to remove raw audio data support as well in the future?). How do you explain customers that they can create MIDI tracks to store MIDI events but cannot process / receive them properly in plug-ins?

Re: VST3 and MIDI CC pitfall

Posted: Thu Jun 27, 2019 10:53 am
by Arne Scheffler
Musicians struggle all the time to handle MIDI, especially young ones. Even I struggle sometimes even though I speak MIDI for 30 years.
MIDI-CC is a black box system, that no one can understand without looking in the manual of the product, or all the products you work with. That was OK for the time where you only got two or three synthesizer. Today in a world where you have hundreds of plug-ins on your machine it is not possible if all these plug-ins have different MIDI-CC handling. Even the MIDI association is trying to go away from this black box with "MIDI-CI" and "Property Exchange" as MIDI-CC in its current form is bad user experience.

Re: VST3 and MIDI CC pitfall

Posted: Fri Jul 05, 2019 7:02 pm
by signalsmith
Arne Scheffler wrote:
Tue Jun 18, 2019 9:09 am
from my personal point of view, a plug-in should not handle MIDI-CC at all. It's the responsibility for the host to map input events to parameter changes.
Hi Arne!

What I personally think is that MIDI provided two things which are quite valuable:
  1. an ontology of semantic parameter meanings
  2. automatically-linked output parameters
We don't inherently need MIDI for these, but we are currently falling back to MIDI because there isn't an alternative.

Semantic ontology

Sure, it's hacky and defined by convention, and most MIDI controllers lack consistent and widely-respected definitions.

But when supported, things like pitch-bend or sustain-pedal are universal across synths, in terms of their semantic meanings. DAWs and input devices know to make these available, sometimes linked by default to a particular interface (e.g. foot-pedal, or a wheel which snaps back to a neutral position).

VST3 taps into this by using IMidiMapping - but I think that we should acknowledge what this is: VST3 using MIDI as an ontology for parameter meanings, enabling implicit zero-configuration setup for inputs, and hints for interfaces.

Automatically linked outputs

The other thing that MIDI lets us do (which I'm currently struggling with in VST3) is it lets us semantically describe our outputs, so that they are automatically linked up to corresponding inputs on any following effects.

Yes, any decent DAW lets us link parameters between effects, but we have to do it explicitly. MIDI lets us implicitly do some of this routing, using controller numbers (or other MIDI events like pitch-bend or aftertouch).

For example, let's say I want to write an "explicit vibrato" plugin: for a monophonic instrument, it adds a particular vibrato pattern to each note using pitch-bend. This should be able to work transparently with any synth - and it really should be possible to have that output routed by default to the synth input parameter tagged as "pitch-bend". Without that, we're forced to either output legacy MIDI messages, or ask the user to wire us up appropriately.

So, those are the gaps MIDI is currently filling for me.


One option: tag outputs in the same way as inputs

Point (1) above is already covered, because we use IMidiMapping to semantically tag our inputs.

I think the simplest solution to would be: an equivalent of IMidiMapping::getMidiControllerAssignment(), but for output parameters.

That is: we continue to use MIDI controllers as the semantic ontology for our inputs. DAWs would use these semantic tags to automatically link outputs to inputs (by default - the user can always un-link them), and they could do it with full floating-point resolution and sample-accurate timing.

For now, I'm looking at LegacyMIDICCOutEvent, but it does indeed feel like a legacy solution.