VST3 and MIDI CC pitfall

SDK for VST 3 audio plug-in and host development.
ResonantMind
Posts: 4
Joined: Sun Apr 28, 2019 8:31 pm

Re: VST3 and MIDI CC pitfall

Post by ResonantMind » Tue Jul 09, 2019 11:36 am

Arne Scheffler wrote:
Tue Jun 18, 2019 9:09 am
Hi,
from my personal point of view, a plug-in should not handle MIDI-CC at all. It's the responsibility for the host to map input events to parameter changes. The host knows what external hardware is connected and which routings are active. A plug-in does not have the knowledge about this.
As a plus point, the plug-in does not need to make any changes to support the forthcoming MIDI 2.0 spec.

Cheers,
Arne
signalsmith wrote:
Fri Jul 05, 2019 7:02 pm
Arne Scheffler wrote:
Tue Jun 18, 2019 9:09 am
Hi,
from my personal point of view, a plug-in should not handle MIDI-CC at all. It's the responsibility for the host to map input events to parameter changes.
Hi Arne!

What I personally think is that MIDI provided two things which are quite valuable:
  1. an ontology of semantic parameter meanings
  2. automatically-linked output parameters
We don't inherently need MIDI for these, but we are currently falling back to MIDI because there isn't an alternative.

Semantic ontology

Sure, it's hacky and defined by convention, and most MIDI controllers lack consistent and widely-respected definitions.

But when supported, things like pitch-bend or sustain-pedal are universal across synths, in terms of their semantic meanings. DAWs and input devices know to make these available, sometimes linked by default to a particular interface (e.g. foot-pedal, or a wheel which snaps back to a neutral position).

VST3 taps into this by using IMidiMapping - but I think that we should acknowledge what this is: VST3 using MIDI as an ontology for parameter meanings, enabling implicit zero-configuration setup for inputs, and hints for interfaces.

Automatically linked outputs

The other thing that MIDI lets us do (which I'm currently struggling with in VST3) is it lets us semantically describe our outputs, so that they are automatically linked up to corresponding inputs on any following effects.

Yes, any decent DAW lets us link parameters between effects, but we have to do it explicitly. MIDI lets us implicitly do some of this routing, using controller numbers (or other MIDI events like pitch-bend or aftertouch).

For example, let's say I want to write an "explicit vibrato" plugin: for a monophonic instrument, it adds a particular vibrato pattern to each note using pitch-bend. This should be able to work transparently with any synth - and it really should be possible to have that output routed by default to the synth input parameter tagged as "pitch-bend". Without that, we're forced to either output legacy MIDI messages, or ask the user to wire us up appropriately.

So, those are the gaps MIDI is currently filling for me.

-------------------------------------------------

One option: tag outputs in the same way as inputs

Point (1) above is already covered, because we use IMidiMapping to semantically tag our inputs.

I think the simplest solution to would be: an equivalent of IMidiMapping::getMidiControllerAssignment(), but for output parameters.

That is: we continue to use MIDI controllers as the semantic ontology for our inputs. DAWs would use these semantic tags to automatically link outputs to inputs (by default - the user can always un-link them), and they could do it with full floating-point resolution and sample-accurate timing.

For now, I'm looking at LegacyMIDICCOutEvent, but it does indeed feel like a legacy solution.
bluecataudio wrote:
Mon Jun 24, 2019 8:36 am
Unfortunately, it looks like many of our common customers do not share this point of view!

Also MIDI CC can be used for many things (not only parameter changes), so this is definitely an artificial limitation. VST3 is the only plug-in format that does not support MIDI natively.
End user chiming in here.

I'm doing some complicated routing work with effects layers and transitions, and deep parameter programming as well as integration into video art, I'm in the midst of exploring this and what Cubase is capable of, if I need other software for certain projects, etc, etc.

So imo, Cubase does not have clear multi-parameter control and is sort of stuck in this weird middle ground of "'to Automate' or 'to MIDI' how do I control these 10 different things inserted on these seven separate tracks with this one thing on my desk / ouch my brain hurts."

It's weird, like, I pretty much always have to use MIDI, but MIDI isn't fully supported... or it is sometimes, but not always.

I've discovered a few different ways to accomplish this and overcome these limitations or say, intuitive lacking , and I'm pretty sure %80 of Cubase users don't know about this.

I will explain.

So, this is the context of wanting to control multiple parameters across multiple tracks, either with one control, or multiple controls. And let's say we are just talking about VST insert plugins. If they don't support MIDI, this pretty much leaves only one option and this is what I'm pretty sure %80 of Cubase users don't know about, maybe even more like %98......
.... Any Quick Control, can be routed to any track or normally controllable parameter in the program. The secrete is to Shift+Click on a Quick Control which brings up an alternate menu, that lists every controllable parameter in the entire project.

Here is an example.

Create 9 new audio tracks.

go to the 9th tracks quick controls and shift+click in parameter selector space.

go to VST Mixer

here you will find the 8 other audio tracks and all their parameters.

On Audio Track 9, assign QC1 to Audio 1 Fader. QC2 Audio 2 Fader, etc, etc.

You can now control all 8 faders from one audio tracks quick controls... which the Quick Controls via 'Studio Setup'\'Track Quick Controls' could be receiving MIDI from say, a keyboard controller that has 8 faders.


Now, if you wanted to control all 8 Quick Controls from one knob or fader on your keyboard controller, you can go to your 'studio setup' Window and in the Track Quick Controls section, you can create Quick Control CC assignment templates, (I've created 10 or so left on my desktop in a folder) and you could create a template where all 8 Track Controls are assigned to the same MIDI CC. Now all 8 audio faders are responding to the one single fader on your controller.


There's another way to do this pending the specifications of what you are trying to control.

This will only work for plugins that have MIDI support

If say you wanted one controller knob on your keyboard to control multiple parameters on multiple plugins on one audio track AND other plugins parameters on other audio (or group or FX, etc) tracks, you can do this:


say there are 8 parameters total you want to control. 4 parameters are on 3 different plugins one ONE track - Audio Track 1.

Audio Track 1
--Ins1 - FilterPluginA
----Frequency Cutoff
----Resonance
--Ins2 - ChorusPluginA
----Depth
--Ins3 - DelayPluginA
----Feedback

FX Track 1
--parameter
FX Track 2
--parameter
Fx Track 3
--parameter
FX Track 4
--parameter

Create 8 MIDI tracks and set all their inputs to the same controller, the controller that has the one knob or fader you will be using. Set the outputs to the specific parameters listed above so that each MIDI track is Outputting to one parameter.

Select all the MIDI tracks and arm/monitor them and voila. it's that simple. Too bad more VST developers don't support MIDI... there should be a marketing to developers to push them to support MIDI in all their plugins.




So that's sort of been my workflow to accomplish. Using internal virtual MIDI software like LoopBE30 can help with routing solutions as well... Really, a tool that should be standard in all DAWs.

Either way, it's a bit of a convoluted process in my opinion just to control multiple things. I wish VST developers were maybe encouraged more to include MIDI in their plugins (Like GRM Tools does), it seems there's mostly only MIDI support for VSTis.

Not sure if this has to do with what you guys are talking about, hope it helps.

Post Reply