Display executed MIDI commands in some notification
I'm experimenting with custom MIDI handling using an iRig BlueBoard and Mozaic.
As working with a BlueBoard is a bit finicky, I sometimes press a wrong button or I don't press it at all. So sometimes I don't really now whether my intended MIDI was sent (e.g. switching the input from 1 to 2, which is also not perceivable from within the main screen of GTL).
In https://forum.audiob.us/discussion/comment/830057/#Comment_830057 (and following post), I asked whether it is possible to send some commands to a BlueBoard to make its buttons glow, which would be a way to communicate something to me. For example: when button A is blue, the input 1 is active; if B is blue, input 2 is active; if C is blue, both inputs are active.
This doesn't seem to be possible, so I thought that maybe I can send some message to the iOS notification center through Mozaic. This way I could send notifications that I could see wherever I'm currently at, and also within GTL. But this doesn't seem to be easily possible.
So I thought about this: what about displaying a small notification (for a second or maybe two) somewhere on the GTL main screen whenever a MIDI was received that executed some binding? And an even cooler thing: allow custom messages to be displayed which are simply triggered by any MIDI that was received (so I could display a notification even for a MIDI message that doesn't do anything in GTL, but maybe in AUM)?
Just an idea.
Comments
Short update: it IS possible to make the BB's buttons glow. But it behaves buggy, as it seems:
https://forum.audiob.us/discussion/comment/831852/#Comment_831852
So it would still be a very useful thing to display notification inside GTL.
Added to the list, cheers @josh!
Great, thank you.
For the records: I implemented an audio alternative to this suggestion simply using a custom preset in a sampler, where each note stores a specific audio signal which can be triggered through MIDI. For more info see https://github.com/jmuheim/mozaic-blueboard#giving-audio-feedback
Take a look to get some inspiration and also see the “simplified way” that backing tracks model has to offer. Not everyone’s cup of tea, I know, but it comes from “song over performance” mindset I usually point.
Cheers!
Thanks for sharing this video. I didn't know about arranger apps yet. How do you get the instrument tracks into it? You just record it somewhere else and then import?
What I take from this video is the fact that it sends audio signals for the artist to the left channel and the music itself to the right channel. I had this idea myself already but was not sure whether it would work (ie. whether sending the music to just one channel would produce enough "juice" for the PA system). It seems: it does.
Lumbeat apps are the “in the middle” workflow... check bandinabox, midi arrangers and grooveboxes etc.
Prime app is an evolution from Ableton into audio arrangers (soltron, keytron) which get audio stems as backtracks. The digital domain let the musician cue points with quantization (like GTL) so linear becomes non-linear and you can re-arrange songs on the fly and ive crowd some new approaches to avoid boring karaoke-like gigs etc.
There are also midi arrangers and similar apps (those lumbeat ie) and similar ableton-translation such modstep or similar. More flexibility but more risky too...
So a wonderful world