Let's Design and Build a (mostly) Digital Theremin!

Posted: 3/13/2021 2:27:46 PM
dewster

From: Northern NJ, USA

Joined: 2/17/2012

Rhythm Method

I've got an action plan, and not too surprisingly it boils down to more or less a pared-down version of the Open.Theremin approach - so very nice that their code is open source!

It's sort of a 4-state state machine, though only 2 states - the note on state and the note off state - can be entered @ IRQ; the other two states are transitional. 

Starting at OFF we wait until the volume axis exceeds some threshold and then we transition to the OFF_ON state. 

In the OFF_ON state we calculate the EXPR expression pedal and PB pitch bend values, transmit them if they have changed, transmit a MIDI note on, and then transition to the ON state - and then are done with a single IRQ's work. 

Entering the ON state on the next IRQ we again calculate EXPR and PB and transmit them if there is any change.  If the volume axis drops below the threshold we transition to the ON_OFF state where upon a MIDI note off is transmitted, and we transition to the OFF state whereupon we are done with the IRQ.  A second trigger for the ON => ON_OFF state transition is when the pitch bend distance exceeds some agreed-upon (between MIDI TX and RX - manual settings) fixed amount such as +/-12 notes.  In that case we do ON => ON_OFF => OFF_ON => ON, which turns off the note, sends new EXPR & PB data, and starts a new note, finishing the IRQ.  This entire state transition transmits 15 bytes (I think we want to transmit 14 bit EXPR data rather than 7 bit data, and we can use MIDI "running status" to shave off the second status byte here) which takes 15 * 320us = 4.8ms, or a frequency of 208Hz.  Here's the breakdown:

 note-off = 3 bytes
 EXPR14 = 5 bytes
 PB(14)  = 4 bytes
 note-on = 3 bytes

So I'm wondering aloud what - if anything - the receive side might be doing to smooth out this rather slow / sparse data stream?  Software folks who write synth code probably know enough about DSP to recognize this as a sampling scenario and treat it as such, but what about all the OS code in between "managing" the MIDI stream?  I don't know if I should meter/time the spaces between data bytes, MIDI commands, clumps of MIDI command, or globally.  Or if I should meter it at all?  If it were me on the other side I think I'd want related clumps coming in at a fixed rate, and always coming in regardless of change, just to dodge a fractional rate resampling type scenario.  But that would generate a lot of MIDI traffic and wouldn't update the parameters as fast as possible.  It's kind of crazy that we have to worry about MIDI bandwidth at all.

Not addressed here is MIDI note-on velocity, which doesn't make a lot of sense for wind or Theremin type controllers, as the new note is selected a significant time before it is played.   I suppose I'll have to come up with something, but it's an ill-fitting parameter in this scenario, and I'm anticipating trouble.  MIDI is massively keyboard-centric; even the breath controller parameter, which I believe was included for the DX7, is generally ignored by synths, adding to the ill-fitting guessing game.  EXPR is an expression pedal input, and the spec says it's for individual note volume, but it seems to be used (here and there!) to control timbre-type dynamics too.  Ah well, gotta start somewhere.

[EDIT] Should have mentioned pitch bend, which is 14 bits of data.  If we set the PB range to +/-1 octave this is 24 notes, and 2^14 / 24 = 682.  So sub-note resolution is 682 here, and the ear is at most sensitive to 3 cents, or 33.  It seems we could easily do +/-2 octaves and above here and largely avoid weird note retriggering at the PB limits.  A lot of it depends on how many bits are actually utilized (not truncated or ignored) on the RX end and, again, how the data is filtered in time.

Posted: 3/14/2021 6:14:07 PM
dewster

From: Northern NJ, USA

Joined: 2/17/2012

Plan B

After agonizing over the metering of MIDI data, I decided MIDI flooding is probably OK because the MIDI link is basically point to point with no merging going on (none at least at 31.25k Baud).  And then I hit on the idea of pulling the UART FIFO into this problem in order to make MIDI data clumps self-timing:


The processor MIDI interface is a register in the register set.  The register is a slave to the processor and a master to the write side of the FIFO it is connected to on the other side.  The read side of the FIFO is a slave to the MIDI UART.  Normally the FIFO write side full indicator is readable via the processor register, so the processor can know when it is OK / not OK to write so that no data is lost due to the buffer being full (back pressure).  I instead replaced this with FIFO read side not empty indicator.  There's no danger of writing too many bytes to the MIDI FIFO as it is quite deep (1k bytes) and our MIDI data clumps are quite shallow (15 bytes max).  So the new plan is to have the processor wait until the last byte in a clump is taken by the UART before writing another clump.  This required a bit of editing and a new FPGA load, but now the SW MIDI state machine can just hold off until the FIFO is empty, and doesn't require any sort of timer.  You don't want to calculate any MIDI data before you can transmit it, there's no point in a bunch of old data growing even staler in a deep FIFO.

Coding is going well but I'm quite cramped for bytes, which is making development difficult.  I barely have MIDI doing anything (just playing fixed velocity notes is more fun than it should be - perhaps the novelty will wear off?) and already I've removed the PV_MOD UI page and replaced it with a MIDI page.  PV_MOD is only there for radical filter center frequency shifts, like abrupt vowels and such - and as much as I wanted to like it and employ it, it mostly just sits there gathering dust.  Without it vowels can still be shifted, only over larger ranges and in a less controlled manner, and that's probably enough for most stuff.  A couple of minor presets will bite the dust.  Giving MIDI the flexibility it deserves requires more than one free knob, and PV_MOD didn't take up huge amounts of code or real time, but anything is a help, and the page and knobs will come in handy.

Posted: 3/19/2021 9:03:24 PM
dewster

From: Northern NJ, USA

Joined: 2/17/2012

Q-tip

Just did a quick experiment measuring antenna voltage droop vs. calculated Q droop.

I was wondering if increasing C as the hand approaches the antenna could account for the quite noticeable antenna voltage swing droop.  Some equations for series LCR:

  F = 1/(2*pi*sqrt(L*C))

  C = (1/(2*pi*sqrt(L))^2

  Q = (1/R)*sqrt(L/C)

As usual the coil was a 3.826mH / 46.5DCR single layer solenoid driven by 8 xistor ECL_LC oscillator w/ 47 ohm output, connected to standard 3/8” antenna, home made HV probe.

Q is a direct voltage gain figure, but measured voltage droops much more than the series LCR equations back calculated Q.

Conclusion: hand/body R must be the dominant damping factor.

[EDIT] Further proof of the conclusion: I draped an insulated ground wire over the antenna and measured 220Vpp @ 690kHz.  With my hand I was getting 162Vpp @ 690kHz here.

[EDIT2] I probably wasn't clear enough in describing the mechanism under test here.  The third equation above shows that Q is directly proportional to the inverse square root of C, so as the hand approaches the antenna and C increases, we would expect Q to drop.  Which is does, but not at the much higher observed rate, which is due to the extra damping provided by what appears to be the resistance of the human body.  I suppose some of it could be RF emission as well, as the length of the human body would be a better RF radiator than a little (to the wavelength) rod or plate.

Posted: 3/19/2021 9:39:01 PM
JPascal

From: Berlin Germany

Joined: 4/27/2016

What if you didn't use your body-C, but a fixed capacitor to ground that has no R?

Posted: 3/19/2021 10:02:46 PM
dewster

From: Northern NJ, USA

Joined: 2/17/2012

"What if you didn't use your body-C, but a fixed capacitor to ground that has no R?"  - JPascal

That's effectively what I did with the draped ground wire, which exhibited much less droop at the same frequency.  I wonder if anyone has done a backward calculation to determine a rough human body series R (to use e.g. in oscillator Spice simulations)?

[EDIT] This is such an obvious basic experiment, I don't know why I didn't think to do it years ago.

Posted: 3/20/2021 3:50:53 PM
dewster

From: Northern NJ, USA

Joined: 2/17/2012

Damp Hands

In order to roughly model hand damping voltage droop, I just performed the following rather inexact experiment.  I grounded one test lead and clipped a resistor to it.  Then I clipped a second test lead to the other side of the resistor and draped this second test lead over the antenna.  I changed the resistor value until damping at 700kHz was roughly the same as my hand, which turned out to be 1.5k Ohms.  Testing different frequencies by increasing and decreasing the test lead coupling - and therefore the capacitance - revealed a fairly good correlation with my hand damping.


The above is what I'll be simulating with in the future, where R is the DCR + oscillator drive resistance.

Posted: 3/24/2021 6:11:04 PM
dewster

From: Northern NJ, USA

Joined: 2/17/2012

MIDI Nonstandard (The CC14 Blues)

OK, got basic MIDI and all the UI page knobs up and fully operational yesterday.  Will go into the details in a future post.

Lost maybe 4 hours yesterday to what I thought was a bug in my code, but it seems to be more of a bug in the debugger and/or the MIDI standard itself.  The debugger I'm using is ReceiveMIDI written by Geert Bevin (the guy who implemented the Linnstrument SW) and it's quite nice!  A command line tool, you configure it to listen to a port and apply whatever data filters, and it starts spewing out a line for each filtered MIDI command received.  I noticed MIDI CC14 (14 bit controller change) messages weren't monotonically increasing / decreasing with smooth left hand movement, that values here and there were off a little, but systematically so.  By examining CC14 as separate CC7 messages it was clear that the LSBs (least significant bytes) weren't always getting combined with the MSBs (most significant bytes). 

This is a major, major flaw in the MIDI standard itself.  To transmit a CC14 you send the MSB to its CC address, then the LSB to the CC address += 32.  But you don't have to send the LSB if you don't want to.  Right here you can immediately see trouble: how does the receiver know when to use the data if the LSB might never arrive?  The MIDI spec says the reception of the MSB zeros out the LSB, but that's terrible if you indeed send the LSB later because it will cause two actions and a glitch: apply MSB with LSB=0 (not what you want - glitch!), then MSB+LSB (what you want).  You could send the LSB first (like is done for pitch bend - the spec is inconsistent too) but sending the MSB later zeros it out.  And if you send LSB first it will immediately act on it!  This is total mess, I can't believe they didn't think something this fundamental all the way through.  I mean, what were they smoking?

Here's some folks discussing / crabbing about it and wondering what the heck to do:
https://community.vcvrack.com/t/14-bit-midi-in-1-0/1779

Here's Haken talking about how they do things in light of the rotten MIDI spec:
https://www.hakenaudio.com/mpe

It seems the MPE update could have dealt with this issue - but it didn't!  So Haken had to roll their own MPE+.  Lordy.

I see that Haken always transmits MIDI velocity 127 (max) which is something I was considering as velocity heavily conflicts with manual envelope modulation, and Theremins aren't percussive / don't generally have ADSR handling the attack.  BTW the spec says to send 64 here.

Haken also uses an absolutely enormous 96 note pitchbend range, which is something they can do with the higher resolution of their proprietary standards extension (extra byte of pitch bend).  I agree with everything they say on that page, sharp guys.

OT: Just switched from Google / duckduckgo search engines to Yandex: https://yandex.eu/.  Tired of the other engines taking my search terms only as mild suggestions, presenting me with page after page of vaguely related nonsense to wade through.

[EDIT] Ha! Yandex pointed me to this, first page: https://www.facebook.com/groups/53827181469/permalink/10156550867896470

Posted: 3/25/2021 4:24:19 PM
dewster

From: Northern NJ, USA

Joined: 2/17/2012

MIDI UI

Just updated the librarian, here's a screen shot:

At the upper right you can see the new MIDI UI page, which replaced the old PV_MOD page. 

- CONTROLS -
cloc[0:63] : CC minimum value location [-48dB:0db] (volume axis).
ctrl[0:31] : CC control number (modulated by volume axis).
tloc[0:63] : Key-on/off location [-48dB:0db] (volume axis, with hysteresis).
chan[0:15] : MIDI Channel.
velo[0:31] : Key-on velocity scaling, 0 gives constant 64, 31 gives constant 127, values in between use volume hand velocity.
bend[0:127] : Pitch bend span (+/- half steps, e.g. 24 gives +/-2 octave PB span).
oct[-7:7] : Pitch octave offset.

For Theremin type playing I set ctrl=2 (breath) or ctrl=11 (expression pedal) and tloc < cloc so I don't hear the note-on event.  Also velo=31 to get full volume.  And bend=36 gives plenty of range before retriggering - I'm not hearing much in the way of pitch stepping even with absurdly high settings of bend, though I do hear some volume stepping depending on the preset and ctrl type.

I used my new-ish inverse scaling types for bend and cloc.  For bend I think it's maybe more obvious why: the larger the pitch bend range the lower the gain of pitch bend value.  For cloc you need to compress the control operation over a smaller range the higher the cloc setting.

This morning I was noticing lots of strange note retriggering at the on/off threshold.  I maxed out the hysteresis but it was still there.  Turns out the MIDI cable was running right by the volume plate and its ribbon cable, inducing fairly heinous interference.  Moved it a few inches away and everything's fine - whew!

I'd give you a recording but I'm having a devil of a time figuring out how to route the Yoshimi soft synth audio to Audacity.

Posted: 3/25/2021 5:33:27 PM
pitts8rh

From: Minnesota USA

Joined: 11/27/2015

I'm excited to try this. 

Sample players sound terrible with pitch wheel bends, but is it going to sound better when you control an actual synthesizer, not a sampler?  Or will it have the same chipmunk effect if you deviate too far from a sine wave?

So if you bring up a note and sweep more than two octaves above it does it retrigger at the two octave point with the bend snapping to zero? Or what does happen?

Posted: 3/25/2021 8:54:42 PM
dewster

From: Northern NJ, USA

Joined: 2/17/2012

"I'm excited to try this."  - pitts8rh

I'm excited for you to try this!

"Sample players sound terrible with pitch wheel bends, but is it going to sound better when you control an actual synthesizer, not a sampler?  Or will it have the same chipmunk effect if you deviate too far from a sine wave?"

It really depends on the patch.  I haven't looked into what's going on inside Yoshimi (it's not bad but I'm mainly using it as a noise source for dev) but many of the patches do the chipmunk thing with pitch bend, and some don't.  Why one would tie things like formants to the pitch wheel is beyond me, perhaps it's some sort of default one needs to turn off?  A big point of synthesis is to get away from those baked in resonances.

"So if you bring up a note and sweep more than two octaves above it does it retrigger at the two octave point with the bend snapping to zero? Or what does happen?"

Exactly!  Say you've got bend=24 and start on say D3 and sweep up.  Once you hit the center of D5 the old stretched D3 note is turned off, the pitch bend is re-centered, and a new D5 note is played with the original velocity that the old D3 was played at.  The patch really dictates how much of an audible glitch is generated at the transition - if there is significant ADSR decay time then you will hear the pitch wheel reset and perhaps even two notes playing for a while.  If not then it's just a short blip.  But you can set bend=36 or bend=48 (or even higher if your synth supports it - Yoshimi can do 64 max) and you'll never hit a retrigger unless you're flailing around like Joe Cocker in subsonic land.  The pitch maxes out at A8 (the D-Lev limit; MIDI note 117).

You must be logged in to post a reply. Please log in or register for a new account.