" and a very time critical routine (multiplication of the actual wavetable value, a 16bit unsigned integer with the volume value, a 8bit unsigned integer) is even implemented in assembler language." - Thierry
IMO this is one of the problems us engineers suffer from - We get tied to our paradigm, and dont think outside our "home" discipline..
IMO, doing digital multiplication of waveform with volume is an example of this - it results in the quality of the waveform reducing as volume reduces, and simultaneously requires fast real-time routines to be implemented, burdening the processor.
Simply take the volume value to a D/A (PWM is usually ok and can be output from an on-MCU timer) as volume doesnt vary rapidly, and a large TC of perhaps 3ms is fine) and use an analogue multiplier (VCA) to control the volume - the 16 bit waveform is then not degraded.
(my "home" is of course the other extreme - I probably build highly complex analogue waveshapers when a digital one would be more sensible! ;-)
Ha! You're not wrong there, Fred, but I would add that programmers are no better. I studied primarily business computing (when the business mainframe had an escalating demand for COBOL programmers to maintain 20 year old code and home micros were just 8 bit toys) and the distinction between hardware and software was absolute. It took me a couple of years of hanging out with embedded processor guys who designed complete systems - hardware and software - to realise that this was a misconception and that functionality could migrate from one to the other.
" It took me a couple of years of hanging out with embedded processor guys who designed complete systems - hardware and software - to realise that this was a misconception and that functionality could migrate from one to the other." - GordonC
Yeah - Thats where I was in the '80s .. Back then in the embedded field, where we designed equipment using processors, there was no clear delineation between software engineers and hardware engineers - we were all expected to be capable of both.. Some were more efficient at SW and some better at HW, so work did tend to be steered a bit - but everyone could code and everyone knew basic logic at least - quite a few had basic analogue understanding as well..
That all changed.. First with C (we all programmed in assembler - I was the first to get into C with a compiler I bought personally) but the real change was when C++ was adopted - thats when I got out of software, I just couldnt get on with canned objects which hogged resources and never worked.. The paradigm of "Software engineers dont need to understand hardware" and "Hardware engineers just need to provide what software engineers demand" set in -
Its so damn stupid! I was able to do more with a small 8 bit MCU than the new generation was able to do with a big expensive MCU and tons of external hardware, because I was still programming in ASM+C and designing my own external HW, while the new generation played with their Lego software and demanded hardware that would have been simple if the code had been written instead of pasted.
2 years after C++ the company (which had been making £M's profits) went to the wall - the big expensive projects sunk it.. Only one product they made got taken off the reciever and is still in production - it uses an 8 bit MCU with code in ASM+C ;-)
My biggest puzzle is understanding how technology is advancing the way it is - its incredible to me that this division hasnt thwarted progress.. There must still be some seriously capable engineers managing things.
Fred.
ps - Dewster.. Just saw your toast(er) LOL ! Brilliant! ;-) ... Its a theremin Jim, but not as we know it....
"My biggest puzzle is understanding how technology is advancing the way it is - its incredible to me that this division hasnt thwarted progress.. There must still be some seriously capable engineers managing things." - FredM
IMO it's maybe 95% Moore's Law keeping things going. Code and processors can be the most bloated things in the world but no one notices much because speed and memory capacity keep expanding geometrically. Kind of tough on the embedded crowd though.
Personally I'm flabbergasted whenever my dual core 3GHz PC goes out to lunch for several seconds. Maybe it ran out of zeros and needs to compute more via Euler's Identity? (Though one would think the dedicated zero register in most processors would obviate the need.)
Randy, please don't add another thing to wait, try to tell us about your experience with the Theremini, so far you have been the only one Thereminist that played the instrument, it's very important your opinion.
Hello again, almost done ... I've been all kinds of wacky busy in the past two weeks... It will soon be time to dry the baguette and make some croutons. I'm not going to hype the salad that's on its way too much with this post... Thanks for all o' y'all who have been waiting patiently. Writing about the theremin is for me definitely on the opposite end of the fun spectrum from actually playing the theremin... But writing, whilst being busy with a ton of other things is definitely off the scale.