MIDI to CV converter using PWM?

GroupDIY Audio Forum

Help Support GroupDIY Audio Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Consul

Well-known member
Joined
Jun 3, 2004
Messages
1,653
Location
Port Huron, Michigan, USA
So I got one of those $4.30 MSP Launchpads from TI and I've been thinking about a project for it. It's not a very powerful board, using only the bottom-end chips from the MSP range, but I think it'll be good enough for this.

It has A/D converters (2 x 10bit, I think), but what it doesn't have is any D/A, which I think is odd since D/A is generally easier to implement. (EDIT: I should point out that it's the two chips I got with the Launchpad that don't have D/A. Other MSP chips do.) So, one day I was sitting in class learning all about motor control via PWM, and how, thanks to the inertia of the motor, you can vary the pulse width of a signal and it sends what can be thought of as an equivalent DC voltage to the motor (Vrms = Va*sqrt[Duty] if I recall correctly). Well, when I think "inertia" my mind kinda landed on low-pass filters (which essentially average a signal over time) and I got to thinking about how to encode a DC voltage using PWM. Then I wouldn't need a D/A to get a CV signal out to the world.

Obviously, this is nothing new, but I was kinda proud of myself for thinking of it. It turns out that there are many applications, including large medical laser power supplies, that use this exact technique for varying (very large) DC voltages. But in order for me to get my theory straight rather than stumbling around, I have some questions.

My main question has to do with sampling theory, actually. Namely, when you're dealing with what is essentially a 1-bit D/A which is outputting a relatively slow-changing non-periodic signal, how do I figure out what the minimum frequency of the original PWM signal needs to be in order to avoid aliasing? I'm thinking it works pretty much the same, ie, if I want the voltage to be able to change as quickly as 1/100 of a second, I would generate the PWM signal at 200Hz.

As an aside, this also leads to the question about "how many bits per wave cycle" I'll need, which will determine the resolution of the output DC voltage. I actually feel I understand this part pretty well. In other words, referring to the PWM wave as 200Hz is not the same as 200 bits per second. It'll actually be many bits per cycle.

This leads, then, to output ripple, and that has to do with how high of a frequency of PWM signal I need to start with to end up with minimal ripple on the DC output. I'm thinking I can apply what I know about power supply design to this and be pretty much spot-on, but I wouldn't mind some input from people who know much more than I do. :)

So what do you think? Is this doable, or is the usual 12- or 16-bit D/A chip still the only way to go for MIDI to CV? Thanks for your help!
 
dmlandrum said:
My main question has to do with sampling theory, actually. Namely, when you're dealing with what is essentially a 1-bit D/A which is outputting a relatively slow-changing non-periodic signal, how do I figure out what the minimum frequency of the original PWM signal needs to be in order to avoid aliasing? I'm thinking it works pretty much the same, ie, if I want the voltage to be able to change as quickly as 1/100 of a second, I would generate the PWM signal at 200Hz.
huh? The PWM is not one bit but dynamic range or bits of resolution is quantized by number of clock ticks in base PWM repeat rate. The basic trade off is you want the PWM rate to be >> than the frequency you want to pass, but to have more dynamic range you want to have a lot of clock ticks, so this somewhat conflicting goal depends on clock rate of the processor and flexibility of the PWM firmware.
As an aside, this also leads to the question about "how many bits per wave cycle" I'll need, which will determine the resolution of the output DC voltage. I actually feel I understand this part pretty well. In other words, referring to the PWM wave as 200Hz is not the same as 200 bits per second. It'll actually be many bits per cycle.
Sampling theory requires 2 samples per cycle to reconstruct the frequency data, while more than 2x is desirable for high fidelity.

This leads, then, to output ripple, and that has to do with how high of a frequency of PWM signal I need to start with to end up with minimal ripple on the DC output. I'm thinking I can apply what I know about power supply design to this and be pretty much spot-on, but I wouldn't mind some input from people who know much more than I do. :)
ripple also depends on the reconstruction filter that will define response time or HF response. A very slow filter can have no ripple and no HF response.
So what do you think? Is this doable, or is the usual 12- or 16-bit D/A chip still the only way to go for MIDI to CV? Thanks for your help!

PWM is doable, but you need to spend some time on the bench to see how it acts and if your platform is compatible with your desired outcome.

JR

PS: I actually use a PWM output to make a sine wave signal for my drum tuner... not as clean as an analog sine wave generator or 16 bit D/A, but good enough for the drumheads.

 
What I want to do is take a PWM signal generated in the chip and put it out one of the digital pins then through a low-pass filter to get a steady DC out that corresponds to the duty cycle of the original signal. What I'm concerned with is how to a) determine the frequency of PWM signal I need, b) determine how many bits per cycle of that PWM signal I need (I mentioned this is the part I feel most comfortable with), and c) reducing ripple at the output.

Part C will probably need some experimentation, since I'm sure there's a level of ripple that the ear will no longer be able to discern when controlling, say, a VCO. This is for MIDI to CV for a modular synth, after all. (Or to put it another way, how much variance can a control voltage signal have when controlling an oscillator before we can hear the variance?)

Now, this DC voltage will not be a steady DC, but rather it's going to change as new keys are hit, or a wheel is moved, or the like. So that makes it a slow, non-periodic signal. How fast I might want that signal to be able to change is where the sampling theorem question comes in.

I really don't see why this wouldn't work. I just wanted to get some stuff hashed out up front. Thanks for the help.
 
A problem I see is this: if you filter out too much ripple (i.e. a strong low pass) you limit the ability of your CV output to change quickly.  Wouldn't an R2R ladder into a buffer be a better way to do this?
 
That would limit me to (I think) 6 bits of resolution on the output from these chips, which would mean inaccurate pitches (or a small range) on the DC side. The idea is to pull this off with what I have on hand. If I wanted to buy more stuff, there are of course better ways to do it.
 
dmlandrum said:
That would limit me to (I think) 6 bits of resolution on the output from these chips, which would mean inaccurate pitches (or a small range) on the DC side. The idea is to pull this off with what I have on hand. If I wanted to buy more stuff, there are of course better ways to do it.

You'd only be 20-odd keys short of a piano but if it's not enough it's not enough.  I figured a few resistors and a couple quad opamps or similar wasn't a lot to add.  You'd probably need something buffery anyway because the micro likely won't do more than low tens of mA.  That said, if it's gotta be PWM, there's probably a clever way to do it.  Interesting idea.
 
Without knowing more about your micro.. you need to determine A) what output resolution you need and B) what output settling time or speed you can tolerate.

The faster the processor clock, the smaller the time resolution of the PWM. The bits of output resolution are pretty simple, a 8 bit resolution would require a 256 tick long PWM period. 9 bits needs 512 ticks, 10 bits 1024, etc. The speed math has to do with update rate or PWM period. A 1 Mhz processor clock, requires over a thousand microseconds to output 10 bit resolution, or a roughly 1mSec/1kHz PWM period.

I just pulled some arbitrary numbers from my brown hole, but you get the point there is a trade off between resolution, update rate, and aliasing. The bottom line is how much resolution do you need, and can you live with the PWM update rate.  

JR
 
Thank you, John and bahrens! You've confirmed more or less what I was thinking. So it's really just a matter of whether the uC is fast enough and what errors we can tolerate in the output signal.
 
> relatively slow-changing ...signal

To do WHAT?

Amplitude? Lights? Bah, 10% accuracy is slight glitch, a 32-step counter is plenty good.

Musical Frequency? Argh. Musicians are so fussy. So fussy that pitch CV is usually log-bent to be practical. Usually 1V/octave. Consider: 5 octaves, 12 notes per octave, anybody can hear a matching-error of 1/10th of a note, some musicians will hear 1/100th of a note. 5*12*10 is nearly 1,000 steps for "eh", and 7,000 for very-good.

BTW, I dunno what the chip is, but this kind of constant work should be offloaded from the main brain onto a side-circuit. Little utility CPUs often have counters which can be set and will count-off solo while the main brain does other (async) work. You probably have a 16,000-tick counter, so counts like 7,000 are not absurd.

Taking 16,000 counts at 100 times a second is 1,600,000 Hz. Or the deluxe 32K counter needs 3MHz. Can you feed MHz into your counters? If not, can you tolerate larger pitch errors? A 1,024-count would be proof of concept and maybe some techo-tunes, and only needs 100KHz clock, but more sure would be better.

Back when I went to summer school, in the snow, up-hill, both ways, PaIa did similar stuff with, IIRC, a 4-bit 200 K_Hz processor. So your "not a very powerful board" should be total over-kill.
 
Back
Top