Consul
Well-known member
So I got one of those $4.30 MSP Launchpads from TI and I've been thinking about a project for it. It's not a very powerful board, using only the bottom-end chips from the MSP range, but I think it'll be good enough for this.
It has A/D converters (2 x 10bit, I think), but what it doesn't have is any D/A, which I think is odd since D/A is generally easier to implement. (EDIT: I should point out that it's the two chips I got with the Launchpad that don't have D/A. Other MSP chips do.) So, one day I was sitting in class learning all about motor control via PWM, and how, thanks to the inertia of the motor, you can vary the pulse width of a signal and it sends what can be thought of as an equivalent DC voltage to the motor (Vrms = Va*sqrt[Duty] if I recall correctly). Well, when I think "inertia" my mind kinda landed on low-pass filters (which essentially average a signal over time) and I got to thinking about how to encode a DC voltage using PWM. Then I wouldn't need a D/A to get a CV signal out to the world.
Obviously, this is nothing new, but I was kinda proud of myself for thinking of it. It turns out that there are many applications, including large medical laser power supplies, that use this exact technique for varying (very large) DC voltages. But in order for me to get my theory straight rather than stumbling around, I have some questions.
My main question has to do with sampling theory, actually. Namely, when you're dealing with what is essentially a 1-bit D/A which is outputting a relatively slow-changing non-periodic signal, how do I figure out what the minimum frequency of the original PWM signal needs to be in order to avoid aliasing? I'm thinking it works pretty much the same, ie, if I want the voltage to be able to change as quickly as 1/100 of a second, I would generate the PWM signal at 200Hz.
As an aside, this also leads to the question about "how many bits per wave cycle" I'll need, which will determine the resolution of the output DC voltage. I actually feel I understand this part pretty well. In other words, referring to the PWM wave as 200Hz is not the same as 200 bits per second. It'll actually be many bits per cycle.
This leads, then, to output ripple, and that has to do with how high of a frequency of PWM signal I need to start with to end up with minimal ripple on the DC output. I'm thinking I can apply what I know about power supply design to this and be pretty much spot-on, but I wouldn't mind some input from people who know much more than I do.
So what do you think? Is this doable, or is the usual 12- or 16-bit D/A chip still the only way to go for MIDI to CV? Thanks for your help!
It has A/D converters (2 x 10bit, I think), but what it doesn't have is any D/A, which I think is odd since D/A is generally easier to implement. (EDIT: I should point out that it's the two chips I got with the Launchpad that don't have D/A. Other MSP chips do.) So, one day I was sitting in class learning all about motor control via PWM, and how, thanks to the inertia of the motor, you can vary the pulse width of a signal and it sends what can be thought of as an equivalent DC voltage to the motor (Vrms = Va*sqrt[Duty] if I recall correctly). Well, when I think "inertia" my mind kinda landed on low-pass filters (which essentially average a signal over time) and I got to thinking about how to encode a DC voltage using PWM. Then I wouldn't need a D/A to get a CV signal out to the world.
Obviously, this is nothing new, but I was kinda proud of myself for thinking of it. It turns out that there are many applications, including large medical laser power supplies, that use this exact technique for varying (very large) DC voltages. But in order for me to get my theory straight rather than stumbling around, I have some questions.
My main question has to do with sampling theory, actually. Namely, when you're dealing with what is essentially a 1-bit D/A which is outputting a relatively slow-changing non-periodic signal, how do I figure out what the minimum frequency of the original PWM signal needs to be in order to avoid aliasing? I'm thinking it works pretty much the same, ie, if I want the voltage to be able to change as quickly as 1/100 of a second, I would generate the PWM signal at 200Hz.
As an aside, this also leads to the question about "how many bits per wave cycle" I'll need, which will determine the resolution of the output DC voltage. I actually feel I understand this part pretty well. In other words, referring to the PWM wave as 200Hz is not the same as 200 bits per second. It'll actually be many bits per cycle.
This leads, then, to output ripple, and that has to do with how high of a frequency of PWM signal I need to start with to end up with minimal ripple on the DC output. I'm thinking I can apply what I know about power supply design to this and be pretty much spot-on, but I wouldn't mind some input from people who know much more than I do.
So what do you think? Is this doable, or is the usual 12- or 16-bit D/A chip still the only way to go for MIDI to CV? Thanks for your help!