DIY success- cybernetic heater control

GroupDIY Audio Forum

Help Support GroupDIY Audio Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

JohnRoberts

Well-known member
Staff member
GDIY Supporter
Moderator
Joined
Nov 30, 2006
Messages
28,330
Location
Hickory, MS
JohnRoberts said:
I'm just now finishing up my DIY project to make a time/temperature controller for my little spot heater so I can turn off the in wall heat unit back in my bedroom. I had a little trouble calibrating the diodes but I now have a nice temperature readout, digital clock, smart thermostat  (almost finished)


JR

OK, we have fire in the hole, but under cybernetic control.

Just in time for spring when I won't need this at all, I can now thermostatically control my small 800W auxiliary heater with different temp targets changeable every 15 minutes over 24 hours. So now I can turn off my in wall heater and heat up the bedroom just before I  retire, and just before I get up in the morning. Perhaps I'll set a low comfortable temp at night (?). But completely off during the day. This will use less electricity total than my current in wall heater running 24x7 at a low but not off setting. I can now make it warmer when I want it warm, and completely off when I don't care. 

I was able to repurpose an old drum tuner proto PCB 12 note display as a crude digital clock with 15 min resolution using the quarter note LEDs (green=AM/red=PM). In temp mode green= actual temp, red= temp target

I count PS AC zero crossings for clock time base.

I use 4 diodes in series for my temperature probe, crudely calibrated for 60-72' range but this is arbitrary and flexible, I'll ultimately set it for comfort not precise temps.

I drive a 16A triac from a small opto-triac for mains supply isloation.

It is a little awkward  and a small LCD display would be hipper if I was making a real product. I save the temperature targets into flash memory so even if the power blinks off I don't lose settings, just lose a little time on the clock that is easy to reset.

DIY can make us greener and save us money.

It's really nice when a plan comes together, my out of pocket cost was only a few $ for the power triac and opto triac. But i did spend mote time than I'd like to admit writing software. It took me minutes to design the overall program design, multiple hours to actually code it.  :-( I'm easily distracted and my boss tolerates my diversions

JR

PS: I needed to heat-sink the triac when running my 800W heater, so I couldn't make this too small, but smaller than it is now. The isolated tab triac makes it relatively easy to heat sink.
 
Good stuff John.

I recycle everything too. I have tons of  microcontroller based boards left over from various jobs. If/when a one off job comes up there is always something to fit it into.

But my fetish is really with the screws. Boxes and boxes that came out of dismantled equipment. Let's put it that way. I have been manufacturing (educational) robot arms since '97 and all the micros screws for the gripper gear assembly came from old compact cameras.
 
sodderboy said:
I cheated and got this for $120.  I turn my heat on and off from the phone. 
It will be a huge saver in the summer with the AC.
Mike
I really like the concept obviously... my in wall heater is very old school, but now I guess I just need a bigger triac to control it too.

My air-co/heat-pump has delay functions and a hand held IR remote for temp/mode/delay but no actual computer interface.  I am half considering jury rigging up an IR link so I can can have one of my custom TOD/TEMP controllers actually talk to the air co/heat pump too.  If course I need to decode the IR output from the hand held remote... I thing there are computer accessories that may do that. But no need to tie up a computer for such a simple decision maker.

I already turn off the air when I am away, and set it to come back on before i get home.

I was surprised how cool my bedroom got today with the heat completely off.. While not very cold outside, bedroom was within a couple degrees of the bottom of my display scale. That is actually good news means more electricity to save during daytime hours.

JR


 
> fire in the hole, but under cybernetic control.

Cool. (Un-cool?)

I must re-endorse the Honeywell cyber-thermostat.

Especially for older electric baseboard which creaks.

No, it does not have all-day 15-minute settings (but who is that predictable?). Four per day, though I think they can be set to a minute. (You DO have to set ALL four, or when it gets to "undefined" it goes wild.)

The sweetness is a 1-2-3-4 power control. When way below set point, it runs full, of course. But as it gets close to set point, then cycles, it runs the heater at part-power. So instead of HOT!/cold every few minutes, it runs warm and generally pretty constant.

(The modulation will upset a fan-blown heater. There's a setting for that, but then it's just a mediocre standard time-thermostat.)

FWIW: it has held its clock through several power failures. It drifts, but not so you care for heat purpose; anyway you do have to re-set for Daylight Stupid Time twice a year and you can catch the drift at the same time.

And WOW! Mid-winter 2 years ago I paid $50 after seeing it for $60+. Now that the shine is off the product and spring is in sight, $36.

Agree that I have always thought of a PC (now Tablet) controlled 'stat, where everything was visible and hackable. But I could easily squander more seconds programming than a lifetime of getting-up and twisting a switch.

And FWIW: I am hoarding dumb old mercury thermostats for the day when our smart-toys get smart enough to revolt.
 
Yup, it looks like I could have saved myself some grief, but I got to stretch my brain a little, and it's fun to make a few cheap diodes and a micro into a full feature thermostat.

I am a little curious about the heat control on the old school in wall heater. It works remarkably well for a simple mechanical system. No doubt based on a bi-metallic arrangement (like your mercury thermo) but since the mechanism is built into the heater they had to do some engineering to add enough hysteresis for it to work and deliver a range of heat outputs.

I am seeing some things to improve in mine, I have no threshold hysteresis, and saw it dither a few times at the once per second update rate. I may add a 1/2 degree of hysteresis, or update less frequently than once every second, which will have a similar effect.

While 4x a day seems too few, 4 times an hour is probably too many temp targets...

It will take a while to get this dialed in and the small heater makes a lot more noise creaking and humming than the former. At this point I'm just excited because it actually works.

JR

PS: This modern triac is much more painless than some of my earlier experiences using them. At Peavey I inherited one power amp on-switch design, where a triac was used to handle the high inrush current, allowing a cheaper low current power switch to trigger the gate of the higher current triac. I had the bad fortune to run into a number of triacs that would arbitrarily refuse to switch on, months later when the spirit didn't move them. When that occurred the flame proof resistor in their gate drive would over dissipate and release it's smoke (but not flame). The larger problem was these amps were used in fixed installs, so any service call, no matter how trivial made the job a profit loser. After going down a long and winding road, returning misbehaving triacs to the manufacturer to dissect, they were unable to explain the behavior. Namely drawing enough gate current to fry the gate resistor, while still not switching the triac on.  Since I couldn't tolerate that many field failures for that market, I bit the bullet and paid up for a heavier current old school mechanical power switch. 

 
> heater makes a lot more noise creaking and humming

10 second updates are MORE than enough (yes, instant after manual change).

If error less than 1 degree, run 5 seconds on 5 seconds off. (Optional: if 2 deg low run 8 on 2 off, if 2 deg high....)

That may kill the creak once at setpoint.

The hum.... get a different heater. My Sengoku SP-160 has a just-audible buzz if I put my ear right on the panel (which is not comfortable). My 240V baseboard fed 120V can just-barely be heard a foot away on a silent night.

Or of-course a FWB and a BFC. Though then you have to allow for the 120V-168V change.

Farm-store has 250W brooder lamps.
 
PRR said:
> heater makes a lot more noise creaking and humming

10 second updates are MORE than enough (yes, instant after manual change).

I added about a 1/2 degree of hysteresis... the 1 second dithering was only bothering me because I could see it. (I light an LED to show status)

If error less than 1 degree, run 5 seconds on 5 seconds off. (Optional: if 2 deg low run 8 on 2 off, if 2 deg high....)

That may kill the creak once at setpoint.
I can smack it once on the flat top and that quiets it down for a while. Can't do that while I'm asleep (but don't need to).
.
The hum.... get a different heater. My Sengoku SP-160 has a just-audible buzz if I put my ear right on the panel (which is not comfortable). My 240V baseboard fed 120V can just-barely be heard a foot away on a silent night.

Or of-course a FWB and a BFC. Though then you have to allow for the 120V-168V change.

Farm-store has 250W brooder lamps.
Even electric blanket clicks, but you can get used to anything.

It's 9pm, so time to start ramping up the bedroom temps. With the heat completely off during the day it got colder than my temp scale. So 60'F or less. 

JR
 
Fixed one software bug that was preventing the flash memory reads to work reliably, but heater-thermostat was happy all night with 1/2 degree hysteresis.

I did notice one disturbing thing... during the initial warm up, with heater running full on (7A?), to heat the room from 60' or so, I noticed the old repurposed extension cord end was too hot to touch...  Not a good thing. I cleaned that up with a cheap duplex outlet mounted into what was the speaker hole in old proto metal.

I decided to dissect the several decades old extension cord, just to see why it was getting so hot and after cutting away a bunch of molded rubber, the actual metal contacting the plug was using maybe an 1/8 to 3/16' deep mating contact area. It didn't get "burst into flames" hot, but too hot to hold onto with bare hands. Scary.

JR

PS: Of course the $0.59 duplex outlet was made in China, but the $0.44 plastic cover plate was made in the good old US of A.
 
> extension cord end was too hot to touch...  Not a good thing.

That's how a LOT of fires start.

Not "too many loads".

Often not "one large load".

But an "acceptable" load over a long period of time in a low-low-lowest-bid connector.

Not aided by being "several decades old". Rust never sleeps; neither does copper-tarnish.

When I ran a 10A heater, I had to discard the factory plugs on heater and extender cord and replace them with heavy-duty; and replace those again every couple years. Ritual of handling them every few cold nights to stay aware of their condition.

Be aware of the wall-outlets. It has been common to back-stab instead of using screws. A good back-stab is a fine contact, but they exist only to save (labor) cost, so are never too good. They will sometimes serve well for decades, or burn inside the wall, possibly at much lower than rated current. I need to re-do the whole upstairs.
 
PRR said:
> extension cord end was too hot to touch...  Not a good thing.

That's how a LOT of fires start.
yup my bad... I figured a little too long, too skinny, line cords would just heat the room a little more evenly..  8) I didn't think I'd need to heat sink a plug in connection...  :eek: The wire didn't even get warm, but the whole risky business has been properly disposed of now.
Not "too many loads".

Often not "one large load".

But an "acceptable" load over a long period of time in a low-low-lowest-bid connector.
In hindsight I'm not sure that extensions cord was ever rated for 7A continuous (my bad again).
Not aided by being "several decades old". Rust never sleeps; neither does copper-tarnish.
Actually after digging pretty deep, the too-small metal contacts I found were clean and shiny. It must have been protected from the environment by being so far from the openings.
When I ran a 10A heater, I had to discard the factory plugs on heater and extender cord and replace them with heavy-duty; and replace those again every couple years. Ritual of handling them every few cold nights to stay aware of their condition.
I recall replacing the heater on/off switch a few years ago when it devolved into an off-off switch.
Be aware of the wall-outlets. It has been common to back-stab instead of using screws. A good back-stab is a fine contact, but they exist only to save (labor) cost, so are never too good. They will sometimes serve well for decades, or burn inside the wall, possibly at much lower than rated current. I need to re-do the whole upstairs.
I am optimistic, that my house predates aluminum and  back-stab outlets. The subject bedroom outlet is even 3 circuit grounded, while I wouldn't bet money on the ground connection or proper polarity hot/cold wiring, it should handle 7A without melting.

It's after 9pm so the heater is on now...  ;D I programmed temperature steps to bring the room up to temp in stages. It was below 60' (or below my approximately calibrated 60') earlier today.  I can set thresholds above and below the temp display range, and have a secondary 5 LED relative temp display +/- 2', so before I just dialed in the thermostat for 1' colder than the room got down to. 

This is like having a new toy to play with... 8) Realistically I should pick up a couple of those pretty programmables thermostats and rig them up to my in-wall hooters, but wheres the fun in that?



JR
 
> I figured a little too long, too skinny, line cords would just heat the room a little more evenly..  I didn't think I'd need to heat sink a plug in connection...

The wire is rarely a problem. Connectors are.

Maybe you've lost the hindsight you must have gained working in the Big Amp factory? Or maybe these problems don't happen until after warranty. In the far field, we never see much burnt wire, often find bad contacts with heat stains.

I doubt you had less than #18. As you say, that won't get hot at 7A. And the connectors are sample tested at high current by UL. But reality is different.

Backstab and Al are different. If you have 3-hole outlets, it's certainly in the backstab era. You can poke #14 or #12 into a hole and a spring grips it. Kinda. These gave SO much trouble they were banned. And then the mass-housing lobby rose up and got a #14-only exemption.

The push-in connector *can* be good. There's a whole line of standalone push connectors which compete with WireNuts, and are supplied on Halo recess fixtures which are very fire-conscious. But that's not what you get in outlets.

There's also Back Wire, hole in the back, but then you must screw it. Just a captive screw-clamp instead of screw-wrap. Excellent connection.

But when you get the connector to the wire, then you face the crappy connector design. "Design" may be too strong a word. The origins of the 2-blade plug are murky. There were several competing products, rarely used at first, and NEC/UL declined to get involved picking a Standard. When appliances were few, OK, but the market exploded. And price pressure cheapened the sockets and plugs. There IS a UL standard, hanging a weight on the cord. IMHO most connectors fail such a test after a few insertions. You can change a plug (who does?) but nobody changes wall-outlets until they HAVE to.

Extension cord outlets are often worse.

"Electrical Fires" happen a lot. CPSC asked for suggestions. The industry bonded behind "Arc Fault". Though it is not clear that arcing causes many fires. Or how you could tell a steady connector arc from a vacuum cleaner. Or that the AFCI standard we got is actually effective against anything except an artificial UL test.

The more likely fault is "glowing contact". With poor pressure and the least oxide, plus a few jiggles, our plugs are likely to make a microscopic contact. The current density, and oxides, can become a hot negative resistance. Because the load is positive resistance, this is stable. And the glowing contact can dissipate near constant power over a range like 1A to 10A. Since it is thermal, you can't see it in the load waveshape (what AFCIs monitor). Some AFCIs also have GFI action, and it turns out that some thermoplastics leak more when hot, and *this* may break the glowing contact. But by the time this happens the socket is all out of shape. And it's the wrong way to detect heat. And the better outlets use a thermoset which doesn't break-down and leak, but crumbles instead.

You don't see such problems in rationally designed connectors. The Schuko is better, perhaps with a little hindsight from US plugs. The BS(UK) connector grips much better. (Also generally lower current and enough voltage to punch oxide.)

 
PRR said:
> I figured a little too long, too skinny, line cords would just heat the room a little more evenly..  I didn't think I'd need to heat sink a plug in connection...

The wire is rarely a problem. Connectors are.

Maybe you've lost the hindsight you must have gained working in the Big Amp factory? Or maybe these problems don't happen until after warranty. In the far field, we never see much burnt wire, often find bad contacts with heat stains.
Perhaps I've been lucky to avoid experiencing many seriously bad mains contacts.

I did experience a miswired extension cord (I borrowed from a neighbor?) with hot and ground reversed...  Yup, no ground fault interruptor to prevent me getting a serious shock out in my wet yard (shock could have been worse. I was putting a sump pump into a water filled hole. i generally respect mains electricity so I wasn't in the hole, but wet enough feet to get shocked. )  :p

My contact/current experience in amplifiers was mostly things like trying to push a couple kW out a 1/4 T-S speaker jack. Spacial 1/4" jacks with two tip contacts and higher contact force, make that work (mostly). Some out of tolerance offshore speaker plugs could get hot, but I don't know of fires.   

Also dealt with line cord/outlet limits in the context of how much amp power UL would let us get from a basic line cord/outlet. 
I doubt you had less than #18. As you say, that won't get hot at 7A. And the connectors are sample tested at high current by UL. But reality is different.

Backstab and Al are different.
indeed
If you have 3-hole outlets, it's certainly in the backstab era.
Thanks, I was guessing, since grounded outlets have been around for a while.

I have seen back stab when replacing light switches and ignored it, when I replaced the ballast in my fluorescent (with a higher efficiency version) that was back stab only, and the fixture for my DIY UVc lamps were stab only.
You can poke #14 or #12 into a hole and a spring grips it. Kinda. These gave SO much trouble they were banned. And then the mass-housing lobby rose up and got a #14-only exemption.

The push-in connector *can* be good. There's a whole line of standalone push connectors which compete with WireNuts, and are supplied on Halo recess fixtures which are very fire-conscious. But that's not what you get in outlets.

There's also Back Wire, hole in the back, but then you must screw it. Just a captive screw-clamp instead of screw-wrap. Excellent connection.
I think I've seen that but don't recall where, perhaps not mains wiring.
But when you get the connector to the wire, then you face the crappy connector design. "Design" may be too strong a word. The origins of the 2-blade plug are murky. There were several competing products, rarely used at first, and NEC/UL declined to get involved picking a Standard. When appliances were few, OK, but the market exploded. And price pressure cheapened the sockets and plugs. There IS a UL standard, hanging a weight on the cord. IMHO most connectors fail such a test after a few insertions. You can change a plug (who does?) but nobody changes wall-outlets until they HAVE to.
Yup, I have only dealt with this in passing in context of how much current. FWIW, outlets and plug current ratings are continuous and many loads are not (well my heater and incandescent light bulbs may be continuous), but amps and the like are not. As modern high current consumer products move to use more power factor correction the load current is more evenly spread out, more average. A good thing for high current applications because there is less voltage drop due to wiring. Back in the early days of high power audio amps, UL limited amp size, as if amps were full rated power 24x7, but they shifted gears back in the early 90s to allow higher power from Edison outlets. 
Extension cord outlets are often worse.

"Electrical Fires" happen a lot. CPSC asked for suggestions. The industry bonded behind "Arc Fault". Though it is not clear that arcing causes many fires. Or how you could tell a steady connector arc from a vacuum cleaner. Or that the AFCI standard we got is actually effective against anything except an artificial UL test.
I have mixed feelings about UL, but warmed up to them a little more when they stood with us in court when peavey got sued because some muso got killed by a miswired hot outlet ground while playing two Peavey guitar amps. Our products were correct, the outlet was the killer, but it was appreciated that UL had our back in court. 
The more likely fault is "glowing contact". With poor pressure and the least oxide, plus a few jiggles, our plugs are likely to make a microscopic contact. The current density, and oxides, can become a hot negative resistance. Because the load is positive resistance, this is stable. And the glowing contact can dissipate near constant power over a range like 1A to 10A. Since it is thermal, you can't see it in the load waveshape (what AFCIs monitor). Some AFCIs also have GFI action, and it turns out that some thermoplastics leak more when hot, and *this* may break the glowing contact. But by the time this happens the socket is all out of shape. And it's the wrong way to detect heat. And the better outlets use a thermoset which doesn't break-down and leak, but crumbles instead.
I've experienced the occasional warm plug.. Again a little surprising they work as well as they do.
You don't see such problems in rationally designed connectors. The Schuko is better, perhaps with a little hindsight from US plugs. The BS(UK) connector grips much better. (Also generally lower current and enough voltage to punch oxide.)

Not to change the subject back to my heater, but last night it worked as designed and programmed. I need to tweak the temperature targets. It was too warm after I was in bed, and I need more time to ramp up from 60' to 66' or so target with only a small 800W heater. More than enough to maintain a constant temp, but it struggles to heat up a cold room.

I will try to resist the temptation to apply windage. I know what the room temp is in real time, and know when and what I want the temp to be in the future, so I could do a simple approximation for when to start heating adaptively based on current temps.

I am trying to talk myself out of doing this (I need more data)., but it would also track with the seasonal changes outside since the cold room doesn't get as cold. The heat required to change the room temp will be roughly linear, but heat to maintain temp will vary with delta between inside/outside temp. Of course software doesn't even need to be linear, I can just do a step wise approximation in the software, even a crude linear approximation is better than nothing. Trade off for algorithm errors is that room heats to target temp perhaps a little too soon, vs not getting warm enough until later, so easy choice.  If I had more heat power available, I could make that another adaptive variable, but I observed that it took hours to get up to temp last night so I don't have too much heat power (only 800W). Another thought that might be simpler to program. Just learn how long it takes to heat up, and use that data the next night. I could even measure the on vs off time after at temp to literally measure steady state heat loss (for given start temp that imputes outdoor temp).  Interesting... somebody stop me...  8)

For a full house sophisticated Heat pump controller, I am tempted to detect outside temp also, so I could grab and bank a little more heat while the outdoor temp is higher and heat pump is most efficient, but this is contrary to optimal interior comfort. The in-wall heat pump unit already seems to heat less as outdoor temps rise, but this works OK for general comfort.  I don't need to detect outdoor temp for small bedroom application. I can impute outdoor temp roughly from how cold room gets with heater off. 

JR

[edit- well nobody stopped me...  I figure I can do this simpler. I can start with a guess at capability to heat the room in degrees/time, and use this to work backwards to start heating the room early. Then I can literally measure how much the room heats up vs, actual true time the heater is on... This way I can update the actual capability-rate of heating on the fly, say updated every 15 minutes.  If I miss and don't reach the temp target in time on a given night, the next night I probably won't miss as it learns capability over time. And can adjust the rate slowly as the seasons change.  Damn, I've got real work I should be doing, but this is more fun.  :-*  [/edit]
 
I was gonna say: I've seen smart-stats that guess a 15 minute warm-up, note if they fall short at the appointed time/temp, and try a 20 minute pre-heat the next night. (That's more for 20KW house-burners than for a 0.8KW room warmer; an hour or so may be correct.)

How much accuracy is justifiable? Say that after a solid 800W pre-heat, it holds with 200W. And say it hits set-point an hour early. 200WH is three cents on the electric bill, 3 bucks over a southern heat-season. If your programming is only worth a buck an hour, go for it, but 10-minute anticipation error is silly. If you can bill $50/hr, forget it, burn the juice.

I just realized that I don't try for better than 30 minute "error" on my programmable. I'm erratic, and crampy, and another penny of juice is better than a cold bed.

Reading outside temps is an old-old optimization. You sometimes see the can and capillary tube on old buildings. Usually to increase heat when temperatures fall quickly; counter-act thermal inertia of boiler and building. The idea to pump more heat when it is NOT cold is thought-provoking.
 
PRR said:
I was gonna say: I've seen smart-stats that guess a 15 minute warm-up, note if they fall short at the appointed time/temp, and try a 20 minute pre-heat the next night. (That's more for 20KW house-burners than for a 0.8KW room warmer; an hour or so may be correct.)

How much accuracy is justifiable? Say that after a solid 800W pre-heat, it holds with 200W. And say it hits set-point an hour early. 200WH is three cents on the electric bill, 3 bucks over a southern heat-season. If your programming is only worth a buck an hour, go for it, but 10-minute anticipation error is silly. If you can bill $50/hr, forget it, burn the juice.

I just realized that I don't try for better than 30 minute "error" on my programmable. I'm erratic, and crampy, and another penny of juice is better than a cold bed.

Reading outside temps is an old-old optimization. You sometimes see the can and capillary tube on old buildings. Usually to increase heat when temperatures fall quickly; counter-act thermal inertia of boiler and building. The idea to pump more heat when it is NOT cold is thought-provoking.

I couldn't help myself and this seems so well suited to a micro.

Since I am running all the time I can learn what I need to know on the fly. 

The magic data that isn't obvious is how much do I need to anticipate. Whenever I am running with the heat actually on for a significant solid block of time, I can literally note the starting temp and the ending temp, divided by the on time to calculate my ability to raise the room temperature.

Once I have this magic number I can look forward for the next higher temperature target, then work backwards from there with my known ability to heat up the room per unit time and arrive at a current time temp target that will get me to the final target in time.

To be conservative I can reduce the measured ability to heat up the room by 25% so I should always get there a little early. By updating the rate of heating as I go, even if I start too fast or too slow I can adjust.

=====
conversely if the next temp target is lower than present, i want to program in a straight line decay, where temp drops the incremental fraction of the distance to arrive at the terminal temp. Of course if the room cools slower than the target, it doesn't much matter. 

Now instead of programing in a bunch of different changes, I tell it when I want it to be how warm, and when I want it to be cool again, and if I want some intermediate slower cooling  I can program that in too so it can slope down slowly, between warm and less warm. then slope down faster to cool again.

Now all I have to do is finish coding it, easier to conceptualize that grunt it out, but I'm more than half done.  I have all the temp targeting done, now I need to measure the heat rise, and average that over several samples.

it will get smarter as it runs, but I will make the adaptive parts relatively quick, since heat rise depends somewhat on outdoor temps too. Too long of an averaging time, means it will always be lagging reality. I will probably ignore temp rise data if it isn't on at least 80-90% of the time interval. Perhaps ignore if actual temp isn't a couple degree's below target.

JR
 
 
> program in a straight line decay

Will work, but of course the curve is more exponential. No matter how lame the CPU is, you have plenty of CPU cycles to do exponential approximation. You need never do any real work again, with a project like this.

Happy hacking!
 
PRR said:
> program in a straight line decay

Will work, but of course the curve is more exponential. No matter how lame the CPU is, you have plenty of CPU cycles to do exponential approximation. You need never do any real work again, with a project like this.

Happy hacking!
Yup, 16bit 20 mips is enough computer power to make this do anything I can imagine for it...

I like the straight line for decay because I can't imagine how/why to curve it that would be better?. I can program in multiple straight line segments at 15 min intervals so I can make it be pretty much what I want and when, while actively heating the room..

For prediction how much lead I need to get up to temp in time from cold, without wasting energy by heating up too soon, or missing and not getting up soon enough, I have been thinking about this a little wrong, trying to capture the data on the fly as a back ground task. I have so much processor power and know what I want to end up with so I can use a little more complicated algorithm to predict with more certainty as a primary task.

I can historically remember my worst case so I know pretty much what my worst case will be for when to start sampling it. I start off by probing to directly measuring how long it takes to heat up the room today x degrees. Then I can see how much it cools off in another fixed time period, or duty cycle to hold at that temp. Which will tell me heat loss. With this data, perhaps hours before I need to end up at temp, I can more precisely calculate when I need to turn up today to get where I want to end up today.  I can even repeat the probe again to fine tune my estimation.

I can make this more than adequately precise hitting temp targets going up, For going back down I don't care if it misses?

I noticed a weird bug where switching between time and temp display a few times it got stupid and jumped to the wrong time, but I don't see what caused that bug yet. I need more evidence to figure it out, or stop pushing buttons in the middle of the night.

Weather report says 20's tonight so should be a good test. Last night was more comfortable than the night before, and tonight should be better. It's already ramping up to temp, using previous algorithm, so not as smart as it will finally be, but on the job and working.

I need for spring to hurry up and spring so I can stop messing with this and turn off the heat.

JR 


 
After fixing a few obscure bugs, like one occurring exactly at midnight, where it would grab some bogus way-too-hot, temperature target.  It has settled down and is behaving as programmed. I noticed that the outlet I added to my rube-goldberg thermostat runs ice cold in use, while the plug going into my wall outlet was getting hot.

I bought a new outlet to replace that sometime, but after unplugging and re-plugging it a few times the temperature of the plug cooled off.

Since spring is breaking out, I am almost ready to stick a fork in this until next winter.

JR
 
Well it's another winter heating season so I revisited my Rube Goldberg cybernetic heater.

My biggest complaint about it was the old heater shaking and making a rattling noise as the wires vibrated from first heating up. Not very loud, but loud enough in a quiet bedroom in the middle of the night.  I decided to try duty cycling the amount of power I fed the heater, so it only turned on at full strength when first heating up the cold room, or perhaps during the next ice age.

I didn't want to generate massive amounts of RF or test how inductive my heater wires are so I limited myself to only switching at zero crossings. I found that using too many cycles (actually half cycles) as a PWM time base, led to audible pulses of energy. I finally compromised on a 5 (half cycle) PWM timebase. I selected the odd number for the timebase, so any DC content from odd half cycles would alternate, every other period and cancel out. The 1/5, 2/5, power level resolution was fine enough.

The plan worked mostly, the noise level dropped proportionately so typically 1/5 or 2/5 as loud as before. Strangely similar sounding, just less loud.

This week I decided to investigate how a newer heater would act. The cheapest auxiliary heater I could find at wallmart (for $19.95) was more than 2x the power, had an internal fan, and a thermostat. Actually pretty remarkable for the price.

I don't remember how big of a triac I used last year when I made this but apparently large enough. Plugging the new heater in worked pretty much as expected, but now at full heat there was fan noise, and at either 1/5 or 2/5 power, the fan didn't like being hit with pulses and made an audible thunk thunk thunk, but much faster (say around 30 Hz).

So I took it apart, and just happened to have two 75 ohm power resistors back in my lab. So 150 ohms in series with the fan, quieted down the thunk thunk thunk, and reduced the fan speed at max heat to a moderate level. The fan actually starts and turns slowly at 1/5 power so it'a all good, and quiet.

If anything it looks like I need to remove a bunch of extra software I wrote so the heater would anticipate future temperature changes. I was apprehensive that the small heater could not keep up with temperature changes, but it turns out the older 650W heater could easily over heat the room, and now with more than twice that heat available, I can ignore worrying about keeping up. 

In hindsight the 15 minute resolution for time of day temperature changes is more than necessary. Changing thermostat settings on full hours is adequate.

Now it's quiet and I rarely see this new heater running on higher than 2/5, and that 2/5 may be related to the threshold hysteresis I have coded in so it doesn't dither around when close to temperature.

Bring on the cold..... I'm ready

JR
 
> heater shaking and making a rattling noise as the wires vibrated from first heating up.
>  I decided to try duty cycling


PRR said:
I must re-endorse the Honeywell cyber-thermostat.

Especially for older electric baseboard which creaks.

...as it gets close to set point, then cycles, it runs the heater at part-power.

Oh.... and I'm running 240V baseboard at 120V. 4 feet of baseboard costs a lot more than you paid, and at 120V outputs a lot less, so may not be for you.

I'm uncertain of your fan hack. Seems if the triac sticks ON, it starts to go the full 1500 Watts, it needs "full fan" to stay safe.
 

Latest posts

Back
Top