At first glance this control system would seem even more primitive than a conventional simple thermostat control. It would work only if the initial conditions were the right sign because it only cuts on when the output is equal to or just above (or below) the reference temp.
For example, in the morning you set the ac to 80 degrees when it was
75 degrees. At noon the temperature reaches 80 degrees and the ac automatically cuts in as it should. The room cools down a degree or two and then the ac shuts off as it should. The room warms up again and the ac cuts on again.
The controller seems to be working.
If you waited until the afternoon, however, when it was already 85 degrees in the room to set the ac at 80 degrees, the ac would not cut on as it could not tell the difference between being above or below the reference temp.
One less than satisfying solution might be for the controller to start looking for the output. It could only do this by _changing_ the output until it reaches the reference. In the thermostat example the controller would wait a certain amount of time and if the output doesn't reach the reference point, the controller turns on the ac for x minutes. If that didn't find the output it shuts off and turns on the heat pump for 2x minutes. If that doesn't work then back to the ac for 4x minutes . . .
Obviously this isn't going to revolutionize the thermostatic controls industry, but has anyone heard of any solutions that could end run sign determination yet still be reliable?
Bret Cahill