Hi,
I'm hoping someone can point out problems with my thinking here:
I'm working on a microcontroller project (5v PIC), and I'm going to add a low-voltage detector using the built in ADC.
What I want to do is monitor the voltage on the battery I'm powering the circuit with (9v) and when it drops below a threshold (for example 8.3v) I'll notify myself somehow (blinking led)
I'm also trying to conserve current when the circuit isn't in use by going into standby mode on the voltage regulator (draws 1uA in sleep, supposedly).
So the first idea for the voltage detector is simply drop the voltage off the 9V source with a voltage divider (ballpark : 1Mohm SMD resistors). Feed that into the PIC and bam, can relate the ADC input voltage to a realworld battery voltage.
|------|
9v-------| Vreg |---- < |------| | > r1 | < |-----------| |---| |-----| ADC (pic) |---|led| < |-----------| |---| > r2 < | = gndThen I realized that because the resistors bypass the sleep-mode on the voltage reg, it's always going to drain currect (around 9uA for a 1Mohm voltage divider).
Then I was thinking maybe I could add a p-channel mosfet inline with the
9V battery and R1. As long as I can maintain the bias on the gate then this is a more current friendly solution. (perhaps use a p-channel depletion i think, pull-up the gate to 9v through a resistor)Does this seem like a bad idea?
Thanks for any insights, Andrew (to email me, remove all the underscores in my email address)