Recently a question came up for my group's project.
We're using a temp sensor chip that has micro-amps of max output current. We're sending this down a long piece of wire (maybe 20awg).
I suggested a unity gain buffer because the long length of wire will create noise but more importantly: the chip can't drive a signal through the long wire.
Unfortunatly, I only know a buffer should be used, hoewver, I don't know how to mathematically calculate (in theory) how long the wire can be before the chip can't drive it.
If the wire is 2ohm for 100ft and the chip has 100uA max output current; is the voltage across the wire 200uV (2ohm x 100uA)?
I found a chart for voltage drop per 100ft of wire, but it only listed specific current values. Is there a better chart or an explaination on how to calculate everything?
thanks in advance!