I read on a book that when a circuit has an AC voltage source as
the input then it's input resistance is
Ri= ui/ii
where ui is the instantaneous input voltage and ii the
instantaneous input current.
Isn't this false?
It's not a definition that I've seen. It doesn't strike me as a good
definition in general because the math would get a little dicey at the
zero crossings. Otherwise I guess it would be basically correct for a
resistive network, even if resistances changed with time.
--------------------
The input resistance implied is not that of the source but that of the
circuit supplied by the load-as seen by the source and is independent of the
source.
In any case, an ideal voltage source is a convenient fiction useful in
circuit analysis but hard to find in the real world (sure with fast enough
feedback one can come close).
As for the definition- it simply notes that v(t)=R*i(t) rather than
v(t)=L*(di/dt) or i(t)=C*(dv(t)/dt
That is correct.
If R is constant over the range of possible v(t), i(t) then be happy, Ohm's
Law is valid and life is simple(?)
.--
Don Kelly snipped-for-privacy@shawcross.ca
remove the X to answer
----------------------------
PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.