Question re: AC/DC Adapters?

Hi, I have a Linksys ADSL Internet gateway whose AC/DC adapter decided to die. The old adapter was a 12V, 1.0A device.

I bought from a shop a multi-voltage adapter, and set it to 12V, and from the specs of the adapter its giving out 0.9A, would this damage the hub? It is working absolutely fine, i'm just worried it could shorten its lifespan.

In a similar way I have an external USB hub that has the option to be powered. on the back it states: Input Voltage range: 5V, 2500mA MAX.

is it the case with these dc adapters that so long as its /LESS/ than the amount stated as required, and so long as the device works, it's fine to use a lower powered adapter?

Mark.

Reply to
markus4412
Loading thread data ...

The old adapter is rated to deliver up to 1 ampere of current. The LinkSys device may not need that much.

Sounds like you have been successful; i.e., 0.9 amperes is enough to power it.

Actually it's the other way around: the power supply should have a current rating at least as large as the load requires. Your USB hub should use a power supply rated at 5v at 2500ma or higher current.

Reply to
Roby

In all your devices, the current is a dependant variable, the voltage is an independent variable. ( In your case, set at 12 v.)

In other words, for a given voltage, the current depends on how much the device draws, not how big the power supply is.

That being said, for a given voltage, you don't want to go too high in over sizing the current capacity of the power supply. It should pretty much match or be slightly over the maximum current draw from the device.

Why?

  1. Properly sized supplies are more efficient and less wasteful of power.

  1. Lets say you have a 50A 12 V Supply for a device that takes 12 V. at just 50 ma. The device would work fine when connected to the power supply but it could be dangerous if there was an internal short (or you accidentally shorted out the power leads). What you have then is an arc welder with a big fat hot spark and a likely meltdown.

This explains why big power supplies are sub-divided with fuses and circuit breakers, just like the AC panel in your house.

Beachcomber

Reply to
Beachcomber

rule of thumb... if the supply gets hot in operation it isn't big enough.

the "MAX" refers to the maximum load of the device.. the power supply rating should be larger.

Reply to
Tim Perry

So why does it say 5v, 2500mA MAX, as in maximum input voltage? Or am I reading it incorrectly?

Reply to
markus4412

2500 mA MAX as in output current
Reply to
Roby

MAX is the maximum input *current*. It tells you that the hub will not draw more than 2500 mA from the supply, regardless of how many devices it supports. The more devices working from hub, the more current is drawn, up to a maximum of 2500 mA. The input voltage is fixed at 5 volts.

Ed

Reply to
ehsjr

Ahh, it all makes sense now. So as long as I use a 5 volt adapter, it really wouldn't matter ieven f it supplied a 5amp current, since the hub will only draw up to 2.5 amps, and excess amperage won't be a problem? (so long as I don't connect it up to something supplying

50amps which could then cause problems if the hub DID have a short circuit or something) ?
Reply to
markus4412

Yes, even though you are talking about AC. You can still apply Ohms law here.

E (voltage) = I (current) * R (resistance)

or specific to this case

I = E/R

In the case of your load, the resistance (or impedance) is fixed no matter how big or small the current supplying capacity of your 5 volt adapter.

The current draw will always be the voltage divided by the resistance.

There is normally more to it with AC circuits, but this is a very basic example.

Beachcomber

Reply to
Beachcomber

Exactly. As an example, the battery in your car is capable of supplying hundreds of amps, yet the little lights in things like your radio dial, speedometer, gas gauge - whatever gets illuminated on the dashboard, use way below 1 amp. If your car radio has a digital clock in it, it may use less than one thousandth of an amp (1 mA), but is not harmed by the car battery.

Ed

Reply to
ehsjr

---------------- On the other hand, it could well mean that the maximum allowable load is

2500ma measured at the input so that the number of devices is limited by this. If too many devices are being used, the input current could be over 2500ma, exceeding the rating. Don't expect it to supply a 5A load (at 5V) and still have an input of 2.5A.
Reply to
Don Kelly

That would be a completely bass-ackward and meaningless input rating. If rated that way, it would in effect say "don't load the hub so much that it draws more than 2500 mA". Where have you seen it used like that? Ed

Reply to
ehsjr

I haven't checked the ratings on any hubs and may not be thinking of the same kind that you are. If they are USB hubs, then the input current will be the sum of the load currents. If not, and there is some control in the device, then the input current will likely exceed the sum of load currents. In either case, the output is limited.

Yes, I am implying "don't load the hub to the point where the input is greater than 2500ma" - it saves summing the individual loads. That is the only way an input current rating makes sense. To me, it doesn't imply that the hub limits the current to 2500ma although, for a price, that could be done.

Are you suggesting that you can have 5V, 5000ma out with a 5V 2500ma input?

Reply to
Don Kelly

The rating tells you that you do not need to provide a power supply capable of over 2500 mA. Without that rating, how does the user know how big of a supply to provide? It *absolutely* makes sense. Otherwise, the user doesn't know whether to provide a supply at

2500 mA, or 5000 mA or 10000 mA or whatever.

To me, it doesn't imply that

It is not implied - it is stated. Whether it is active limiting, or passive limiting, or intrinsic to the components within the hub is not germaine, if that's what you have in mind.

You seem to be saying that the input current rating is there to protect the hub and/or the devices connected to it. Is that what you have in mind?

Hell no. I'm suggesting that the *input* rating is just that - an *input* rating. Input ratings tell you how much current the device will draw (or power it will consume), not the current or power the input must be limited to.

Output power will always = input power - losses ... but you already know that.

Ed

Reply to
ehsjr

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.