Lost your password? Please enter your email address. You will receive a link and will create a new password via email.
Please briefly explain why you feel this question should be reported.
Please briefly explain why you feel this answer should be reported.
Please briefly explain why you feel this user should be reported.
Choosing power supply, how to get the voltage and current ratings?
Voltage Rating If a device says it needs a particular voltage, then you have to assume it needs that voltage. Both lower and higher could be bad. At best, with lower voltage the device will not operate correctly in a obvious way. However, some devices might appear to operate correctly, then fail inRead more
Voltage Rating
If a device says it needs a particular voltage, then you have to assume it needs that voltage. Both lower and higher could be bad.
At best, with lower voltage the device will not operate correctly in a obvious way. However, some devices might appear to operate correctly, then fail in unexpected ways under just the right circumstances. When you violate required specs, you don’t know what might happen. Some devices can even be damaged by too low a voltage for extended periods of time. If the device has a motor, for example, then the motor might not be able to develop enough torque to turn, so it just sits there getting hot. Some devices might draw more current to compensate for the lower voltage, but the higher than intended current can damage something. Most of the time, lower voltage will just make a device not work, but damage can’t be ruled out unless you know something about the device.
Higher than specified voltage is definitely bad. Electrical components all have voltages above which they fail. Components rated for higher voltage generally cost more or have less desirable characteristics, so picking the right voltage tolerance for the components in the device probably got significant design attention. Applying too much voltage violates the design assumptions. Some level of too much voltage will damage something, but you don’t know where that level is. Take what a device says on its nameplate seriously and don’t give it more voltage than that.
Current Rating
Current is a bit different. A constant-voltage supply doesn’t determine the current: the load, which in this case is the device, does. If Johnny wants to eat two apples, he’s only going to eat two whether you put 2, 3, 5, or 20 apples on the table. A device that wants 2 A of current works the same way. It will draw 2 A whether the power supply can only provide the 2 A, or whether it could have supplied 3, 5, or 20 A. The current rating of a supply is what it can deliver, not what it will always force thru the load somehow. In that sense, unlike with voltage, the current rating of a power supply must be at least what the device wants but there is no harm in it being higher. A 9 volt 5 amp supply is a superset of a 9 volt 2 amp supply, for example.
Replacing Existing Supply
If you are replacing a previous power supply and don’t know the device’s requirements, then consider that power supply’s rating to be the device’s requirements. For example, if a unlabeled device was powered from a 9 V and 1 A supply, you can replace it with a 9 V and 1 or more amp supply.
Advanced Concepts
The above gives the basics of how to pick a power supply for some device. In most cases that is all you need to know to go to a store or on line and buy a power supply. If you’re still a bit hazy on what exactly voltage and current are, it’s probably better to quit now. This section goes into more power supply details that generally don’t matter at the consumer level, and it assumes some basic understanding of electronics.
Regulated versus Unregulated
Unregulated
Very basic DC power supplies, called unregulated, just step down the input AC (generally the DC you want is at a much lower voltage than the wall power you plug the supply into), rectify it to produce DC, add a output cap to reduce ripple, and call it a day. Years ago, many power supplies were like that. They were little more than a transformer, four diodes making a full wave bridge (takes the absolute value of voltage electronically), and the filter cap. In these kinds of supplies, the output voltage is dictated by the turns ratio of the transformer. This is fixed, so instead of making a fixed output voltage their output is mostly proportional to the input AC voltage. For example, such a “12 V” DC supply might make 12 V at 110 VAC in, but then would make over 13 V at 120 VAC in.
Another issue with unregulated supplies is that the output voltage not only is a function of the input voltage, but will also fluctuate with how much current is being drawn from the supply. A unregulated “12 volt 1 amp” supply is probably designed to provide the rated 12 V at full output current and the lowest valid AC input voltage, like 110 V. It could be over 13 V at 110 V in at no load (0 amps out) alone, and then higher yet at higher input voltage. Such a supply could easily put out 15 V, for example, under some conditions. Devices that needed the “12 V” were designed to handle that, so that was fine.
Regulated
Modern power supplies don’t work that way anymore. Pretty much anything you can buy as consumer electronics will be a regulated power supply. You can still get unregulated supplies from more specialized electronics suppliers aimed at manufacturers, professionals, or at least hobbyists that should know the difference. For example, Jameco has wide selection of power supplies. Their wall warts are specifically divided into regulated and unregulated types. However, unless you go poking around where the average consumer shouldn’t be, you won’t likely run into unregulated supplies. Try asking for a unregulated wall wart at a consumer store that sells other stuff too, and they probably won’t even know what you’re talking about.
A regulated supply actively controls its output voltage. These contain additional circuitry that can tweak the output voltage up and down. This is done continuously to compensate for input voltage variations and variations in the current the load is drawing. A regulated 1 amp 12 volt power supply, for example, is going to put out pretty close to 12 V over its full AC input voltage range and as long as you don’t draw more than 1 A from it.
Universal input
Since there is circuitry in the supply to tolerate some input voltage fluctuations, it’s not much harder to make the valid input voltage range wider and cover any valid wall power found anywhere in the world. More and more supplies are being made like that, and are called universal input. This generally means they can run from 90-240 V AC, and that can be 50 or 60 Hz.
Minimum Load
Some power supplies, generally older switchers, have a minimum load requirement. This is usually 10% of full rated output current. For example, a 12 volt 2 amp supply with a minimum load requirement of 10% isn’t guaranteed to work right unless you load it with at least 200 mA. This restriction is something you’re only going to find in OEM models, meaning the supply is designed and sold to be embedded into someone else’s equipment where the right kind of engineer will consider this issue carefully. I won’t go into this more since this isn’t going to come up on a consumer power supply.
Current Limit
All supplies have some maximum current they can provide and still stick to the remaining specs. For a “12 volt 1 amp” supply, that means all is fine as long as you don’t try to draw more than the rated 1 A.
There are various things a supply can do if you try to exceed the 1 A rating. It could simply blow a fuse. Specialty OEM supplies that are stripped down for cost could catch fire or vanish into a greasy cloud of black smoke. However, nowadays, the most likely response is that the supply will drop its output voltage to whatever is necessary to not exceed the output current. This is called current limiting. Often the current limit is set a little higher than the rating to provide some margin. The “12 V 1 A” supply might limit the current to 1.1 A, for example.
A device that is trying to draw the excessive current probably won’t function correctly, but everything should stay safe, not catch fire, and recover nicely once the excessive load is removed.
Ripple
No supply, even a regulated one, can keep its output voltage exactly at the rating. Usually due to the way the supply works, there will be some frequency at which the output oscillates a little, or ripples. With unregulated supplies, the ripple is a direct function of the input AC. Basic transformer unregulated supplies fed from 60 Hz AC will generally ripple at 120 Hz, for example. The ripple of unregulated supplies can be fairly large. To abuse the 12 volt 1 amp example again, the ripple could easily be a volt or two at full load (1 A output current). Regulated supplies are usually switchers and therefore ripple at the switching frequency. A regulated 12 V 1 A switcher might ripple ±50 mV at 250 kHz, for example. The maximum ripple might not be at maximum output current.