Amperage question

MikeinDestin

Member
Joined
Nov 21, 2024
Messages
66
Location
Destin, FL
We all know if you run a linear amplifier on 125 volts, the current will be much more than if you were to run it on 240+ volts. Why then is it the opposite with DC, at least using batteries? When the voltage goes down so does the current. Does the fact that batteries produce more resistance when their voltage decreases? More resistance would certainly decrease the current. Would a DC power supply produce different results? Thanks for any responses.
 

G7RUX

Active Member
Joined
Jul 14, 2021
Messages
656
We all know if you run a linear amplifier on 125 volts, the current will be much more than if you were to run it on 240+ volts. Why then is it the opposite with DC, at least using batteries? When the voltage goes down so does the current. Does the fact that batteries produce more resistance when their voltage decreases? More resistance would certainly decrease the current. Would a DC power supply produce different results? Thanks for any responses.
This is a straightforward application of Ohm’s Law; double the Volts across a resistor and you double the current. When looking at 5he amplifier example you’re using Ohm’s Law in a different way; for a given amount of power delivered to a resistor the current goes down as the Volts applied goes down.
 

prcguy

Member
Premium Subscriber
Joined
Jun 30, 2006
Messages
16,960
Location
So Cal - Richardson, TX - Tewksbury, MA
The gain of a tube or transistor will go down when the tube plate voltage or transistor supply voltage goes down. So if you feed an amplifier running on 14 volts 4 watts of power and it puts out 100w it might draw about 20 amps of current. Turn down the voltage on the amplifier to 12 volts, the gain will go down and the same amplifier fed 4 watts might only put out 75 watts. The current draw will also go down to maybe 17 amps. Every amp will act a little different to a change in voltage depending on design, if bias is affected, etc.

It’s a little different than comparing a 100w light bulb running off 14 volts then going to 12 volts as that is a simple ohms law calculation where the amplifier is an active device with more things changing with voltage.
 

G7RUX

Active Member
Joined
Jul 14, 2021
Messages
656
That is quite possible, yes. If you were to run a 100 W amplifier from 12 V, 24 V or 48 V (of course as long as the design permitted it) then you would expect around 8.3 A, 4.2 A and 2.1 A in each case, assuming 100 % efficiency.

This is hugely simplified because equipment designs generally don’t behave exactly like this and the interplay is much more complicated. (PRCguy posted pretty much that comment just as I pressed “go”!)
 

MikeinDestin

Member
Joined
Nov 21, 2024
Messages
66
Location
Destin, FL
The gain of a tube or transistor will go down when the tube plate voltage or transistor supply voltage goes down. So if you feed an amplifier running on 14 volts 4 watts of power and it puts out 100w it might draw about 20 amps of current. Turn down the voltage on the amplifier to 12 volts, the gain will go down and the same amplifier fed 4 watts might only put out 75 watts. The current draw will also go down to maybe 17 amps. Every amp will act a little different to a change in voltage depending on design, if bias is affected, etc.

It’s a little different than comparing a 100w light bulb running off 14 volts then going to 12 volts as that is a simple ohms law calculation where the amplifier is an active device with more things changing with voltage.
That's basically what I was trying to figure out. (The second paragraph) Thanks!
 
Top