The apparent answer to the question would be (100 W)/(120 V) = 0.8333 A, assuming that, as a pure resistance load, the light bulb has a power factor close to 1.0.
Chat with our AI personalities
I = E ÷ R = 120V ÷ (60Ω + 40Ω + 20Ω) = you figure it out now
A 120V power supply connected to a 30 Ohm resistor will produce 120/30 or 4 amps of current.
Since power is volts time amps, the current in a 60W lamp connected to 120V is 0.5A. Since a lamp is a resistive load, there is no need to consider power factor and phase angle, so that simplifies the explanation. ======================== Assuming this is an incandescent or halogen lamp (using a filament to make the light) there is a trick here: the resistance of a lamp filament varies with temperature and does not follow Ohm's law. The resistance will be much lower, thus the current will be much higher when the filament is cold, when the lamp is first connected. As the filament heats up, the resistance increases until it gets to a steady operating point of 0.5A. For a halogen lamp, the operating temperature is about 2800-3400K, so the R at room temperature is about 16 times lower than when hot... so when connected, the current is about 8A but drops rapidly. The current could be even higher if the lamp is in a cold environment. Non-halogen lamps operate at a lower temperature and would have a lower initial current--about 5A. And this all assumes the lamp is rated for 120V. If it is a 12V/60W lamp, the filament will probably break and create an arc, which may draw a very large current.
Because they are "in-phase". In order to get 240v, you need two 120v Alternating Current lines that are 180° out of phase, that is, opposite phases. Only when one line is +120v and the other -120v will you see 240v between the wires.
Because the white wire on a 120 volt circuit is the neutral wire that is connected to the silver screw on outlets and switches. It is connected to the neutral bar in the service panel.