200ma is .200 amps or .2 amps
No. The adaptor will overheat.
Yes. The current rating should be the same or greater than the original. This means the adapter can supply up to 500mA; In your case it only needs to supply 200mA, so it is more than up to the job.
There are, 2000/1000 = ,amps in 2000 milliamps. For the math challenged that is 2 amps.
18 Volts.
No, 200mA is not the same as kilohms.Amperes is a unit of current flow. Ohms is a unit of resistance. Other than being related by Ohm's law - Voltage = amperes x ohms - the two units are not the same.
Yes, the maximum that the adapter can deliver is 1300 mA or 1.3 amps. The maximum that the device will draw is 200 mA or .2 of an amp.
YES!If you have a TV antenna amplifier rated at 12 Volts and 200 milliamps, you can use any power supply that will deliver at least 200 milliamps at 12 Volts. The important item is to keep the 12 volts at 12 volts. note: 200 milliamps is 0.2 amps. Even if you had a power supply that delivered 2000 amps at 12 volts you would be OK as it will only draw the 200ma that it needs.
Yes. Yes, you can replace a transformer with one that has a higher current rating. The load on the transformer should be less than 200mA because presumably that is what the circuit was designed for. Since the current through the transformer should be less than 200mA, the 500mA transformer will not be damaged. The opposite is not true. You should not replace a 200mA rated transformer with a 100mA transformer, for example. If the current exceeds 100mA, the transformer could fry.
No it will not harm the plug in device. The 1 amp relates to 1000 mA or in other words it has five time the capacity as the 200 mA adapter.
Convert the current to amperes, then (using Ohm's Law), divide voltage by current.
INPUT: AC 120V 60Hz OUTPUT: DC 12V 200mA