As households and businesses began hooking up to brand new electrical lines in the late 19th century, utility companies needed a way to calculate how much electricity their customers were using. The first electricity meter was invented in 1872. Some 14 years later, Thomas Edison developed a model using electrolysis to measure electricity usage; a set-up similar to the one demonstrated in our tutorial on the simple electrical cell was used; the amount of zinc eaten up indicated how much electricity had been consumed. This was soon replaced by meters that used induction. The induction-type meter pictured below dates to around 1930, and is similar to many in use today.
Electromechanical induction makes this device work. Three magnetic fields are in play; one generated by the permanent magnet (the poles of which are positioned around a metal disc), one proportional to the voltage and a third proportional to the current. Because voltage (in volts) multiplied by current (in amps) equals power (in watts), these forces act on the disc in such a way as to make it turn at a speed proportional to the power used (expressed in kilowatt hours).
The amount of electricity used can be read from the dials. This tutorial begins with 4500.00 KWH already racked up on the meter, a figure deduced by reading the numbers on the dials from left to right (the red dials record the numbers that follow the decimal).
Experiment with this tutorial to see how electricity usage is recorded by the meter. Select as many household appliances as you like to increase or decrease the electricity consumed, and observe how the disc speeds up or slows down in response. Observe the dials as they record the usage.
What’s this mean for your utility bill? Use the dropdown menus to change the hours per day that the appliances are used and the unit price of electricity. The projected monthly cost will be calculated.