NotTaR of Television Sets : How did the (vertical) frame rate get ch..  
 Copyright © 1994-2007, Samuel M. Goldwasser. All Rights Reserved. Reproduction of this document in whole or in part is permitted if both of the following conditions are satisfied: 1. This notice is included in its entirety at the beginning. 2. There is no charge except to cover the costs of copying. I may be contacted via the Sci.Electronics.Repair FAQ (www.repairfaq.org) Email Links Page.

     << Notes on cable and broadc.. |  Index  | Why is the NTSC color sub.. >>

How did the (vertical) frame rate get chosen

Some people think that TVs are synchronized to the local power line since the vertical scan rate is around 60 Hz (or 50 Hz). This is not correct.

No TV (at least once the broadcast standards were defined - some experimental schemes did) ever used the power line for synchronization. However, older TVs had line frequency power transformers (no SMPSs) whose stray magnetic fields could affect the CRT deflection slightly. So it made sense (well, this is one justification at least) to make the vertical scan rate (field rate) equal to the power line frequency. Otherwise, there would be a jiggle or wiggle in the picture due to the stray magnetic field affecting the deflection of the beam inside the CRT. Since it was thought at the time (and for other reasons as well like cost) that 60 Hz was adequate to produce an acceptable amount of flicker, this all fit together nicely.

In the good old days before color TV, the frame/field rate was exactly 30/60 Hz (or 25/50) Hz. With color, it had to be changed slightly (see the section: Why is the NTSC color subcarrier such a weird frequency?) but since TVs no longer use line power transformers, there would not even be a slow position shift (period of several seconds) due to this so it didn't matter.