Can someone please explain to me the difference between 29.97 & 30 fps?
Does it make much difference?
And is one better than the other?
Thanks...
29.97 vs 30 f.p.s.
Moderator: Ken Berry
- Ron P.
- Advisor
- Posts: 12002
- Joined: Tue May 10, 2005 12:45 am
- System_Drive: C
- 32bit or 64bit: 64 Bit
- motherboard: Hewlett-Packard 2AF3 1.0
- processor: 3.40 gigahertz Intel Core i7-4770
- ram: 16GB
- Video Card: NVIDIA GeForce GTX 645
- sound_card: NVIDIA High Definition Audio
- Hard_Drive_Capacity: 4TB
- Monitor/Display Make & Model: 1-HP 27" IPS, 1-Sanyo 21" TV/Monitor
- Corel programs: VS5,8.9,10-X5,PSP9-X8,CDGS-9,X4,Painter
- Location: Kansas, USA
- Ken Berry
- Site Admin
- Posts: 22481
- Joined: Fri Dec 10, 2004 9:36 pm
- System_Drive: C
- 32bit or 64bit: 64 Bit
- motherboard: Gigabyte B550M DS3H AC
- processor: AMD Ryzen 9 5900X
- ram: 32 GB DDR4
- Video Card: AMD RX 6600 XT
- Hard_Drive_Capacity: 1 TB SSD + 2 TB HDD
- Monitor/Display Make & Model: Kogan 32" 4K 3840 x 2160
- Corel programs: VS2022; PSP2023; DRAW2021; Painter 2022
- Location: Levin, New Zealand
A somewhat more geeky answer is that as long ago as 1940, when the US National Television System Committee (NTSC) was established, the standards for black and white TV were adopted (varying an earlier 1936 standard). The new standard included an interlaced frame speed of 30 fps.
In 1950, the Committee started looking at the broadcasting standard for colour TV which was finally approved in December 1953. While this new standard was fully backward compatible with existing black-and-white TV sets, they had to add extra colour information and this was done by adding a color subcarrier of 4.5 ¡Ñ 455/572 MHz (approximately 3.58 MHz) to the video signal. To reduce interference between the chrominance signal and FM sound carrier required a slight reduction of the frame rate from 30 frames per second to 30/1.001 (very close to 29.97) frames per second, and changing the line frequency from 15,750 Hz to 15,734.26 Hz. And that is the official explanation for that slight variation between the two speeds!!!

In 1950, the Committee started looking at the broadcasting standard for colour TV which was finally approved in December 1953. While this new standard was fully backward compatible with existing black-and-white TV sets, they had to add extra colour information and this was done by adding a color subcarrier of 4.5 ¡Ñ 455/572 MHz (approximately 3.58 MHz) to the video signal. To reduce interference between the chrominance signal and FM sound carrier required a slight reduction of the frame rate from 30 frames per second to 30/1.001 (very close to 29.97) frames per second, and changing the line frequency from 15,750 Hz to 15,734.26 Hz. And that is the official explanation for that slight variation between the two speeds!!!
Ken Berry
-
mitchell65
- Posts: 1200
- Joined: Sat Feb 14, 2009 7:50 pm
- System_Drive: C
- 32bit or 64bit: 64 Bit
- motherboard: Dell Inc. 04GJJT A00
- processor: 2.80 gigahertz AMD Athlon II X4 630 Quad Core
- ram: 4Gb
- Video Card: ATI Radeon HD 4200
- sound_card: Realtek High Definition Audio
- Hard_Drive_Capacity: 560Gb Sata
- Location: Cornwall UK
-
Black Lab
- Posts: 7429
- Joined: Wed Dec 15, 2004 3:11 pm
- System_Drive: C
- 32bit or 64bit: 64 Bit
- Location: Pottstown, Pennsylvania, USA
Of course not. Ken reads Techno Geek Weekly for pleasure. 
Jeff
Dentler's Dog Training, LLC
http://www.dentlersdogtraining.com
http://www.facebook.com/dentlersdogtraining
Dentler's Dog Training, LLC
http://www.dentlersdogtraining.com
http://www.facebook.com/dentlersdogtraining
