4 Baffling Parts of TV That Actually Used to Make Sense
Picture a child today who streams a TV show from 20 years ago. They never watched the show when it originally ran on ABC and in fact have never watched any show on an over-the-air network. They notice that, every 10 minutes or so, trombones play to mark a suspenseful moment. Then the show cuts to black, but it immediately resumes because the episode isn’t over.
You explain to this child that these act breaks were originally commercial breaks, during which the action had to pause so the network could show ads. “Wow, like on YouTube?” says the child, and you nod. The ads are gone from the show now, but their legacy remains. And if we look further back into the history of television, we’ll see a bunch of other forgotten quirks that are responsible for how TV is today.
Multi-Camera Sitcoms Were Invented Because Mics Used to Suck
The earliest attempts at filmmaking were like plays. A director plonked a camera down and shot actors on a set that was just like a theater’s stage. Very quickly after this, directors discovered they could get more creative, building different kinds of stages, moving the camera around, or even shooting outdoors. But then came a speedbump: talkies. When movies got sound, the first microphones were stationary, and the actors who spoke into them had to stay stationary as well, sending filmmaking back to square one.
Don't Miss
To give the impression of action, even though the actors stayed still, directors now invented what’s called the multiple-camera setup. Several cameras shot the scene, from different angles. Even if a scene consisted of just two actors each standing at different ends of the room as they talked, the footage could alternate between close-ups of each character, to make the scene feel dynamic. In the 1940s and 1950s, multiple-camera setups became standard for television.
Once we invented other kinds of microphones (boom mics, and wireless mics), actors could move around. Films dropped the multi-camera setup as soon as possible, and most scripted TV shows eventually did as well. Sitcoms were the last holdout, and though today, even most sitcoms instead used a single-camera setup, some like The Connors and Frasier stick to multiple cameras. Today, if you refer to a “multi-camera sitcom,” you’re talking about a show filmed on an obvious set in front of a live studio audience.
But if a show wants to shoot in a studio in front of an audience (and there quite a few reasons a show might like to do this), that doesn’t mean they need a multiple-camera setup. A show could, if they choose, keep all those trappings of the classic sitcom but film with one single fixed camera, taking in the entire scene with every shot. Remember: Actors can move, so the original reason they needed multiple cameras is gone now.
CBS
“The same way they know where to look in real life.”
Of course, there are plenty of artistic reasons for sitcoms zooming in on different elements. But there was also another technical reason that’s no longer an issue: resolution.
For most of the 20th century, a television picture consisted of 525 horizontal lines. Not all these lines made up the image, so a TV shot was the equivalent of an image 480 pixels high. That’s not a big picture. If sitcoms simply showed one wide shot for an entire scene, no one would ever get a good look at the characters’ faces. An HD picture today, however, is six times as big. That means a face in one fraction of the screen looks as clear as a face taking up the entire screen did in the 1990s. A 4K picture, meanwhile, is 24 times as big, so you could see even a small face in a corner as clearly as people used to see a close-up.
Studio sitcoms could, if they choose, discard a century of techniques and film each scene just like a static play. That might actually improve them, because plays are pretty fun to watch. You know how you’re impressed every time a movie has characters interact for a long take, without any cuts? Sitcom actors do that every single episode, and the only reason we don’t realize it is we keep switching between cameras. If you watch the scene without all those cuts and close-ups, you’ll get to experience it like the studio audience does.
And based on all those hoots and cheers, they seem to enjoy it.
TV Seasons Were Based Around When New Cars Came Out
Lately, we’ve all become very frustrated by how years pass between each season of certain prestigious shows. Over on broadcast TV, however, networks continue to reliably drop new seasons every fall, as they have for the past 70 years. They also launch a few shows in January, to replace ones they’ve canceled, but by and large, fall is when new seasons come out.
ABC
This raises two questions. First: Isn’t it kind of weird that we call these “seasons” when they release annually? A year has multiple seasons (winter, summer, maybe others depending on where you live). “In which season do new TV seasons debut” should not be a valid query, but it is one, and the answer is “fall.”
The other question is: Why fall? You might guess it’s because that’s when the school year begins, or when people have abandoned fun summer activities and now are back to saying inside. The real reason is that fall was when automakers unveiled their new models.
NBC
In car companies, networks saw lucrative sponsorship opportunities. Networks invented the fall schedule so they’d have exciting new TV and high ratings exactly when car companies most wanted to advertise, so the networks could charge them the highest fees.
If networks instead wanted to sustain audience excitement throughout the year, rather than catering to this one specific kind of sponsor, they’d be better off debuting new shows every month, year-round. That’s the route streaming services eventually took, as well as the strategy of ad-free premium cable channels.
A Quirk of Math Means TV Has a Weird Frame Rate
A film works by flashing several still images in front of you every second, and you interpret this as constant motion. They originally rolled 12 frames by every second, then they raised that to 24, and they stuck with 24 pretty much ever since. But choosing a frame rate for television was more complicated.
Networks each had a fixed spectrum for broadcasting. This was 4.5 megahertz, or 4,500,000 hertz. Due to the way they encoded sound and color, this spectrum had to be exactly divisible by the number of lines on the screen every second. This number of lines would be the frame rate multiplied by the number of lines per frame, which was 525. Dividing 4,500,000 by 525 gives 8571.42857143. So, TV engineers now needed to pick a frame rate they could divide 8571.42857143 by, producing a whole number.
Networks were hoping for a frame rate of 30, since that neatly matched up to the frequency of the alternating current that runs through appliances. If you divide 8571.42857143 by 30, you sadly don’t get a whole number. But if we tweak 30 just a bit, and change it to 29.97, we do get a whole number (or close enough anyway). So, 29.97 was the frame rate that TV networks adopted.
Now, when you watch a show on Netflix or something, they don’t broadcast over the airwaves, so they don't use a frame rate of 29.97. That would be ridiculous, needlessly picking a decimal number like that, right? No, they don’t use 29.97. They use a frame rate of 23.976.
Like we said, films used 24 frames per second, and for years, that footage had to go through a conversion process before it could air on TV. If you try throwing a 24 frames-per-second film out at 29.97 frames-per-second, those don’t sync up, and the result is a mess. But if we change a film to 23.976 frames-per-second, that’s exactly 80 percent of the TV frame rate, so you can drop that in your magic box, and TV magic will shift the frames to the right spots to make it look fine.
Companies converted films from 24 to to 23.976 by slightly slowing them down. A movie viewed on television was always a couple seconds longer than that same movie on the big screen, though the slowdown was too subtle for anyone to notice. Digital video cameras later made things a little easier by simply recording at 23.976 frames-per-second, sparing people one step of the conversion process.
Today, if you watch a show on Disney+, it won’t ever have to be converted to 29.97 frames-per-second. No one will watch it on a TV with 525 scanlines, and it never needs to fit into 4.5 MHz. Nevertheless, it’ll have a frame rate of 23.976. Not 24, no. It’ll be 23.976. That is the number we are stuck with.
The Remote Control Was Made to Eliminate Commercials
The first television remote controls were made by Zenith in the 1950s. They had marvelous names like the Flashmatic and the Space Commander 400, and they had just four buttons. The first turned the set on or off. The second changed the channel, going one channel up. The third also changed the channel, going one channel down. The fourth was the mute button. The remote control was the idea of Zenith founder Eugene F. McDonald, and every part of it was designed for one purpose: skipping ads.
He wanted channel buttons, not to search for alternative programs when one got boring but to switch channels whenever a sponsor message came up. In the 1950s, you see, channels didn’t yet have synchronized commercial breaks. As for the mute button, to quote Zenith’s own ads, you needed “just touch a button to shut off the sound of long, annoying commercials while the picture remains on the screen.”
McDonald’s ambitions went beyond merely sparing you, the individual viewer, the pain of listening to ads, like modern adblockers do. He hoped to make ad skipping/silencing universal, which would make advertising unprofitable and ultimately kill the TV advertising industry. That was a wild idea from a company that made TVs, since TV programs were entirely funded by advertising and would continue to be for decades.
Zenith
You sometimes find yourself in the world McDonald envisioned, streaming shows ad-free or watching your own personal copies of movies. You still use the remote, for the other obvious advertised benefit: the convenience of controlling the TV from across the room. Plus, many TVs today don’t even have their own buttons, other than the remove control. But when you don’t have any ads to skip, one remnant of the original remote serves no function but remains. We’re talking about the mute button.
Zenith invented muting — it didn’t exist on TVs till they added it to their remote. Now, mute buttons are everywhere, including on every video you scroll past on your phone. But on a TV, without ads? How often do you ever want to mute what you’re watching? Anytime you need to turn the volume all the way down, you really want to pause the show so you can resume watching when you’re ready to listen again. Or you possibly want to skip forward (something impossible when Zenith first invented muting). But when you’re watching ad-free TV, rather than TikToks with captions and superfluous music, you never want to mute it and simply wait as the picture goes on playing.
Seriously, when was the last time you hit mute on your TV? Was it because you were watching porn? Don’t say that; the conversation there is the best part.
Follow Ryan Menezes on Twitter for more stuff no one should see.