The royal wedding in London of Catherine Middleton and Prince William was recently televised and shown all over the world in real time ("live"). This was not always possible. When Queen Elizabeth II (who is still queen as I write this in 2011) was crowned in London in 1953, it was not possible to send live televised images of the event across the Atlantic. The two largest US television networks at the time raced to be the first to cover Elizabeth's coronation on the air. Film of the ceremony was flown across the ocean by the Canadian Broadcasting Corporation. Arriving in Goose Bay, Newfoundland, the film was transferred to a Royal Canadian Air Force CF-100 jet fighter and taken to Montreal. By tapping into the Canadian broadcast, NBC beat CBS to the air by thirteen minutes. Note 1
In 2011, I didn't bother to wake up at five AM to watch the royal wedding live. But in 1953, I did watch the coronation films in my home, which got its first television set in 1952, when I was 10 years old. My father had purchased a set that looked sort of like the one to the left, a bare chassis, without a case. The picture shows it from the back. The large gray object in the center is the back of the Cathode Ray Tube ("CRT"). Inside that completely evacuated glass enclosure, an "electron gun" fired a beam of electrons at a phosphor coating on the face of the tube in order to paint the television picture. Anyone who is curious about how this worked can find explanations on the internet.
The set was installed behind a cut-out in a custom-built cabinet inside a closet in our family room, so our television was embedded in the wall. It had a large oval loudspeaker above it, hidden behind a fabric grill. The picture was of course black and white; color would not be available for years. There was no remote control, that being also a technology that was yet to come. You changed the channel by walking up to the set and rotating a large knob. Behind that knob, large electronic components (such as coils and capacitors) were rotated into position to make contact and thus change the resonant frequency of the electronic filters that served to select the desired channel. When these contacts got dirty, the tuner would malfunction. Pressurized cans of volatile cleaning solution were sold, which could be sprayed into the tuner to clean the contacts.
Early television stations didn't manage to fill all their airtime, so when you tuned in, you might see a test pattern something like the one shown to the right. They would frequently go off the air at midnight or thereabouts, sometimes following the old radio station convention of first playing the Star-Spangled Banner. Many years later, in the late 60s, the PDP-6 mainframe computer in MIT's Artificial Intelligence Group also played the Star-Spangled Banner before ITS (the "Incompatible Time-Sharing system") shut down at the end of the day. Note 2
Early television sets had adjustment knobs which are not required on modern sets. These included knobs for brightness, contrast, vertical hold, and horizontal hold. If the vertical hold was not properly adjusted, the image on the screen would slowly scroll up or down, being replaced by a second identical image separated from the first by a black horizontal bar. Misadjustment of the horizontal hold would cause the entire picture to break up into slanted slices. Note 3
There was no way to record a television broadcast in your own home, and in fact, recording was difficult even for the broadcasters. Video tape recorders were not developed until around 1969, and the first ones were reel-to-reel devices, much too expensive for home use. Prior to their arrival, television shows were recorded on a device called a kinescope. This was basically a motion picture camera, with its frame rate adjusted to match the television frame rate of 30 frames per second. The film then had to be chemically developed, and when played back, the pictures were of rather low quality. Note 4
Eventually, the Radio Corporation of America ("RCA") developed a system for broadcasting in color. One of the hurdles that had to be overcome in this development was making it "compatible" with all of the black and white sets that had already been sold. That is, the color information had to be added to the signal in such a way as to be ignored by the black-and-white receivers. This meant that the black-and-white sets could continue to receive color signals as if they had been sent out in black and white, while color receivers would properly process the added color information. RCA repeatedly used the term "compatible" in talking about their system, to stress this advantage. I recalled first seeing color television displayed in the window of an RCA building in Manhattan, not far from Rockefeller Plaza. It attracted large crowds of people.
RCA had a great deal of difficulty selling color television sets when they were first introduced. They faced a classic "chicken and egg" problem. Nobody wanted to buy the extremely expensive color television sets because very few color programs were being broadcast, so there was little to watch. And broadcasters didn't want to incur the high expense of airing their programs in color, because almost nobody owned television sets capable of viewing them. I think RCA got over this by pouring a very large sum of money into putting lots of color programs on the air, even though they initially had tiny audiences. This jump-started the process, making it worthwhile for consumers to buy the color receivers.
The quality of the pictures on the original color receivers was not very good. The sets had particular problems with skin tones, which are of course very important. Color introduced additional controls which had to be properly adjusted. Otherwise, the colors on the screen would skew too far towards red or too far towards green, as illustrated on my face below:
Color did not arrive all at once. Even after color was available, many shows were still broadcast in black and white. On NBC in the early sixties, at the start of every show broadcast in color, an animation was shown culminating in the peacock shown to the right. This was accompanied by the narration, "The following program is brought to you in living color, on NBC." Click here or on the picture to see the video (requires a Quicktime browser plug-in).
NBC started broadcasting exclusively in color in the mid sixties. But even as late as the seventies, one could purchase a black and white television receiver for less money than a color receiver. I think that now, it's impossible to find one at all.
You don't need to understand all the details to appreciate the following: nowadays, everything has gone digital. Television images are broadcast in a digital format, and generally displayed on flat screens with discrete pixels (a "pixel" is a "picture element"). A so-called "1080p" high definition television display is an array of 1,920 X 1,080 pixels, each of which comprises three colors, red, green, and blue. That's a total of 1,920 X 1,080 X 3 = 6,220,800 elements, the brightness of each being adjusted at least 60 times per second (minimally twice the frame rate, and often 120 or 240 times per second). I'm an electrical engineer (although I've been retired since October, 2003), yet I find it difficult to imagine how all those pixels are addressed and adjusted at that rate.
And consider the manufacturing of large LCD ("Liquid Crystal Display") screens. All 6,220,800 picture elements, each with an associated Field Effect Transistor and capacitor, have to be fabricated flawlessly (you don't need to know exactly what these components are in order to appreciate the difficulty). The largest LCD screen being manufactured today is made by Sharp, with an astounding diagonal measurement of almost two and three quarter meters. Specifically, it has a 254 cm. diagonal, meaning it's 239 cm. wide and 134 cm. high (9 feet diagonally, 94 X 53 inches), and if one pixel is defective after manufacture, the entire unit has to be discarded.
We've come a long way from that first small black and white set we used to watch "I Love Lucy".
Note 1: For a very detailed account of the battle between NBC and CBS to be the first on the air with coronation coverage, see The Great Coronation War, on AmericanHeritage.com. Logan Airport plays a part! [return to text]
Note 2: One row of 36 lights on our PDP-6 console could be set to display the contents of any desired memory location. The computer played music by turning the low-order six lights of memory location zero on and off at audio frequencies, so it could play in six part harmony. The small incandescent lamps couldn't actually turn on and off at that speed, but that was not important, as only the signals feeding the lamps mattered. These were summed and fed into an amplifier driving a loudspeaker. Actually, I think three of the lamps fed a left-channel speaker, and three fed a right-channel speaker, but I don't remember the details.
The computer had been programmed to play various selections in addition to the Star-Spangled Banner. My favorite was the "Little Fugue" (technically, the "Bach little fugue in G minor"), by Johann Sebastian Bach. Click on that link to hear what it sounded like (actually, played on a PDP-1). It was played at a very fast tempo, and has the characteristic "duck call" sound resulting from "square wave" (on or off) signals. Most early music coding at MIT was done by Peter Samson. (Click the next link to see a remarkable "animated score" version of the Little Fugue on YouTube.)
On one occasion, someone removed the loudspeakers temporarily to use them in some other project. At system shutdown, I mentioned to one of the programmers (I think it was Bill Gosper), that I missed the playing of the Star-Spangled Banner. He said, "Oh, it's still playing". "No it isn't," I said, "the speakers have been taken away." He replied, "You can't hear it, but it's still playing - take a look." I looked at the console lights, and indeed you could see them blinking in the rhythm of the Star-Spangled Banner. So to my programmer friend, it was still being played, whether we could hear it or not.
The name of our time-sharing system, ITS, the "Incompatible Time-Sharing system", was a play on the name of the primary time-sharing system used by the MIT computer center, which was called CTSS (Compatible Time-Sharing System). The "Compatible" in the name meant it was compatible with the standard batch processing operating system for the IBM 7094. The name of the Artificial Intelligence Lab's system stressed that it was unique, and not compatible with anything else. It was mostly written by Richard Greenblatt, Stewart Nelson, and Tom Knight (who gave it its name). [return to text]
Note 3: My friend and MIT classmate Martin Henry Schrage recalled some of the other adjustments provided on early televisions sets. In addition to those I've already mentioned, there were knobs for fine tuning, vertical position, vertical width, horizontal position, horizontal width, and focus. Although Martin's father refused for quite a while to buy a television himself, Martin's ability to properly adjust these settings was appreciated by his neighbors. I wonder if this experience had something to do with his later study of electrical engineering at MIT. [return to text]
The TV frame rate of 30 frames per second is in contrast to the commercial motion picture rate of 25 frames per second, and the amateur motion picture rate of 16 frames per second. Television in the United States uses 30 frames per second because it is one half the US alternating current frequency of 60 Hz. This rate was chosen to reduce the visual squirming that could have resulted from electromagnetic interference, had a frame rate not synchronized to the line been used. [return to text]