Saturday, August 23, 2014

Glum and Twinkley

I went to college to become David Brinkley.

Television news evolved from the Movietone newsreel, beginning with John Cameron Swayze’s Camel Newsreel Theater in 1948.  All the world’s news in less than fifteen minutes, delivered by a histrionic announcer nearly out of breath with excitement and featuring, whenever possible, animals and women in bathing suits. It was “news” as spectacle, something distant and, ultimately, irrelevant—certainly not to be interpreted or understood.

But Swayze had a powerful rival, one of the “Murrow gang” at CBS (and, of course, there was Saint Edward R. himself taking down McCarthy):  Douglas Edwards.  The tone of CBS was serious and sober, intellectual and erudite.  “Newsreel” news started sounding as dated as Dixieland jazz in a bebop era.  And people were beginning to pay attention to television news.  An astounding number of people, in fact; within a decade, network television had become the primary news source of most Americans and newspapers had already begun their long decline.

How do you compete with a living legend and his band of apostles, people like Douglas Edwards, Eric Sevareid, Mike Wallace and an up-and-comer named Walter Cronkite?  NBC had a news broadcaster from the same mold named Chet Huntley, serious, credible and capable.  But where was the difference that would vault NBC’s newscast into the ratings lead?

Why not a team?  Comedy teams—Burns & Allen, Martin & Lewis, Laurel & Hardy—had been a mainstay of radio and translated well into early television, so why not try the formula with news?  Chet Huntley would be the perfect Murrow-esque straight man, and chubby-cheeked David Brinkley the wisecracking relief.
Brinkley injected irony into the news.  He was a solid journalist; his credentials were impeccable, so there was no question of his credibility.  But he also, through his tone and occasional sardonic comments (like saying that Senate Majority Leader Everett Dirksen appeared to comb his hair with an eggbeater) he relieved the impact of the events he reported. After an initial drop, the ratings started climbing, ultimately surpassing CBS’s.  A real comedy team rewrote the song Love and Marriage in their “honor”: 
Huntley Brinkley, Huntley Brinkley, one is glum and the other’s twinkly …

Throughout the crises of the early ‘60s, as I was coming of age—the Selma March and the Freedom Riders, the near-nuclear war of the Cuban Missile Crisis, the Kennedy Assassination—they were my guides, especially Brinkley.  There was something about his sense of irony that was reassuring; the events were terrible, yes, and of great import—but we’d survive them.  Don’t take it all too seriously; that was Chet’s job.  When I majored in broadcast journalism at the University of Washington, it was because, primarily, of David Brinkley.

And that, I realize now, was the beginning of the most important change television brought about regarding news.  The messenger became as important as, if not more important than, the message.  Walter Cronkite became “the most trusted man in America” according to repeated polls, a televised personality with the authority of a revered spiritual leader.  Huntley retired in 1970 and faded from the scene rapidly, but Brinkley went on for decades more, always a star, always an authoritative entertainer (or perhaps an entertaining authority).  Through Watergate and the Reagan revolution and the Gulf War, he explained and reassured—and he provided the model of the personality journalist.

When cable news became a force in the 1990s, it wasn’t to the Murrow tradition that CNN and Fox and MSNBC turned; it was to the Brinkley manner.  He’d probably appreciate the irony that his legacy has become the Bill O’Reillys and Rachel Maddows and Anderson Coopers. 


Tuesday, August 19, 2014

This Just In

Happy Birthday Philo T. Farnsworth!

The Boobs Tube

Who owns the air?

Despite its simplicity, this is a question that many of my students got wrong.  Whenever I’d ask it in my Mass Media class, a few brave souls would confidently call out “the government.”

Wrong.  It’s mine.  Mine, mine, mine.

The problem (or the virtue, depending on one’s perspective) is that it’s also yours.  The air, or more accurately the electromagnetic spectrum that passes through it, is owned by the American public (at least within the United States.)  If anybody wants to use it for private purposes, they have to have my permission. 

Yours, too. 

And everybody else’s.

It wasn’t always that way.  In the infancy of radio, anybody with the simplest of equipment could start broadcasting anything they wanted.  After all, the First Amendment guarantees freedom of speech and of the press, doesn’t it?  We’re a free people, aren’t we?  So all across the country aspiring broadcasters—bakers, car dealers, entertainers, charlatans, religious leaders, even kids in their basements with tubes affixed to mom’s breadboard—started filling the airwaves.

And, quite quickly, they made an unfortunate discovery:  the electromagnetic spectrum, and particularly the part of it devoted to radio and television signals, is a severely limited resource.  There’s only a certain amount of “water” that can fit through this “hose.”  Signals started crossing.  Those with greater power drowned out those with less.  Distant stations interfered with local ones broadcasting on the same frequency.  Everybody was talking at the same time. The entire medium faced a collapse into chaos.

Somebody needed to be a traffic cop, and since radio waves don’t respect local boundaries, that cop had to be the federal government.  First in 1927, as the Federal Radio Commission, and with expanded powers in 1934 as the Federal Communications Commission, that regulator was established by Congress.  And broadcasting, the most powerful medium in human history, the most influential force in American life of the 20th Century, has been censored. (Yes, I know:  even the FCC claims that they lack “censorship” power, and technically that’s true; nobody is jailed for violating their regulations.  But I’m thinking of “censorship” in a broader sense, as I’ll try to explain below.)

“Wait,” you’re thinking about now.  “Where are the boobs you promised?  Why are we getting a lecture on media law?”  Don’t worry:  here they, or at least one of them, are.

During the halftime show of the Super Bowl in 2004, singers Janet Jackson and Justin Timberlake “accidentally” exposed, for a microsecond, one of her breasts.  Writer Marin Cogan analyzes the “event” and its impact in far more detail than I could, but the upshot is that this one microsecond of partial nudity exposed (pun intended) the glaring hypocrisy of television regulation.  Led by an organized campaign, half a million people complained to the FCC for this violation of “decency” standards, and the FCC fined 20 Viacom-owned stations $550,000 (later reversed by the courts).

Just the next month, viewers so inclined could have watched Season 5 of The Sopranos and enjoyed multiple blood-spattered murders, graphic sex and nudity (one frequent setting was a strip club) and profanity-saturated dialogue, as well as a story line which, some might argue, made mobsters into sympathetic characters.  The FCC acted on no complaints about The Sopranos; no fines were imposed or even contemplated.

The difference?  The Sopranos was produced for Home Box Office (HBO), carried by cable and satellite and available only to those willing to pay a rather hefty premium for it.  HBO did not need to seek public permission to use the medium, and the FCC has no jurisdiction over it.

And here’s the paradox of television:  programming (or at least some of it) is much better now than ever before, of a quality that often exceeds even the best movies.  Such series as The Sopranos, The Wire, House of Cards, Mad Men, etc. exemplify the most complex and challenging writing, acting and production available in American culture today.  And yet they’re only possible because television has transformed from a democratic medium to one for the elite.  Ironically, the First Amendment now protects programming that is not available to large numbers of Americans.

Broadcast television, using the public airwaves, was democratic; it was required by law to offer something for everybody—all the owners of the airwaves.  Of course, it never really did this; it always catered to a white, middle-class audience, the target of mass advertisers.  In fact, this kind of “democracy” actually led to a kind of tyranny—the tyranny of the conventional.  Since programming was ratings-driven, television was dictated by majority rule—and the majority, in this case, wasn’t just ethnic, but also economic.  The result was a fairy-tale view of American life, a fantasy of family and manners that never existed, and never could have.  Lucy and Ricky slept in separate beds, even when she was pregnant (a word she couldn’t use).  Captain Kirk and Lieutenant Uhuru exchanged an interracial kiss—but only under compulsion by aliens.  Western heroes contested hundreds of gunfights, but the bullets never drew blood from the bodies they punctured. And the language television characters spoke was decorous and “moral”, as far removed from real American English as any Shakespeare production.  Good triumphed and evil was punished, and even the news edited out graphic content to avoid disturbing complacent viewers.  And half-time shows featured marching bands and clean-cut, “Up With People”-style choral groups.  The standard for decades, agreed upon by the FCC and the broadcasters themselves, was the “least common denominator”; that programming is best which offends the fewest.

Pay television has radically changed that.  Some might argue that The Sopranos is no more “realistic”, in the long run, than Gunsmoke; fair enough.  Any dramatic structure with a beginning, a middle and an end is going to distort the chaos of real life.  But at least there’s more variety, more complexity of characterization and plot development, more willingness to explore uncomfortable issues.  And, yes, more boobs.
But what of a medium for a democracy?  What are the consequences of segregating the best programming, the best sporting events, the concerts and topical comedy, behind an increasingly burdensome pay wall?  What is the morality of leaving the “leftovers” for the poor or those without cable or satellite service while the elite get to enjoy the “good stuff”?  Granted, increasingly such programming is available directly from internet sources such as Netflix or Hulu.  But these still require disposable income—lots of it. 

We used to be able, at least, to discuss the latest episode of M*A*S*H the next morning at work.  Have we lost even that basic bond?


Monday, August 18, 2014

King of the Wild Suburbs

Yes.  I had a coonskin cap.

I also had a Davy Crockett toy flintlock pistol, and bubblegum cards with Fess Parker’s and Buddy Ebsen’s images on them, and an imitation leather jacket with fringes.  I scouted the wild vacant lots of Seattle’s Lake City neighborhood seeking a bear to shoot, and I watched all five episodes of the series, possibly the first made-for-TV mini-series, as well as the 1955 movie version compiled from it.

I was, in short, the target demographic for one of the earliest, and most successful, marketing campaigns of the television era.

Davy Crockett was the 1950s equivalent of Star Wars.  It hadn’t been heavily promoted; the first three episodes were intended for Disneyland’s “Frontierland” segment, which alternated with other themes from sections of the park.  The stars, Fess Parker as Davy and Buddy Ebsen as George, weren’t well-known (this was well before Ebsen’s Beverly Hillbillies days.)  Although technically a Western, which was still a popular genre, the period (the early 19th Century) and the locales (Tennessee, Washington D.C. and Texas) weren’t exactly Utah’s Monument Valley. 

But something in it appealed to the zeitgeist, and the new medium of television spread it like a virus.  Across the country, boys like me started showing up in elementary school wearing dyed rabbit-fur “coonskin” caps (over 100 million dollars’ worth were reportedly sold) and playing Frontiersman and Indian.  Over sixty years later, the theme song (“Born on a mountaintop in Tennessee/Greenest land in the land of the free …”) can still be a powerful earworm (in fact, I just did it to myself).  The only other theme with that effect is “Gilligan’s Island” (Oh, shit.)

The real Davy Crockett was a soldier, land speculator, politician, and adventurer.  Today, his politics would place him in the Tea Party:  he served in Congress as an advocate for tax relief for low-income farmers in his state and died fighting for Texas’ secession (although in that case from Mexico.)  He was a shameless self-promoter who wrote his own mythic autobiography. And his death at The Alamo was the stuff of legend. To his credit, he DID lose his seat in Congress for his lonely opposition to the Indian Removal Act that initiated the Trail of Tears. The TV show missed no opportunity to voice his famous maxim:  “Be sure you’re right, then go ahead.”  In short, he was the epitome of American exceptionalism.  For a generation, the first TV generation, Fess Parker’s version of him—witty, resourceful, personally courageous and independent—provided a powerful role model.  Recently, I was considering a visit to Santa Barbara and scouting hotels, and such was the lingering influence that I was immediately drawn to the Fess Parker Hotel.

Yesterday was Davy Crockett’s 228th birthday, and we’re approaching the 60th anniversary of the broadcast of the first episode.  Today, we’re awash in HBO and Showtime series, Marvel Comics movies, and product tie-in merchandise.  But this, I’d argue, is where all that started. 

I wonder if Amazon.com sells coonskin caps?


Saturday, August 16, 2014

A Home With Five Windows



There was, of course, one in the living room, with all the furniture arranged around it.  There was one in my bedroom.  Mom and Dad had one in their bedroom, on a shelf near the ceiling so they could see it easily in bed.  There was one in the kitchen.  And, for awhile at least, there was one in the bathroom.

I grew up in a home with five television sets.
   
Television was supposed to strengthen the American family.  Hell, visionaries like Marshall McLuhan even believed it would organize the worldwide human family into one big, harmonious “global village.”  Television advertisements projected an image of social bonding—Father, Mother, two happy children (all white and middle class, of course) happily sharing high-quality drama, or cocktail parties with all the friends and neighbors (Mother in an evening dress serving drinks). 

Cecilia Tichi, in her 1992 book The Electronic Hearth, showed how television manufacturers consciously co-opted the imagery of the colonial fireplace hearth and transferred the warmth and fellow-feeling associated with it to this new technology.  Television was to keep the children home; it would even keep (or get) the husband home to be attended to by his loving wife.  And at the same time, it would be, in a phrase popular at the time, a “window on the world”, bringing news and sports from far away straight into the family room for all to experience.  We’d even eat our TV dinners together off aluminum and plastic TV trays!

Much of that did indeed happen.  I was “nourished” by my share of TV dinners and turkey pot pies.  But increasingly, I ate them in my own room, on my own TV tray, watching my own programs on my own television set.  And so did many others.  The great unifier proved, in fact, to be one of the great dividers.

One of the most important trends in current media is fragmentation and the dissolution of the mass audience.  Marketers have learned to target narrow demographic audiences with ever-increasing precision so that they don’t waste any effort trying to reach people who aren’t already inclined to buy, and this has driven most media, but especially television, to tailor offerings to niche viewers.  Cable and satellite viewers can spend 24 hours watching golf or cooking or home improvement or news and entertainment in a variety of other languages.  Five hundred channels and plenty on, for every conceivable (at least legal) taste and inclination.  And one result is that, as a society, we are fractured and polarized as never before during the Age of Media.  We simply have less and less to talk to each other about—and when we do talk, we have increasingly polarized points of view.

Despite the sexist, racist and classist images used to promote it, the electronic hearth wasn’t such a bad idea.  But as soon as it became possible to buy cheap sets, it became an impossible dream.  We could all gaze out of our own windows on our own worlds.
The TV I had in my bedroom.



Thursday, August 14, 2014

The Candle and the Kid

Bright, sunny Southern California:  the rocket jutted from the launch pad like a saber unsheathed for battle.  In a way, that’s exactly what it was:  a slender, three-stage saber carrying a basketball-sized message to the Soviet Union.  Although it was developed and operated by the U.S. Navy, the rocket. Vanguard, was ostensibly on a peaceful scientific exploration (“… to seek out strange new worlds and civilizations, to boldly go …”) as part of International Geophysical Year.  But we all knew the real mission:  to show the Russkies, and the rest of the world, that American ingenuity (actually, the ingenuity of captured Germans) could create the first artificial moon.
For the first time, Americans watching on live television could hear a countdown and feel the rising tension as it approached zero.  Wisps of condensing vapor floated off the silver-and-white shell and finally, as the countdown ended, roiling clouds of smoke and jets of flame shot out of the bottom.  Vanguard paused, the clouds billowed, and it started to move slowly upward with tremendous grace and power.
And then, a few feet off the ground, it sank back down and exploded.
It was humiliating.  Even climbing into the sky and exploding there would have been better than this (which is actually what happened with the second Vanguard attempt.)  Even more humiliating:  it was to have been the first, but the Russians had already stunned the world by launching their Sputnik two months before.  Not only couldn’t we beat them; we couldn’t even match them!  And their space adventures were launched in secrecy, while we made our blunders in full televised view.
It probably altered my entire future.  The American reaction, after the initial panic, was to pour resources into a new emphasis on science and math education.  Who needed literature and art any longer?  The Russians could launch satellites!  I suddenly found myself in a new, experimental math class, so new that the textbooks were bound in construction paper and printed, it seemed, with a mimeograph.  I was baffled by it; if it hadn’t been for Rich, who was to become a lifelong best friend and who let me copy his answers on exercises, I would have failed.  SMSG Math made it clear that I had no future in science or technology, however urgent those fields might have been.  I ended up an English major.
Then the Russians orbited a dog!  True, it died up there after barking and whimpering its way around a few orbits because they had no way to bring it back, but still … Clearly, a manned flight would come soon, and we couldn’t even launch a damn metal basketball.
The Navy finally successfully launched a Vanguard (it’s still up there today) and the Army sent its own satellite, but the resources and talent went to the new civilian agency, NASA, and sending a man into space first was a national imperative.  By 1961, the Mercury program and its first seven astronauts were ready.  The plan was to send Alan Shepard essentially straight up on the nose of an Army Redstone Intermediate Range missile and then fall straight back to earth—fifteen minutes of a roller-coaster ride.  The launch was scheduled for May 2, 1961.
The Commies did it again.  April 12:  Yuri Gagarin not only went into space, but orbited the earth.  In a special affront to American rivals, he reported that he “didn’t see any angels.”
So NASA had to aim for second place, and with a different set of sub-orbital rules.  But it was still a matter of national pride and high drama.  After all:  Russian rockets worked, and American ones seemed to explode.  There was a certain macabre fascination to watching Shepard emerge from the ready room.
Actually, it got progressively more macabre, because the first two times he climbed into the Mercury capsule, the countdown was cancelled by bad weather after all the build-up. The omens didn’t look good. 
Finally, May 5, the weather report looked satisfactory.  Because the launch was to occur shortly after dawn in Florida, we West Coasters had to be up and watching at about 4:00 a.m.  It was a heady experience for a 13-year-old on a school day.   After more delays, the countdown reached its climax at 6:30 Pacific Time and Shepard, with characteristic American bravado, told Mission Control to “light this candle.”   Thanks to television and emerging national networks, and entire nation was able to share the event as it happened.  Thanks to the advent of videotape, you still can, so instead of trying to describe it, let me direct you to NBC’s coverage of the firstAmerican manned mission.





Wednesday, August 13, 2014

My (Short) Career in Television

            It’s hard to believe in this time of massive corporate conglomeration, but television once was an integral part of the local community, just as radio had been before it.  KING was owned by an heir to the Stimson lumber fortune, Dorothy Stimson Bullitt.  KOMO, the second station on the air, was an enterprise of a local flour mill, Fisher.  KTNT, the first CBS affiliate, was started by the Tacoma News Tribune.  And the first non-commercial station in the Northwest, KCTS, was the home of my brief career in television production (KCTS even reflected these local roots in its name—it stood for “Community Television Service”).
            That, of course, meant that it was perpetually broke.  KCTS started in 1954 with organizational support of the University of Washington and donated (i.e., already-obsolete) cameras and equipment from Mrs. Bullitt.  University students staffed the operation most of the time, but occasionally community volunteers did some of the work.  That’s where I come in.
            In the late fifties and early sixties, I was a Boy Scout, and, when I turned teen, an Explorer.  Other scouts went camping, canoeing, hiking … they learned how to whittle and put up shelters and identify edible plants and all kinds of cool quasi-military stuff.
            Mine volunteered at KCTS.  I was a teen-aged camera operator.
            It was a decidedly low-budget operation.  Virtually all the programming was local and, frankly, talky and less-than-compelling.  The studio smelled almost like a lumber yard because the sets were plywood and 2X4’s. The cameras smelled electronically hot; transistors were still cutting-edge, and these cameras used tubes.  They were bulky and heavy, mounted on dollies.  To operate them required a delicate balance between stasis and movement; the goal was, of course, to be invisible to the viewer, so the camera operator had to refrain from any motion and never sneeze or cough.  But the vidicon tube that captured the image had to move frequently or the image would permanently “burn”.  It was the job of the director to shift between the two cameras frequently enough to allow each operator to reframe the image.  All broadcasting was, of course, in black and white.
            So every week, my small troop of Explorers would go to the University of Washington campus and spend a couple of hours running the cameras, patiently standing stock-still and stifling body noises.  It was the genesis of my original major when I went a few years later to the U.W. as a student:  Broadcast Journalism.  You might have seen me today on one of the networks (hopefully not Fox News) but for one unfortunate outcome.
            I was fired.
            I didn’t make any untoward noises or burn the vidicon.  It was purely a matter of genetics.  You see, even as a teen I was abnormally short, just barely five feet tall.  Even beggars can be choosers under extreme circumstances, and the KCTS directors finally had enough of camera angles that highlighted the underside of speakers’ chins.

            

Death in the Afternoon




It was a Hindenburg moment.  Two men had just died in a tower of water and flying debris, on live television.  The announcer, Bill O’Mara, had no script and no precedent.  It was 1951, and viewers six years away from the war were plenty familiar with death—but it was something distant and hidden whenever possible.  When it covered the D-Day invasion, Life magazine had agreed to a War Department request not to show any bodies, and that was the general rule.  But here, in the middle of a civic festival and sporting event, it was:  unavoidable, graphic, and immediate.
Bill O’Mara led his invisible audience in the Lord’s Prayer.
That’s my first memory of television.
I was nine months old when the first television broadcast was made in Seattle.  A struggling station owned by an appliance store, KRSC, broadcast a high-school football game from Memorial Stadium (well, most of it, anyway; the equipment shorted out from rain) live and in grainy black-and-white, accessible to only a small handful of privately-owned sets.  Shortly afterward, it was bought by the heir of a lumber fortune, Dorothy Bullitt, and eventually transformed into the regional powerhouse KING Broadcasting.
In the summer of 1951, when I was three, Seattle hosted its first and, for quite some time, only major-league sporting event, the Gold Cup unlimited boat race.  A local car dealer, Stanley Sayres, had commissioned a revolutionary boat from local builder Ted Jones, the Slo-Mo-Shun IV, and with it had won a stunning victory in Detroit in the 1950 Gold Cup, giving him the right to host the race in Seattle. Already, after only three years, nearly 39,000 televisions were owned, most of them in Seattle.  KING TV was the only station licensed in the area at the time, and the event provided the perfect opportunity to demonstrate the capabilities of the new medium.  With bulky telephoto-equipped cameras set up on towers, the station could cover, at least from a distance, the entire two-mile course on Lake Washington. 
Unlimited boat racing was dominated by wealthy dilettantes from the East, primarily Detroit, and no event had ever been held west of the Mississippi, so this Gold Cup brought a form of national recognition to an attention-starved upstart city.  And, of course, it brought a contingent of chastened boat owners determined to win back their accustomed trophy.  Sayres was prepared with a new, second boat, the Slo-Mo-Shun V, to meet the challenge of Detroit’s powerhouses and the Lake Tahoe-based Hurricane IV.  And an entrant from Portland, Orth Mathiot, built a new boat named Quicksilver, which was finished late and largely untested. 
And, in the final heat, he and his riding mechanic, Thomas Whittaker, died.  The boat nosed into the water and disintegrated, live on television.   
And that was the first of many deaths I witnessed as a I grew up with television:  the on-air murder of Lee Harvey Oswald, the deaths of Apollo 1 astronauts Roger Chaffee, Ed White and Gus Grissom (and the later explosion of the space shuttle Challenger), the deaths of racing legends Ayrton Senna and Dale Earnhardt, and so many others.  I witnessed war, murder, suicide, and accident.  A Japanese performance art group in Seattle dangled from a skyscraper upside down and a rope broke, sending one of them headfirst to the sidewalk.  I watched thousands die in the World Trade Center, and a Viet Cong prisoner shot in the head by his captor.  I’ve seen children die of starvation and disease, and soldiers die in battle, and, from helicopter vantage points, live shoot-outs between criminals and police. 

It’s fashionable these days to disparage “reality television.”  And it’s been a critic’s commonplace since its inception to identify television as a source of distorted, unreal images of American life.  All that’s legitimate.  But it’s also worth remembering that no medium in history has so consistently confronted consumers with the most basic of realities—the one most of us go to great lengths to avoid facing.  If nothing else, television has been the machine of mortality.