Wednesday, December 17, 2014

Evolution of Television Photo Essay by CBS

This was so relevant to what I've been doing that I just have to post the link to it.  Thanks, CBS.  Evolution of Television

Wednesday, December 10, 2014

Dogies and Deuces

What the hell is a “dogie”?  And why do you have to keep them moving?

Frankie Laine had the answer:  your true love is waiting at the end of the line.  Each week,  a young and devastatingly handsome Clint Eastwood and the rest of the Rawhide cast drove “disapproving” cattle through the perils of outaws, Indians, the wilderness and each other and provided moral guidance to a generation of American boys (myself, of course, included.)  They were ubiquitous:  in 1959, 26 Western series competed for airtime, and eight of the top ten ratings went to Westerns. 
And I can still sing (privately, of course) many of the theme songs, complete with all the lyrics; so powerful was the influence of Westerns in the 50s and 60s that their songs and stories are indelibly imbedded in my mind.  I almost bought an expensive new hat the other day mainly because it was a Stetson.

A killer’s business card read “Have Gun Will Travel.”  Johnny Yuma was a rebel (and sounded an awful lot like Johnny Cash!)  I was introduced to classical music, the “William Tell Overture,” by Clayton Moore as The Lone Ranger and symphonic splendor by Wagon Train and Bonanza.
And, of course, who was the tall, dark stranger there?  “Maverick is his name.

These were the male role models of the postwar generation—and a strong link back to the role models of our fathers, as well.  Westerns had, of course, been a staple of “B” movies of the 1920s, 1930s and 1940s (and of radio as well).   For an entire generation who’d witness Depression and impending war, they promoted the virtues of independence, self-reliance, physical power and gunfighting skill—virtues that in real life had withered away along with the frontier that bred them.
   
The sudden emergence of a new medium, television, demanded a vast supply of stories, and immediately, and Hollywood was able to recycle ready-made matinee movies-- The Lone Ranger, Gunsmoke, The Cisco Kid—and “C” list stars like James Arness, William Boyd and Ronald Reagan (who, among other accomplishments,  hosted Death Valley Days).  The “Greatest Generation” thus passed along their myths and values to the succeeding baby boomers.

Or so they thought.  But something happened in the transmission; somehow, some subversive force crept in.

Nobody apparently thought much of it at the time, but a surprising number of these “heroes” were former Confederates (a conceit resurrected in Joss Whedon’s late, lamented Firefly):   Johnny Yuma, Rowdy Yates, The Lone Ranger, Paladin, Josh Randall (Steve McQueen’s character in Wanted Dead or Alive) and, of course, the Maverick brothers.

They were conveniently scrubbed of overt racism (at least as it was understood at the time), but they exemplified plenty of other anti-authoritarian, “rebel” values.  I even learned to play poker from watching Maverick—hardly a prevailing establishment skill.  They drank, smoked, and exhibited an evident contempt for corrupt or inept representatives of “respectable” society, including law enforcement.  Having lost a war, they were perpetual outsiders, true to their own internal codes of conduct but hostile to those imposed upon them.

Just saying: is it just coincidence that in the wake of this wave of Westerns, my generation elected two Californians (one a certified cowboy actor) and two Texans?  Or that, buried deep within my liberal consciousness, plays the soundtrack of rebellion?

Burn the land and boil the sea.  You can’t take the sky from me.

P.S.:  here, courtesy of Wikipedia, is a list of TV Westerns.



Tuesday, November 4, 2014

A Feast For the Eyes

Big slices of turkey (white meat, of course).  Mashed potatoes.  Cornbread stuffing.  And a helping of green peas.  The classic Thanksgiving dinner, the traditional meal of family gatherings around the ancestral dinner table (with the extra leaves inserted for more room and a smaller table nearby for the children.)

Except it wasn't Thanksgiving, and there was no family gathering, nor was there an ancestral table.  In fact, it helped tear families apart and change the dietary habits of Americans for generations.
It was the 1954 Swanson TV Dinner.


Bear with me for a momentary digression.  In 1992, Cecilia Tichi published The Electronic Hearth, an examination of a particular strain of early television advertising.  Her thesis was that television was introduced into the American home by a bit of clever and sustained subterfuge—marketers co-opted a traditional symbol of family and warmth and reassigned it to television:  the colonial hearth.  From before the Revolution, the fireplace hearth had been the center of the home, the source of physical heat and, often, the source of cooked food as well.  It was the focal point of the room, and all who lived there gathered around it for physical and emotional comfort.

But by the 1950s, with central heating, there was little use for the hearth except as a nostalgic entertainment on cold evenings.  Television, its promoters promised, could take its place.  


This substitution overlooked one fundamental problem, though:  television demands to be watched.  The glow from the screen with constantly shifting lights and shadows oozes into our consciousness even if we’re not looking directly at it.  Characters speak dialogue to each other (emerging out of rather tiny, imprecise speakers), and if you care about the program you’re watching, you need to pay attention.  And, since television was a “one-time-only” event in the early days, if you missed a plot point, you missed it forever (or at least until re-run season).

Leave the TV off and engage in a traditional family meal, with conversation and interchange?  Try to treat it as background and pay divided attention, both to it and to each other?  Or give in and eat meals in front of the TV, each in a separate attention bubble?

Swanson and others capitalized upon and encouraged the latter.  The war had inspired incredible advances in television technology (it was basically the same technology as radar) that transformed it from an expensive, pre-war novelty into an affordable device for the masses.  Advances in freezing and food preservation, plus the emergence of industrial agriculture in the wake of the Depression, had made the nationwide distribution of pre-packaged food practical.  Even the aluminum tray in which the food was set benefited from war production of aircraft.  And women, clinging to new-found freedoms and interests outside the home, no longer felt as compelled to spend hours preparing meals; 25 minutes at 425 and it was all ready.  It was the perfect marriage of two emerging postwar mega-industries:  Big Food and Big Telecom.  



But one more element was necessary.  You can’t easily watch television at the same time you’re eating off of a heavy oak table—too much neck craning, turning, and spilling involved.  It’s far more convenient to face the TV directly, with the food at stomach level while sitting.  Thus, another product of wartime advances in aluminum and fiberglass:  the TV tray.


So it was that, growing up, I ate most of my meals in my own room, watching my own programs (The Mickey Mouse Club, of course, Combat with Vic Morrow, The Twilight Zone, etc.) while my parents and sister did the same in other parts of the house.  


So before you complain next time you see a group of friends sitting together, but each immersed in their own smartphone world, remember:  it’s a process that started over 60 years ago.  

Wednesday, October 29, 2014

The Wild Rainiers

Why was Washington one of the first two states to legalize marijuana?  Those like me who came of age with Rainier Beer and Ivar's Acres of Clams commercials understand.






Monday, October 27, 2014

The End of Broadcast Television

There's a terrific analysis of the societal changes since World War II that are dooming broadcast television in the October 25 online Newsweek.  Check it out at http://a.msn.com/r/2/BBb9Lb4?a=1&m=en-us.

Saturday, October 25, 2014

Weather as a Graphic Novel

“You don’t need a weatherman to know which way the wind blows …”
                        Bob Dylan, Subterranean Homesick Blues

            There’s a storm blowing in from the southwest, drawing moisture all the way from the remnants of a hurricane near Hawaii.  It will arrive in my neighborhood in about three hours.  Winds will be blustery, and I can expect at least a half inch of rain this evening.

            I know all of this because I have access, whether on broadcast TV or my computer, to the most sophisticated, high-tech meteorological equipment on the face of the earth.  I have multiple radar stations lighting up my screen with shades of green, yellow, orange and, occasionally and worryingly, red; I have satellite images of cloud patterns far out into the Pacific; I have live weather cameras scattered around the state; I have graphs and charts from Federal agencies and the University of Washington; I have an app from the Weather Channel that tells me, minute by minute, what’s happening and what will ensue in the immediate future.  I have video and GoPro pseudo-experiences of tornadoes, hurricanes and blizzards.  Weather is a certifiable obsession, not just for me, but for many of us who live in the Northwest.  I used to do content analyses of local television news broadcasts for my Mass Media class.  In every broadcast, around 11% of the time was devoted to weather (25% was commercials, another 11% was sports—that doesn’t leave much time for actual “news”).

            It’s colorful, scientific, entertaining—such weathercasters as Jeff Renner and Steve Pool are local icons—but it’s not fun.

            It used to be.  I grew up with weather presented as a graphic novel.

            Early television news was, quite frankly, pretty boring.  A reconstructed radio or newspaper reporter read copy from a paper script, often with a backdrop of analog clocks set to different time zones.  “Video” was really film clips (usually days out of date) from movie newsreels. The tone was straight-laced and somber, and the coverage (local broadcasts lasted fifteen minutes) superficial.  But television, as early producers were discovering, is fundamentally more than “radio with pictures”—it’s a visual medium.  You have to see things, not hear about them.

            But just how, in a pre-satellite, pre-weather radar, pre-remote video world, was one to show a low-pressure area?

            For KING TV in Seattle, the answer was cartoons.  In 1951, when they were still the only station in the Northwest, they hired a local commercial cartoonist, Bob Hale, and commissioned him to illustrate the weather.  Although his was hardly an effervescent personality, Bob Hale quickly became a star.  He had little scientific or meteorological expertise, but that wasn’t necessary; the U.S. Weather Bureau supplied the forecasts.  Hale’s task was to make them comprehensible to the rest of us, and he did so admirably, with a happy sun (“Old Saul”), grim, glowering clouds and puffy wind gusts.  It was like a high-school magic show every night—a bespectacled, balding, geeky guy (reportedly not infrequently fortified by liquids from the nearby Doghouse bar) with an easel and grease pens.  His segment often drew more viewers than the news itself, and his cheery, simple style inspired some of us to try our hands at cartooning. 

Bob Hale
           
  I was one of them.  At the height of his popularity, Bob Hale offered a correspondence course in cartooning, and I prevailed upon my parents to pay for my tuition.  Each week I’d receive a lesson and send my “homework” back to be evaluated.  Unfortunately, my evaluations revealed that I had little talent and nothing in particular to say in graphic art, and a future career in cartooning weathercasting was dashed. 

            Hale’s wasn’t.  He attained national notice and, in the early ‘60s, was lured away to San Diego, to be replaced by another cartoonist, Bob Cram.   Cram was actually funnier and had a more complex style, and he created a cast of recurring characters like Onshore Flo and Milli Bar to dramatize upcoming events.  For nearly another decade he carried on the cartooning tradition on KING, even after Hale came back for a brief comeback attempt.


            But it was a new age, a Space Age, and new technologies left little room for folksy guys with grease pens.  Sexy weathergirls in front of green screens, yes.  But cartooning weathermen…not so much.  By the early ‘70s, even KING became serious and scientific, and the weathermap became increasingly colorful and graphic in a far different way. Weathercasters increasingly possessed advanced degrees and scientific credentials and actually prepared their own forecasts from an increasing array of tools and data.  And weather became a serious, sober subject.

            Is it just coincidence that in this new age we have a flourishing culture of Know-Nothing climate change deniers?  Maybe they’d benefit from a cartoonist to explain it all to them.
           


Friday, October 17, 2014

The Umbilical


            Know what this is?


            If you’re of the television generation, you’ll recognize it immediately:  a Television Signal Enhancement Device (TSED).
 
            The other day, my wife and I contemplated a $4000, 4K, Ultra High-Definition, curved-screen TV.  The image was incredible:  nearly 3D without dork glasses, sharp and precise like the finest studio still photography, color of purest hue … 
 
            Impressive.  Mesmerizing, in fact.  It’s not anything like what we grew up with.
 
            It’s not just that, for nearly a decade and a half, television was black and white.  It’s that, most often, the image was nearly indecipherable and required constant attention and adjustment.  Even in the cities, not far from the transmitters, the picture could be wavy, filled with electronic “snow” or horizontal bars.  It could start “flipping” vertically, out of control.  And the signal was subject to weather conditions or time of day.  Sometimes, we could quite clearly watch programs from KTNT in Tacoma or even KVOS in Bellingham, 85 miles away.  Other times, we couldn’t even pick up KIRO in Seattle itself.  Hence the coathanger, often supplemented with flags of aluminum foil.  It was all part of the delightful and frustrating mystery of television.
 
            Ironically, the solution was already at hand, coincident with the very birth of television in the Pacific Northwest.
 
            Astoria, Oregon, at the mouth of the Columbia River, is the oldest white settlement in the Northwest, dating back to the Fur Trade; it’s an unlikely place for technology revolutions.  But the wife of an engineer (and owner of a small radio station) named Ed Parsons had seen a demonstration of television in 1947 in Chicago, and she wanted her own set.  She must have been very persuasive, because not only was there no broadcaster in the region but, even if there had been, Astoria is behind a wall of mountains that blocked broadcast signals from either Seattle or Portland.  With a population at the time of around 10,000, it was far too small for its own TV station, either. 
 
            In time for the first broadcast from Seattle on Thanksgiving, 1948, Parsons put an antenna on the roof of the Astoria Hotel and strung a coaxial cable from it into his apartment.  The result:  happy wife—and the birth of cable television.  You can read his own account here.
 
            In a way, this genesis was unfortunate, because it created a perceptual block to the potential of cable.  This type of service went by the acronym CATV:  Community Antenna Television.  It was seen as a means to improve what already existed—a (much) better coathanger, if you will.  CATV could bring in distant stations, all right, but only if they were technically within the broadcast range.  The FCC, under pressure from broadcasters and their supporting advertisers, decreed that all local stations HAD to be carried and no distant stations could be offered if they duplicated local programming.  For three decades, this service lay dormant.  Most of us didn’t need it:  coathangers were far cheaper, and we were used to squiqqly, snowy images.
 
            But in the mid-seventies, things started changing.  First came Home Box Office (HBO), offering (for an additional price) recent-run, uncut movies.  Then the cable systems started offering WTBS from Atlanta.  Atlanta, Georgia!  Cool!  There was something especially exotic about watching old movies and second-run programs from the opposite coast.  There was even something exotic about watching Canadian news from Vancouver B.C.  And MTV.  Little melodramatic musicals!  Video Killed the Radio Star.  And ESPN.  My wife and I became, for a time, big fans of Australian Rules Football (although I suspect she was mostly drawn to the tight shorts the players wear.)  
 


            And from there it was but a short sprint to The Sopranos and Mad Men and home renovation programs and 24-hour golf and clear, sharp pictures—and 4K, $4000 screens.
 
            And now it’s all changing again.  Just in the past few days, HBO and CBS both announced that they will make their programming available through internet streaming without a cable or satellite subscription.  More and more people, rebelling against mandatory “bundling” of programming and outrageous cable subscription fees, have been “cord-cutting” and relying upon such services as Netflix to enjoy television. 
 
            Cable, if you’ll pardon me, may already be reaching the end of the line.
           


Tuesday, September 30, 2014

Going GoPro



It’s all a matter of perspective.

Despite the promise of the ad above, the television I grew up with was a third-person medium. In drama and comedy, the preferred shot was a medium-length two-shot, because the camera was bulky and mounted on a rolling tripod. For sports, the distance was even greater; cameras were often mounted high up, far from the playing field, using long telephoto lenses. The viewer was, in fact, a spectator, and even more isolated from the action than if he or she were present at the event. At least at the event one can choose where to look; on television, one saw only what the director chose.

From this point of view, we saw many spectacular things, of course. One of my earliest experiences occurred in 1955, during qualifying for the Gold Cup hydroplane race in Seattle.


The pride of Seattle, Slo-Mo-Shun V, was near the end of its qualifying run: Spectacular, unforgettable … but distant, out “there”. We could never know what driver Lou Fageol felt as the boat’s bow lifted, floated higher, soared into the sky, and then crashed back down onto the concrete-hard water. In some ways, we might as well have been reading about it.

So it was with most sports; viewers were “outside” the action, distanced and divorced from the players on the field or racecourse. No matter how much we might identify with a jersey, no matter how loudly we might shout at the screen during a touchdown or a homerun, television kept us away.

No longer. Television technology has evolved along a steady path taking us from passivity to participation. Want to know what driving an unlimited hydroplane feels like? Watch this:



 From first-person shooter gaming to soldiers in combat, we’ve transformed our experience into something more closely resembling real life. Instead of being told about the experience by a narrator, we’re, as much as possible, sharing it (without the sometimes painful consequences). And instead of consumers of video, we have increasingly become producers of it, documenting our lives as they happen in a way never before possible.

And what’s really remarkable is how quickly this has all happened and how quickly we’re internalized it as “normal”. You Tube was only founded in early 2005, nine years ago. Go Pro, which manufactures the cameras that have made first-person video so popular, was founded in 2002. The kinds of stunning action sequences that were once available only to moviemakers like Steven Spielberg that were so technically revolutionary (like the opening sequence of Saving Private Ryan in 1998) are now available to us all. In terms of living vicariously, of “participating” in lives and events we could never know in “real” life, television has finally, after some 500 years, pulled even with reading—maybe even ahead.

Friday, September 12, 2014

It Was (Is?) A White, White World

            When I first encountered television, the “people” who populated it were surprisingly diverse (kinda).  Yes, they were stereotypes, but at least characters from different races, ethnicities and classes were present.  Amos and Andy made a brief transition from radio (and demonstrated that blatant stereotypes that had been relatively inoffensive on radio couldn’t survive the visibility of television), and Eddie “Rochester” Anderson was the comic foil to Jack Benny (again, using a time-worn vaudeville stereotype, but at least visible.)  The Cisco Kid chased outlaws in the Old West.  Ricky Ricardo spoke English with a Cuban accent and even, as her on-air pregnancy made apparent, slept with red-haired Lucy.  The Lone Ranger had Tonto to back him up, even if Tonto did speak an invented “Indian” language.  I Remember Mama celebrated an immigrant Norwegian family, and, of course, The Honeymooners honored the work of a bus driver and a sewer maintenance worker.  The characters were exaggerated and unredeemably offensive to modern-day audiences—but they at least gave an impression of an America that was composed of many streams.     
  
            Then it virtually disappeared.  The screen went white.  It matched the rest of my world.

            Seattle in the 1950s was a segregated city, not by law but by bank policy and social practice. Before the war the population had been overwhelmingly Nordic.  While wartime airplane production at Boeing had brought an influx of African-American workers, primarily from the South, bank redlining and restrictive covenants had created the “CD” (Central District); the Lake Washington Ship Canal created a northern boundary as effective as a moat.  The Lake City neighborhood I grew up in remained entirely white—with new homes for white Boeing workers entering the middle class. 

            Radio had been relatively cheap and accessible to all segments of American culture.  But television was expensive, and advertisers wanted to attract those with the money.  So the dramatic and comedic world of television quickly came to look like my community—not the one south of the Ship Canal.  And, symbiotically, my mental universe came to look increasingly like the world of television.

            Starting in the late 1960’s, a Hungarian immigrant named Dr. George Gerbner began formulating a theory of media influence called cultivation.  I think it explains a lot.

            Gerbner started with an observation.  We have historically learned about ourselves through stories.  Who are the good people?  Who are the bad?  How’d we come to be here?  What’s expected of us?  What happens if we make bad decisions?  How, in short, are we supposed to behave as decent, civilized human beings?  Yes, we can observe those around us, and we do—but our personal experience is limited, often extremely limited, and distorted.  Storytellers supplement our experience.  They shape our view of the world, both within and without our village.

            Storytellers captivated us around the campfires and the Homeric halls.  They told the tales of prophets and the exploits of gods.  They wrote the scripts that were acted out on the Elizabethan stages.  And they, in the guise of scholars, wrote our histories.

            Gerbner realized that, in America in the 1960s, the storytellers were on television, and they were promoting the myths and values of the corporate titans who fed and sustained them. 

            The measure of a story’s value was the number of people (or, more precisely, the number of the “right kind” of people) who listened to it, and to maximize that, the storytellers told stories that comforted rather than challenged.  They created a televised world of stability, predictability, and familiarity—a mirror that showed viewers not who and what they are, but rather who and what they thought they are. 

            It was a world of befuddled white men who conversed with palomino horses.  White patriarchs ran households of sons from My Three Sons to Bonanza.  White men tamed the frontiers, caught the criminals and solved the crimes, and lived with sexy witches and genies.

            The problem with this world wasn’t just that it told patriarchal stories; it was that those stories were ubiquitous, across every genre, available at all hours.  In Nazi Germany, Joseph Goebbels had demonstrated, with his “big lie” theory, that even the most preposterous tales could be accepted as truth if they were repeated often enough and allowed to go unchallenged by conflicting stories.  Something of the sort happened with American television.

            Gerbner called the phenomenon “cultivation.”  An individual stereotype here or there, he discovered, was unlikely to change most peoples’ perceptions.  But a relentless barrage of such stereotypes, day after day, could eventually come to be more “real” than reality.  Especially for heavy television viewers, the television world could become the real world, especially the more it reinforced (or “cultivated”) pre-existing attitudes.

            So, for a child growing up in an all-white, middle-class suburb populated by commuting dads and stay-at-home moms, a televised world of white, middle-class commuting dads and stay-at-home moms became the norm, the default.  In the world of television, those who conformed to this norm succeeded; those who did not either failed or, if their racial or ethnic sexual characteristics made it clear that they could never fit in, simply disappeared.

            For over three decades, Gerbner and his research assistants did massive studies of the characters who populated the television world, watching hundreds of hours of entertainment programs and methodically cataloguing the demographics of the casts.  The numbers alone were startling enough, and they remained remarkably stable over the years.  For every woman character, there were three men.  And women didn’t age as well as men, either; as they got older, they became evil—or they disappeared altogether.  Poor people disappeared, too, appearing only about one percent of the time, and then most likely as criminals.  African-Americans appeared in roughly the same proportion as their actual number in the “real” America—but almost always as secondary characters, seldom as leaders or successful professionals.  Asian-Americans and Native Americans were virtually invisible.

            Professor Gerbner died in 2005, and his studies would seem to be out of date.  Our television screens seem to be full, now, of successful people of color and women, even characters who are identifiably gay.  Mad Men has revealed to millions how artificial and stilted that image was.  We even have a black President.  The “default” has surely changed, hasn’t it?  

            It’s a work in progress.  Two years ago, Cheerios produced a commercial featuring an unbelievably cute little girl asking her mom if it were true that Cheerios were good for the heart.  Assured that they were, she then covered her sleeping dad with them.  It was a warm and funny family moment.


            The mom is white.  The dad is black.  The reaction, from some quarters, was vicious.  Little Gracie’s family may be reality, but for many, it still hasn’t displaced 50 years of television “reality.”

Saturday, September 6, 2014

The Remote Revolution

            On the last day of February, 1983, CBS broadcast the final two-hour episode of M*A*S*H*.  The doleful guitar, the opening strains of “Suicide is Easy”, the Bell helicopter landing in a dusty field—we gathered around to see what would become of Hawkeye, Margaret, Radar, B.J., and the rest of a company of battlefield surgeons we’d come to know over the last decade.  And America, or at least a good portion of it, came to a stop.

            That episode, “Goodbye, Farewell, and Amen”, was seen by over 125 million people—all at the same time.  Its rating was 60.2, which means that over 60% of all televisions in the country were tuned to it.  Even more astounding was its share, the percentage of televisions that were actually turned on at the time; that was 77.  For two hours that night, three-quarters of all television viewers shared the same experience.

            For years, stories had been told of streets being emptied of cars and crime rates plummeting during such program finales.  One popular report (probably an urban myth) said that water consumption in major cities spiked enormously when people rushed to the bathroom during commercials.  But this was the high-point of the phenomenon.  Huge audiences still watch certain programs—the Super Bowl for one obvious example—but television no longer has the power to hold us in thrall, to command an entire culture, or at least a major portion of it, to sit down and watch when network executives dictate.

            It’s ironic.  We viewers overthrew the corporate titans by adopting a technology that helps us be lazy:  the remote control.


            Oddly enough, M*A*S*H*, the program that eventually earned the largest audience in television history, was almost cancelled for poor ratings its first season.  It was saved by being moved in the network schedule to follow the highly-popular All In The Family and it finally took off on its own.  The strategy was common in network television at the time (and still is today):  pair a ratings success with a new or struggling program to follow it.  The tactic was rooted in a simple assumption:  most people would rather sit passively and watch whatever comes next rather than get up, walk a few steps, and manually change the channel.  It worked.

            It also worked for the economics of network television.  Blocks of commercials could precede and follow programs and, of course, interrupt them at predictable intervals.  The audience would wait patiently and absorb the commercials’ messages.  It was an efficient, tidy scheme, all based on passivity.

            The very first remote control, a wired device marketed by Zenith in 1950, was called the “Lazy Bone”.  A classic example of the Law of Unintended Consequences:  the remote was conceived as a device to enhance our passivity.  In practice, it had a startlingly different result, one whose consequences are still reverberating. 

            Look at the ad above.  One of the advertised benefits of “Flash-matic” is that it allows the viewer to mute “long, annoying commercials.”  It also, of course, allows the viewer to change the channel without getting up.  Sounds simple enough—but it was the storming of the network’s Bastille.

            If viewers don’t have to listen to commercials, then what’s the point of paying for them?  And if viewers have more choice over what to watch (disregarding the fact that they always had to choice to watch nothing at all), then what’s the point of devising elaborate schedules?  This wasn’t a major problem in the 1950s, when there were only three networks and a handful of local stations to choose from and all the networks ran commercials at the same time.  But what if … ?

            What if, instead of merely flipping channels and muting commercials, viewers could flip time itself?  What if they could decide when to watch a particular program?  What if they could alter the speed of commercials so that they could speed through them and not even have to watch silent images?  What if they could pause to go to the bathroom or make a meal or go back and watch something they’d missed?  Again, a new technology and the law of unintended consequences raised—and answered—these questions.

            Video tape was, in 1956, a powerful problem-solver for national networks; it allowed them to avoid repeating programs for different time zones and to store them much more easily than with the film they’d used before.  But the machines were complicated and expensive, far too much for home users.  And besides, who’d want to record TV programs, anyway?  That would take even more effort than standing up to change the channel.  Maybe the movie studios could sell some old movies to play on them … that appeared to be the best future use.

            Yes, a few companies, including Sony and RCA, sold a few videotape machines for home users in the 1960s.  But they had limited capacities and reel-to-reel mechanisms; only the geekiest were attracted by them. As a community organizer in the early 1970s, I used one, with a black and white camera, but it was bulky, complicated, impossible to edit, and more of an annoyance than a useful tool.

            Sony, though, struck again in 1975 with the first Betamax, using a cassette tape that was easy to insert and eject.  And  everything changed.  We at home could actually use these things, even though the “flashing 12:00” (from the user’s bafflement over how to program it to record) became iconic.  With competition and the VHS format came affordable prices.  With a reason to own one came a consumer revolution.

            We still have at home boxes full of old VHS tapes:  tapes of movies recorded off the air, complete with commercials.  Tapes of all available early episodes of “Dr. Who” (which my wife discovered on late-night TV after getting home from a night job).  Tapes of sequential episodes of favorite programs, which could have been the material for primordial binge watching.  And, yes, pre-recorded tapes of movies.  Granted, we were a bit atypical perhaps (OK, obsessive)—but we weren’t alone.  And the corporate giants finally recognized the monster they’d created and went all the way to the Supreme Court to try to kill it.  Why, recording programs was theftThey owned television, not us!  Jack Valenti, representing the film industry, put it this way, testifying before Congress:  "I say to you that the VCR is to the American film producer and the American public as the Boston strangler is to the woman home alone."

            They lost; we won the right to watch whatever they offered, whenever and however we wanted.  And, with the growth of Cable in the 1980 (which the networks also did everything in their power to stifle), we had far more to choose from as well.


            Think of it as the Berlin Wall of broadcasting.  With the “Flash-matic” and the Betamax, the crumbling of corporate power began.

Wednesday, September 3, 2014

Zero Dockus ...

Zero dockus, mucho crockus, hallabolooza bub,
That’s the secret password that we use down at the club.

            Almost every afternoon after I walked home from school I’d rush to my bedroom, turn the portable TV knob on, wait for a tiny point of light to expand into a full-screen image, and sing along with a local icon, Stan Boreson.  And then I’d settle in for an afternoon of stupid puns, parody songs, old movies and ethnic stereotyping. 

            In the 1950s, before the advent of telecommunication satellites and continental microwave relays, much of television was local.  KING, the first and, for four years, only station on the air in Seattle, was, at that time, an ABC affiliate, but that didn’t mean an awful lot.  Network programming had to be physically delivered by messenger from the production centers in New York and Hollywood, and ABC itself was a struggling, nearly bankrupt network without much to offer, at least until Walt Disney got involved.

            With a growing demand for more hours of programming, stations like KING enlisted local performers to host daytime chat shows for housewives and “educational” variety shows for children.  The sets were cardboard (on the same level of sophistication as 1920s German Expressionist films or early Dr. Who), and the performers came cheap—and the audience, entranced by the very existence of television, wasn’t very critical.  By the mid-fifties, almost all stations had hours of programming for children:  Wunda Wunda, a woman in a clown get-up who read stories; Captain Puget, a seafarer who showed old movies and told stories; Brakeman Bill, a railroad engineer who showed old movies and told stories; and, at the top, J.P. Patches, an improvisational clown, and King’s Klubhouse with Stan Boreson, a former radio performer of some renown.

            There were others as well, some supplied by the networks:  Kukla, Fran and Ollie, Howdy Doody,  Shari Lewis and Lambchop, and, of course, Mickey Mouse Club.  But, at least in the first years, it was the locals who “hooked” us on television, who gave us hours of entertainment, and who implanted corny skits and silly songs we can still recite verbatim after half a century.  And they gave us more:  these were the creators of an identity—the postwar, baby boom generation that first experienced the world through a medium totally foreign to anything their parents had known.  And they taught many of us what it meant to come from the Pacific Northwest.

            Stan Boreson, in particular, created the “Scandahoovian.”  Seattle, before the war, had been heavily dominated by immigrants from, primarily, Sweden and Norway, and Boreson developed an exaggerated Scandanavian persona, complete with exaggerated accent, ever-present accordion for accompaniment, and a plethora of “Scandahoovian” folk songs like The Lutefisk Song, Catch a Pickled Herring, I Just Go Nuts at Christmas, and Valking in My Vinter Undervear.  Even for those of us who were not Scandanavian, the Northwest became a place of immigrants who spoke something other than Harvard English.  His constant companion was a nearly-inert Bassett hound, No-Mo (even that name, a play on the Unlimited Hydroplane Slo-Mo-Shun IV, had regional meaning.)  His humor was broad, irreverent but gentle, and full of puns, and there was utterly no didactic content.

            Even with cardboard and plywood sets, it couldn’t last forever, and Boreson went off the air in 1967, replaced by nationally-syndicated and network shows and Hanna-Barbera cartoons. So, too, disappeared all of the local contemporaries, including, last of all, J.P. Patches.  There was still childrens’ programming, of course (at least while the F.C.C. still required it of licensees), but it was slicker, and the identity more urban and national.  As much as Sesame Street and Mr. Rogers’ Neighborhood have contributed, they could never have given a Seattle kid as strong a sense of regional identity as Stan Boreson and No-Mo did.

            And zero dockus, mucho crockus, hallabolloza ban,
Means now you are a member of King’s TV club with Stan.

           



Saturday, August 23, 2014

Glum and Twinkley

I went to college to become David Brinkley.

Television news evolved from the Movietone newsreel, beginning with John Cameron Swayze’s Camel Newsreel Theater in 1948.  All the world’s news in less than fifteen minutes, delivered by a histrionic announcer nearly out of breath with excitement and featuring, whenever possible, animals and women in bathing suits. It was “news” as spectacle, something distant and, ultimately, irrelevant—certainly not to be interpreted or understood.

But Swayze had a powerful rival, one of the “Murrow gang” at CBS (and, of course, there was Saint Edward R. himself taking down McCarthy):  Douglas Edwards.  The tone of CBS was serious and sober, intellectual and erudite.  “Newsreel” news started sounding as dated as Dixieland jazz in a bebop era.  And people were beginning to pay attention to television news.  An astounding number of people, in fact; within a decade, network television had become the primary news source of most Americans and newspapers had already begun their long decline.

How do you compete with a living legend and his band of apostles, people like Douglas Edwards, Eric Sevareid, Mike Wallace and an up-and-comer named Walter Cronkite?  NBC had a news broadcaster from the same mold named Chet Huntley, serious, credible and capable.  But where was the difference that would vault NBC’s newscast into the ratings lead?

Why not a team?  Comedy teams—Burns & Allen, Martin & Lewis, Laurel & Hardy—had been a mainstay of radio and translated well into early television, so why not try the formula with news?  Chet Huntley would be the perfect Murrow-esque straight man, and chubby-cheeked David Brinkley the wisecracking relief.
Brinkley injected irony into the news.  He was a solid journalist; his credentials were impeccable, so there was no question of his credibility.  But he also, through his tone and occasional sardonic comments (like saying that Senate Majority Leader Everett Dirksen appeared to comb his hair with an eggbeater) he relieved the impact of the events he reported. After an initial drop, the ratings started climbing, ultimately surpassing CBS’s.  A real comedy team rewrote the song Love and Marriage in their “honor”: 
Huntley Brinkley, Huntley Brinkley, one is glum and the other’s twinkly …

Throughout the crises of the early ‘60s, as I was coming of age—the Selma March and the Freedom Riders, the near-nuclear war of the Cuban Missile Crisis, the Kennedy Assassination—they were my guides, especially Brinkley.  There was something about his sense of irony that was reassuring; the events were terrible, yes, and of great import—but we’d survive them.  Don’t take it all too seriously; that was Chet’s job.  When I majored in broadcast journalism at the University of Washington, it was because, primarily, of David Brinkley.

And that, I realize now, was the beginning of the most important change television brought about regarding news.  The messenger became as important as, if not more important than, the message.  Walter Cronkite became “the most trusted man in America” according to repeated polls, a televised personality with the authority of a revered spiritual leader.  Huntley retired in 1970 and faded from the scene rapidly, but Brinkley went on for decades more, always a star, always an authoritative entertainer (or perhaps an entertaining authority).  Through Watergate and the Reagan revolution and the Gulf War, he explained and reassured—and he provided the model of the personality journalist.

When cable news became a force in the 1990s, it wasn’t to the Murrow tradition that CNN and Fox and MSNBC turned; it was to the Brinkley manner.  He’d probably appreciate the irony that his legacy has become the Bill O’Reillys and Rachel Maddows and Anderson Coopers. 


Tuesday, August 19, 2014

This Just In

Happy Birthday Philo T. Farnsworth!

The Boobs Tube

Who owns the air?

Despite its simplicity, this is a question that many of my students got wrong.  Whenever I’d ask it in my Mass Media class, a few brave souls would confidently call out “the government.”

Wrong.  It’s mine.  Mine, mine, mine.

The problem (or the virtue, depending on one’s perspective) is that it’s also yours.  The air, or more accurately the electromagnetic spectrum that passes through it, is owned by the American public (at least within the United States.)  If anybody wants to use it for private purposes, they have to have my permission. 

Yours, too. 

And everybody else’s.

It wasn’t always that way.  In the infancy of radio, anybody with the simplest of equipment could start broadcasting anything they wanted.  After all, the First Amendment guarantees freedom of speech and of the press, doesn’t it?  We’re a free people, aren’t we?  So all across the country aspiring broadcasters—bakers, car dealers, entertainers, charlatans, religious leaders, even kids in their basements with tubes affixed to mom’s breadboard—started filling the airwaves.

And, quite quickly, they made an unfortunate discovery:  the electromagnetic spectrum, and particularly the part of it devoted to radio and television signals, is a severely limited resource.  There’s only a certain amount of “water” that can fit through this “hose.”  Signals started crossing.  Those with greater power drowned out those with less.  Distant stations interfered with local ones broadcasting on the same frequency.  Everybody was talking at the same time. The entire medium faced a collapse into chaos.

Somebody needed to be a traffic cop, and since radio waves don’t respect local boundaries, that cop had to be the federal government.  First in 1927, as the Federal Radio Commission, and with expanded powers in 1934 as the Federal Communications Commission, that regulator was established by Congress.  And broadcasting, the most powerful medium in human history, the most influential force in American life of the 20th Century, has been censored. (Yes, I know:  even the FCC claims that they lack “censorship” power, and technically that’s true; nobody is jailed for violating their regulations.  But I’m thinking of “censorship” in a broader sense, as I’ll try to explain below.)

“Wait,” you’re thinking about now.  “Where are the boobs you promised?  Why are we getting a lecture on media law?”  Don’t worry:  here they, or at least one of them, are.

During the halftime show of the Super Bowl in 2004, singers Janet Jackson and Justin Timberlake “accidentally” exposed, for a microsecond, one of her breasts.  Writer Marin Cogan analyzes the “event” and its impact in far more detail than I could, but the upshot is that this one microsecond of partial nudity exposed (pun intended) the glaring hypocrisy of television regulation.  Led by an organized campaign, half a million people complained to the FCC for this violation of “decency” standards, and the FCC fined 20 Viacom-owned stations $550,000 (later reversed by the courts).

Just the next month, viewers so inclined could have watched Season 5 of The Sopranos and enjoyed multiple blood-spattered murders, graphic sex and nudity (one frequent setting was a strip club) and profanity-saturated dialogue, as well as a story line which, some might argue, made mobsters into sympathetic characters.  The FCC acted on no complaints about The Sopranos; no fines were imposed or even contemplated.

The difference?  The Sopranos was produced for Home Box Office (HBO), carried by cable and satellite and available only to those willing to pay a rather hefty premium for it.  HBO did not need to seek public permission to use the medium, and the FCC has no jurisdiction over it.

And here’s the paradox of television:  programming (or at least some of it) is much better now than ever before, of a quality that often exceeds even the best movies.  Such series as The Sopranos, The Wire, House of Cards, Mad Men, etc. exemplify the most complex and challenging writing, acting and production available in American culture today.  And yet they’re only possible because television has transformed from a democratic medium to one for the elite.  Ironically, the First Amendment now protects programming that is not available to large numbers of Americans.

Broadcast television, using the public airwaves, was democratic; it was required by law to offer something for everybody—all the owners of the airwaves.  Of course, it never really did this; it always catered to a white, middle-class audience, the target of mass advertisers.  In fact, this kind of “democracy” actually led to a kind of tyranny—the tyranny of the conventional.  Since programming was ratings-driven, television was dictated by majority rule—and the majority, in this case, wasn’t just ethnic, but also economic.  The result was a fairy-tale view of American life, a fantasy of family and manners that never existed, and never could have.  Lucy and Ricky slept in separate beds, even when she was pregnant (a word she couldn’t use).  Captain Kirk and Lieutenant Uhuru exchanged an interracial kiss—but only under compulsion by aliens.  Western heroes contested hundreds of gunfights, but the bullets never drew blood from the bodies they punctured. And the language television characters spoke was decorous and “moral”, as far removed from real American English as any Shakespeare production.  Good triumphed and evil was punished, and even the news edited out graphic content to avoid disturbing complacent viewers.  And half-time shows featured marching bands and clean-cut, “Up With People”-style choral groups.  The standard for decades, agreed upon by the FCC and the broadcasters themselves, was the “least common denominator”; that programming is best which offends the fewest.

Pay television has radically changed that.  Some might argue that The Sopranos is no more “realistic”, in the long run, than Gunsmoke; fair enough.  Any dramatic structure with a beginning, a middle and an end is going to distort the chaos of real life.  But at least there’s more variety, more complexity of characterization and plot development, more willingness to explore uncomfortable issues.  And, yes, more boobs.
But what of a medium for a democracy?  What are the consequences of segregating the best programming, the best sporting events, the concerts and topical comedy, behind an increasingly burdensome pay wall?  What is the morality of leaving the “leftovers” for the poor or those without cable or satellite service while the elite get to enjoy the “good stuff”?  Granted, increasingly such programming is available directly from internet sources such as Netflix or Hulu.  But these still require disposable income—lots of it. 

We used to be able, at least, to discuss the latest episode of M*A*S*H the next morning at work.  Have we lost even that basic bond?


Monday, August 18, 2014

King of the Wild Suburbs

Yes.  I had a coonskin cap.

I also had a Davy Crockett toy flintlock pistol, and bubblegum cards with Fess Parker’s and Buddy Ebsen’s images on them, and an imitation leather jacket with fringes.  I scouted the wild vacant lots of Seattle’s Lake City neighborhood seeking a bear to shoot, and I watched all five episodes of the series, possibly the first made-for-TV mini-series, as well as the 1955 movie version compiled from it.

I was, in short, the target demographic for one of the earliest, and most successful, marketing campaigns of the television era.

Davy Crockett was the 1950s equivalent of Star Wars.  It hadn’t been heavily promoted; the first three episodes were intended for Disneyland’s “Frontierland” segment, which alternated with other themes from sections of the park.  The stars, Fess Parker as Davy and Buddy Ebsen as George, weren’t well-known (this was well before Ebsen’s Beverly Hillbillies days.)  Although technically a Western, which was still a popular genre, the period (the early 19th Century) and the locales (Tennessee, Washington D.C. and Texas) weren’t exactly Utah’s Monument Valley. 

But something in it appealed to the zeitgeist, and the new medium of television spread it like a virus.  Across the country, boys like me started showing up in elementary school wearing dyed rabbit-fur “coonskin” caps (over 100 million dollars’ worth were reportedly sold) and playing Frontiersman and Indian.  Over sixty years later, the theme song (“Born on a mountaintop in Tennessee/Greenest land in the land of the free …”) can still be a powerful earworm (in fact, I just did it to myself).  The only other theme with that effect is “Gilligan’s Island” (Oh, shit.)

The real Davy Crockett was a soldier, land speculator, politician, and adventurer.  Today, his politics would place him in the Tea Party:  he served in Congress as an advocate for tax relief for low-income farmers in his state and died fighting for Texas’ secession (although in that case from Mexico.)  He was a shameless self-promoter who wrote his own mythic autobiography. And his death at The Alamo was the stuff of legend. To his credit, he DID lose his seat in Congress for his lonely opposition to the Indian Removal Act that initiated the Trail of Tears. The TV show missed no opportunity to voice his famous maxim:  “Be sure you’re right, then go ahead.”  In short, he was the epitome of American exceptionalism.  For a generation, the first TV generation, Fess Parker’s version of him—witty, resourceful, personally courageous and independent—provided a powerful role model.  Recently, I was considering a visit to Santa Barbara and scouting hotels, and such was the lingering influence that I was immediately drawn to the Fess Parker Hotel.

Yesterday was Davy Crockett’s 228th birthday, and we’re approaching the 60th anniversary of the broadcast of the first episode.  Today, we’re awash in HBO and Showtime series, Marvel Comics movies, and product tie-in merchandise.  But this, I’d argue, is where all that started. 

I wonder if Amazon.com sells coonskin caps?


Saturday, August 16, 2014

A Home With Five Windows



There was, of course, one in the living room, with all the furniture arranged around it.  There was one in my bedroom.  Mom and Dad had one in their bedroom, on a shelf near the ceiling so they could see it easily in bed.  There was one in the kitchen.  And, for awhile at least, there was one in the bathroom.

I grew up in a home with five television sets.
   
Television was supposed to strengthen the American family.  Hell, visionaries like Marshall McLuhan even believed it would organize the worldwide human family into one big, harmonious “global village.”  Television advertisements projected an image of social bonding—Father, Mother, two happy children (all white and middle class, of course) happily sharing high-quality drama, or cocktail parties with all the friends and neighbors (Mother in an evening dress serving drinks). 

Cecilia Tichi, in her 1992 book The Electronic Hearth, showed how television manufacturers consciously co-opted the imagery of the colonial fireplace hearth and transferred the warmth and fellow-feeling associated with it to this new technology.  Television was to keep the children home; it would even keep (or get) the husband home to be attended to by his loving wife.  And at the same time, it would be, in a phrase popular at the time, a “window on the world”, bringing news and sports from far away straight into the family room for all to experience.  We’d even eat our TV dinners together off aluminum and plastic TV trays!

Much of that did indeed happen.  I was “nourished” by my share of TV dinners and turkey pot pies.  But increasingly, I ate them in my own room, on my own TV tray, watching my own programs on my own television set.  And so did many others.  The great unifier proved, in fact, to be one of the great dividers.

One of the most important trends in current media is fragmentation and the dissolution of the mass audience.  Marketers have learned to target narrow demographic audiences with ever-increasing precision so that they don’t waste any effort trying to reach people who aren’t already inclined to buy, and this has driven most media, but especially television, to tailor offerings to niche viewers.  Cable and satellite viewers can spend 24 hours watching golf or cooking or home improvement or news and entertainment in a variety of other languages.  Five hundred channels and plenty on, for every conceivable (at least legal) taste and inclination.  And one result is that, as a society, we are fractured and polarized as never before during the Age of Media.  We simply have less and less to talk to each other about—and when we do talk, we have increasingly polarized points of view.

Despite the sexist, racist and classist images used to promote it, the electronic hearth wasn’t such a bad idea.  But as soon as it became possible to buy cheap sets, it became an impossible dream.  We could all gaze out of our own windows on our own worlds.
The TV I had in my bedroom.



Thursday, August 14, 2014

The Candle and the Kid

Bright, sunny Southern California:  the rocket jutted from the launch pad like a saber unsheathed for battle.  In a way, that’s exactly what it was:  a slender, three-stage saber carrying a basketball-sized message to the Soviet Union.  Although it was developed and operated by the U.S. Navy, the rocket. Vanguard, was ostensibly on a peaceful scientific exploration (“… to seek out strange new worlds and civilizations, to boldly go …”) as part of International Geophysical Year.  But we all knew the real mission:  to show the Russkies, and the rest of the world, that American ingenuity (actually, the ingenuity of captured Germans) could create the first artificial moon.
For the first time, Americans watching on live television could hear a countdown and feel the rising tension as it approached zero.  Wisps of condensing vapor floated off the silver-and-white shell and finally, as the countdown ended, roiling clouds of smoke and jets of flame shot out of the bottom.  Vanguard paused, the clouds billowed, and it started to move slowly upward with tremendous grace and power.
And then, a few feet off the ground, it sank back down and exploded.
It was humiliating.  Even climbing into the sky and exploding there would have been better than this (which is actually what happened with the second Vanguard attempt.)  Even more humiliating:  it was to have been the first, but the Russians had already stunned the world by launching their Sputnik two months before.  Not only couldn’t we beat them; we couldn’t even match them!  And their space adventures were launched in secrecy, while we made our blunders in full televised view.
It probably altered my entire future.  The American reaction, after the initial panic, was to pour resources into a new emphasis on science and math education.  Who needed literature and art any longer?  The Russians could launch satellites!  I suddenly found myself in a new, experimental math class, so new that the textbooks were bound in construction paper and printed, it seemed, with a mimeograph.  I was baffled by it; if it hadn’t been for Rich, who was to become a lifelong best friend and who let me copy his answers on exercises, I would have failed.  SMSG Math made it clear that I had no future in science or technology, however urgent those fields might have been.  I ended up an English major.
Then the Russians orbited a dog!  True, it died up there after barking and whimpering its way around a few orbits because they had no way to bring it back, but still … Clearly, a manned flight would come soon, and we couldn’t even launch a damn metal basketball.
The Navy finally successfully launched a Vanguard (it’s still up there today) and the Army sent its own satellite, but the resources and talent went to the new civilian agency, NASA, and sending a man into space first was a national imperative.  By 1961, the Mercury program and its first seven astronauts were ready.  The plan was to send Alan Shepard essentially straight up on the nose of an Army Redstone Intermediate Range missile and then fall straight back to earth—fifteen minutes of a roller-coaster ride.  The launch was scheduled for May 2, 1961.
The Commies did it again.  April 12:  Yuri Gagarin not only went into space, but orbited the earth.  In a special affront to American rivals, he reported that he “didn’t see any angels.”
So NASA had to aim for second place, and with a different set of sub-orbital rules.  But it was still a matter of national pride and high drama.  After all:  Russian rockets worked, and American ones seemed to explode.  There was a certain macabre fascination to watching Shepard emerge from the ready room.
Actually, it got progressively more macabre, because the first two times he climbed into the Mercury capsule, the countdown was cancelled by bad weather after all the build-up. The omens didn’t look good. 
Finally, May 5, the weather report looked satisfactory.  Because the launch was to occur shortly after dawn in Florida, we West Coasters had to be up and watching at about 4:00 a.m.  It was a heady experience for a 13-year-old on a school day.   After more delays, the countdown reached its climax at 6:30 Pacific Time and Shepard, with characteristic American bravado, told Mission Control to “light this candle.”   Thanks to television and emerging national networks, and entire nation was able to share the event as it happened.  Thanks to the advent of videotape, you still can, so instead of trying to describe it, let me direct you to NBC’s coverage of the firstAmerican manned mission.





Wednesday, August 13, 2014

My (Short) Career in Television

            It’s hard to believe in this time of massive corporate conglomeration, but television once was an integral part of the local community, just as radio had been before it.  KING was owned by an heir to the Stimson lumber fortune, Dorothy Stimson Bullitt.  KOMO, the second station on the air, was an enterprise of a local flour mill, Fisher.  KTNT, the first CBS affiliate, was started by the Tacoma News Tribune.  And the first non-commercial station in the Northwest, KCTS, was the home of my brief career in television production (KCTS even reflected these local roots in its name—it stood for “Community Television Service”).
            That, of course, meant that it was perpetually broke.  KCTS started in 1954 with organizational support of the University of Washington and donated (i.e., already-obsolete) cameras and equipment from Mrs. Bullitt.  University students staffed the operation most of the time, but occasionally community volunteers did some of the work.  That’s where I come in.
            In the late fifties and early sixties, I was a Boy Scout, and, when I turned teen, an Explorer.  Other scouts went camping, canoeing, hiking … they learned how to whittle and put up shelters and identify edible plants and all kinds of cool quasi-military stuff.
            Mine volunteered at KCTS.  I was a teen-aged camera operator.
            It was a decidedly low-budget operation.  Virtually all the programming was local and, frankly, talky and less-than-compelling.  The studio smelled almost like a lumber yard because the sets were plywood and 2X4’s. The cameras smelled electronically hot; transistors were still cutting-edge, and these cameras used tubes.  They were bulky and heavy, mounted on dollies.  To operate them required a delicate balance between stasis and movement; the goal was, of course, to be invisible to the viewer, so the camera operator had to refrain from any motion and never sneeze or cough.  But the vidicon tube that captured the image had to move frequently or the image would permanently “burn”.  It was the job of the director to shift between the two cameras frequently enough to allow each operator to reframe the image.  All broadcasting was, of course, in black and white.
            So every week, my small troop of Explorers would go to the University of Washington campus and spend a couple of hours running the cameras, patiently standing stock-still and stifling body noises.  It was the genesis of my original major when I went a few years later to the U.W. as a student:  Broadcast Journalism.  You might have seen me today on one of the networks (hopefully not Fox News) but for one unfortunate outcome.
            I was fired.
            I didn’t make any untoward noises or burn the vidicon.  It was purely a matter of genetics.  You see, even as a teen I was abnormally short, just barely five feet tall.  Even beggars can be choosers under extreme circumstances, and the KCTS directors finally had enough of camera angles that highlighted the underside of speakers’ chins.