History on Film
What makes a film a classic? A better question to ask is: what makes a film epic a classic film epic? Without boring readers to tears with dry, statistical analysis — and for the sake of argument — let’s say that Sir David Lean’s 1962 desert epic Lawrence of Arabia conveniently and consistently fits both bills.
At roughly four hours in length, including overture, intermission, and exit music (in Robert Harris’ exemplary restoration effort), it’s every critic’s Exhibit A in the “classic film epic” department, no contest about it. But why is that? Well, it’s got style to burn. It’s got wit; it’s got taste; it’s got sweeping romantic vistas and magnificent location scenery. And it has an enigmatic title character in T.E. Lawrence, deftly handled by the young and nearly unknown Peter O’Toole in a wide-ranging (and incredibly revelatory) performance of the first order.
Viewers were equally divided as to whether Lawrence was any more knowable at the end of the saga than at the beginning of it. Certainly the way the character’s been written (Sir Robert Bolt and Michael Wilson contributed to the Academy Award-nominated screenplay, with Bolt taking most of the credit since Wilson had been blacklisted at the time of the movie’s release) makes Lawrence out to be more of a warmongering adventure seeker and less of a stand-up-and-get-shot-at hero — an anti-hero, if you prefer.
Still, all glory and honor are due O’Toole for what must have been an impossible acting assignment. He had to capture Lawrence’s softer “feminine” side, so to speak — his latent homosexuality could only be hinted at in 1962 — without a) giving away the game; b) without making him into a limp-wristed caricature; and c) without giving up any of the manly heroics associated with the historical figure per se.
In addition, O’Toole had to reveal Lawrence’s exceptionally volatile nature, as well as his high tolerance for pain — the torture scene featuring the sadistic Turkish Bey with the troublesome cough (played by José Ferrer) is a good case in point. Although such an electrically-charged term as “male rape” could never have been uttered in an early 1960s feature that is exactly what occurred here and what director Lean was going after. Lawrence’s refusal to go into specifics about his manhandling is an indication of how far he was willing to go to prove his manly (that is, heroic) nature to his Arab friends.
The plot, in brief, concerns a misfit British officer, Lieutenant Lawrence, and his involvement with Saudi Prince Faisal (Alec Guinness, in a false beard and even more faux accent). His orders are to keep a close watch on those Arab beggars (“They’re a nation of sheep stealers,” comments the bigoted General Murray) and report his findings to British High Command in Cairo. Instead, Lawrence takes the bull by the horns by throwing himself headlong into an ad hoc campaign of his own devising. “I’ve got orders to obey, thank God,” exclaims Murray’s replacement, General Allenby. “Not like that poor devil. He’s riding the whirlwind.” Indeed, a whirlwind that lands him in hot water.
Lawrence’s goal is to oust the stubborn Turks from the gulf port of Aqaba by using a ragtag army of Bedouin tribesmen, the only force available to him. As fate (and luck) would have it, his plan works brilliantly — too brilliantly, one might add – and rather too easily for Lawrence’s future benefit. Sadly, it’s all downhill from there for the heavily burdened “El Aurens,” as the natives now refer to him. A legend of his own making, helped along by the cynical American reporter Jackson Bentley, Lawrence learns that he’s human after all and prone to all-too human failings — among them, a built-in self-loathing for what he’s become, i.e., a masochist as well as a sadist. His unraveling at the Arab council and the realization that he’s been a pawn in the hands of both Faisal and Allenby leads to his abandoning the desert for anonymity in the British countryside.
In his international film debut, Egyptian-born Omar Sharif contributes class, charm, and good looks (along with a sizzling screen presence) as Lawrence’s sympathetic Arab companion, Sherif Ali. Both Ali and Lawrence share a unique love/hate relationship with each other. Today, our so-called modern sensibilities might admit to a probable “bromance” between these two politically and culturally distinctive individuals. This opens up the issue of whether Lawrence, as depicted in the film, had homosexual tendencies. The introduction of two servant boys, Daud (John Dimech) and Farraj (Michel Ray, an Anglo-Brazilian), may even have allowed for such an indulgence. Historically, however, Lawrence was said to have been asexual, so that ends that query.
Others in the all-male cast include Anthony Quinn (with an immensely prominent, hooked proboscis) as warrior chieftain Auda Abu-Tayi, his Arab “ally” in arms; Jack Hawkins as a remarkably convincing General Allenby; Claude Rains as Mr. Dryden, head of the Arab Bureau; Anthony Quayle as Colonel Brighton; Arthur Kennedy as Jackson Bentley, the Lowell Thomas doppelganger; and bushy browed Sir Donald Wolfit as the short-sighted General Murray.
The film is divided into two parts, with the second half dragging slightly. The downbeat ending is, as expected, just that. But there’s no overlooking the award-winning desert cinematography by Freddie Young, or Maurice Jarre’s flavorful and much admired (by this author, anyway) film score, another Oscar® winner. Despite many months in the desert (Lawrence of Arabia was filmed partially in Jordan and along the southern coast of Spain), director Lean held it together, in the process showing how to keep the focus on the human element amid the bloody spectacle of war. Produced by movie mogul Sam Spiegel, whose crowning achievement this undoubtedly was. All that’s left to say is: “Here, here!” Ω
It’s hard to fathom even today that Margaret Mitchell’s best-selling 1936 fictional novel, Gone With the Wind, was practically an unwanted property in Hollywood. No studio head would get near a Civil War story, let alone adapt one for the silver screen.
For years Tinsel Town touted the widely-held belief (perpetuated by Metro-Goldwyn-Mayer’s head of production, the “boy genius” Irving Thalberg) that “no Civil War picture ever made a nickel!” This was only partially true, of course: in its day, D.W. Griffith’s three-hour 1915 silent epic, The Birth of a Nation, not only set attendance records whenever and wherever it was shown, but revolutionized the way motion pictures would be marketed and made for all time.
Still, Thalberg’s boast would forever be put to rest when producer David O. Selznick, who was a son-in-law to Louis B. Mayer (one of the M’s in M-G-M), purchased the rights to Atlanta native Mitchell’s thousand-page tome. The result was a box-office juggernaut, the likes of which went on to break all-existing records for decades to come.
As heavy as Webster’s Unabridged Dictionary but not nearly as densely worded, the book version of GWTW (as it is customarily abbreviated) can be described (with tongue planted firmly in cheek) as the American variant of Leo Tolstoy’s massive historical epic War and Peace, but without the Russian author’s literary acumen or extraordinarily philosophical insight into the human condition.
The comparison is not at all a stretch, for both works take place during intensely turbulent times of immensely significant change for their respective eras. For starters, Ms. Mitchell (who was known in her native Atlanta as Peggy Marsh, after marriage to her second husband) concentrated on the character of Katie Scarlett (originally Pansy) O’Hara.
A lively spitfire of a Southern belle, Scarlett uses large dollops of girlish allure, feminine guile, and willful behavior, along with a ruthless capacity for survival at any cost, to overcome any number of obstacles, both to her person and to her beloved Tara, the land her father, Irish plantation owner Gerald O’Hara, insisted was “the only thing worth fightin’ for, worth dyin’ for!”
But what relation does Scarlett O’Hara have to Natasha Rostova, the youthful heroine of Tolstoy’s massive novel? Quite a lot and more than meets the eye!
First of all, there are several pairs of individuals intimately detailed and observed in both works — Scarlett with Rhett Butler and Ashley Wilkes with his cousin, Melanie Hamilton, juxtaposed against Natasha and Prince Andrei Bolkonsky, as well as Pierre Bezukhov and his wife, Helene Kuragina, among numerous others. It was as if GSTW’s author had merged the personalities of Natasha and her own cousin, the mild-mannered Sonya (a mirror image of the sweetness-and-light personified by Melanie), with that of Scarlett herself; then had her pine away for the cerebral Pierre (standing in for the poetic dreamer Ashley), while spending the bulk of the story’s plot on the sordid lives of the buxom Helene (another side of Scarlett’s capricious persona) and her dashing lover Dolokhov, who safely incorporates multiple facets of that lovable rogue, Rhett.
We may add another viable if all-too obvious connection: the invading Emperor Napoleon Bonaparte with that of Union general, William Tecumseh Sherman, whose physical presence is never shown but whose name is blazoned across the screen in one of those telling intertitles familiar to followers of silent cinema.
These contrasts may one day serve as the basis for a more extensive study along the same academic lines as I’ve outlined above. But for now, let it suffice that the three-hour-and forty-minute screen adaptation of Gone With the Wind is itself a masterpiece of narrative filmmaking. Overlooking the dramatic merits and deficits of its screenplay (credited to Sidney Howard, who died before the film was released) or the cavalier treatment of slavery, as well as its muddled political views and skirting of the larger racism issue, GWTW represents the highpoint of Hollywood storytelling at its starriest.
One major reason for the film’s popularity at the time of its release was the coincidental element of a country (the U.S., in this instance), on the brink of war, sending its men folk off to battle while the women stayed put, waging their own fight to keep home and hearth intact. Scarlett O’Hara epitomized that daily struggle in her gutsy determination to hold on to her memories of the past, along with what remained of her family and property.
That the women of 1930s America related to Scarlett’s predicament and saw themselves in her heroic defense of the home front rightly bolstered box-office receipts to unheard-of levels. They loved the fact that Scarlett was a smart, and sometimes cold-hearted, small-business owner: a real-life Rosie the Riveter in every respect that no man could tame.
And speaking of taming men, contrary to commonly held wisdom, wise-cracking Clark Gable, in the role of a lifetime, was not exactly a shoe-in for the rugged Rhett Butler. Also considered were such marquee items as Ronald Colman, Gary Cooper, Basil Rathbone (author Mitchell’s personal choice), and Errol Flynn. Selznick knew that Gable was right for the part, but he was loath to haggle with his wily father-in-law over the star’s employment. Mayer drove a hard bargain in allowing Gable, then under contract to M-G-M, the opportunity to star in Selznick International’s mammoth production. A deal was finally struck between the two moguls whereby Selznick would secure Gable’s services in exchange for M-G-M’s obtaining the distribution rights — a win-win situation for both studios.
Replete with double entendres and humorous asides for all occasions, as the nefarious Captain Butler, Gable delivers his lines with easy affability and abundant charm and finesse, even though his Southern drawl comes and goes with equal ease. It’s one of the actor’s best roles and a shame the he didn’t win an Oscar for it (he lost out to Robert Donat for Goodbye, Mr. Chips, the sentimental favorite of that year).
With literally a cast of thousands at its disposal, some of the other key participants involved in GWTW were two British subjects, Leslie Howard as Ashley and Olivia de Havilland as Melanie, in addition to Laura Hope Crews as Aunt Pittypat, Hattie McDaniel (the first African-American to win an Oscar for Best Supporting Actress) as Mammy, Butterfly McQueen as whiny housemaid Prissy, Thomas Mitchell as Gerald O’Hara, Harry Davenport as Dr. Meade, Ona Munson as Belle Watling, and Victor Jory, Isabel Jewell, Rand Brooks, Carroll Nye, Oscar Polk, Eddie “Rochester” Anderson, Ward Bond, Paul Hurst, Cammie King Conlon, Ann Rutherford, Evelyn Keyes, Barbara O’Neil, George Reeves, Fred Crane, Everett Brown, Howard Hickman, Leona Roberts, Jane Darwell, J.M. Kerrigan, William Bakewell, Irving Bacon, Louis Jean Heydt, and many other walk-ons, cameos and bit parts, including stuntman Yakima Canutt.
Directed initially by George Cukor, who was fired and replaced by Victor Fleming (Captains Courageous, The Wizard of Oz), with some scenes, quite possibly, helmed by director Sam Wood and even Selznick himself, all focus and attention belong to Vivien Leigh as the feisty Miss Scarlett. The celebrated and well-publicized search for the elusive Scarlett is the stuff of movie legend, leading up to Selznick and his brother, Myron’s, unique choice of Ms. Leigh (born in Darjeeling, British-India) for the challenging assignment.
Among the vast field of contenders and aspirants vying for the coveted part were Bette Davis, Paulette Goddard, Susan Hayward, Miriam Hopkins, Jean Arthur, Joan Bennett, Lana Turner, Tallulah Bankhead, Alicia Rhett (who appeared in the picture as Ashley’s sister, India), and Lucille Ball (!). In hindsight, of those mentioned Leigh was the only actress who measured up to Mitchell’s vivid description of the green-eyed, sweet-faced, yet “lusty with life” protagonist, copping an Academy Award (the first of two for the unstable performer) as Best Actress for her extraordinary efforts. With few exceptions, from start to finish Scarlett is on-screen for roughly the entire length of the picture. And Leigh keeps her frivolous nature front and center throughout.
Puzzlingly, about the only thing that wasn’t transferred to the screen from the novel was the war itself. Look again at the restored Blu-ray/DVD editions of the movie: you will search in vain for any of the most famous battles being depicted. What there is involves the citizens of Atlanta running for their lives to escape the advancing Union Army. There’s plenty of shelling and noise, and runaway carriages with galloping horses and men, as well as pandemonium and voluntary evacuations (for example, the hustle and bustle of the flighty Aunt Pittypat); and, of course, that impressionable stomach-churning scene at the “hospital” where Scarlett witnesses a Confederate soldier’s leg being amputated.
Beyond that, about the only sequence where viewers actually experience the consequences of a war-ravaged South takes place near the Atlanta train depot, i.e., that spectacular crane shot of thousands upon thousands of the dead and dying, lying wounded and waiting for medical attention, while the camera slowly pulls back to reveal the tattered flag of Dixie flapping helplessly in the breeze — a visual metaphor for the movie’s title.
The score by Viennese-born composer Max Steiner, one of the longest to that time, is a certifiable classic among movie-music buffs. His instantly recognizable main Tara theme practically screams Hollywood to any and all corners. The production was designed by William Cameron Menzies, with art direction by Lyle Wheeler and costume designs by Walter Plunkett.
If this isn’t the greatest epic Hollywood’s Dream Factory has ever produced (in the final analysis, it’s all a matter of personal taste), then Gone With the Wind absolutely lives up to its reputation as a certifiable crowd-pleaser without equal.
Copyright © 2015 by Josmar F. Lopes
There was a time in Denzel Washington’s young life when he had entertained notions of becoming a preacher. After all, his father, the Reverend Denzel Hayes Washington Sr. (Denzel was named after his dad), was an ordained minister in the Pentecostal Church. And wouldn’t it have been nice if the son had followed in the father’s footsteps?
But by age 14, Denzel’s parents had split up and the more junior Washington was sent off to a private prep school, i.e., Oakland Military Academy in New Windsor, New York. Although by the time Denzel had studied there the military curriculum had long since been discontinued, it was still a forlorn environment for the impressionable inner-city youth from Mount Vernon.
Years later, the actor would recall that the decision to send him to Oakland Military Academy had profound ramifications for his personal life. “I wouldn’t have survived in the direction that I was going,” Denzel stated. “The guys I was hanging out with at the time, my running buddies, have now done 40 years combined in the penitentiary. They were nice guys, but the streets got them.”
And Tinsel Town got nice guy Denzel, a fair trade at best. A little over 20 years passed when Washington, now a major force on the Hollywood scene after glowing reviews in several big-screen features, was signed to appear in the Civil War epic Glory (1989). He played the part of the taciturn Private Silas Trip, a former slave fighting for the North who also fought for the freedom of his people.
“I wanted to do something different,” Denzel indicated at the time, “and to feel removed from the present time. It’s difficult to do a period piece and to give yourself as an actor a different feeling, as though you’re in a different time.”
“He really defined that character,” commented film critic Julian Roman, “to the point of someone who became a part of the war … but beyond that became a comrade to his friends, became a loyal soldier to his regiment commander, and that’s a transcendent performance.”
“I didn’t even know that blacks fought in the Civil War,” the actor told the Associated Press. “The American history classes that I took didn’t seem to dwell on that at all. It was inspiring for me; it gave me a lot of energy to continue research and get further and further into it. Although the character I play isn’t based on a real person, I kind of put ideas together that I found from reading slave narratives and things like that.”
Battle Cry of Freedom
Directed by former Harvard-graduate Edward Zwick, the letters of another Harvard alumnus, those of Colonel Robert Gould Shaw (Matthew Broderick, who also provides the voiceover), a young, white Union commander in the 54th Massachusetts Volunteer Infantry Regiment, written to his northern abolitionist mother (Jane Alexander, unbilled), formed the basis for this inspiring portrait of gallantry and racism during the American Civil War.
Other relevant sources included the novel One Gallant Rush: Robert Gould Shaw and His Brave Black Regiment by Peter Burchard and Lincoln Kirstein’s photographic compilation, Lay This Laurel.
Unlike the real-life 54th, which was made up mostly of free black men from the North, the screen regiment is comprised almost entirely of ex-slaves. Except for the presence of Col. Shaw, his parents, and the imposing figure of author, abolitionist, editor and speaker Frederick Douglass (Raymond St. Jacques) — two of whose sons actually signed up with and fought for the 54th — the principal participants depicted in the drama are purely fictitious.
One of these fictitious creations, Pvt. Trip, is flogged for having deserted his troops in the midst of their training. As it turns out, Trip was only looking for a decent pair of shoes, which the troops had been denied due to the racist tendencies of the quartermaster in charge of their supplies. Denzel’s tearful acquiescence in full view of his fellow troopers, and before his commanding officer, is one of the most powerful sequences in the movie.
Trip would rather take the punishment than show weakness by backing down from a beating. In his own words, Denzel put the case before us: “Basically what I did was, got on my knees and sort of communicated with the spirits of those who had been enslaved, who had been whipped. And when I came out I was in charge. I said ‘Trip was in charge. If this is what you men, which is what you call yourselves, want to do to Trip, then come with it.’ ”
He and the other volunteers eventually get to display their fighting spirit and worth as Union soldiers in a futile and vividly realistic suicidal attack on an impregnable beach fortress off the coast of South Carolina.
“These men were looking for an opportunity to prove themselves,” Denzel continued. “The battle was no more dangerous than their day-to-day lives with the constant threat of slavery and slave masters with their mentality over their heads. They were looking for the opportunity to have a fair fight and to have a rifle as well, regardless of the odds.”
Subsequently channeling Rev. Denzel Washington Sr., Denzel Jr. sounds distinctly like a man preaching to the choir. And in a rousing scene that takes place the night before the final battle, Denzel (in his guise as Trip) gets to clap and sing along with his fellow soldiers in a spontaneous revival meeting. Do I hear an “Amen” out there?
The hardships these men experience along the way frame the main part of the story behind the unsuccessful charge at Fort Wagner where, historically, the 54th Massachusetts Volunteer Infantry lost half their men. Pride, courage, bravery, dignity and sacrifice are all touched upon in this potent war drama, a fitting tribute to the soldiers who fought and died in that vicious battle, which occurred almost simultaneously with a similar confrontation on the wide-open fields of Gettysburg, PA.
After several nominations wherein he came up empty-handed, in 1990 Denzel finally won a well-deserved Academy Award for Best Supporting Actor for his personification of an angry black man railing against social injustice. For me, the most poignant portion of the entire film comes when the lifeless body of Col. Shaw is unceremoniously thrown into a huge ditch alongside the corpse of Pvt. Trip and others of their regiment, with gulls and sea birds squealing and squawking noisily overhead. Their bodies come together in an involuntary “embrace,” which symbolizes the union of each man’s spirit in brotherly love and understanding — if not in life, then in the after-life.
However, the real-life tragedy of what actually took place after the battle had been lost was mercifully omitted. In the book, Past Imperfect: History According to the Movies, published by the Society of American Historians, Pulitzer Prize-winning author James M. McPherson, in the chapter on the movie Glory, describes the outcome in distressing terms:
“The Confederate defenders of Fort Wagner stripped Shaw’s corpse and dumped it into an unmarked mass grave with the bodies of his black soldiers. When the Union commander sent a flag of truce across the lines a day later to request the return of Shaw’s body (a customary practice for high-ranking officers killed in the Civil War), a Confederate officer [General Johnson Hagood] replied contemptuously, ‘We have buried him with his niggers.’”
Interestingly, Col. Shaw’s father had quite a different reaction to his son’s “dishonorable” burial: “We would not have his body removed from where it lies surrounded by his brave and devoted soldiers … We can imagine no holier place than that in which he lies, among his brave and devoted followers, nor wish for him better company — what a body-guard he has!”
With a screenplay by Kevin Jarre and striking photography by the veteran British cinematographer Freddie Francis, Glory also featured excellent performances from Morgan Freeman as Sgt. Major Rawlins, Cary Elwes as Major Cabot Forbes, Andre Braugher as Thomas Searles, and Jihmi Kennedy as Jupiter Sharts, with Alan North, Bob Gunton, John Finn, Jay O. Sanders and Cliff De Young in other roles.
The exceptionally fine and moving musical score by James Horner, with the welcome participation of the Boys Choir of Harlem, is one of this composer’s best remembered pieces. It’s a favorite of record collectors and sound buffs (Shawn Murphy is the sound engineer), with more than a hint of Carl Orff’s secular cantata Carmina burana in its sweeping choral passages and ethereal, otherworldly tonalities.
(End of Part Three – To be continued…)
Copyright © 2014 by Josmar F. Lopes
When the Legend Becomes Fact — Hollywood and the Historical Film (Part Two): Oliver Stone’s ‘JFK’ and the Lone Gunman Theory
“Let There Be Light” – And Let Us Be Illuminated By It
Continuing with my rumination on a course I once developed concerning Hollywood and the Historical Film, exactly how much history and how much fiction does one include in such an undertaking? On the flip side of the issue, is there anything we may wish to exclude?
Questions of this nature pose a perplexing problem for the instructor, in that the focus of the course is placed exclusively on the limitations and uses of available sources. And a lot is riding on those same sources!
For example, one can turn to the Bible, a primary source for many people’s moral and ethical guidance, and ask the obvious question: “Is the Bible history, and can it be used to teach history?” First and foremost, such a query must take into account matters of fact, faith and fiction, in addition to myths, legends and the all-important religious interpretation of events.
This is a delicate subject to broach with students because it goes to the very core of their belief system and upbringing. Inside an academic setting, it’s a perfectly valid form of inquiry and well within the reasonable. But outside academia’s hallowed halls, one must tread lightly so as not to offend those same beliefs. Therefore, let us proceed with caution.
To begin our analysis, what should one make of the frequent parables present throughout the Biblical narrative? For one thing, we can say that parables, as told by various individuals — Christ primarily — in both the Old and New Testament, serve the purpose of putting a potentially difficult topic or principle into simple, everyday terms. This was done so that the average layperson might understand and absorb their lessons.
Are there ways we can tell how much of what is being conveyed via parables is truth, exaggeration, verbal embellishment or other such extravagance? If by that question one is referring to “fact checking,” that would be a physical impossibility, considering that, for one, we still have the aforementioned distance problem to deal with, as well as the time factor involved in retracing the steps of who said what, where and when so many eons ago.
What about the problem of errors, mistakes or liberties taken with the known (or generally acknowledged) facts? Do the facts found in the Bible, such as they are, coincide with or run counter to the veracity of events as described elsewhere in the historical record? This is the crux of the problem. For if the historical record — those so-called “known facts” — are found not to coincide with the Biblical explanation of events, do we then discard the historical record, or do we drop the Biblical sources as unreliable?
Here’s another interesting case in point, drawn from the Gospels: we know from history that the Roman governor of Judea — the province where the historical Jesus both lived and died — ruled with an iron hand. The reason for this attitude was both practical and plain: to put down rebellion at the first sign of trouble.
How, then, do we explain Pontius Pilate’s reluctance to swiftly carry out that part of Roman justice demanded of his office, i.e., to execute a potential “rabble rouser” such as Jesus, swiftly and at the first sign of trouble? Wouldn’t we expect Pilate to act as any Roman governor would and take matters into his hands, or would his behavior depart from the norm simply because of his proximity to Christ?
Depending on who you ask, the Biblical narrative would “seem” to indicate the latter, which somewhat contradicts what scholars, historians and other learned individuals know of the historical Roman governor’s role in Christ’s Crucifixion, or for that matter any crucifixion.
This takes us to the next topic up for discussion: is history truth? Or, to put it another way, is there such thing as historical truth? If there is, how does it compare to, say, Biblical truth? You will notice the paraphrasing of Pilate’s own rhetorical query, “What is truth?”
We have seen that history can be subjective — that is, one’s view of a subject is always taken from the person viewing it (thus referring back to the old issue of history as being written by the victors), what tends to be called the “subjective vantage point.” Can this view encompass other vantage points — in other words, a more objective one, whereby a topic, matter or person is interpreted in a less opinionated fashion, thereby refraining from pontificating on its substance? Of course it can! But it’s not that easy, is it?
Again, we come to what I describe as the “invariable variable,” also known as the distance problem rearing its ugly head. By that I mean to ask: are we so far removed from the Biblical (or prehistorical) context of past events as to be irretrievably separated from them?
The answer to that is: it all depends. Different events in the past can have any number of differing, even multiple, interpretations or meanings, whether or not they are viewed from a subjective or objective angle.
The Kennedy Case
Let’s take one such event from the recent past and examine it from both the subjective and objective vantage points, certainly one of the most photographed and investigated murder cases of our time, i.e., the assassination of President John F. Kennedy by Lee Harvey Oswald, as interpreted by filmmaker, producer and screenwriter Oliver Stone in the movie JFK (1991).
Stone’s film charts a familiar course set forth 15 years earlier by director Alan J. Pakula’s All the President’s Men, a movie about the Watergate break-in and subsequent investigation of the scandal that brought down President Richard M. Nixon (a subject Mr. Stone tackled separately).
In Pakula’s picture, there are two crusading reporters, Bob Woodward (Robert Redford) and Carl Bernstein (Dustin Hoffman), who write for the Washington Post, headed by Chief Editor Ben Bradlee (Jason Robards). In Stone’s reworking of Kennedy’s untimely death and the ensuing investigation of same, Kevin Costner plays crusading New Orleans District Attorney Jim Garrison, who, as portrayed by Costner, is as far from the real-life fanatical, self-righteous, wrong-headed prosecutor as New York is from Los Angeles.
JFK follows Garrison as he leads his team of investigators on a wild goose chase over unsupported terrain: there are charges and counter-charges, dirty dealings and underhanded activities, clandestine meetings, supposed conspiracy theories, angry Cubans, ex-military types, contrived or fabricated evidence, numerous blind alleys, red herrings, dead or disappearing witnesses, and whatever else the D. A.’s illogical mind conjures up.
Now let us juxtapose Stone’s operatically conceived opus with an actual piece of research material: the 1989 documentary Who Shot President Kennedy? Written, directed and produced by Robert Richter, and narrated by anchorman and reporter Walter Cronkite, the level of investigative journalism demonstrated in this 57-minute feature, which takes into account every known facet of the assassination — from footage of the Dallas physicians who tried to save Kennedy’s life and computerized 3-D images of Dealey Plaza, to a frame by frame analysis of the Abraham Zapruder footage and leading critics of the Warren Commission’s findings — puts to shame many of JFK’s most far-fetched conclusions.
To begin with, what did the city of New Orleans have to do with Kennedy’s murder in Dallas, Texas? Quite a lot, as the film would have us believe. In the first place, Lee Harvey Oswald (Gary Oldman, in an uncanny personification), the so-called Lone Gunman (a designation made years after the fact), was born there; and in the second, Garrison’s bogus criminal case was aimed squarely at a New Orleans businessman named Clay Shaw (a fey Tommy Lee Jones) and his sometime partner, cross-dresser David Ferrie (a peculiarly manic and foul-mouthed Joe Pesci), scapegoats both.
The result is Rashomon run amok. In the end, one has no idea who to believe or how to separate the “good” guys from the “bad” guys (there are no black hats here, only varying shades of gray). In reality, Garrison tried his best to sway an incredulous court to convict Clay Shaw on flimsy if unsubstantiated evidence. If the film had stayed in the Big Easy, it might ultimately have made more sense. As it turned out, though, Stone had his fictional Garrison go in every direction at once, all the while trying his best to keep up appearances as the dedicated D.A. and devoted family man and husband.
What were those directions? Among the various corners turned, the director had his cast and crew look at the case against Oswald in much the same manner as the above-mentioned documentary, which included the single bullet theory (the timing problem, the angle of trajectory, the type of weapon fired, and other incongruous issues), the possibility of a Grassy Knoll assassin (or lack thereof), the photographic and acoustic evidence from Dealey Plaza, Oswald’s alleged ties to Cuba and the Soviet Union, likewise his FBI and CIA connections, the president’s body, the supposed botched autopsy (or “altered wounds” theory), and so on. Whew, that’s a whole lot of fat to chew on for three hours of movie time!
As we know from past experience, the longer a specific case is investigated, the more it will reveal about itself. In this instance, however, the more the JFK assassination is probed and poked at, the more speculative it gets and the more speculation surrounds it, which only leads to more unanswered questions and crackpot “theories” — some of which belong to the realm of fantasy and the bizarre, not to mention the harebrained.
Still, does the fact that the most investigated and photographed case in modern history make the resultant inquiry any less meaningful, or the findings any easier to accept? We know there were many problems with the Warren Commission’s Report, but after watching JFK one is forced to admit that Oliver Stone’s version of events is not without glitches of its own. Bravura film-making, which the director’s motion picture undeniably encompasses, does not a true picture make!
Additional problems are presented or addressed, along with newer and ever bolder hypotheses about who killed Kennedy, to include blatant, out-and-out inventions. One gets the feeling that Stone is constantly lurching for a definitive answer, which remains stubbornly out of his reach. The question at this point becomes: has Stone taken undue liberties with the facts? Can he beg our indulgence over their use by employing the oft-quoted “poetic license” excuse?
We may even put forth a few theories of our own, such as: doesn’t a film’s director have a responsibility — moral, ethical or otherwise — to present the facts as they are? The “truth,” if it indeed exists, is out there (at least, according to The X-Files’ Fox Mulder), so why can’t he see it?
Do directors, by their very nature, have their own agendas to pursue, arrived at before filming even begins? By their action, does it soil whatever believability has been attained, only to be buried under layer upon layer of unproven allegations?
Are they not attempting to fit pieces of gathered evidence, conveniently labeled “the facts,” into a previously developed, predetermined script? And isn’t this another form of manipulation of past events, a parable to end all parables, the cinematic Gospel according to Stone?
All of the above certainly merits our attention, which may warrant further inquiry at a later time.
(To be continued…)
Copyright © 2014 by Josmar F. Lopes
Come On, People Now
That’s a great title for an article about the music of the Swinging Sixties. And with so much happening right here, right now, in the good ole USA — from the fiftieth anniversary of the assassination of President John F. Kennedy to the passing of folk legend and peace activist Pete Seeger and the upcoming half-century celebration of the Beatles’ landmark invasion of our shores — there’s no better time like the present to rekindle one’s association with that long-ago period from about 1962 up through 1971 when popular songs and colorful individuals formed the backbone of various movements.
The songs and individuals I had in mind, however, were ones I personally remember listening to on the radio and/or watching on TV. What’s more, I recall hearing a handful of these tracks in my school’s English and Social Studies classrooms — in some cases, within a few months of their release. How many of us can say we experienced that sense of having belonged to a tiny part of history in the making?
Today, I am grateful to have lived through those turbulent times. Granted, the impetus for posting this piece comprises the thinnest hint of nostalgia for songs that actually meant something. Besides the obvious sentimental value, I wanted to make the case for the enduring efficacy of these unforgettable artworks, as well as pay belated tribute to their creators.
Now that I’ve reached a point in life where maturity and understanding have merged with a writer’s ability to come to grips with these matters, I felt compelled to pursue the mystery of why these songs still haunt our memories after so many years in circulation.
Maybe it was my disgust at the poor quality of this year’s Grammy nominees. Maybe it was my disappointment at seeing how worn and jowly ex-Beatle Paul McCartney had gotten in that spiritless duet with drummer Ringo Starr — and how unremarkable Sir Paul’s output has become of late (“bland” is the word I would use).
Whatever the reason, I needed little motivation to remind readers of what true folk, pop and rock once sounded like to a generation that learned to appreciate song lyrics that were as dense and meaningful as they were occasionally diffuse; with instantly recognizable tunes that, despite the passage of time, have continued to celebrate a momentous era in America.
If I have left a favorite singer or two out, please accept my apologies. The ones I’ve chosen reflect my own preferences and are, in no way, a commentary on the abilities (good or bad) of those artists excluded from this list. To paraphrase a line from Spencer Tracy in Pat and Mike: “Not much meat, but what there is, is ‘cherce.’”
It’s fair to say that Dylan ushered in the times, and from there went on to inspire an entire generation of like-minded artists. Born Robert Allen Zimmerman in Duluth, Minnesota, on May 24, 1941, musician, performer and songwriter Bob Dylan (he took his surname from Welsh poet Dylan Thomas, whose dictum, “Do not go gentle into that good night,” he took to heart) rose to fame in the Sixties as the unofficial, if habitually unwilling, spokesperson for social and civil causes (“Don’t follow leaders!” he famously insisted in 1965).
Influenced early on by Woody Guthrie, the father and pioneer of folk and protest songs, along with rocker Little Richard and Country & Western star Hank Williams, Dylan used the power and substance of language (drawing from the likes of Walt Whitman, French Symbolism, and the Beat poets) to venture forth on his own as the voice and conscience of America’s disheartened youth.
With such classics as “Blowin’ in the Wind,” made popular by the trio of Peter, Paul and Mary (who smoothed over the song’s edges with the pristine purity of their vocals), and the droning, prophetic “The Times They Are A-Changin’,” Dylan sang with the stridency of a picketing union worker, the immediacy of a Baptist preacher, and the disarming yet wise-beyond-his-years boyishness that captivated audiences used to less offensive material.
“Blowin’ in the Wind,” the first item on our list, betrays strong African-American spiritual roots. In the rhetorical form of a question and answer — a mini sermon, if you will — it’s a give-and-take lifted in part from the Old Testament Book of Ezekiel. The words are simple and direct, the instrumentation (acoustic guitar with intermittent bursts from Dylan’s harmonica) Spartan and lean, the voice solemn and sincere, all persuasively arrayed to point up man’s longing for freedom and dignity in his continuing struggles against injustice:
How many roads must a man walk down before you call him a man…?
The lyrics have something to say as well about outlawing armed conflict long before our country’s involvement in Southeast Asia took hold:
Yes, ‘n’ how many times must the cannon balls fly before they’re forever banned…?
A year or more before President Kennedy was killed, Dylan chanted this prescient verse:
Yes, ‘n’ how many deaths will it take till he knows that too many people have died?
And what’s the sought-after solution to these problems? It’s simple, really:
The answer my friend is blowin’ in the wind the answer is blowin’ in the wind.
Dylan himself has clarified the meaning: “Too many of these hip people are telling me where the answer is but oh I won’t believe that. I still say it’s in the wind and just like a restless piece of paper it’s got to come down some … But the only trouble is that no one picks up the answer when it comes down so not too many people get to see and know … and then it flies away. I still say that some of the biggest criminals are those that turn their heads away when they see wrong and know it’s wrong.”
Yes, ‘n’ how many times can a man turn his head and pretend he just doesn’t see…?
Yes, ‘n’ how many years can some people exist before they’re allowed to be free?
Yes, ‘n’ how many times must a man look up before he can see the sky?
Yes, ‘n’ how many ears must one man have before he can hear people cry?
If there is any way out of these intractable conditions, it can be found in a later musical number — a suitably spiritual one, we should add — written by our friend Mr. McCartney in 1969, after a dream he had involving his long departed mom, Mary:
When I find myself in times of trouble Mother Mary comes to me
Speaking words of wisdom: let it be
And in my hour of darkness She is standing right in front of me
Speaking words of wisdom: let it be
Let it be, let it be,
Let it be, yeah, let it be
There will be an answer: let it be.
His song offered a slightly more consoling message “in times of trouble” than, say, the lyrical fist-shaking that Mr. Dylan previously propounded. Still, Paul’s late-in-the-day composition, “Let It Be,” came at the tail end of the decade and was the last single the Beatles released before they disbanded.
Better Times Ahead?
One of Dylan’s most challenging outpourings, an oracular expression of holy-rolling writ large (and a jeremiad standard in its day), is his “The Times They Are A-Changin’” from 1964. At the time, his vision of the coming inundation, of “wars and rumors of war,” of political turmoil, of parents forced to give way to their offspring, of generational divide and quasi-scriptural proclamations that the “first shall be last” — compounded by his mumbling vocals — smacked of the ravings of a street-corner lunatic on the fringe of society.
Sadly, most if not all of Dylan’s apocalyptic imagery would de facto come to pass with the outbreak of the Vietnam War conflict. Conversely, it was exactly this kind of verbal warning shot, cloaked in the formal structure of popular song (shades of composer Kurt Weill), that so enraged the senior members of “society,” i.e., the “establishment,” as it was known back then. At the risk of making it sound like a lengthy diatribe, I print the song’s thought-provoking lyrics in full:
Come gather ‘round people
Wherever you roam
And admit that the waters
Around you have grown
And accept it that soon
You’ll be drenched to the bone
If your time to you
Is worth savin’
Then you better start swimmin’
Or you’ll sink like a stone
For the times they are a-changin’
Come writers and critics
Who prophesize with your pen
And keep your eyes wide
The chance won’t come again
And don’t speak too soon
For the wheel’s still in spin
And there’s no tellin’ who
That it’s namin’
For the loser now
Will be later to win
For the times they are a-changin’
Come senators, congressmen
Please heed the call
Don’t stand in the doorway
Don’t block up the hall
For he that gets hurt
Will be he who has stalled
There’s a battle outside
And it is ragin’
It’ll soon shake your windows
And rattle your walls
For the times they are a-changin’
Come mothers and fathers
Throughout the land
And don’t criticize
What you can’t understand
Your sons and your daughters
Are beyond your command
Your old road is
Please get out of the new one
If you can’t lend your hand
For the times they are a-changin’
The line it is drawn
The curse it is cast
The slow one now
Will later be fast
As the present now
Will later be past
The order is
And the first one now
Will later be last
For the times they are a-changin’
His namesake, poet Dylan Thomas, once wrote that, “Old age should burn and rave at close of day.” Not only that, but it should “Rage, rage against the dying of the light.” Bob Dylan, who raged and fumed so early on in his career, crashed and burned much sooner than most — and long before the dying of his light.
To many of his diehard fans, Dylan had betrayed the folkie “cause” by going all-out electric at the 1965 Newport Folk Festival. And the lyric wordplay, by turns virulent and elegiac, witty and bizarre, were more oblique than ever in his corresponding Bringing It All Back Home and Highway 61 Revisited releases, as well as the classic double-album Blonde on Blonde.
On the morning of July 29, 1966, upon his recent return from an exhausting nine-month world tour the month before, Dylan was involved in a life-changing motorbike crash near his home in Woodstock, New York, which led to his subsequent withdrawal from performing. His forty days and forty nights in the wilderness stretched into a year and a half of self-imposed isolation.
“When I had that motorcycle accident,” Dylan told a reporter in 1984, “I woke up and caught my senses. I realized that I was just workin’ for all these leeches. And I really didn’t want to do that … I was pretty wound up before that accident happened. I probably would have died if I had kept on going the way I had been.” This begs the question of whether Dylan had also been dabbling in booze and drugs, thereby using the extended “timeout” to undergo detoxification. His absence from the scene has never been fully explained.
Emerging from the dark, Dylan released two back-to-back albums of new material: the introspective John Wesley Harding in 1968, and the country-flavored Nashville Skyline in 1969. The public soon learned that he and his Butterfield Blues Band (a.k.a. The Band) had been busy documenting their latest efforts in the experimental recordings dubbed The Basement Tapes (1975), which confirmed the singer-songwriter’s growing obsession with Country & Western themes fused with rural rock.
He would not perform live again until a 1974 concert tour. Five years later, Dylan, who was born into the Jewish faith, would formally convert to Christianity. He was no longer the proverbial “Mad Prophet of the Airwaves” (that honor would go to the fictional Howard Beale from the movie Network), but a man trying to confront the expected norms of artistic life. He would celebrate his conversion with the launch of Slow Train Coming (1979).
Bob Dylan’s abandonment of live performing, and the acid-tripped rock-n-roll lifestyle that went with it and that he formerly espoused, had a heavy impact on other bands and individuals, as we shall see.
(End of Part One – To Be Continued…)
Copyright © 2014 by Josmar F. Lopes
Introduction to “Reel” Life
The fall 2013 issue of Cineaste includes a feature-length article by the magazine’s consulting editor, Dan Georgakas, of a ten-part documentary series entitled “Oliver Stone’s Untold History of the United States: The Course of Empire.”
Known for his faux-biographical depictions of Presidents John F. Kennedy (JFK), Richard M. Nixon (Nixon) and George W. Bush (W), as well as imaginative recreations of events and personages in such films as Salvador, Platoon, Wall Street, The Doors, Born on the Fourth of July and World Trade Center, screenwriter, producer, director and lecturer Oliver Stone has also coauthored a 750-page companion book of the documentary with professor of history and director of the Nuclear Studies Institute at American University, Peter Kuznick, who contributed to the series’ script.
In the book and documentary (to be issued on DVD in March 2014), ex-Vietnam veteran Mr. Stone attempts to correct our perceptions about the numerous inaccuracies that have been foisted upon Americans with regard to their own history. Among the themes associated with his five-year project is one where Stone maintains that we have been thoroughly misled about U. S. involvement in a variety of international conflicts, beginning with (but not limited to) the Second World War — ergo the Untold History aspect of the title.
From such seminal ideas as Manifest Destiny and American exceptionalism to this country’s later foreign policy with respect to Korea, the Cold War, Vietnam, Iraq and other trouble spots around the globe, the idea of a secret basis to, or “between the lines” reading of, American history is challenged and refuted by Mr. Georgakas. To begin with, he charges the narrative of Stone’s documentary with, among other things, “a penchant for interpreting historical decisions as dependent on personalities,” as if all it took to plunge America into all-out war were the bull-headed decisions of a few charismatic leaders with gutsy feelings in their bellies.
Georgakas then takes the director to task, mostly over his use of Hollywood fiction films (“A troubling and surprising aspect”), which serve as visual manifestations of many of the events discussed and analyzed in Stone’s multi-part series. When viewing these movie clips, Georgakas contends, viewers might mistake them for the unvarnished truth — or worse, as indisputable evidence of the validity of Stone’s claims. Further, he goes on to cite an intrinsic problem that exists in this country, in that many people tend to get their history (along with their facts) from movies, television and online news services — which as many of us know, aren’t always the most dependable and, more often than not, have agendas of their own to push.
This argument raises the whole issue, then, of whether anyone — be they American or German, British or Chinese, Russian or Lithuanian — has the temerity to portray history, or past historical events, in forms (that is to say, Hollywood films) that are fundamentally at odds with legitimate or traditionally-accepted means; thereby making said forms subject to re-interpretation by a single if not a whole host of individuals — in Stone’s case, by a radical filmmaker with his own agenda to pursue.
To my understanding, this defeats the purpose of having historians, i.e., persons trained and experienced in recognizing the differences between fiction and fact, act as custodians of the past. As we are keenly aware, it’s a widely held notion that “history is written by the victors.” What this statement ultimately reveals, however, is that events leading up to those victories possess a built-in degree of ambiguity. In other words, they are dependent exclusively on the writer’s limitations as an educator or historian, along with that individual’s choice of material from among a wealth or lack of available sources, as well as specific knowledge of events.
More to the point, an element of trust must exist between the reader and the writer in accepting this individual’s finished output. If that trust is broken or disturbed, or never existed in the first place, then the fault lies with the writer and his work. But if that trust can be established at the outset and kept intact throughout, only then can we be assured of a fairly objective and reliable reading of the past — given the nature and type of quantifiable evidence relied upon.
An excellent example of this can be found in Miranda Carter’s richly detailed book, George, Nicholas and Wilhelm: Three Royal Cousins and the Road to World War I, wherein extensive correspondence between the three titular heads of state, their personal recollections and individual diaries and memoirs, in addition to historical records, documentation, memoranda, obscure notations, newspaper accounts, period writings, and other primary-source material helped to elucidate the topic in a thoroughly satisfying manner.
This is where reader, writer and editor Mr. Georgakas, and screenwriter, director and lecturer Mr. Stone, part company, in that the main bone of contention is the latter’s use of Hollywood fiction films as stand-ins for the requisite evidential source-work; or, to put it brusquely, the introduction and incorporation of non-traditional (read: illegitimate) forms that are hardly the last word in authenticity or accuracy.
By that reckoning, Stone’s past record of cinematic accomplishments is not exactly what Georgakas, or anybody else for that matter, would term a fitting background for this kind of “complex social, economic, and political” endeavor, thus squelching the needed trust factor from the start.
When in the “Course” of Human Events
I have always been fascinated by history. I did, in fact, major in the subject at Fordham University, while I continue to espouse a thoughtful and constantly evolving interest in Hollywood films with stories about individuals, personalities and themes related to the past (Lawrence of Arabia, Lincoln, Saving Private Ryan, Glory, Patton, The Aviator, The King’s Speech, El Cid, and numerous others). This is what attracted me to Cineaste’s piece and director Stone’s prospective thesis.
The magazine itself has even devoted whole issues to the subject. In fact, their spring 2004 edition included a “Film and History Supplement” that was published with “special support provided by the Academy Foundation of the Academy of Motion Picture Arts and Sciences.” The editors of Cineaste, as well as this author, concur with and categorically accept the notion that film can bring clarity and purpose to historical subject matter in more entertaining ways than traditional methodologies can.
When I was a teacher of English as a Foreign Language in Brazil, I developed a course entitled “American History and Culture through Film.” The course’s aim was to chart the path the country took to become an independent nation and world power, by linking this objective to various Hollywood films that dealt with the same concerns. Commentaries from historians and movie critics, in conjunction with film reviews, critiques, pictures and clips would be shown in support of the assigned reading material.
The text that was used, An Illustrated History of the USA by Bryn O’Callaghan (published by Addison Wesley Longman, Ltd., 1990), explored the “development of the United States from its origins as a land inhabited by scattered Amerindian tribes to the culturally diverse but united country that we see today.”
Reflecting back on my course as it relates to Georgakas’ article, I realized, to my surprise, that perhaps I had been performing the same function back in my pedagogical days that Stone was attempting to perform today — and via similar methodology. The difference being, however, that back then I claimed no right to historical accuracy in my use of Hollywood fiction films. By providing, where needed, the appropriate explanations and clarifications to what my students were viewing, the readings on culture and history assigned them worked hand-in-hand with the images I intended to show. Still, numerous questions came to mind as I was planning and preparing my course for presentation.
As an indication of this thought process — and, to be perfectly honest, the thought processes of my students — I have listed many of the questions below. The answers to these questions have been provided where feasible, although for the most part they remain open-ended, which, for all curious and supportive teachers everywhere, is as it should be.
To begin with, what is history? What is the difference between what we call “history” and a simple “story” — a word derived from the same Latin and Greek roots? Why do we study history? How is history different from, say, myth and legend? For one thing, history is the study of past events, or events known to have occurred in the past. We may also define it as a search for the truth. The reason we study it is self-evident: to understand how and why these events occurred and, if possible, avoid a repetition of those mistakes that led to their occurrence.
Discounting the influences of religion, how is history conveyed and preserved? One of the ways that history can be conveyed is by oral means, which is not the most practical or reliable. One of the ways it can be preserved is by the written word, which turns out to be quite practical, but can also become unreliable. Other methods of conveying and preserving history are visual, i.e., through pictures, photographs, film, TV, video, and digital, electronic or hard-copy formats.
If we look at one of these methods — film — we can see that film is a combination of many forms of preservation. We know that film is a visual record of an event (for example, Abraham Zapruder’s 8mm footage of the Kennedy assassination). The event can be current or one that took place in the past.
A documentary, then, is a real-life record of an event or occurrence, either currently or in the past. We also have fictional records of an event, which can be defined as a recreation of the past, albeit one where the narrative is subject to embellishment so as to incorporate a specific story line or plot. Representations of future events, or events yet to have occurred, are labeled science fiction. To this we add a level of speculation about the future and what that future might hold.
When filmmakers decide to recreate the past on screen, it’s instructive to ask how one can transform an event that has already taken place into one that has yet to occur. To put it another way, when we see a cinematic representation of a past historical event, do we ever wonder how the past could suddenly have become the present? Upon completion of our viewing of a film, how often do we notice that the present has now become the past? Why is that important? What influence does the past have on current events? How about on future events?
As we ponder the range of possibilities implicit in the above queries, keep in mind George Orwell’s famous warning from his novel 1984: “Who controls the past controls the future. Who controls the present controls the past.” The implication here is that by controlling the past one can also control both the present and the future.
Film, as far as we know, is the end-product of a vision of many people, a collective vision of a talented and diverse group of individual mind-sets. Among the individuals involved with that collective vision are the film’s director, producer and screenwriter, the production designer and art director, the costume designer and cinematographer, the soundtrack and Foley artists, the composer and special-effects artists, and dozens upon dozens more. How do we bring all these disparate elements together into a coherent whole in order to tell a viable story? For that matter, what story do we want to tell? How does one choose a story from the past from among so many stories available to us?
Here’s a little exercise you can do in the privacy of one’s home or apartment. Take an incident from your childhood, say, your first day at school. Now think about the main characters involved in that incident.
Next, take the main events of your story and telescope them by mapping out a timeline of events. Narrow the focus down to the essential ingredients; try concentrating on one specific event at a time, one major highlight of your tale that will help tell the story visually. What events do you use? Which characters do you include? Which ones do you reject?
Think about the time-lapse of events, what we call foreshortening, as a way of telling your story. Do you want a visual representation of these events (“plain vanilla”), or a verbal and visual one (“mixed bag”)? How would you present them and in what order?
The next part is more thought provoking. From the above mental exercise, determine if your final product will be a “true” representation of the past or a fictionalized account. Could actors really take the place of real people in your story? Who would you hire to play the major roles? Who would be the lead? Who would direct the film version? Who would write the script? Compose the score? Shoot the footage? Decorate the set? So much to think about, so little time to spare. This is the dilemma of all those professional story-tellers out there we call filmmakers — welcome to the club!
Revisionist History and the Distance Problem
What do we mean by revisionist history? Theoretically, revisionist history (also known as historical revisionism) is the process of finding inaccuracies or fallacies in the historical narrative or record and making corrections to it. Hand-in-hand with correcting the narrative is the added difficulty of having to challenge people’s long-held views of the past and their inability, as we perceive it, to concede to their revision.
Is this what is meant by a “modern interpretation” of past events, for example, the aforementioned Kennedy assassination and the supposed “lone gunman” theory (JFK)? What about the plot to kill Adolf Hitler (Valkyrie), or the Watergate scandal (All the President’s Men), or Iran-Contra (Clear and Present Danger), or the hunt for Bin Laden (Zero Dark Thirty), or any number of past occurrences?
As a prospective movie-maker, a potential Oliver Stone in the making, can you get away with revising the past — and to what degree? What’s to be gained by doing so, and is it right to engage in revisionism for artistic purposes, the so-called “art for art’s sake” excuse? How can we escape the dangers of historical revisionism? Shouldn’t we present these events as they really were? That’s the province of investigative journalism, isn’t it, of the kind that figured prominently in the movie, All the President’s Men.
Let’s take the stories of individuals from the past. Some recent film subjects include King George VI, Abraham Lincoln, Franklin Delano Roosevelt, and Nelson Mandela. These are all famous subjects who made their lasting mark in the past. Distance from a subject can bring with it a kind of physical as well as mental distortion.
Try this experiment: take a small object — your smart phone, for instance — and place it in front of you. You can still read the phone’s contents, can’t you? Now place the phone farther and farther away from you. It becomes progressively more difficult to read the contents, doesn’t it? Bring it closer to you, and it becomes clear again.
Let’s look at a postcard, any postcard, of your favorite haunt or vacation spot. Postcards have a sharper focus “close up” than they have when held at a distance from one’s prying eyes. One can’t read the inscription on the back when the postcard is away from your field of vision. One must rely on one’s memory of what the inscription actually says. But memory is a fragile thing, as we know, and many times the memory of what was said or written fades into obscurity. Now you can understand and appreciate what distance can do to history. This is what we call the distance problem.
This experiment can be applied to politics as well as to history. Do politicians really believe that their constituents will remember what they promised to do for them once they get elected? With the invention of the Internet and YouTube and other electronic devices and means of preservation, we now have the ability to instantly fact-check what those same politicians have promised and thus correct the historical narrative, i.e., those inaccuracies and fallacies, not to mention distortions, of the recent past that were once taken for granted. It’s a dream come true for historians.
To Film or Not to Film
Which films can be used to describe the historical narrative? While my course was basically concerned with films on or about American history, any halfway decent representation of the past can be utilized as long as one prefaces their use and content with the following caveat: “This is only a fictional reenactment of the events or persons depicted” — that is, “It ain’t necessarily so, folks.”
Some subjects have an enormous quantity of available footage to select from (Westerns, Lincoln, the Civil War, and Vietnam). Others have a very limited field from which to choose (the American Revolution, FDR, and Nixon). While most of the films can be about actual historical events or situations in the past, some are purely fictional representations (Gone with the Wind, The Manchurian Candidate) with factual aspects thrown in. Nevertheless, I attempted to highlight as much of American history and culture as possible in my choice of movies, without boring the students. Since I was dealing with a Brazilian mind-set, I chose popular films about subjects and personalities that Brazilians had a general knowledge and curiosity about.
Along the same lines, what is a film genre? Why do we classify movies by subject? Is it easier or more difficult to place a film story in a genre? What are the conventions of a genre? Let’s have a look at a typical American genre, the Western. What are the conventions of a Western? Well, there’s a good guy, a bad guy, the chase or pursuit of one party after the other, the posse (the ones who do the pursuing), and the conflict. There’s also the resolution of the conflict, known as the duel or shootout; the scenery, the horses, the Indians, the girl, and the reward.
Most people would be surprised to learn that most of the above conventions never took place, or if they did occur it was not in the manner represented on film. An illustration of this point is the story of Wyatt Earp.
Before 1900, Earp was almost a totally unknown figure. Then, a story teller took his tale and transformed it into a dime-store dreadful called Frontier Marshal, which made Earp’s name a legend. Afterward, the legend became myth and the myth became one of the most famous Wild West stories of all time. This bred other tall tales, including those of Jesse and Frank James, Billy the Kid, Annie Oakley, Buffalo Bill and a myriad of others.
There are numerous examples of films that discuss how fictitious events become fact. The “facts,” such as they are, get converted and distorted into legend, which later become myth. John Ford’s Fort Apache, and especially The Man Who Shot Liberty Valance, bring this facet of myth-making to practical and disturbing life. A modern interpretation of this phenomenon is present in Clint Eastwood’s unforgettable film, Unforgiven, in which the lead character, William Munny, is faced with having to live down his murderous reputation, while simultaneously being challenged to live up to that same reputation in order to collect a monetary reward at the end.
This brings us back to the essential problem of history and the historical film, which can be encapsulated in the famous line spoken by one of the reporters covering the funeral of the John Wayne character, Tom Doniphon, in The Man Who Shot Liberty Valance. After listening to the real version of events surrounding the shooting of the notorious killer, Liberty Valance, by former governor of the state and U.S. Senator Ransom Stoddard (James Stewart), the newspaper man unhesitatingly burns his notes and declares to Stoddard, “This is the West, sir. When the legend becomes fact, print the legend,” which is exactly what many historical films do.
And no history remains “untold” for very long, as we will see in Part Two.
(End of Part One)
Copyright © 2014 by Josmar F. Lopes
Moviegoers may fondly recall one of the finest costume dramas of years past, The Madness of King George. Directed by Nicholas Hytner (The Crucible), the 1994 film version of Alan Bennett’s play, The Madness of George III (which was also directed by Hytner), dramatized the plight of England’s eighteenth-century sovereign George III, magnificently re-enacted by the late Nigel Hawthorne (“What, what!”), who repeated his successful stage role for the screen.
In particular, we remember how that same monarch — who lost the American colonies to a bold, grassroots movement — was suddenly struck down in the prime of his life by a mysterious ailment, all the while behaving rather “irrationally,” to put it mildly, toward his wife, Queen Charlotte (a steadfast Helen Mirren), and her ladies in waiting, while spewing forth all-manner of childish prattle; then, ultimately being “cured” by minister-turned-medic Dr. Willis (the no-nonsense Ian Holm, in his pre-Bilbo Baggins days), who uses the most unconventional of methods then available to science, in a sort of physician’s mind-over-monarch approach.
In our more “enlightened” times, King George’s pet peeves, in fact, may turn out to have been a rare form of disease we know today as porphyria, an incurable affliction that attacks the body’s central nervous system. It manifests itself in various forms, including abdominal pain and tachycardia, as well as unreasonable behavior toward individuals (i.e., boorish shouting, constant interruptions, and lewd remarks — and there was plenty of that in the film) and portentously dark urinary output. In the movie, the color of the king’s “water” is a rather unsettling shade of blue.
While all this is transpiring behind the scenes, the king’s indolent eldest son (also named George, and played in appropriate, stiff-upper-lip fashion by a lazy-eyed Rupert Everett), is goaded into taking over his father’s throne by first having His Majesty declared a mental incompetent (good luck with that!), then attempting to assume the role of a “democratically” appointed regent by allying himself with his father’s opponents. Now, George! Behave thyself!
When The Madness of King George was first released, it was considered a fairly scathing commentary on Queen Elizabeth II and her dysfunctional royal family. Today, it bears closer scrutiny, for it depicts not only the disintegration from within of a fragile form of government (shades of the fractious U.S. Congress), but one that was entirely dependent upon the force and personality of a solitary, charismatic leader, i.e., the king himself (paging Donald Trump).
Nevertheless, the latest research from a St. George’s University in London study showed that the king may indeed have been suffering from mental illness all along. After a lengthy examination of the hundreds upon hundreds of letters the prolific George III sent to various and sundry individuals throughout his long life, the research team concluded that, because of the sovereign’s wordiness and use of complex language and vocabulary — in addition to his overly colorful epithets and breathless loquaciousness — he may also have experienced one of the earliest documented incidents of bipolar disorder.
As for the blue-tinted urine, that was explained by the royal physicians’ ministering of a medication derived from the gentian plant. With its deep blue petals, the plant is still in use today as a mild sedative or tonic that could turn one’s urinary output blue (What, what?).
With the king’s return to “normally accepted” court behavior, a parliamentary crisis is averted and the English monarchy resumes its steady-as-she-goes course for a grand total of 60 years under His Majesty’s rule. The newly put-forth diagnosis of mental illness, mixed with dual personality issues, appear to finally address the notion that the madness of King George had a psychological as well as a neurological basis.
That George III eventually recovered his wits about him, and came back, full-throttle, to lord it over his wayward son — and put down those poor unfortunates who came ever-so-close to fomenting out-and-out rebellion against his realm — was not lost on modern movie audiences.
Considering how our own politicians have been faring of late, maybe they should have their own urinary output examined. You know, just in case …
The Madness of King George (1994)
Produced by Stephen Evans and David Parfitt; directed by Nicholas Hytner; screenplay written by Alan Bennett, from his play The Madness of George III; music by George Fenton and George Frederic Handel; cinematography by Andrew Dunn; edited by Tariq Anwar; starring Nigel Hawthorne, Helen Mirren, Ian Holm, Rupert Everett, Amanda Donohue, John Wood, and Rupert Graves. 107 min. Distributed by Samuel Goldwyn Company.
Copyright © 2014 by Josmar F. Lopes