The Oscars vs. the public

Although it’s still one of the highest rated shows on television, last week’s Academy Awards garnered its lowest ratings in six years, down about 16% from last year alone. Some (including myself) see this as a reflection of how terrible the show is — with monotonous acceptance speeches for little-understood categories (like Best Sound Mixing), tacky musical numbers that seem to have arrived via a time warp from another century and an emcee whose attempts at humor fall embarrassingly flat.

But hey, the Oscars (with few exceptions) have always been like that — even when they got much higher ratings. We watched anyway — because we wanted to gawk at the stars, learn the winners of the “big” awards as soon as they were announced and perhaps get to savor an unexpected memorable moment or two.

Lastly, we hoped to see our favorite films get rewarded with a statuette. And herein lies a critical “problem” with the Oscars, at least as cited by many pundits. As noted by Michael Cieply and Brooks Barnes in a New York Times article, the Best Picture winner, Birdman, collected only about $11 million in ticket sales between the time it was nominated and the day of the awards. In contrast, American Sniper, another Best Picture nominee, took in $317 million over the same period — equalling almost as much as the combined total of the other seven nominees. Yet Sniper went home with only one award — a minor one for sound-editing.

The Oscars vs. movie-goers disparity is actually worse than the Sniper example. If you look at the top eight highest grossing movies of 2014, American Sniper is the only film on the list to have gotten a Best Picture nomination. This is not a new phenomenon. It is a trend that has been developing for years. Indeed, it is precisely what led the Academy to expand the potential list of nominees beyond its prior five movie limit. The goal was to increase the number of popular films that got nominated. It hasn’t quite worked out that way. Chances are most viewers won’t see their favorite films win any awards — because their favorite films aren’t even nominated.

Cieply and Barnes’ explanation is that the Oscars “have become hopelessly detached from movie viewers…Both the Academy and the echo chamber of Hollywood’s awards-system machinery have nearly broken their connection with the movies that many millions of people buy tickets to watch.” The Academy voters have become “elitist” and “not in step with anything that is actually popular. No one really believes anymore that the films they chose are the ones that are going to last over time.”

I beg to to differ.

If the Academy is “elitist,” then so is almost every other institution that gives out film awards. According to IMDB, Birdman received 170 wins and 152 nominations, including several Best Picture wins and a spot on almost every critic’s Top Ten list. The only other movie with comparable credentials was Boyhood, another low-grossing Best Picture nominee. In contrast, The Hunger Games: Mocking Jay – Part 1, the highest grossing film of the year, received only 1 win and 9 nominations — none of which were for Best Picture. Even American Sniper had only 8 wins and 25 nominations overall.

As I see it, the reason for this is that the Oscars, as well as most of the other awards cited by IMDB, are not based on popularity, at least not as measured by box office success. Ideally, they are determined primarily by artistic merit. These two criteria show very little overlap, especially these days. This disconnect is by no means unique to movies. You see the same thing in literature. The winners of the National Book Awards, the Pulitzer prize or Nobel prize are rarely given to books that topped the New York Times Best Sellers list.

I’m not naive. I know that politics, among other non-quality factors, contribute to an Oscar win. But it’s still fair to say that sheer popularity is not the determining factor — probably less determining now than ever before. And this is as it should be. I view the trend of recent years as a positive one. Would we rather return to a time when movies like Oliver! or Driving Miss Daisy wins Best Picture? I hope not.

Birdman may or may not be your choice for Best Picture. But it is undeniably a great film. It had terrific acting by the entire cast, creative cinematography, an inventive percussive soundtrack, topped off by an original thought-provoking screenplay. I contend that this is a film that will “last over time” — certainly much more so that most of the films that make up the top box-office winners — populated with big budget, special-effects-laden sequels and franchises like The Hunger Games, Captain America, The Hobbit and Transformers. Which film do you consider more likely to join Citizen Kane, The Godfather, or Casablanca on the AFI’s list of 100 greatest films of all timeBirdman or Transformers?

Despite all of this, I do concede there is an uncomfortable disconnect between the awards and the public — one that has grown larger over the years.

The disconnect is partly attributable to the rise of the blockbuster movie — which has divided the year into summer action movies vs. fall “serious” movies. The result is that the most popular films come out in the summer and the most Oscar-nominated films in the fall.

This is assisted by the fact that, as has always been true, money talks loudest in Hollywood. Ask a studio head if would he rather produce a crappy film that makes huge amounts of money or a great film that barely ekes out a profit. Almost always (maybe always), the answer will be the former. So action movie crap too often gets a green light.

The disconnect is also partly attributable to a change in viewing habits. More and more, people are content to view movies at home on their large screen televisions (or even their small mobile devices), rather than in theaters. This especially hurts theater ticket sales of smaller independent character-driven films — ones that skew toward an older less-theater-going audience and do not benefit much from being seen in a theater anyway.

There was a time when some of the best most memorable movies of the year were also among the most popular. The peak for this was probably the 1970’s — when movies like The Godfather, The French Connection, The Sting and Annie Hall won Best Picture. No more. We live in a time when, with few exceptions, the most popular and money-making films are the ones that most appeal to teenagers seeking the film equivalent of a comic book or young-adult novel. This is not the best criterion for a great film. And these films rarely get many award nominations.

So, yes, the Academy Awards are detached from the mainstream of movie-goers these days. While there is still the potential to make movies (such as American Sniper) that achieve both box office and critical success, it has become increasingly difficult to do so. But the solution is not to turn the Oscars into the People’s Choice Awards. Hollywood should continue to strive to give Oscars to what it perceives as the best films, regardless of box office receipts. If that means a decline in the popularity of the Oscar telecast, so be it.

However, I believe Hollywood should be able to figure out how to make the Oscar ceremony a much more entertaining event. That could go a long way to improving the ratings. I have ideas about this, starting with focusing on movies rather than dumb musical numbers…but that’s a subject for another column.

Addendum: I just finished reading Richard Corliss’ Time magazine article [subscription required] covering this same territory. I agree with his contention that a couple of the top grossing movies were well-received and could have qualified for a Best Picture nomination (especially “The Lego Movie” and “The Guardians of the Galaxy”) — although I can’t see any of them winning. However, I would point out that equal Rotten Tomatoes ratings are not all equivalent; two movies could both get a 90% approval from critics yet these same critics could agree that only one of the movies deserves consideration for Best Picture. Corliss also makes a good point that the subject matter of most of the nominated movies appeals more to an older audience — which likely reflects the 60-ish average age of Academy members. This could use some fixing. Beyond that, the Time column did not lead me to modify the views I expressed here.

Posted in Entertainment, Movies, Television | Leave a comment

The Ku Klux Klan vs. Muslim extremists

In a recent column for Time (These Terrorist Attacks Are Not About Religion), Kareem Abdul-Jabbar put it bluntly:

“When the Ku Klux Klan burns a cross in a black family’s yard, Christians aren’t required to explain how these aren’t really Christian acts.

Most people already realize that the KKK doesn’t represent Christian teachings. That’s what I and other Muslims long for—the day when these terrorists praising Mohammed or Allah’s name as they debase their actual teachings are instantly recognized as thugs disguising themselves as Muslims.”

At first glance, I find Abdul-Jabbar’s analogy to be compelling. Comparing extremist Muslims to the Ku-Klux-Klan makes a lot of sense. They are both hate-filled violence-prone minorities. However, on closer examination, the analogy begins to fall apart.

For one thing, by whose authority does Abdul-Jabbar assert that the terrorists are “disguising themselves as Muslims” — as opposed to being true Muslims? I assume that members of Al-Qaeda would make the same accusation about Abdul-Jabbar. As I have previously asserted, there are minority segments of all religions. Being a minority, even a violent minority, does not mean you cannot also be a legitimate member of a religion. There are certainly those would claim that advocating violence is as much a part of religious teachings, both Muslim and Christian, as advocating peace.

As for the terrorists who gunned down the staff of Charlie Hebdo — it is true that they are small in number. However, these terrorists were not just a bunch of thugs acting in isolation. They are not, as Abdul-Jabbar suggests, the equivalent of  “bank robbers wearing masks of presidents.”

Rather, the terrorists were trained and backed by Al-Qaeda in Yemen. And Al-Qaeda does not exist in a vacuum. It survives in part because of support from the population and authorities in the countries where they reside. Many Muslims in these countries offer tacit approval of such acts, even if they assert that they would never carry out such acts themselves.

Here is where I believe that Abdul-Jabbar’s Ku Klux Klan analogy is at its most accurate, although not in the way he intended. We shouldn’t look at the analogy from the point of view of a comfortable American living in 2015. Rather, look at it from the perspective of an African-American living in the deep South in the 1950’s.

Here you are, a black person at the time when the Ku Klux Klan’s power and influence were at their height. The Klan may represent only a tiny minority of the Christian population around you. They may represent a distorted view of Christianity, one that Christ himself would reject. Indeed, as a black person, you likely attend a Christian church that holds very different views.

Regardless, you know that none of this really matters. The larger truth is that the Klan survives because it is tolerated by the rest of the community. More than that, much of the community quietly approves of what the Klan is doing, even if they would never participate in its actions.

Indeed, the majority population of the Southern states are overtly racist. As a black person in the South in the 1950’s, you see this every time you are humiliated by the institutionalized racism that surrounds you. You have to go to the back of the bus. You can’t use the “whites only” water fountain. Schools are completely segregated. You can’t buy a house in most neighborhoods of a city. You can’t even vote. And you risk getting beaten by the police for challenging any of these restrictions. This racism is sanctioned by the government, all the way from the local councilman to the governor of the state.

This is the full picture of the time of Ku Klux Klan. With this full picture in mind, we see that the analogy to the Muslim situation today is apt, but differently than the way Abdul-Jabbar asserts.

Today, we see a Muslim world in the Middle East where, like the deep South decades ago, the population is unwilling to speak out against the actions of the extremists. Too often, the silence masks a disturbing approval of these actions. The supporters may not represent the majority— but they are far from a trivial component. In many instances, discrimination is institutionalized — even towards other members of the Muslim faith* — as seen in the gross inequality toward women and harsh penalties (including death) for those who rebel against the faith. And, of course, anti-Semitism is rampant everywhere.

The Ku Klux Klan was an extreme manifestation of racism in the South, but not the exclusive or even primary proponent of it. I believe the same is true today for the Muslim extremists in the Middle East.

If and when the day ever comes that the views of Abdul-Jabbar are representative of all parts of the Muslim world, I will happily join Abdul-Jabbar in what he “longs for.” Until then, I contend that these terrorist attacks are about religion — not the religion as Abdul-Jabbar practices it, but religion none-the-less.

Just saw this today: Egypt student gets 3-year jail term for atheism.

Posted in Media, Politics | Leave a comment

Smart device overkill

I own a smart TV. Among other things, I can use it to connect to Netflix, with no other device needed.

I also have a smart Blu-ray player. It too includes an option to select Netflix, as well as a small assortment of other “channels.”

Lastly, I have an Apple TV. As you probably already know, I can choose to watch Netflix from this device as well.

I have absolutely no need for three different ways to stream video from Netflix. One is definitely sufficient. [I’m not even going to go into the fact that I can also watch Netflix on my Mac, iPad and iPhone.]

Currently, the Apple TV is my preferred choice. This is because, of the three devices, it has the most feature-filled and easiest-to-navigate interface. I also stay exclusively with Apple TV because it is the device I use for channels, such as HBO GO, that are not available on the other two devices. Apple TV is also the only device that gives me access to my iTunes library and offers AirPlay. Case closed.

Essentially, if my television and Blu-ray player magically became dumb devices overnight, it would not matter to me one whit.

This is the dilemma that is facing the makers of these smart devices. The market is currently suffering from an overdose of overlapping devices. It’s especially tricky for television makers (see this Macworld article for related insight). No matter how smart televisions become, it won’t matter to their sales if people like me still prefer to use Apple TV instead. At the same time, Apple needs to worry that, if they don’t update the Apple TV sufficiently, people like me may yet abandon it in favor of improved and expanded features on televisions.

In the end, there may remain room for more than one choice to be retained and stay profitable. For example, those on a tighter budget might stick with their television alone (as this doesn’t require an additional purchase) while those with more disposable income go for an Apple TV or Roku.

Regardless, the current mishmosh is not sustainable. There will be winners and losers. The losers will gradually vanish from the landscape. I already anticipate this happening with smart Blu-ray players, maybe even with optical disc players altogether. Who will emerge as dominant in the battle between televisions vs. Apple TV/Roku devices remains to be seen. However, I expect that new hardware coming later this year will go a long way to determining which way the ball will bounce. Personally, I’m still hoping for a much improved Apple TV to win the day. But it’s far from certain that this will happen. Game on.

Posted in Apple Inc, Entertainment, Movies, Technology, Television | Leave a comment

Giving religion the respect it deserves

If a religion finds a particular action offensive to its beliefs, shouldn’t we at least attempt to avoid the action, if only out of a show of respect?

In the wake of yesterday’s massacre at a Paris newspaper, the public’s answer tilts clearly towards “no” — at least for those who have adopted “Je suis Charlie” as a rallying cry. More precisely, people are proclaiming that potentially offensive free speech and free expression should not be censored — certainly not by the violent acts of a few. The people at Charlie Hebdo had every right to publish what they did — even if, by depicting satirical images of Muhammad, they were offending many Muslims.

Given the violence that occurred, this is a relatively easy call to make. If the alternative is to defend the terrorists, there isn’t much room for debate.

The question often becomes more nuanced, however, if you remove the violence and simply ask the question I posed at the top of this article.

For me, however, the answer remains the same: No.

Let me back up a bit. I am not advocating being gratuitously insulting to a religion. Nor am I in any way supporting behavior that could be viewed as discriminatory or racist. I also believe that unqualified respect should be expected in certain cases. For example, no matter how much you disagree with a particular religion, I believe you should be respectful when on their turf. In other words, if you are inside a synagogue, church or mosque, you should observe the customs of the institution, even if you disagree with them.

Beyond that, we should give and expect to receive respect in most interactions. But there are limits. In the context of public discussion, for example, we should be as free to be critical of religion — even to the point of being insulting or offensive — as we would be for any other entity. In the op-ed pages of a newspaper, it is acceptable to be hypercritical of politicians — or political groups as a whole. Similarly, movie reviewers are permitted (some might say encouraged) to say extremely negative things about a film, even things that will undoubtedly be hurtful to the people who made the movie. No one claims such writing should be off-limits, out of respect to the people who might otherwise be offended. Even if you believe a writer has gone beyond the limits of decency and good taste, you would still defend his right to state his opinions. At least I hope so. I see no reason why critical writing about religion should be an exception. Religion deserves no more or less respect than these other institutions.

More generally, you can be offensive to a religion even without the intention of being critical. Depicting (non-satirical) images of Muhammad potentially falls into this category. Attempting to avoid such actions is an especially slippery slope on which to embark. For example, suppose I told you that there is a religion that believes all paintings hung in public places, such as museums, should be hung upside down. This is out of respect to God, as it allows him to see the paintings properly oriented when he looks down on them from heaven. If you were the curator of a museum, would this knowledge lead to rehang all your paintings? I would hope not. Would you change your mind if I told you that that there were more than 10 million members of this religion and they all found your behavior to be extremely offensive? Again, I would hope not.

There is a limit to what we will or should do to accommodate others’ religions. We cannot allow free expression be held hostage by the myriad of odd beliefs of the hundreds of religions that exist in the world. I’m not advocating unnecessarily going out of your way to be provocative. But neither should you be fearful of being provocative if you feel it is justified. That is why I believe it is acceptable for publications to include images of Muhammad, whether or not members of the Muslim religion object. In Paris yesterday, we saw one horrific consequence of believing otherwise.

[For related coverage, see yesterday’s column.]

Posted in Media, Politics | Leave a comment