Thursday, March 31, 2011

As April 15th approaches, and Uncle Sam takes his pound of flesh from you, consider this excerpt from:



 G.E. Avoids Taxes Altogether

By DAVID KOCIENIEWSKI
MARCH 25, 2011

The company reported worldwide profits of $14.2 billion, and said $5.1 billion of the total came from its operations in the United States.
Its American tax bill? None. In fact, G.E. claimed a tax benefit of $3.2 billion.
Although the top corporate tax rate in the United States is 35 percent, one of the highest in the world, companies have been increasingly using a maze of shelters, tax credits and subsidies to pay far less. Such strategies have pushed down the corporate share of the nation’s tax receipts — from 30 percent of all federal revenue in the mid-1950s to 6.6 percent in 2009.
Even as the government faces a mounting budget deficit, President Obama has designated G.E.’s chief executive, Jeffrey R. Immelt, as his liaison to the business community and as the chairman of the President’s Council on Jobs and Competitiveness. 
The assortment of tax breaks G.E. has won in Washington has provided a significant short-term gain for the company’s executives and shareholders. While the financial crisis led G.E. to post a loss in the United States in 2009, regulatory filings show that in the last five years, G.E. has accumulated $26 billion in American profits, and received a net tax benefit from the I.R.S. of $4.1 billion.
In the mid-1980s, President Ronald Reagan overhauled the tax system after learning that G.E. — a company for which he had once worked as a commercial pitchman — was among dozens of corporations that had used accounting gamesmanship to avoid paying any taxes.
“I didn’t realize things had gotten that far out of line,” Mr. Reagan told the Treasury secretary, Donald T. Regan, according to Mr. Regan’s 1988 memoir. The president supported a change that closed loopholes and required G.E. to pay a far higher effective rate, up to 32.5 percent.
That pendulum began to swing back in the late 1990s. G.E. and other financial services firms won a change in tax law that would allow multinationals to avoid taxes on some kinds of banking and insurance income. The change meant that if G.E. financed the sale of a jet engine or generator in Ireland, for example, the company would no longer have to pay American tax on the interest income as long as the profits remained offshore.
The American Jobs Creation Act — signed into law by President George W. Bush in 2004 -- contained more than $13 billion a year in tax breaks for corporations, many very beneficial to G.E. One provision allowed companies to defer taxes on overseas profits from leasing planes to airlines. It was so generous — and so tailored to G.E. and a handful of other companies — that staff members on the House Ways and Means Committee publicly complained that G.E. would reap “an overwhelming percentage” of the estimated $100 million in annual tax savings.
According to its 2007 regulatory filing, the company saved more than a billion dollars because of that law in the three years after it was enacted.
Some tax experts question what taxpayers are getting in return. Since 2002, the company has eliminated a fifth of its work force in the United States while increasing overseas employment. In that time, G.E.’s accumulated offshore profits have risen to $92 billion from $15 billion.
Representative Lloyd Doggett, Democrat of Texas, who has proposed closing many corporate tax shelters said “Our tax system should encourage job creation and investment in America and end these tax incentives for exporting jobs and dodging responsibility for the cost of securing our country.”
As the Obama administration and leaders in Congress consider proposals to revamp the corporate tax code, G.E. is well prepared to defend its interests. The company spent $4.1 million on outside lobbyists last year, including four boutique firms that specialize in tax policy.
“We are a diverse company, so there are a lot of issues that the government considers, that Congress considers, that affect our shareholders,” said Gary Sheffer, a G.E. spokesman. “So we want to be sure our voice is heard.”






Saturday, March 26, 2011

THE FIGHTER is now out on DVD.

Got 20 minutes?
Listen to Terry Gross' interview with Mark Wahlberg and director David O. Russell. Or, if you haven't seen it, check out the CBS 60 MINUTES profile of Wahlberg (below) which aired on November 21, 2010:

Wednesday, March 23, 2011

The best piece on Hollywood I've read in a long time (highlights are mine):



The Day the Movies Died
No, Hollywood films aren't going to get better anytime soon. Mark Harris on the (potential) death of the great American art form.

February 2011
If you want to understand how bad things are in Hollywood right now—how stifling and airless and cautious the atmosphere is, how little nourishment or encouragement a good new idea receives, and how devoid of ambition the horizon currently appears—it helps to start with a success story.
Consider: Years ago, an ace filmmaker, the man who happened to direct the third-highest-grossing movie in U.S. history, The Dark Knight, came up with an idea for a big summer movie. It's a story he loved—in fact, he wrote it himself—and it belonged to a genre, the sci-fi action thriller, that zipped right down the center lane of American popular taste. He cast as his leading man a handsome actor, Leonardo DiCaprio, who happened to star in thesecond-highest-grossing movie in history. Finally, to cover his bet even more, he hired half a dozen Oscar nominees and winners for supporting roles.
Sounds like a sure thing, right? Exactly the kind of movie that a studio would die to have and an audience would kill to see? Well, it was. That film, Christopher Nolan's Inception, received admiring reviews, became last summer's most discussed movie, and has grossed, as of this writing, more than three-quarters of a billion dollars worldwide.
And now the twist: The studios are trying very hard not to notice its success, or to care. Before anybody saw the movie, the buzz within the industry was: It's just a favor Warner Bros. is doing for Nolan because the studio needs him to make Batman 3. After it started to screen, the party line changed: It's too smart for the room, too smart for the summer, too smart for the audience. Just before it opened, it shifted again: Nolan is only a brand-name director to Web geeks, and his drawing power is being wildly overestimated. After it grossed $62 million on its first weekend, the word was: Yeah, that's pretty good, but it just means all the Nolan groupies came out early—now watch it drop like a stone.
And here was the buzz three months later, after Inception became the only release of 2010 to log eleven consecutive weeks in the top ten: Huh. Well, you never know.
"Huh. Well, you never know" is an admission that, put simply, things have never been worse.
It has always been disheartening when good movies flop; it gives endless comfort to those who would rather not have to try to make them and can happily take cover behind a shield labeled "The people have spoken." But it's really bad news when the industry essentially rejects a success, when a movie that should have spawned two dozen taste-based gambles on passion projects is instead greeted as an unanswerable anomaly. That kind of thinking is why Hollywood studio filmmaking, as 2010 came to its end, was at an all-time low—by which I don't mean that there are fewer really good movies than ever before (last year had its share, and so will 2011) but that it has never been harder for an intelligent, moderately budgeted, original movie aimed at adults to get onto movie screens nationwide. "It's true at every studio," says producer Dan Jinks, whose credits include the Oscar winners American Beauty and Milk. "Everyone has cut back on not just 'Oscar-worthy' movies, but on dramas, period. Caution has made them pull away. It's infected the entire business."
For the studios, a good new idea has become just too scary a road to travel. Inception, they will tell you, is an exceptional movie. And movies that need to be exceptional to succeed are bad business. "The scab you're picking at is called execution," says legendary producer Scott Rudin (The Social Network, True Grit). "Studios are hardwired not to bet on execution, and the terrible thing is, they're right. Because in terms of execution, most movies disappoint."
With that in mind, let's look ahead to what's on the menu for this year: four adaptations of comic books. One prequel to an adaptation of a comic book. One sequel to a sequel to a movie based on a toy. One sequel to a sequel to a sequel to a movie based on an amusement-park ride. One prequel to a remake. Two sequels to cartoons. One sequel to a comedy. An adaptation of a children's book. An adaptation of a Saturday-morning cartoon. One sequel with a 4 in the title. Two sequels with a 5 in the title. One sequel that, if it were inclined to use numbers, would have to have a 7 1/2 in the title.1
And no Inception. Now, to be fair, in modern Hollywood, it usually takes two years, not one, for an idea to make its way through the alimentary canal of the system and onto multiplex screens, so we should really be looking at summer 2012 to see the fruit of Nolan's success. So here's what's on tap two summers from now: an adaptation of a comic book. A reboot of an adaptation of a comic book. A sequel to a sequel to an adaptation of a comic book. A sequel to a reboot of an adaptation of a TV show. A sequel to a sequel to a reboot of an adaptation of a comic book. A sequel to a cartoon. A sequel to a sequel to a cartoon. A sequel to a sequel to a sequel to a cartoon. A sequel to a sequel to a sequel to a sequel to a movie based on a young-adult novel.2 And soon after: Stretch Armstrong. You remember Stretch Armstrong, right? That rubberized doll you could stretch and then stretch again, at least until the sludge inside the doll would dry up and he would become Osteoporosis Armstrong? A toy that offered less narrative interest than bingo?
Let me stipulate that we will probably come out of three or four of the movies categorized above saying "That rocked!" (One of them is even being directed by Nolan.) And yes, it is technically possible that some years hence, a magazine article will begin with the sentence, "Stretch Armstrong's surprising journey to a Best Picture nomination began when..." But for now, let's just admit it: Hollywood has become an institution that is more interested in launching the next rubberized action figure than in making the next interesting movie.
At this moment of awards-giving and back-patting, however, we can all agree to love movies again, for a little while, because we're living within a mirage that exists for only about six or eight weeks around the end of each year. Right now, we can argue that any system that allows David Fincher to plumb the invention of Facebook and the Coen brothers to visit the old West, that lets us spend the holidays gorging on new work by Darren Aronofsky and David O. Russell, has got to mean that American filmmaking is in reasonably good health. But the truth is that we'll be back to summer—which seems to come sooner every year—in a heartbeat. And it's hard to hold out much hope when you hear the words that one studio executive, who could have been speaking for all her kin, is ready to chisel onto Hollywood's tombstone: "We don't tell stories anymore."
How did hollywood get here? There's no overarching theory, no readily identifiable villain, no single moment to which the current combination of caution, despair, and underachievement that defines studio thinking can be traced. But let's pick one anyway: Top Gun.
It's now a movie-history commonplace that the late-'60s-to-mid-'70s creative resurgence of American moviemaking—the Coppola-Altman-Penn-Nichols-Bogdanovich-Ashby decade—was cut short by two movies,Jaws in 1975 and Star Wars in 1977, that lit the fuse for the summer-blockbuster era. But good summer blockbusters never hurt anyone, and in the decade that followed, the notion of "summer movie season" entered the pop-culture lexicon, but the definition of "summer movie" was far more diverse than it is today. The label could encompass a science fiction film as hushed and somber as Alien, a two-and-a-half-hour horror movie like The Shining, a directorial vision as singular as Blade Runner, an adult film noir like Body Heat, a small-scale (yes, it was) movie like E.T. The Extra-Terrestrial, a frankly erotic romantic drama like An Officer and a Gentleman. Sex was okay—so was an R rating. Adults were treated as adults rather than as overgrown children hell-bent on enshrining their own arrested development.
Then came Top Gun. The man calling the shots may have been Tony Scott, but the film's real auteurs were producers Don Simpson and Jerry Bruckheimer, two men who pioneered the "high-concept" blockbuster—films for which the trailer or even the tagline told the story instantly. At their most basic, their movies weren't movies; they were pure product—stitched-together amalgams of amphetamine action beats, star casting, music videos, and a diamond-hard laminate of technological adrenaline all designed to distract you from their lack of internal coherence, narrative credibility, or recognizable human qualities. They were rails of celluloid cocaine with only one goal: the transient heightening of sensation.
Top Gun landed directly in the cortexes of a generation of young moviegoers whose attention spans and narrative tastes were already being recalibrated by MTV and video games. That generation of 16-to-24-year-olds—the guys who felt the rush of Top Gun because it was custom-built to excite them—is now in its forties, exactly the age of many mid- and upper-midrange studio executives. And increasingly, it is their taste, their appetite, and the aesthetic of their late-'80s postadolescence that is shaping moviemaking. Which may be a brutally unfair generalization, but also leads to a legitimate question: Who would you rather have in charge—someone whose definition of a classic is Jaws or someone whose definition of a classic is Top Gun?
The Top Gun era sent the ambitions of those who wanted to break into the biz spiraling in a new direction. Fifteen years earlier, scores of young people headed to film schools to become directors. With the advent of the Reagan years, a more bottom-line-oriented cadre of would-be studio players was born, with an MBA as the new Hollywood calling card. The Top Gun era shifted that paradigm again—this time toward marketing. Which was only natural: If movies were now seen as packages, then the new kings of the business would be marketers, who could make the wrapping on that package look spectacular even if the contents were deficient.
In some ways, the ascent of the marketer was inevitable: Now that would-be blockbusters often open on more than 4,000 screens, the cost of selling a movie has skyrocketed toward—and sometimes past—$40 million to $50 million per film, which is often more than the movie itself cost to make. According to the Los Angeles Times, the studios spent $1 billion just to market the movies that were released in the summer of 2009. "Opening a movie everywhere at once is a very, very expensive proposition," says Jinks, who points out that ten years ago American Beauty could open slowly and become "ridiculously profitable without ever being the number one movie. But today, if you're opening, you're inevitably going to overspend in order to try to buy that first-place finish."
With so much money at stake, the marketer's voice at the studio table is now pivotal from the day a studio decides whether to make a movie—and usually what that voice expresses is trepidation. Their first question is not "Will the movie be good?" but "Can it be sold?" And by "sold," what they mean is "sold on the first weekend." Good movies aimed at adults tend to make their money more slowly than kid stuff, and they're helped by good reviews and word of mouth, which, from a marketing standpoint, are impossible to engineer. That's one reason studios would rather spend $100 million on a franchise film than a fraction of that on an original idea. When Rudin first got hold of The Social Network, he says, "I would get calls from people at other studios saying, 'Is that movie going? We'd love to do it. How much do you need to make it?' And I'd say, 'Somewhere between $35 and $40 million.' And they'd say, 'Oh, well, we were thinking $15.' The days of having five companies chase you for a movie that needs to be good in order to work are over."
"Fear has descended," says James Schamus, the screenwriter-producer who also heads the profitable indie company Focus Features, "and nobody in Hollywood wants to be the person who green-lit a movie that not only crashes but about which you can't protect yourself by saying, 'But at least it was based on a comic book!' "

Such an unrelenting focus on the sell rather than the goods may be why so many of the dispiritingly awful movies that studios throw at us look as if they were planned from the poster backward rather than from the good idea forward. Marketers revere the idea of brands, because a brand means that somebody, somewhere, once bought the thing they're now trying to sell. The Magic 8 Ball (tragically, yes, there is going to be a Magic 8 Ball movie) is a brand because it was a toy. Pirates of the Caribbean is a brand because it was a ride. Harry Potter is a brand because it was a series of books. Jonah Hex is a brand because it was a comic book. (Here lies one fallacy of putting marketers in charge of everything: Sometimes they forget to ask if it's a good brand.) Sequels are brands. Remakes are brands. For a good long stretch, movie stars were considered brands; this was the era in which magazines like Premiere attempted to quantify the waxing or waning clout of actors and actresses from year to year because, to the industry, having the right star seemed to be the ultimate hedge against failure.
But after three or four hundred cases in which that didn't prove out, Hollywood's obsession with star power has started to erode. In the last several years, a new rule of operation has taken over: The movie itself has to be the brand. And because a brand is, by definition, familiar, a brand is also, by definition, not original. The fear of nonbranded movies can occasionally approach the ridiculous, as it did in 2006 when Martin Scorsese's The Departed was widely viewed within the industry as a "surprise" hit, primarily because of its R rating and unfamiliar source material. It may not have been a brand, but, says its producer Graham King, "Risky? With the guy I think is the greatest living director and Nicholson, Matt Damon, Wahlberg, and Leo? If you're at a studio and you can't market that movie, then you shouldn't be in business."
Inception was not a brand, which is why nobody with a marketing background is too eager to go find the nextInception—although ironically, any studio in town would eagerly green-light Inception 2. On the other hand, as you read this, the person who gave the go-ahead to Fast Five, the (I hate to prejudge, but...) utterly unnecessary fifth installment in the Vin Diesel–Paul Walker epic The Fast and the Furious, is sleeping soundly right now, possibly even at his desk. On June 10, 2011, he will bestow on several thousand screens a product that people have already purchased four times before. How can it miss?
Of course, it can miss; can't-miss movies miss all the time. But when a movie that everyone agrees is pre-sold falls on its face, the dullness of the idea itself never gets the blame. Because the idea that familiarity might actually work against a movie, were it to take hold in Hollywood, would be so annihilating to the studio ecosystem that it would have to be rebuilt from the ground up. Give the people what they don't know they want yet is a recipe for more terror than Hollywood can accommodate.
And while that bland assembly-line ethos hasn't affected the small handful of terrific American movies that reach screens every year, it's been absolutely devastating for the stuff in the middle—that whole tier of movies that used to reside in quality somewhere below, say, There Will Be Blood but well north of Tyler Perry's Why Did I Get Married Too? It's your run-of-the-mill hey-what's-playing-tonight movie—the kind of film about which you should be able to say, "That was nothing special, but it was okay"—that has suffered most from Hollywood's collective inattention/indifference to the basic virtues of story development. If films like The Bounty Hunter andPrince of Persia: The Sands of Time define the new "okay," then the system is, not to put too fine a point on it, in very deep shit.
Fixing that, however, is nobody's current priority—because fixing it would require an admission that something's broken. Marketing isn't about that; it's about looking at what's selling and then selling more of it. These days, Hollywood's most popular commodity is the concept of endless summer. Back in the mid-'80s, the season adhered at least somewhat to actual summer: It'd launch around the last weekend of May and peter out by mid-August. That left eight months for movies that did not involve CGI or spandex. But the last decade has seen an extraordinary degree of calendar creep: In 2010, "summer" arguably began on April 30, with a remake of A Nightmare on Elm Street, and was still going pretty strong on September 10, with the opening of Resident Evil: Afterlife. And since summer's so popular, why not have more than one of them? In Hollywood thinking, what we used to call "spring" is now "Summer: The Prequel," festooned with movies like Clash of the Titans that are designed to slake your blockbuster thirst as early as the first weeks of spring. And once you go that far, it's not much of a leap to reimagine Thanksgiving as "WinterSummer." And then Christmas—which, last year, offered a new Narnia movie as well as reboots of Yogi Bear, Gulliver's Travels, and Tron—sort of becomes "WinterSummer II: The Return of WinterSummer!"
The rise of marketers has also brought on an obsession with demographics. As anyone in Hollywood will tell you, the American filmgoing populace is divided two ways: by gender and by age. Gender is self-explanatory (usually); the over-under dividing line for age is 25. Naturally, every studio chief dreams of finding a movie like Avatar that reaches all four "quadrants" of the audience: male and female, young and not. But if it can be made for the right price, a two- or even one-quadrant film can be a viable business proposition.
In Hollywood, though, not all quadrants are created equal. If you, for instance, have a vagina, you're pretty much out of luck, because women, in studio thinking, are considered a niche audience that, except when Sandra Bullock reads a script or Nicholas Sparks writes a novel, generally isn't worth taking the time to figure out. And if you were born before 1985... well, it is my sad duty to inform you that in the eyes of Hollywood, you are one of what the kids on the Internet call "the olds." I know—you thought you were one of the kids on the Internet. Not to the studios, which have realized that the closer you get to (or the farther you get from) your thirtieth birthday, the more likely you are to develop things like taste and discernment, which render you such an exhausting proposition in terms of selling a movie that, well, you might as well have a vagina.
That leaves one quadrant—men under 25—at whom the majority of studio movies are aimed, the thinking being that they'll eat just about anything that's put in front of them as long as it's spiked with the proper set of stimulants. That's why, when you look at the genres that currently dominate Hollywood—action, raunchy comedy, game/toy/ride/comic-book adaptations, horror, and, to add an extra jolt of Red Bull to all of the preceding categories, 3-D—they're all aimed at the same ADD-addled, short-term-memory-lacking, easily excitable testosterone junkie. In a world dominated by marketing, it was inevitable that the single quadrant that would come to matter most is the quadrant that's most willing to buy product even if it's mediocre.
"It's a chicken-versus-egg thing," says writer-producer Vince Gilligan, the creator of the why-aren't-there-movies-this-good cable hit Breaking Bad. "The studios say, 'Well, no one else is coming to movies reliably these days except for young males, so we'll make our movies for them.' And yet if you make movies simply for young males, nobody else is going to want to go. So Hollywood has become like Logan's Run: You turn 30, and they kill you."
he good news is that the four-quadrant theory of marketing may now be eroding. The bad news is that it's giving way to something worse—a new classification that encompasses all ages and both genders: the "I won't grow up" demographic. As recently as 1993, three kid-oriented genres—animated movies, movies based on comic books, and movies based on children's books—represented a relatively small percentage of the overall film marketplace; that year they grossed about $400 million combined (thanks mostly to Mrs. Doubtfire) and owned just a single spot in the year's top ten. In 2010, those same three genres took in more than $3 billion and by December represented eight of the year's top nine grossers.
Let me posit something: That's bad. We can all acknowledge that the world of American movies is an infinitely richer place because of Pixar and that the very best comic-book movies, from Iron Man to The Dark Knight, are pretty terrific, but the degree to which children's genres have colonized the entire movie industry goes beyond overkill. More often than not, these collectively infantilizing movies are breeding an audience—not to mention a generation of future filmmakers and studio executives—who will grow up believing that movies aimed at adults should be considered a peculiar and antique art. Like books. Or plays.
In a way, that kind of thinking is just the terminus of a decades-long marginalization of the very notion of creative ambition by the studios. If in the 1970s making good original movies was a central goal of the men who ran the studios, by the 1980s that goal had devolved to making good original movies to release at the end of the year, for Oscar season. In the 1990s, as the boom in American independent filmmaking began, the idea of a "good" movie, as New York Times critic Manohla Dargis has pointed out, eventually became a niche that could be outsourced—first to self-made moguls like Harvey Weinstein and then to boutique divisions of the studios themselves. "There was a moment a few years ago," says Schamus, "when studios said, 'Hey, all of these specialty companies seem to be taking up all the seats in the front row at the Oscars, so if they can do it, we can do it—we'll just throw money at them!' And the results, financially, ranged from mildly catastrophic to ridiculously catastrophic."
That boom went bust. Several of the studio-owned boutique divisions overspent insanely, often on weak material (and we can all agree that a really bad small movie is every bit as wretched as a really bad big movie). They muscled into the marketplace, laying waste to smaller indie companies in the process; then they collapsed of their own weight and left the field of dramatic filmmaking devastated. "And for all those people who spent years trying to get movies made at all the companies that are now gone, there's now one place to work where you can get respectfully treated and fairly judged," says Rudin. "It's HBO."
So cable has become the custodian of the "good" niche; entities like HBO, Showtime, and AMC have found a business model with which they can satisfy a deep public appetite for long-form drama. Their original series don't need to attract huge audiences; and as a result, any number of ambitious writers, directors, and producers who might long ago have pitched their best stuff to studios now turn to the small screen, because one thing nobody in cable television will ever say to them is "We don't tell stories anymore."
"The sad thing," says HBO programming chief Michael Lombardo, "is that a world has closed to a group of serious storytellers—and there are some stories that should be told in a two-hour format. Our success is a sort of silver lining in a story that's economically driven by what studios are doing to try to survive in a complicated international market. But the losers ultimately are people who are looking to appreciate serious work in film."
So who's the bad guy here? It must be said that studio executives and marketers make such tempting bad guys that it becomes too easy to assume that the problem could be fixed by nothing more than a changing of the guard. In public esteem, they stand somewhere alongside congressmen, bankers, and health insurers.
They're philistines, foes of art, craven bottom-liners, vulgarians. It's a nice theory, but it ignores the fact that each studio has its own culture. And after all, somebody green-lit The Social Network and True Grit and The Town—a modestly budgeted movie that surprised even its own director-star by opening in a robust first place and then racking up strong grosses week after week. "The business has been bifurcated into big tentpole movies and dramas that have become more and more marginalized," says Ben Affleck. "I understand that the kind of movie I made is hard to sell, so even though it was probably the least expensive movie Warner Bros. will make all year, it still represented a risk. And it'd be nice to imagine that it's a viable business to make twelve or fifteen of those movies a year at a studio, but it's just not."
The economic pressures the studios are facing aren't just an excuse—they're real. Movie-ticket sales may be reasonably strong, but any number of economic forces are conspiring against the production of adult dramas. They don't generally have the kind of repeat-viewing appeal that would make them DVD smashes. They often end up with an R rating, which puts a ceiling on their earning capacity and makes a modest budget absolutely essential. Oscar nominations or even wins can no longer be relied upon to goose a quality film's revenues. (Last spring's top honoree, The Hurt Locker, ended up as the lowest-grossing Best Picture winner since the 1950s.) And overseas markets are becoming less predictable and more insular—Schamus points out that Japan and Italy have taken a pronounced turn away from Hollywood films and toward homegrown fare, a trend that's likely to spread around the globe. (And adult dramas play particularly poorly abroad.)
"Listen, the obligation of anyone in those studio jobs is to help their company make a profit," says Scott Stuber, who served as Universal's president of production before leaving in 2005 to become a producer. "When things are going well, sometimes you're willing to reach a little bit more; you'll say, once in a while, 'We're just going to do this movie because we believe in it.' But when they're not going so well... it gets difficult. There's just not as much money out there as there used to be, and we're all inundated with so much noise now that it's hard to cut through every weekend for consumers' attention."
Which brings us to the embarrassing part. Blaming the studios for everything lets another culprit off too easily: us. We can complain until we're hoarse that Hollywood abandoned us by ceasing to make the kinds of movies we want to see, but it's just as true that we abandoned Hollywood. Studios make movies for people who go to the movies, and the fact is, we don't go anymore—and by we, I mean the complaining class, of which, if you've read this far, you are absolutely a member. We stay home, and we do it for countless reasons: A trip to the multiplex means paying for parking, a babysitter, and overpriced unhealthy food in order to be trapped in a room with people who refuse to pay for a babysitter, as well as psychos, talkers, line repeaters, texters, cell-phone users, and bedbugs. We can see the movie later, and "later" is pretty soon—on a customized home-theater system or, forget that, just a nice big wide-screen TV, via Netflix, or Amazon streaming, or on-demand, or iPad. The urgency of seeing movies the way they're presumably intended to be seen has given way to the primacy of privacy and the security of knowing that there's really almost no risk of missing a movie you want to see and never having another opportunity to see it. Put simply, we'd rather stay home, and movies are made for people who'd rather go out.
"Remember when a video didn't come out until ten months after the movie opened, so you really had to go see it?" says Graham King. "Gone now. It's a vicious circle, because audiences are saying—or we're guessing they're saying—that they want these movies, but it's so easy to say, 'I'm going to wait,' and it's not a cheap night out to go to the movies anymore. So it's not surprising that the studios aren't willing to risk much money on the chance that this time, the audience is going to say, 'Let's actually go see this on the big screen.' "
Still, sometimes we do actually show up. Moviegoing is, after all, a lifelong habit, and we don't need all that much encouragement to keep trying. During one remarkable stretch last fall, the box office was dominated, on successive weekends, by The TownWall Street: Money Never Sleeps, and The Social Network, and as the studios suddenly seemed to reassert that they didn't intend to give up on us completely, we fulfilled our half of the bargain by buying tickets. Now it's your turn again, Hollywood. Because somewhere out there, somebody has a pitch as good as Inception. There will no doubt be a dozen reasons not to green-light it. But say yes and we just might give you another $800 million out of gratitude.
Make no mistake: Hollywood wants that $800 million. And in fact, they may have figured out the perfect way to extract it from our wallets. It took twenty-four years to get here, but it's finally happening: Top Gun 2.


Saturday, March 19, 2011

NETFLIX VENTURES INTO ORIGINAL PROGRAMMING


Netflix locks up rights to its first TV series

By Associated Press

LOS GATOS, Calif. — Netflix’s Internet video streaming service will get first-run episodes of the upcoming TV series, "House of Cards," starring Academy Award winning actor Kevin Spacey.
The deal announced Friday illustrates Netflix Inc.’s growing power in Hollywood as its mines the revenue from its 20 million subscribers to create new home entertainment options.
Netflix didn’t disclose how much it is paying Media Rights Capital, the studio behind "House of Cards."
Buying the rights to 26 episodes of "House of Cards" before the series is in production represents a dramatic departure for Netflix. Its online video streaming library boasts more than 20,000 titles, but most of them are older movies and previously shown TV series.
"House of Cards" is expected to debut on Netflix late next year.

Tuesday, March 15, 2011

Pro-banker pols get two thumbs down:



Another Inside Job

By PAUL KRUGMAN
March 13, 2011

Count me among those who were glad to see the documentary “Inside Job” win an Oscar. The film reminded us that the financial crisis of 2008, whose aftereffects are still blighting the lives of millions of Americans, didn’t just happen — it was made possible by bad behavior on the part of bankers, regulators and, yes, economists.
What the film didn’t point out, however, is that the crisis has spawned a whole new set of abuses, many of them illegal as well as immoral. And leading political figures are, at long last, showing some outrage. Unfortunately, this outrage is directed, not at banking abuses, but at those trying to hold banks accountable for these abuses.
The immediate flashpoint is a proposed settlement between state attorneys general and the mortgage servicing industry. That settlement is a “shakedown,” says Senator Richard Shelby of Alabama. The money banks would be required to allot to mortgage modification would be “extorted,” declares The Wall Street Journal. And the bankers themselves warn that any action against them would place economic recovery at risk.
All of which goes to confirm that the rich are different from you and me: when they break the law, it’s the prosecutors who find themselves on trial.
To get an idea of what we’re talking about here, look at the complaint filed by Nevada’s attorney general against Bank of America. The complaint charges the bank with luring families into its loan-modification program — supposedly to help them keep their homes — under false pretenses; with giving false information about the program’s requirements (for example, telling them that they had to default on their mortgages before receiving a modification); with stringing families along with promises of action, then “sending foreclosure notices, scheduling auction dates, and even selling consumers’ homes while they waited for decisions”; and, in general, with exploiting the program to enrich itself at those families’ expense.
The end result, the complaint charges, was that “many Nevada consumers continued to make mortgage payments they could not afford, running through their savings, their retirement funds, or their children’s education funds. Additionally, due to Bank of America’s misleading assurances, consumers deferred short-sales and passed on other attempts to mitigate their losses. And they waited anxiously, month after month, calling Bank of America and submitting their paperwork again and again, not knowing whether or when they would lose their homes.”
Still, things like this only happen to losers who can’t keep up their mortgage payments, right? Wrong. Recently Dana Milbank, the Washington Post columnist, wrote about his own experience: a routine mortgage refinance with Citibank somehow turned into a nightmare of misquoted rates, improper interest charges, and frozen bank accounts. And all the evidence suggests that Mr. Milbank’s experience wasn’t unusual.
Notice, by the way, that we’re not talking about the business practices of fly-by-night operators; we’re talking about two of our three largest financial companies, with roughly $2 trillion each in assets. Yet politicians would have you believe that any attempt to get these abusive banking giants to make modest restitution is a “shakedown.” The only real question is whether the proposed settlement lets them off far too lightly.
What about the argument that placing any demand on the banks would endanger the recovery? There’s a lot to be said about that argument, none of it good. But let me emphasize two points.
First, the proposed settlement only calls for loan modifications that would produce a greater “net present value” than foreclosure — that is, for offering deals that are in the interest of both homeowners and investors. The outrageous truth is that in many cases banks are blocking such mutually beneficial deals, so that they can continue to extract fees. How could ending this highway robbery be bad for the economy?
Second, the biggest obstacle to recovery isn’t the financial condition of major banks, which were bailed out once and are now profiting from the widespread perception that they’ll be bailed out again if anything goes wrong. It is, instead, the overhang of household debt combined with paralysis in the housing market. Getting banks to clear up mortgage debts — instead of stringing families along to extract a few more dollars — would help, not hurt, the economy.
In the days and weeks ahead, we’ll see pro-banker politicians denounce the proposed settlement, asserting that it’s all about defending the rule of law. But what they’re actually defending is the exact opposite — a system in which only the little people have to obey the law, while the rich, and bankers especially, can cheat and defraud without consequences.


Wednesday, March 9, 2011

BATMAN MEETS THE SOCIAL NETWORK:


Warner bringing films to Facebook

Associated Press

NEW YORK — Movie rentals are coming to Facebook. Warner Bros. says it is testing a service that will offer select movies for purchase or rental on the largest online social network.
Facebook users who visit the official page of “Batman: The Dark Knight’’ will be able to click a “watch’’ icon and pay 30 Facebook credits, or $3, to watch the movie, streamed through a Facebook application.
Renters will have access to a movie for 48 hours. Warner Bros will add other movies in coming months.
The page had nearly 4 million “likes’’ as of yesterday afternoon.
Warner Bros. did not say how much it will cost to buy movies.
“Making our films available through Facebook is a natural extension of our digital distribution efforts,’’ said Thomas Gewecke, president of Warner Bros. Digital Distribution.
The offering is open only to people in the United States.
“While officially dubbed a test, we expect to see more studios get behind the effort given the large platform and higher price point,’’ Jefferies analyst Youssef H. Squali wrote to investors. He sees the Facebook-Warner Bros. deal as “yet another caution sign’’ that Netflix Inc.’s stock value is too high. Netflix lost $11.95, or 5.8 percent, to close at $195.45.
“The competitive playing field is getting crowded,’’ the analyst said. “We expect this competition to curtail Netflix’s subscriber growth and drive higher content costs, impacting revenue growth and [profit] margins over time.’’
Netflix, which rents movies online and on DVDs, had no comment.

Tuesday, March 8, 2011

BROADWAY LOVES BOSTON TOO: David Lindsay-Abaire's new play about Southie earns critical raves:


THEATER REVIEW | 'GOOD PEOPLE'
Been Back to the Old Neighborhood?
From left, Becky Ann Baker as Jean, Frances McDormand as Margie and Estelle Parsons as Dottie in the Broadway play “Good People,” set mostly in the Southie area of Boston.

March 3, 2011
           
Don’t make the mistake of thinking you understand Margaret Walsh from the get-go, because she’s not an easy gal to get a fix on. Not at first, anyway.

Embodied with an ideal balance of expertise and empathy by Frances McDormand, Margie (as her friends call her, using a hard “g”) is the not-quite heroine of David Lindsay-Abaire’s “Good People,” the very fine new play that opened Thursday night at the Samuel J. Friedman Theater. And discovering how Margie operates — and where she’s coming from — is one of the more subtly surprising treats of this theater season.

Where Margie comes from is, on one level, a no-brainer. She’s from South Boston, or Southie, and her most basic notions of herself are tied up in her identification with that neighborhood. She was born there, and the odds are she’ll die there, never having escaped its particular culture of poverty and loyalty. Why this has to be is the gentle mystery that propels “Good People,” a Manhattan Theater Club production that also stars Tate Donovan and is directed with a skillfully slow hand by Daniel Sullivan. Other people escape from Southie, including a guy Margie dated in high school. Why can’t she?
For whatever reasons, the seedier milieus of Boston have of late become favorite haunts for popular crime novelists and filmmakers like Dennis Lehane (“Mystic River”) and Ben Affleck (“The Town”), a fashionable place to uncover dark secrets on mean streets. Steeped in mouthy Southie lingo and hard-knocks sociology, “Good People” has a few secrets of its own, but their revelations are unlikely to startle you. In this play character really is fate. And when you look back, everyone has behaved exactly in character, without any plot-bending, credibility-stretching manipulations on the part of its author.
Such integrity is less common than you might think in theater. How often do you leave a play thinking, “There’s no way she would have done that,” or “He wouldn’t talk like that”? But as in his previous Broadway drama, the Pulitzer Prize-winning “Rabbit Hole” (2006), Mr. Lindsay-Abaire (who grew up in South Boston) is scrupulous here about presenting people who are consistent even in their inconsistencies.
You get the feeling that this writer — who made his name with fantasy-tinged, diabolically whimsical works like “Fuddy Meers” and “Kimberly Akimbo” — feels a special obligation not to cheat or take shortcuts when working in naturalism. This means not only that “Good People” refuses to shy from the clichĂ©s that its characters would normally use, but also that there is little that’s dramatically flashy or high impact.
True, Margie has been given a couple of winningly foul-mouthed, bingo-playing eccentric cronies (deliciously portrayed by Estelle Parsons and Becky Ann Baker). But the production -- designed with unobtrusively heightened, class-defining detail by John Lee Beatty (sets) and David Zinn (costumes) — is achingly aware that what doesn’t happen in people’s lives counts for as much, if not more, than what does.
There’s not a moment when Ms. McDormand is onstage that you don’t feel Margie is assessing the absences in her existence as a single, working mother of a grown-up daughter with a child’s mind. This is particularly true once she makes contact with Mike Dillon (Mr. Donovan, excellent), a guy she dated toward the end of high school (but hasn’t seen in 30 years), who is now a successful fertility doctor.
Margie’s reason for looking up Mike is an all-too familiar one in the current economy. She needs a job. In the play’s first scene — which lays the groundwork for everything that follows in ways you don’t fully appreciate at the time — Margie is fired by her boss, Stevie (Patrick Carroll), for excessive tardiness.
Margie knew Stevie when he was a boy, she knew his mother; she knows whom he’s dating now and what people say about him behind his back. She is not above using this knowledge, easily accumulated and stored up in an insular neighborhood like Southie, for emotional blackmail. The thing about Margie, though, is that while she starts off tough, she’s not so good on follow-through. And there comes a very specific moment — it happens when Ms. McDormand bends her knees just a bit, as if the breath has gone out of her — that you realize that not only is Margie conceding defeat here but also that she always knew she was going to.
This central paradox in Margie’s character — what might be described as a feisty defeatism — is beautifully conveyed by Ms. McDormand, who won an Oscar for playing a far more assured figure, a tenacious Minnesota police officer in the 1996 film “Fargo.” In dealing with others, Margie is combative, sly, nasty and tricky. But in the very same breath she is doubtful, reluctant and self-sabotagingly kind.
This combination of elements leads Mike to describe Margie as a master of passive aggression when she shows up, uninvited, at his blandly expensive-looking office.
Certainly Margie knows what buttons to push (including by using the term “lace-curtain Irish”) to trigger waves of guilt in her ex-boyfriend, who has lost nearly all contact with the old neighborhood. She wrangles an invitation to a party at Mike’s home that weekend, given by his wife, Kate (RenĂ©e Elise Goldsberry in a spot-on performance as a bourgeois princess).
Mike’s home is the setting for most of the second act, and I don’t want to tell you too much about what happens there. If you’re a New York theater addict, you probably know that Ms. Goldsberry (who appeared in the musical “The Color Purple”) is African-American. But just because “Good People” is about South Boston, a site notorious for racial strife, doesn’t mean that Mr. Lindsay-Abaire is going to set a match to working-class prejudices. He’s more clever than that. He plays the race card here only to suggest that for his purposes it’s irrelevant.
The social dichotomy being explored isn’t a matter of black and white. It’s who does and doesn’t escape from where he or she comes from. Perversely, it seems, it’s the person with the thickest skin who has the best chances of rising above. Mr. Donovan makes Mike an artful study in willed amnesia, and the pain that surprises him when Margie summons the ghosts of their shared past is all the more palpable by not being directly expressed.
In the arguments that erupt throughout “Good People” — and they range from comradely sniping to the tossing of whatnots (a googly-eyed rabbit toy, as it happens) — characters often accuse one another of being too mean or too nice, too soft or too hard. They’re right on all counts.
Whether it’s Ms. Parsons’s amiably avaricious landlady or Ms. Goldsberry’s reflexively compassionate suburbanite, there’s nothing pure about the goodness or badness of the folks who inhabit this play. This makes them among the most fully human residents of Broadway these days.
GOOD PEOPLE
By David Lindsay-Abaire; directed by Daniel Sullivan; sets by John Lee Beatty; costumes by David Zinn; lighting by Pat Collins; sound by Jill B C DuBoff; dialect coach, Charlotte Fleck; production stage manager, Roy Harris; artistic producer, Mandy Greenfield; general manager, Florie Seery. Presented by the Manhattan Theater Club, Lynne Meadow, artistic director; Barry Grove, executive producer. At the Samuel J. Friedman Theater, 261 West 47th Street, Manhattan; (212) 239-6200; telecharge.com. Through May 8. Running time: 1 hour 55 minutes.
WITH: Becky Ann Baker (Jean), Patrick Carroll (Stevie), Tate Donovan (Mike), Renée Elise Goldsberry (Kate), Frances McDormand (Margaret) and Estelle Parsons (Dottie).

Southie native David Lindsay-Abaire at the Samuel J. Friedman Theatre in New York, where his play “Good People’’ opened last week. 
(Photo by Peter Foley for The Boston Globe)


Monday, March 7, 2011

While we're on the subject of the declining "middle class," a word about college education:



Degrees and Dollars
By PAUL KRUGMAN
March 6, 2011

It is a truth universally acknowledged that education is the key to economic success. Everyone knows that the jobs of the future will require ever higher levels of skill. That’s why, in an appearance Friday with former Florida Gov. Jeb Bush, President Obama declared that “If we want more good news on the jobs front then we’ve got to make more investments in education.”

But what everyone knows is wrong.
The day after the Obama-Bush event, The Times published an article about the growing use of software to perform legal research. Computers, it turns out, can quickly analyze millions of documents, cheaply performing a task that used to require armies of lawyers and paralegals. In this case, then, technological progress is actually reducing the demand for highly educated workers.
And legal research isn’t an isolated example. As the article points out, software has also been replacing engineers in such tasks as chip design. More broadly, the idea that modern technology eliminates only menial jobs, that well-educated workers are clear winners, may dominate popular discussion, but it’s actually decades out of date.
The fact is that since 1990 or so the U.S. job market has been characterized not by a general rise in the demand for skill, but by “hollowing out”: both high-wage and low-wage employment have grown rapidly, but medium-wage jobs — the kinds of jobs we count on to support a strong middle class — have lagged behind. And the hole in the middle has been getting wider: many of the high-wage occupations that grew rapidly in the 1990s have seen much slower growth recently, even as growth in low-wage employment has accelerated.
Why is this happening? The belief that education is becoming ever more important rests on the plausible-sounding notion that advances in technology increase job opportunities for those who work with information — loosely speaking, that computers help those who work with their minds, while hurting those who work with their hands.
Some years ago, however, the economists David Autor, Frank Levy and Richard Murnane argued that this was the wrong way to think about it. Computers, they pointed out, excel at routine tasks, “cognitive and manual tasks that can be accomplished by following explicit rules.” Therefore, any routine task — a category that includes many white-collar, nonmanual jobs — is in the firing line. Conversely, jobs that can’t be carried out by following explicit rules — a category that includes many kinds of manual labor, from truck drivers to janitors — will tend to grow even in the face of technological progress.
And here’s the thing: Most of the manual labor still being done in our economy seems to be of the kind that’s hard to automate. Notably, with production workers in manufacturing down to about 6 percent of U.S. employment, there aren’t many assembly-line jobs left to lose. Meanwhile, quite a lot of white-collar work currently carried out by well-educated, relatively well-paid workers may soon be computerized. Roombas are cute, but robot janitors are a long way off; computerized legal research and computer-aided medical diagnosis are already here.
And then there’s globalization. Once, only manufacturing workers needed to worry about competition from overseas, but the combination of computers and telecommunications has made it possible to provide many services at long range. And research by my Princeton colleagues Alan Blinder and Alan Krueger suggests that high-wage jobs performed by highly educated workers are, if anything, more “offshorable” than jobs done by low-paid, less-educated workers. If they’re right, growing international trade in services will further hollow out the U.S. job market.
So what does all this say about policy?
Yes, we need to fix American education. In particular, the inequalities Americans face at the starting line — bright children from poor families are less likely to finish college than much less able children of the affluent — aren’t just an outrage; they represent a huge waste of the nation’s human potential.
But there are things education can’t do. In particular, the notion that putting more kids through college can restore the middle-class society we used to have is wishful thinking. It’s no longer true that having a college degree guarantees that you’ll get a good job, and it’s becoming less true with each passing decade.
So if we want a society of broadly shared prosperity, education isn’t the answer — we’ll have to go about building that society directly. We need to restore the bargaining power that labor has lost over the last 30 years, so that ordinary workers as well as superstars have the power to bargain for good wages. We need to guarantee the essentials, above all health care, to every citizen.
What we can’t do is get where we need to go just by giving workers college degrees, which may be no more than tickets to jobs that don’t exist or don’t pay middle-class wages.