Aught Lang Syne: Trends of the Decade

In our latest installment of Aught Lang Syne, NPI is going to look at the best and worst trends in our culture from this decade. John S will tackle movies and TV, while Tim will explain sports and, of course, fashion.


The Best Trend in Television: The Demise of Laugh Tracks

The slow, steady, not-yet-completed demise of the laugh track is probably the best overall trend of the decade in television. In the 1990s and the early part of the Aughts, nearly every big, successful sitcom was accompanied by a laugh track: Seinfeld, Cheers, Frasier, Friends, Will & Grace, Becker, The King of Queens, Everybody Loves Raymond, etc. Currently, though, most new sitcoms air without laugh tracks. Even among existing shows, the comedies that are critically praised tend to be canned laughless. The entirety of NBC’s Thursday night lineup—the traditional home of the most popular sitcoms—is laugh track-free. ABC comedies Modern Family, Scrubs, and Cougar Town are also not accompanied by disembodied laughter. None of HBO’s comedies—Curb Your Enthusiasm, Flight of the Conchords, Extras—have laugh tracks, either.

There are, of course, some holdouts: FOX’s only two non-animated comedies—Brothers and ‘Til Death—do have laugh tracks, but neither has much of an audience. CBS, though, has five sitcoms with laugh tracks (Gary Unmarried, Two and a Half Men, The Big Bang Theory, Accidentally on Purpose, and How I Met Your Mother), and all have some degree of success. In general, though, the trend is certainly waning.

The arguments against the laugh track are obvious. If they aren’t to you, then go read Chuck Klosterman’s essay “‘Ha Ha,’ He Said, ‘Ha Ha.’,” from his latest book, in which he calls the laugh track what it is: The sound of dead people laughing. More important, though, is that the trend away from laugh tracks is part of the trend away from traditional, multi-camera sitcoms, and towards the more original, innovative ways of filming comedy. This isn’t to say that all traditional sitcoms are bad—How I Met Your Mother, for example, is a good show—but that the trend away from laugh tracks represents the breaking of the sitcom mold. The new, original sitcoms that we’ve seen this decade—Curb, Arrested Development, The Office—simply would not work with canned laughter.

The Worst Trend in Television: The Demise of Opening Credits

Where have all the credit sequences gone? I’m not entirely sure who to blame for this one, but it seems that every new drama that airs on TV foregoes a traditional credit sequences for a simple title card. I believe it was 24 who dumped them first, but Lost, Heroes, FlashForward, Gossip Girl, The Vampire Diaries, and The Mentalist are all current network dramas that have ditched any extended opening sequence.

Now, I get why they do it; it saves time. It made sense for 24 to ditch credits (although they did have a brief “Jack Bauer Voiceover Introduction” during the first season), since each hour of the show covered an hour of “real time.” Each episode had to squeeze as much action into to the allotted time as possible.

What’s so dispiriting, though, is how quickly this trend caught on. Every show realized it could gain slightly more time by eliminating credits. It almost became a way for shows to signal that they had a big story to tell*—it seems that the only shows on TV with credits now are sitcoms and procedurals.

*Correlated with this trend is the trend of doing “previously on” before every episode; this used to be reserved for special two-part episodes, before serialization became the norm. This trend is neither good nor bad.

So what’s so bad about this trend? Well, for one, it now means that the credits extend into the actual show, often pretty much until the first commercial break. Even more upsetting, though, is that opening credits are now a lost art. It was ten years ago, now, that The Sopranos premiered, with an utterly transfixing opening. This credit sequence is over 90 seconds long, but I almost never skip through it on the DVD or DVR. Even intros that aren’t as blindingly cool as The Sopranos’ can be integral to setting the tone of the show. Dexter’s theme, for example, is a playful blend of the mundane and the macabre, just like the show itself.

This trend is also related to network television’s trend of scheduling shows to end at 10:07 or 9:03, instead of the hour; in the current television climate, networks need to squeeze in as much advertising time as possible. Only shows on basic or premium cable can really afford credits now. Luckily, though, the best trend of the next decade is probably going to be the end of networks altogether.


The Best Trend in Movies: The Superfluity of Theaters

We’ll cover this more in-depth later in the month, but it really hasn’t been such a great decade for movies. More important than any creative or artistic trend, then, has been the trend in how people watch movies in the Aughts, namely that movie theaters are now entirely superfluous.

Now, I don’t want to dismiss the sanctity of the theater experience, or importance of your local movie theater to your childhood: Some movies ought to be watched in big groups, with big screens. And having a personal attachment to a local theater is fine.

But come on. First of all, it seems like 95% of all theaters now are owned by Loews—not exactly a local flavor. More importantly, though, no movie theaters, even huge multiplexes, can hold every movie. In the past then, movie distributors have essentially controlled what movies people see, particularly in the more remote parts of the country, which may only have one or two local screens.

By the end of the Aughts, though, pretty much anyone can watch pretty much any movie, more or less as soon as it comes out. The proliferation of Netflix and other online rental companies, as well as the ease with which films can be downloaded, both legally and illegally (not that NPI condones piracy), mean that viewers are no longer beholden to the movie theaters’ decisions on which films to show.

There is, also, the convenience factor. Home watching has a myriad of benefits in this respect: You can watch any time, you can pause at any point, you can rewind if you fall asleep, you don’t have to wear pants, you can watch in bed, etc. DVDs, DVR and OnDemand have also benefited TV, but they haven’t fundamentally altered it like they have with movies: People always watched TV at home.

Of course, movie theaters are still important—few people want to watch Live Free or Die Hard on a 13’’ computer screen. And in the coming decade, standard movie theaters may be as common as drive-thrus are now (Raise your hand if you’re reading this and you’ve ever been to a drive-thru. Or even seen one. That’s what I thought.). But right now, at the end of the decade, we have the best of both worlds. If I want to watch Home Alone 2 right now, I’m probably about six clicks away. But if I want to watch Sherlock Holmes on a big screen (you know, so I can capture the full impact of the action and explosions, the way (Sir) Arthur Conan Doyle intended), then I can go to my local Loews.

The Worst Trend in Movies: The End of Original Plots

When I was a kid, I used to wonder what would happen when movies ran out of plots. I figured that there were only a finite number of stories to tell, and eventually people would just have to stop making movies.

Of course, I was a kid, so I didn’t think about sequels, remakes, and parodies. Sequels, of course, aren’t new, and arguably the greatest movie of all time was a sequel. What’s really been different during the Aughts has been the proliferation of franchises. It seems like every blockbuster is made with the potential of a sequel in mind. Superhero franchises abounded during the decade, but there were also sequels to Transformers, Charlie’s Angels, Shrek (two), Saw (five), Final Destination (three), Bad Boys, The Matrix (two), Pirates of the Caribbean (two), Pitch Black, Austin Powers, Mission: Impossible, etc. And that’s not even counting the Harry Potter, Lord of the Rings, and Twilight franchises—since those were based on books—even though those were really one story spread over several films.

It’s not that sequels are inherently bad (I, for one, cannot wait until Paul Blart 2), but that the fact that films are now planned with sequels affects their plots: Oh wait, we can’t wrap up all those loose ends; we’ve gotta save some stuff in case this movie does well at the box office. And the fact that franchises are now how studios make most of their money means that they look for things they can franchise when they release blockbusters. A popular book series? A new toy craze? A Disney ride? You can bet you’ll see some movies based on it.

Sequels themselves wouldn’t be that bad, but they have been accompanied by remakes of old TV shows and old movies—some of which weren’t particularly good or memorable the first time around. When Hollywood does decide to remake something memorable, they generally find a way to alienate its most loyal fans or at least leave people wondering what the point was. Like sequels, remakes are not inherently bad, but more often than not, they are a symptom of a lack of creativity, and not a new look at an old idea.

Hollywood even seems to be making jokes about its own lack of ideas: Scary Movie, Not Another Teen Movie, Date Movie, Epic Movie, Disaster Movie, Superhero Movie, and Meet the Spartans were all released this decade as “spoofs” of tired genres, only to become a tired genre itself. The first two probably qualify as legitimate, if not exactly groundbreaking, satires, but the last five are more like a loose thread of pop culture cags and catchphrases. These movies, from the depraved minds of Jason Friedberg and Aaron Seltzer, are to humor what Hitler was to the Jews. Josh Levin of Slate probably put it the best in his review of Meet the Spartans:

This was the worst movie I’ve ever seen, so bad that I hesitate to label it a “movie” and thus reflect shame upon the entire medium of film. Friedberg and Seltzer do not practice the same craft as P.T. Anderson, David Cronenberg, Michael Bay, Kevin Costner, the Zucker Brothers, the Wayans Brothers, Uwe Boll, any dad who takes shaky home movies on a camping trip, or a bear who turns on a video camera by accident while trying to eat it. They are not filmmakers. They are evildoers, charlatans, symbols of Western civilization’s decline under the weight of too many pop culture references.

Somehow, these movies, combined with their affiliated sequels of course, have combined for revenues of $1.24 billion (that’s right, billion) during the Aughts. For movies in the Aughts, a lack of ideas often translates to success.


Best Trend in Sports: The Widespread Acceptance of Instant Replay

I suppose it’s the skeptic in me that made picking out a “best trend in sports” a lot harder than picking out a worst; we tend to overlook the things that slowly improve the way we experience life, and in this case, sports.

I really wanted to go with the introduction of the first-down line, but that happened in 1998.* My second choice—the adoption of instant replay in the NFL—occurred before the 1999 season. So we’ve got to go with a bit of a cop-out here and focus on the acceptance of instant replay in the sporting world.

*Can you imagine watching a football game these days without the first-down line AND the Fox Box? The field would look so bare!

Remember when replay was adopted in the NFL? Remember how it was against popular opinion? Remember how when officials ruled that Vinny Testaverde had gotten the ball across the goal line when it was just his helmet, most people were like, “Well, sucks to be a Seahawks fan”?* Remember how most people harrumphed at the idea of reinstituting the late-80s version of replay, how they said that it would just slow down the game and that human error was an important part of sports and that the world was unfair so you should just deal with it? How ridiculous does that all sound now?

*A truth, of course, especially in 1999, but an irrelevant one.

A decade later, the tenor of the debate has shifted dramatically. Now, we argue about how Major League Baseball and FIFA, in their respective neglect toward instant replay, have done their fans a disservice. Far from crying about time delays, we’re debating how to eliminate umpires altogether.

This is how well the Instant Replay Experiment has worked in the NFL. And not to sound too much like a technophile, but this is a very good thing. The people who believe that human error is an important thing in sports should familiarize themselves with the term “necessary evil.” Human error is never a good thing.

We can’t really condemn Tiger Woods, honey. I mean, every marriage needs a little human error.

—Malpractice suit? Come on, Bill, it was just a little human error—part of medicine, that’s all.

You know, I felt that way about the recession, too, Abe, but then I realized that Wall Street just had some human error. It happens.

Instant replay doesn’t eliminate human error from sports, but it limits it. And that might be the best thing that’s happened in sports this last decade.

Worst Trend in Sports: Monochrome Uniforms

Now, this is the section where I’m encouraged, if even obliged, to get a little preachy. And I can go so many different ways here: I can attack steroid usage, but that was really a product of earlier times. I can talk about officiating scandal in the NBA, but that’s not really a “trend,” is it? I can go on about how football players are dying from football-related trauma, but that, too, has been happening for decades.

So I’m gonna go with an issue that really strikes at the heart of how I enjoy my American football. I’m gonna go with monochrome uniforms.

I don’t think I should have to defend why monochrome uniforms are a travesty to the aesthetically inclined. I acknowledge that my attention to uniforms is unusual—especially for a male who also cares deeply about the intricacies of sports—but seriously, how can it not infuriate you to see professional football teams don monochrome monstrosities week in and week out? Football is the one sport—again, we’re not counting hockey or soccer here*—where teams can easily integrate multiple colors into a coherent uniform. Perhaps it’s just tradition, but sporting a jersey and pants of complementary but not matching colors works a hell of a lot better on the football field than it does on the baseball diamond or, God help us, the basketball court.

*Less for athletic purposes than for uniform ones. Hockey uniforms are so strangely configured—shorts, really?—as to be irrelevant in uniform discussion. Soccer uniforms place more focus on ads than on team, which, in my book, is enough for soccer to be expelled from “sporthood.”

Most people consider the birth of monochrome uniforms to be 2002, when the Seattle Seahawks, upon moving from the AFC to the NFC, ditched their traditional royal blue and kelly green color scheme for one of navy and “storm” blue, completing the transformation with blue helmets, blue jerseys, blue pants, and blue socks. The Seahawks were indeed the first team in the modern NFL to make a monochrome look the norm; they’ve probably worn their blue jerseys with white pants, but I can’t find any evidence of it on Google.

But I’m going to go further back and, somewhat unfairly, blame the Denver Broncos. In 1997, the Broncos introduced their now famous “Modern” uniforms:* a new, more aggressive logo with a navy jersey, white pants, and thick orange stripe—more like a very slight parabola—along the side. The Broncos themselves did not take the monochrome route until fairly recently, and they still only do it occasionally. But Denver’s template was imitated ad nauseam throughout college football, often by teams with lighter accent colors, such as yellow or silver, that didn’t stand out on white pants, thus leading to a monochrome look at schools such as California, West Virginia, Boise State,** and Washington State. The Denver Broncos thus unwittingly introduced the concept of the monochrome, and the Seattle Seahawks cemented it in professional football.

*This is how they are identified on Madden video games.

**Boise State does not have the “lighter accent color” problem the other schools on this list do. I postulate that the Broncos went monochrome to completely blend into their smurf turf. I’m not kidding.

In the last seven years, over half of the NFL’s teams have sported a monochrome look at least once, and several have done it frequently. The Buffalo Bills also went monochrome in 2002 and have sported the NFL’s worst uniforms ever since. Both the Jacksonville Jaguars and the Baltimore Ravens—younger franchises that had appeared to carve out a distinct niche with more feminine colors—went all-black, as did the New Orleans Saints.* The Arizona Cardinals, Atlanta Falcons, and Minnesota Vikings all updated their uniforms this decade to more stylish, stripe-infused looks that generally work well, except when they go monochrome. The two teams that should be paying homage to the best uniforms in NFL history—the Tennessee Titans and Houston Texans—are the only two in the league** that have gone monochrome with two different colors, which is especially upsetting since their “normal” uniforms look so much better. Of course, this is a bit of a theme here: Teams with respectable uniforms need to slum for a week or two in the Whitechapel of ugly monochromes. The Rams—whose navy and gold update at the beginning of the decade made for perhaps the best look in football—decided to screw contrast (before bizarrely deciding to wear white pants all the time for the last two years, which in my opinion is almost as bad). The Eagles and Patriots have each dabbled in monochrome, although thankfully only once apiece; the Jets have done it more than they’d like to admit.

*Note how all of these uniforms are worsened even more by the dreaded “leotard effect,” wherein a team’s socks match the color of their pants, making it difficult to tell where the pants end and the socks begin. Since the Jaguars and Ravens have black helmets, as well, their all-black uniforms look especially hideous.

**It’s possible the Dolphins have worn all-orange in addition to all-teal. I seem to remember this happening, but can’t find any proof online, which is probably a good thing.

The nadirs of the Monochrome Movement came when two of the league’s most traditional franchises—and two of its best dressed—each jumped the monochrome shark: the Redskins have done it twice in the last two years, and the Bears took the bait in 2002 and again in 2006.

Pretty much the only teams to resist going monochrome are the ones that can’t because they don’t wear primary colored pants (the Steelers wear yellow pants, the Bucs pewter, etc.). Or maybe these teams are the last ones with a sense of propriety. And that’s why I’m prouder than ever to root for the Giants, who have never lived up to their nickname of “Big Blue” on the field.


Best Fashion Trend: Stylish Glasses*

*Throughout this section, “stylish” generally means “black, relatively thick-rimmed.”

Look, I wanted to write about the worst fashion trend, so I had to come up with a best one, too. And I know I’m not qualified to write about fashion: I like T-shirts, jeans, and track jackets. And I don’t even wear glasses.

But there was a time when people who needed glasses were resigned to the fact that they were going to look terrible. Remember She’s All That and how Rachel Leigh Cook was ugly and unwanted and all, and then Freddie Prinze, Jr. took off her glasses and we realized, Wait, Rachel Leigh Cook is actually attractive? That’s how we used to feel about glasses.

And that’s not true anymore. Now, you take a girl (or guy, but I’m gonna stick with girl for my purposes) that looks good without glasses, add stylish glasses, and she looks even better. Why? Because glasses still connote intelligence along with the added element of attainability, which means that attractive girl just transformed into an intelligent attractive girl that’s back in your league.

Who doesn’t love a librarian?

Worst Trend in Fashion: Everything That’s Happened with Hats

And I mean everything.

You have to understand where I’m coming from here: When I was a kid who went to a Catholic school where hats were strongly prohibited,* I couldn’t wait until I got older and started doing things socially—you know, having one of those social lives—so I’d have an outlet for all the hat-wearing I wanted to do.

*Even on dress down days you had to pay extra to wear a hat.**

**Are “dress down days” known nationally? Do kids who went to public school understand the concept? If not, here goes: On a roughly monthly basis, we were allowed to pay money to wear regular clothing. That money went to charity. Forgetting when it was a “dress down day” was a humiliating oversight that lingered for days—the athletic analogue would be airballing a free throw.

But by the time I reached that point in life, nobody wore hats like they used to. I’m not talking about wearing hats backwards, which is fine even if having a forehead like a billboard prevents me from indulging in the movement, or sideways, which nobody ever really thought was cool. I’m talking about two particular changes in how hats were worn. The first is that people no longer pulled hats down so that the apex of the brim was roughly parallel to the ground. Instead, you quit about halfway through putting a hat on, and just left the brim pointing upward on about a 45-degree angle. Everyone I knew at college did this when they wore a hat. But this prevents the hat from performing at least two of its three main functions, which are 1. to block the sun; 2. to block rain; 3. to look cool. In my book, it violates all three.

The second trend is that the people who did pull their brims down to the roughly parallel to the ground area no longer curved the brim at all. They kept it flat, as evidenced by Joba Chamberlain. This not only looks ridiculous—as evidenced by Joba Chamberlain—but it’s very uncomfortable. Flattening the brim applies extra pressure to the front of the head.

And then, this whole style was aggravated by the fact that it became trendy to leave the New Era sticker on top of the brim. Now, for the life of me, I can’t understand who’s looking for that in a hat. The only way I can conceptualize the genesis of this trend is that someone bought a new hat, forgot to take the sticker off, wore it to some social occasion, had his faux pas pointed out by friends hoping to mock him, and then shrugged it off by explaining it was all on purpose—which I admit is a far smarter way to handle the situation than I could have instinctively come up with.*

*And I know this because one day, I bought a new pair of jeans, forgot to take the size sticker off, wore them to school on a dress down day, and had my faux pas pointed out by a friend who used it to mock me for like a week.

There’s no rational reason for leaving the sticker on the hat, just like there’s no rational reason to leave the sticker that repeats over and over again your waist size and length on your jeans. It’s  mindful conformity to an arbitrary standard of what is cool right now that makes no statement whatsoever about fashion, culture, or society. And it’s moronic.

All I want is to wear a hat like a normal human being. But first, I need normal human beings to start wearing a hat like I do.

The Second-Best and Second-Worst Trend in Fashion: The Rise of the Ironic/Culturally Allusive T-shirt

I love wearing T-shirts that I find funny, but there are few things that upset me more than someone else wearing a T-shirt that I know they think is funny but which actually isn’t funny.


Worst Miscellaneous Trend: The Decline of American Gum

There was a time when someone asking me if I wanted a piece of gum could make my day. Gum—yet another prohibited substance of a Catholic schoolboy’s upbringing—was a way of tasting a candy-like substance without all of the negative health benefits of actually digesting candy. And even as I matured from Fruit Stripe and Bubble Tape to Big League Chew and Bubble Yum to Bubblicious and finally Doublemint, I enjoyed gum for both its taste and its being an outlet for my nervous energy. It does, after all, taste better than a toothpick.

Over the course of the Aughts, though, the quality of gum in this country plummeted. Traditional brands like Doublemint and Juicyfruit and Big Red were no longer good enough; gum had to do something else besides taste good and be an outlet for nervous energy. It had to give you alarmingly fresh breath or whiten your teeth or, according to most gum advertising campaigns, have some sort of sex appeal. This has led to gum like Dentyne Ice and Orbit and Trident White and Stride and Eclipse and 5; in other words, this has led to gum that appears to care much more about nifty packaging* than how it actually tastes; in fewer words, this has led to gum that sucks. Have you ever had a piece of Orbit and enjoyed the experience? For how long? Five seconds? The best thing about Orbit is that it loses its flavor almost immediately; the worst thing is that it has an aftertaste with more endurance than Messalina.

*Seriously, Cobalt? Cobalt is an element! It’s on the periodic table. How can it be a flavor of gum? And a good one at that?**

**Other flavors of 5 include Elixir, Solstice, Zing, Flare, and Rain. 5’s policy on nomenclature is apparently to use common nouns, basic interjections or meteorological events.

Doublemint? That’s a gum you could take some pride in. You know what flavor Doublemint comes in? It doesn’t even name one. It doesn’t care if it doesn’t whiten my teeth or only freshens my breath a moderate amount or rips out my molar fillings; it only cares about tasting good for more than 15 seconds.

3 responses to this post.

  1. Posted by James Schneider on December 12, 2009 at 11:58 PM

    ok, 1. Entourage is both on HBO and without a laughtrack(although if one were to be put in, I’m not really sure where it would be put in)
    2. The monochrome uniz are underrated and the texans have the best uniforms in the nfl
    3. The Giants home pants look retarted, monochrome would be an improvement


  2. […] and many others. There are plenty of reasons why comedies have been so good during the Aughts, and we touched on some already, but the same principles that applied to dramas are at work here: The people making TV realized […]


  3. […] S lamented the demise of TV credit sequences. Well, he is not the only one peeved by […]


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: