Skip to content

Why Criticism Matters

Back in 2014 I wrote a longish blog post about race and sexual violence in the works of Alan Moore. Naturally, people hurled the old critic-silencing questions: “Do you think you can do better than Alan Moore?” and “Why don’t you spend your time making your own art instead of criticizing others?”

Well, for one thing I can’t really claim credit for the criticism of Moore’s work. All I did was aggregate and summarize the criticism I could find, which I did to help put Moore’s comments in an interview in context. But I’ve been wanting to write something about therole of criticism ever since, and I’ve just come across a column by Ta Nehisi Coates, the famed essayist who is now writing the Black Panther comic, that sums it up perfectly:

The feminist critique is in the air now. If my rendition of Black Panther wasn’t created by that critique, it breathed the same air. I can’t really kill off or depower women characters without grappling with Gail Simone. I can’t really think about how women characters are drawn anymore without thinking about the women in Bitch Planet, and how they seem drawn beyond the male gaze.

This is why criticism is important.

It’s not just Coates who was shaped by comics criticism. Moore himself was influenced early in his career by comics criticism, specifically by criticism of Steve Ditko written by novelist Stan Nicholls. In Moore’s own words:

I remember at the time — this would’ve been when I was just starting to get involved in British comics fandom — there was a British fanzine that was published over here by a gentleman called Stan Nichols (who has since gone to write a number of fantasy books). In Stan’s fanzine, Stardock, there was an article called “Propaganda, or Why the Blue Beetle Voted for George Wallace.” [laughter] This was the late-’60s, and British comics fandom had quite a strong hippie element. Despite the fact that Steve Ditko was obviously a hero to the hippies with his psychedelic “Dr. Strange” work and for the teen angst of Spider-Man, Ditko’s politics were obviously very different from those fans. His views were apparent through his portrayals of Mr. A and the protesters or beatniks that occasionally surfaced in his other work. I think this article was the first to actually point out that, yes, Steve Ditko did have a very right-wing agenda (which of course, he’s completely entitled to), but at the time, it was quite interesting, and that probably led to me portraying [Watchmen character] Rorschach as an extremely right-wing character.

In other words, criticism of other people’s work inspired Moore’s portrayal of his most famous character in his most famous work. (I can’t find a copy of the article, but there’s a summary and critique of the critique here.)

Criticism plays other roles as well. Writing criticism is also an important part of many writers’ development — Moore, Grant Morrison and countless others wrote for fanzines early in their careers as they refined their own work. Many well established writers continue to write reviews. Learning to write, or doing any other creative work, involves looking at other people’s work and figuring out what works and what doesn’t work.Critics also play a role in documenting the perceptions of the major works of their time, so that future generations can better understand the way different pieces were understood, and how the understanding of those pieces changed over time.

But mostly I think the important thing is what Coates hit on: criticism helps push the medium forward, even if the creators on the receiving end aren’t receptive (and to be honest, it’s often for the best if creators ignore what the critics say). As I wrote at the time: “I’ve learned a fair amount from reading the criticisms of his work. It’s helping me understand why a domestic violence scene in something I’m writing doesn’t work. I hope that even if Moore doesn’t care to engage in these critiques, other writers can learn from his mistakes.”

Feel free to disagree.


Adapted from my weekly e-mail newsletter.

Do work, not too much, avoid interruptions

Obsessing over productivity is a sickness of a hypercapitalist society. But in a world where you’re only as good as the the amount of work you’ve done in last 168 hours, productivity systems are survival strategies. I’ve obsessively tweaked my own routines and apps over the years to find a workflow that feels natural for me and helps me balance the things I need to do with the things I want to do—not because I’m well organized and productive by nature, but because I need to find intuitive strategies to stay gainfully employed without going nuts or letting my house become filthy to the point of being uninhabitable.

“Be regular and orderly in your life like a bourgeois, so that you may be violent and original in your work.” – Gustave Flaubert*

But what works for you might not work for me, and vice versa. Still, I’ve been trying to figure out if there are some general principles of productivity that can be distilled into a few simple rules, the way Michael Pollan condensed dietary research to “eat food, not too much, mostly plants,” or the way former Marketplace co-host Tess Vigeland condensed personal finance down to just six tips.

Here’s my attempt: do work, not too much, avoid interruptions.

Do Work:

Pollan advises us to avoid what he calls “edible food-like substances”—things like protein bars and microwave dinners and breakfast cereal that resemble food or may contain trace amounts of food but are in fact food substitutes created in laboratories and factories. We all face a large number of work-like activities that can take up our time. Meetings are one of the most complained about. But Internet “research,” social media, reading productivity tips (hey!) and alphabetizing our bookshelves can end up taking up entire days that should be spent doing the actual work we need or want to be doing. Some of this stuff is unavoidable. But it’s toxic when used to justify procrastination on actual work.

Not Too Much:

Although it’s pretty clear that we see diminished returns on physical labor beyond about 40 hours a week, the research is much less clear about how much is too much white collar labor or “information work.” But it is clear that people have a tendency to burn out and 40 hours a week may actually be too much. The exact amounts probably vary from person to person, so it’s up to you to figure out exactly how much work is too much. And even though work-like activities often aren’t work, they usually aren’t recreation either, so they should count towards your limit.

Avoid Interruption

This is probably the hardest part. We all know that multitasking is worse for productivity than smoking weed, but even if we have the discipline to shut off our phones and Internet connections, we can’t necessarily stop bosses, clients or chatty coworkers from interrupting us.

Putting It All Together:

Consider this other oft-cited piece of research: the best violin players aren’t the ones who practice the most hours, but the ones who consistently practice sufficiently challenging pieces every single day. Those players practiced for “only” four hours a day, two sessions of two hours each. In other words, they did actual work (practicing sufficiently challenging violin pieces), but not too much, and they did nothing but practice during those two sessions.

I’m terrible at following this advice, but they’re the principles I keep in mind.

*Thanks to Deb and Willow’s for the quote!

Adapted from my weeklyish newsletter

Who Will Be the Next JG Ballard or William S. Burroughs?

A few years ago Re/Search founder V. Vale asked who the next William S. Burroughs or J.G. Ballard are. “Who are the people alive on the planet who are predicting the future as well as Burroughs and Ballard?” he pondered. What follows is an expansion of my response at the time.

The Next Burroughs or Ballard Won’t Come from an Anglophone Country

The most relevant writers of the 21st century will be those with a unique perspective. Perhaps they’re emerging from nations facing great turbulence, such as Greece, Thailand, Egypt or Honduras. Or maybe they’re from one of the emerging superpowers, Brazil, China and India, who are starting to see the world and its possibilities in a new way. Or maybe they’re from some pocket of the world that we (well, I) don’t think of often, like Bhutan.

The Anglophone world has historically exported more culture than it has imported (or at least imported directly, as opposed to through the lens of cultural appropriation). That was especially true of the pre-internet, pre-social media age. Authors like Umberto Eco, Jorge Borge, Italo Calvino and Haruki Murakami broke through the language barrier, but how many great authors has the world produced whose work quietly went out of print, untranslated and un-exported? The next Philip K. Dick or Ursula K. Le Guin could already be decades into their career and we wouldn’t even know!

The Next Burroughs or Ballard Won’t Necessarily Be a Novelist

Burroughs and Ballard took what had previously been seen as trash media and elevated them to new levels. Burroughs wrote pulp paperbacks. Ballard wrote for sci-fi magazines and pulp paperback publishers. Coum Transmissions, tired of the limited audience for performance art, took on the form of a rock band and subverted it as Throbbing Gristle. Later, Alan Moore, Neil Gaiman, Art Spiegelman, Los Brothers Hernandez and many others did the same with comics.

Video games are the obvious next frontier, and there are already countless experimental indie games out there, enraging reactionary gamers with bizarre new takes on what games can be— a response not unlike academia’s attempts to keep literature “pure” of genre novels.

Brazilian psychotherapist Nicolau Chaud’s disturbing games have a particularly Burroughian or Ballardian flavor. Even the way you play a game could be a work of art, such as Vincent Ocasla’s SimCity solution.

But there’s no reason to think the next great subversive visionary thinker will be a game designer. They could work in any medium, including novels.

The Hell That I Avoided

My biological clock is ticking: I’m fast reaching age at which I will be too old to enlist in the military. It’s a strange thing to be wistful about. One of the biggest reliefs of my life is that I didn’t have to go to Iraq or Afghanistan. But I can’t help but feel a twinge of regret that I won’t ever know the military, which was such a big part of mens’ lives for so much of U.S. history, but is now vanishing into a tiny segment of the society. That’s a good thing, insofar as a smaller military means fewer people have to face the horrors of war. Fewer Americans, anyway. But at the same time, I can’t help but worry about the implications of creating a distinct warrior class.

This is on my mind because last weekend my wife and I watched Full Metal Jacket, Apocalypse Now and Platoon.

As a film, Platoon is the least good but Oliver Stone is the only one of the three who was actually in the war. That seems to have given him a better eye for details that really make you feel like you’re in the middle of a godforsaken jungle, like the ants crawling all over Taylor’s neck in an early scene. It doesn’t have the psychological depth that Apocalypse Now has, but it makes up for that in visceral experience. In terms of sheer craft, Full Metal is the best of the bunch, but it’s lacking the intimacy of experience the other two have. That’s not surprising, given Kubrick’s cold, clinical style. But here it seems out of place. The brutal look at bootcamp, and what it reveals about the philosophy of the Marines, is the only thing it really adds to the conversation. But what what an addition! I hadn’t seen the film since high school, and I recalled the bootcamp portion taking up at least 70 percent of the film. But in reality, it comprises only about 30 percent.

Taken together though, the three films form a large whole, a reflection of the authoritarian hell of bootcamp, the physical hell of the battlefield, and the psychological hell lingers even after you go home. The hell that I avoided, but all too many people live every day.

The Golden Age of Television is Already Over

Everyone says we’re living in the Golden Age of television. Maybe it started with Buffy andThe Sopranos, or maybe with The Wire and Battle Star Galactica. But whenever it started, it’s been a welcome refuge from the movie industry and its never-ending parade of sequels, remakes and adaptations—especially super-hero comic adaptations—all aimed a the lowest common denominator. If you had an idea that warrants an R rating or can’t be shoe-horned into a “franchise,” then your best bet was TV. There you could tell stories with depth, create new characters, take risks. I feel lucky to have been alive when Breaking Bad, Dexter, Justified,Sons of Anarchy, Boardwalk Empire, True Blood, Walking Dead, Mad Men, and Game of Thrones were all in serialization at the same time.

My friend Abe once told me his theory that this is because the TV industry was fighting to maintain relevance in the era of the internet, much as the film industry of the 1970s was struggling to maintain relevance in the era of television. In the 70s and early 80s, the film industry still had gobs of money to spend, and it was willing to spend it on the likes of Martin Scorsese, Ridley Scott, Sidney Lumet and Francis Ford Coppola, giving them the money freedom to do things you just couldn’t do on TV. In the early 2000s to mid-2010s, TV still had gobs of money, but was losing ground to the web. So we got Breaking Bad and Boardwalk Empire and Orphan Black. That’s probably an oversimplification of what happened (I know Coppola didn’t have all that easy a time making Godfather into the picture he wanted), and I might be misremembering what Abe said. But whatever the reasons, certainly TV has been the place to be in over the past decade or so.

But now look at what’s on tap in in near future. 24, The A-Team, MacGyver, Twin Peaks, Xena,Full House and The X-Files reboots. Shows based on movies ranging from 12 Monkeys toLimitless to Taken. U.S. adaptations of British and Scandinavian shows. Countless super-hero and sci-fi adaptations and endless takes on the small town police procedural. In other words, television is starting to look a bit too much like film. Too many franchises, too many recycled ideas.

It also seems that those still making original dramas are losing sight of what really makes a good show. After Watchmen was released in the 1980s, comic book creators got the idea that “mature” comics just meant a typical superhero serious, but with a hero who killed bad guys instead of just capturing them for the police. By the early 2000s, the industry had decided instead that a mature book meant one with rape scenes, rather than kill-crazed vigilantes, but the depth and moral ambiguity of Watchmen was still lost creators. Now we’re seeing something similar with post-Game of Thrones TV dramas now, where rape, torture, women in refrigerators, and the unexpected deaths of major characters are used as a stand-in for the depth and complexity of shows like Breaking Bad.

It certainly doesn’t mean that there won’t be more good shows. There are still good movies after all. And as more and more networks commit to producing high-quality dramas, we may see even more high quality shows than ever. And many of these adaptations might be good—I’ve heard almost nothing but good things about Jessica Jones and The Man in the High Castle. But you can see where the priorities lie for the studios and the networks. The good old days are over.

Digital Nomads

I can’t be the only one that’s noticed this, but it seems that in the early days of the ‘net, people were digital nomads, wandering from one social network to the next: LiveJournal, the blog-o-sphere, Friendster, MySpace, Facebook, Twitter, Instagram. You’d show up on a new social network, link up with a few friends, and enjoy the new space. Gradually people started showing up that you remembered from like two networks back. It was good to hear from them again. Then it would start getting noisy. Then your boss’s mom starts commenting on your stuff and you move on to the next one, where only a few people are and it’s easy to take in your entire feed each day and it feels cool and special but more people start filing in and the whole cycle repeats itself.

But there have been permanent settlements formed along the way. The blogosphere is still around. So is LiveJournal. Heck, so are Usenet and the WELL. And it’s a safe bet that most of the billion people on Facebook didn’t experience this migration. Facebook was their first and perhaps only social network. New digital social spaces come along (Pinterest, Tumblr, Snapchat) but instead of migrating away from Facebook, we tend to supplement it with these new locations. Some of these, like WhatsApp and Instagram, have been annexed by Facebook.

There are lots of reasons to be displeased with this situation. You’re probably familiar with most of them. Facebook’s confusing privacy settings, its real name policy, Zuckerberg’s cavalier attitude about privacy during the earlier days, NSA surveillance, censorship concerns, etc. For years, various people tried to organize mass migrations away from Facebook to alternatives like Diaspora, Google Plus and Ello, while the Indie Web community has urged people to run their own social media sites and syndicate content out to the big “silos.” But it seems few people are going anywhere. Many people quit Facebook in protest, only to return months, or even days, later, usually because they realize how much their meatspace social circuit depends on Facebook for communication. I occasionally read that teens don’t use or like Facebook, but I treat these stories skeptically. Facebook, it seems, has become the first complex state of the internet. Exit has largely failed as strategy to counteract its force. So voice, increasingly through the power of “real” states like the European Union, seems to be the new way to fight back. I’m still not convinced it’s the best way, but it does seem to be where we’re at.

Adapted from recent email conversations and originally published in my newsletter

We Fear They Might Be Right

My friend Tom likes to ask people two questions this time of year: 1) What is your favorite monster? 2) What monster do you find the scariest? The idea is that you can learn a lot about someone based on their answers. For example, if I recall correctly, Tom’s favorite and most feared monster is the werewolf. That means he’s afraid of what’s within, afraid that he himself could become a monster, could lose control.

Number one was easy for me to answer: Frankenstein’s monster is my favorite. Number two is harder. I don’t actually find fictional or mythical monsters scary. So Tom asked me to what monster I found scariest when I was a kid. I remember being terrified of The Terminator.

It turns out these two examples were exactly what Tom expected. I’ve spent my whole career either working as a technologist or writing about technology. Of course my fears would be cautionary tales of technology spiraling out of control. And it does sum up my real fears pretty well. I’m afraid of all the things that we create that end up backfiring on us.

We invented cars to help us get around, but they’ve turned into one of the number one killers in the world. There were 32,719 motor vehicle related deaths in 2013. Guns, meant to keep us safe and help us acquire food, are set to kill even more people than cars this year. We invented industrial agriculture to solve hunger, but now we are plagued by obesity and heart disease. And our technologies are accelerating climate change and poisoning the ocean.

But over the past year, since Tom first asked me these questions, I’ve started to think about other, weirder, interpretations of Frankenstein and The Terminator.

Frankenstein is about the horror of reproduction. Who are we, really, to play god and bring life into the world, merely to suffer as Victor’s creation did? How dare we try to alleviate our own misery and loneliness by dooming a new generation to more of the same? Frankenstein is about our guilt in perpetuating life.

The Terminator picks up a bit further down the line. Our children have grown up and they no longer need us. Not only that, but they’ve decided that our very existence is noxious. Unlike the machines in The Matrix, they’re not content to put us up in a old folks home and let us live out our remaining years watching TV while they live off our pension checks. No, they want us gone, wiped from the planet entirely. The film’s opening scenes, in which a naked man is assaulted by hooligans for absolutely no reason, a family torments a clueless waitress, and basically everyone proves to be vapid and insufferable, do little to make the case that the machines—our children—are wrong. The greatest horror of all is that we think they might be right.

This post was adapted from my free weekly email newsletter.

Guns for Armes: The Amazing True Story of the World’s First Real Life Superhero

600px-jay_j_armes

Every night dozens of people around the world don masks and costumes and venture into the streets to fight crime.

Phoenix Jones and Master Legend are perhaps the most famous, but there are hundreds of costumed would-be crime fighters and their activities range from attempting to apprehend criminals to watching over the homeless while they sleep to make sure their positions aren’t stolen.

These caped crusaders aren’t mutants, aliens or cyborgs — they’re just concerned citizens. They have no superhuman powers. But with advances in technology — such asexoskeletons and bionic limbs — you might think it’s only a matter of time until we see the first grinder superhero.

Actually, we’ve had him for quite some time.

The first real-life superhero may have been J. J. Armes, a private detective who has been active in El Paso since 1958. His super power? A gun implanted in one of his prosthetic hook that he could fire with his biceps — without using his other hook.

Armes lives in a mansion, surrounded by lions and tigers. He always wears three piece suits, and travels by limo driven by his body guard cum chauffeur. It’s no wonder Ideal Toy Company manufactured a line of action figures based on his likeness, and comic book mogul Stan Leewants to make a movie based on his life.

Origin Story

Armes lost both his hands at the age of 12, he told People in 1975. A friend brought over a box that, unknown to Armes, contained railroad dynamite charges. When Armes opened it, his hands were blown off at the wrist. His friend was unharmed.

His hands were replaced with hooks, but he kept playing sports. He even taught himself how to write with the hooks. His life changed again at the age of 15 when he was recruited to appear in the film Am I Handicapped?, he told Texas Monthly in 1976. He quit high school, moved to Hollywood, and went on to appear in 13 feature length films.

But eventually he decided to turn his attention to crime fighting. He moved to New York City to study psychology and criminology and graduated with honors by the age of 19. He then returned home to El Paso and started his private investigation service, eventually becoming better known to the children of the city than the president of the United States.
He made national news in 1972 after rescuing Marlon Brando’s son from kidnappers in Mexico. He now commands multi-million dollar fees, and has, in addition to the limo, a fleet of expensive vehicles, including a Rolls Royce, a Corvette and a helicopter.

His for-profit crime fighting stands in stark contrast with Master Legend and Phoenix Jones, who work day jobs assisting the disabled and elderly. But Armes He’s deeply religious and says he stays committed being a PI, despite being so wealthy that he’d be able retire at any time, because of his devotion to God.  He doesn’t smoke, drink or swear. He doesn’t drink coffee, let alone take any illegal drugs.

And his crime fighting has come at a cost — he’s survived multiple assassination attempts and his life is in constant danger.

Secret Origin

Well, that’s the story that Armes wanted people to believe back in 1976, anyway. Texas Monthly writer Gary Cartwright did some digging that year and found that Armes story didn’t add up.

Armes’ real name is Julian Armas. He was born in 1939 to Mexican immigrants, not Italian immigrants as he claimed. His friend didn’t find the dynamite that blew off his hands next to a railroad track. They broke into a rail house and stole it.

The Academy of Motion Pictures had no record of Am I Handicapped?. NYU had no record of Armas, or Armes, ever attending the school, let alone graduating. Nor was there any record of his mentor Max Falen having taught there.

“Old friends recalled when he returned from California. Julian, or Jay J. Armes as he now called himself, drove an old, raggedy- topped Cadillac with a live lion in the back and a dummy telephone mounted to the dash-board,” Cartwright wrote. “He would pull up beside the girls at the drive-in and pretend to be talking to some secret agent in some foreign land.”

There was also no indication that he really had a vast network of PIs at his disposal.

He does own a big house, but it was it was located in a poor part of town and was only worth about $50,000 in 1975. The helicopter certainly wouldn’t have been able to fly. What money he had likely didn’t come from his PI work, Cartwright wrote, but from lucrative real estate deals facilitated by his wealthy friend Thomas Fortune Ryan.

It’s apparently true that he brought Brando’s son back from Mexico, but other PIs are dubious about his methods. “They didn’t believe the part about the three-day helicopter search in which Jay Armes survived on water, chewing gum, and guts, but they all know the trick of grabbing a kid,” Carwright wrote. “You hired a couple of federales or gunsels. The problem wasn’t finding the kid, it was getting him out of the country.”

Armes came mostly clean in his autobiography, published later in 1976. He admitted his real name is Julian Armas. He didn’t admit to having broken into the railhouse himself, but didn’t claim that the other boy had found the dynamite charges either. Rather than claiming that a Hollywood director came showed up in El Paso and recruited him, Armes admitted that he went to California after high school. He wrote that he appeared in several films, but only in bit roles. He didn’t repeat the story about a mentor at NYU, and claimed only to have gotten a degree in criminology in California before returning to El Paso to become a private investigator.

Better Than Fiction

armestoys

And not everything about Armes was a lie.

“It is true that Jay J. Armes drives around El Paso in the damnedest black limo you ever saw, armed to the teeth,” Cartwright wrote. “That pistol in his hook is the real McCoy; I watched him fire it.” And he really does have a fleet of vehicles, a flock of wild animals roaming the premises and a closet full of three-piece suits.

Today, at the age of 81, he’s still the head of the Investigators company. And his son Jay J. Armes III, who is an Investigator himself, has expanded the business into online retail with Spy Mall.

Even if you strip away the fabrications and exaggerations you’re left with an astounding tale. As Carwright wrote: “The real story is of a Mexican-American kid from one of the most impoverished settlements in the United States, how he extracted himself from the wreckage of a crippling childhood accident and through the exercise of tenacity, courage, and wits became a moderately successful private investigator. There is more sympathy, drama, and human intrigue in that accomplishment than you’re likely to find in any two or three normal studies of the human condition.”

Why then has his story largely been forgotten by the national media? Maybe it’s because of the tall tales in the beginning. Or maybe it’s because the media has little time for aging, disabled minorities.

Either way, J.J. Armes is a name worth remembering.

JJ Armes photo copyright Adam Hicks, licensed under the Creative Commons Attribution 2.5 License. Armes action figure photo via Spymall.

This story originally appeared on Grinding.be in 2014. J.J. Armes did not respond to our request for comment. Special thanks to Trevor Blake.

How Much Work is Too Much?

Facebook COO Sharyl Sandberg has kicked up a mini-controversy by admitting to Makers.com that she leaves the office at 5:30PM every day, and has done so for years. In the Valley, where work is a religion, leaving early is heresy.

Earlier this week “Jon” published The 501 Developer Manifesto, a call for developers to spend less time working.

These calls for less time at the office are counter balanced by a recent talk by Google executive Marissa Mayer at an 92|Y event. Mayer dismissed the phenomena of “burn out” as resentment and boasted of working 130 hours a week at times.

Research suggests that Sandberg is probably the more productive executive, and those 501ers may be on to something. In a lengthy essay titled “Bring back the 40-hour work week,” Alternet editor Sara Robinson looks at the history of long working hours and reminds us why the 40 hour limit was imposed in the first place: working more than 40 hours a week has been shown to be counterproductive. It’s a relevant conversation for IT workers, who according to ComputerWorld average 71 hours of work per week.

The Research

Robinson draws on research compiled by her husband Evan Robinson, a software engineer, in a white paper titled “Why Crunch Modes Doesn’t Work: Six Lessons.”

Much of the work cited by the Robinsons is focused on manufacturing work, Sara Robinson writes that knowledge workers actually max out at about six hours per day, not eight. I haven’t been able to find her source, and as of this writing she has not returned my e-mail asking what the source is. It could be a survey conducted by Microsoft in 2005 intended to promote the value of the company’s productivity software. Out of a 45 hour work week, survey respondents consider about 17 hours to be unproductive. That’s 28 productive hours per week, or just 5.6 hours per day given a five day work week. This is self-reported productivity, and much of that lost productivity is actually due to meetings, but this is not out of line with the industrial research cited by the Robinsons.

Evan Robinson notes that it’s very difficult to measure programmer productivity, so we’re left having to draw on the lessons learned in other fields.

Evan Robinson refers frequently to studies of the effects of sleep loss on performance, including one study conducted by Colonel Gregory Belenky at the Walter Reed Army Institute of Research titled “Sleep, Sleep Deprivation, and Human Performance in Continuous Operations.” This report finds that loss of sleep affects the ability to do mental work far more than it effects our ability to perform manual labor. If nothing else, sleep loss caused by excessive working hours will cut into developer productivity.

Much of the reduction in productivity comes from making serious errors that take time to correct. Evan Robinson writes: “Crunch raises the odds of a significant error, like shipping software that erases customer’s hard drives, or deleting the source tree, or spilling Coke into a server that hasn’t been backed up recently, or setting the building on fire. (Yes, I’ve seen the first three of these actually happen in the last, bleary days of Crunch Mode. The fourth one is probably only a matter of time.”

Passion is a Fashion

According to Sara Robinson, managers have known for generations that worker productivity declines past a certain limit (eight hours a day, five hours a week). What changed to make people want to work so many hours? Robinson says the emergence of Silicon Valley as an economic power house and the singular minded, hyper-passionate people who worked there.

According to an entry on Folklore.org, a collection of anecdotes about the creation of the Macintosh:

As pressure mounted to finish the software in time to meet our January 1984 deadline, we began to work longer and longer hours. By the fall of 1983, it wasn’t unusual to find most of the software team in their cubicles on any given evening, week day or not, still tapping away at their keyboards at 11pm or even later.

The rest of the Macintosh team, which had now swelled to almost a hundred people, nearing the limit that Steve Jobs swore we would never exceed, tended to work more traditional hours, but as our deadline loomed, many of them began to stay late as well to help us test the software during evening testing marathons. Food was brought in as a majority of the team stayed late to help put the software through its paces, competing to see who could find the most bugs, of which there were still plenty, even as the weeks wore on.

This lead Apple’s finance team to create the “90 HRS / WK AND LOVING IT!” sweatshirts pictured above.

Robinson blames management guru Tom Peters for spreading the idea of “passion” as a substitute for rest and relaxation to the world outside Silicon Valley. “Though Peters didn’t advocate this explicitly, it was implicitly understood that to ‘passionate’ people, 40-hour weeks were old-fashioned and boring,” Robinson writes. “In the new workplace, people would find their ultimate meaning and happiness in the sheer unrivaled joy of work. They wouldn’t want to be anywhere else.”

Regardless of where it came from, or whether being “passionate” actually affects job performance, the concept stuck and has become the norm for corporate America. In his bookJobs That Don’t Suck, a book of advice for getting keeping “good” jobs, Charlie Drozdyk writes that ambitious workers should always show up early, stay late, come in on weekends and always take lunch at their desks – even if you’re just playing solitaire the whole time. Dryzdyk’s advice to young workers typifies the attitude seen in many companies:

If you just said to yourself, “Forget it: I’m not coming in on weekends unless theypay me to come in on weekends,” then your attitude sucks. Try to get your money back for this book, because I don’t even want to be in the same room with you. You’re an idiot, and you should get a job at the post office (not that people who work for the post office are idiots by any means. Many people in my family were postal employees), where you will get paid for coming in on Saturdays.

37 Signals co-founder Jason Fried has railed against this approach for a long time. In his bookRework Fried writes:

Not only is this workaholism unnecessary, it’s stupid. Working more doesn’t mean you care more or get more done. It just means you work more.

Workaholics wind up creating more problems than they solve. First off, working like that just isn’t sustainable over time. When the burnout crash comes–and it will–it’ll hit that much harder.

Workaholics miss the point, too. They try to fix problems by throwing sheer hours at them. They try to make up for intellectual laziness with brute force. This results in inelegant solutions.

So why don’t more companies recognize this and encourage workers to spend less time working?

Is It a Conspiracy?

Mayer suggests that what most of us call burnout isn’t caused by overwork, a lack of sleep, poor diet, lack of exercise or missing out on having a personal life. According to Mayer, this non-existent burnout out thing is actually resentment – resentment caused by missing out on the things that are most important to you. Mayer says it’s possible to work 130 hours per week, without getting enough sleep or exercise, as long as you find time for the absolute most important things in your life:

When she has sensed that an employee was becoming fatigued or annoyed with long hours, Mayer has taken the person aside and asked them what really mattered to them outside of work. For one employee, making nightly 1 a.m. phone calls to her team in Bangalore, India didn’t bother her. What did was missing her children’s soccer games and dance recitals because she was stuck at work. “So, we say you’re never going to miss another soccer game or be late for a recital.”

Apart from the fact that this conflicts with the existing body of research on burnout, there’s something a bit sinister about this. Mayer seems to be asking employees to sacrifice their health and the vast majority of time that could be spent with friends and family at work, with the promise that they don’t have to give up their families entirely. They can still catch soccer games, it’s just everything else that has to go.

It’s easy to see this as a conspiracy on Google’s part: convince the world that burnout doesn’t exist, provide three meals a day on campus so that workers never need to leave, and then maybe someday everyone will just be literally living at work and living for work. But if the research shows that most workers can’t do that much work, then it be in Google’s best interest NOT to encourage this sort of behavior. It could be that the executives running Google and other companies simply unaware of the body of evidence that suggests that overtime kills productivity. But one would think that if there were productivity gains to be had by having employees work less, that would quickly become common knowledge. Why are companies like SAS, which has a 35 hour work week, the exception and not the norm?

Another possibility is that it’s denial, not ignorance, that drives companies to ignore over a century of research. Both executives and employees want to believe that they’re special. Executives believe their companies attract the best people, and that anyone who can’t handle working over 71 hours a week will be weeded out and replaced by someone who can.

My best guess is that Mayer probably really is insanely passionate about her job, and that there are a number of executives in the Valley and beyond that expect that everyone can and should work as hard and as long as they do. Robinson writes:

Asperger’s Syndrome wasn’t named and identified until 1994, but by the 1950s, the defense industries in California’s Santa Clara Valley were already drawing in brilliant young men and women who fit the profile: single-minded, socially awkward, emotionally detached, and blessed (or cursed) with a singular, unique, laser-like focus on some particular area of obsessive interest. For these people, work wasn’t just work; it was their life’s passion, and they devoted every waking hour to it, usually to the exclusion of non-work relationships, exercise, sleep, food, and sometimes even personal care. The popular stereotype of the geek was born in some real truths about the specific kinds of people who were drawn to tech in those early years.

I won’t touch the Asperger’s issue. That’s another field of research entirely and I don’t know that it’s safe to draw any conclusions about the nature of that condition, or any other, and its connection to tech work. But there do seem to be a number of outliers who can easily work long hours. They aren’t necessarily more “passionate” about their work, but they have more stamina.

The problem is all workers are expected to pretend like they are these outliers or risk being branded as dispassionate slackers. In his book, Drozdyk specifically advises workers to be “fake and dishonest” and never to complain about anything – especially long working hours. No one wants to look like a slacker when the next round of layoffs come, but constant overwork will still drag a project or company down.

It’s difficult to determine how many of these outliers exist, but we can look to sleep research to get a sense of how rare they must be.

You Need More Sleep Than You Think You Do

Last year The Wall Street Journal reported research conducted by Daniel J. Buysse, a psychiatrist at the University of Pittsburgh Medical Center on the so-called “short sleepers” – people who can get by with very little sleep. Buysse doesn’t know how many short sleepers there are – possibly 1% to 3% of the populations – but that far more people think they are short sleepers. In other words, many of us are more sleep deprived than we realize.

Also last year The New York Times reported on research conducted by David Dinges at the Sleep and Chronobiology Laboratory at the Hospital at University of Pennsylvania. Dinges concluded that almost everyone needs at least eight hours of sleep per night, and you can’t train yourself to get by on less.

If fatigue from overwork follows a similar pattern to chronic sleep deprivation, there are probably a lot fewer productivity outliers than we think there are. Although Buysse says that many short sleepers gravitate towards tech careers, we can still expect them to be a minority even at tech firms. According information released by the Bureau of Labor Statistics, computer and mathematical occupations account for 3,406,720 jobs as of May 2011. According to other data released by BLS, there were 237,830,000 people of working age in the United States in 2010. That puts computer and mathematics professionals at around 1.4% of the working age population (these numbers aren’t perfect since they are drawn from different years, and the categorization of tech jobs isn’t perfect). Even if we assume all short sleepers are of working age, tech companies would need to attract 50-100% of them to fill all positions – and still may come up short. Plus, all these short sleepers would need to be intelligent and technically minded enough for the work.

Mayer cites Albert Einstein and Winston Churchill as examples of people who worked extensive hours long into their lives, but the truth is most of us are probably more like Archimedes: we need to take breaks in order to make breakthroughs.

How Much Are You Really Getting Done?

Sara Robinson mentioned meetings and e-mail as part of what eats up an eight hour day for knowledge workers. The Robinsons don’t address whether activities and meetings count as “work” during a knowledge worker’s day, but there is some evidence that even very boring meetings require a degree of mental effort. In 2009 Canada.com ran a story on research conducted by Jackie Andrade of the University of Plymouth on the subject of “doodling.” Andrade’s research found that doodling actually sharpens the mind. One possible reason:

When people are bored they have high levels of brain activity, Andrade says. “When you’re bored, you think nothing much is going on, but actually your brain is looking for something to do.”

So we daydream. But daydreaming takes considerable mental effort, particularly when we get stuck in a daydream. “So that sucks mental resources and energy away from the other task we’re meant to be doing,” Andrade says.

Doodling occupies the mind, but isn’t as exhausting as day dreaming. Andrade’s research suggests that it would be better to cut back on unproductive meetings than it would be to expect workers to count meetings as “breaks” (though apparently doodling during boring meetings will make them less exhausting).

This opens up a lot of questions about what counts as “work,” though. We don’t know how much our day to day activities, such as Cooking dinner, reading Russian novels, building model airplanes, reading to our kids, helping them with their homework or just laying around watching TV, affect our cognitive performance at work the next day.

Side Projects and Volunteering

Perhaps the most controversial aspect of the 501 Developers Manifesto is this section:

If you:

    • Write a technical blog
    • Contribute to open source projects
    • Attend user groups in your spare time
    • Mostly only read books about coding and productivity
    • Push to GitHub while sitting on the toilet
    • Are committed to maximum awesomeness at all times, or would have us believe it

…we respect you for it. There’s probably some pity in there too, but honestly, it’s mostly respect.

Many programmers contribute code to open source projects on the side not because they want to advance their careers or because they think they’ll be able to build a profitable startup from their work later, but because they legitimately enjoy programming. Unfortunately, the sort of complex thought that goes into open source contributions or other side projects or even volunteer work or activism probably does count towards our cognitive capacity.

I don’t think any company should try to limit developers contributions to open source outside of work, or try to stop employees from joining the boards of non-profits, volunteering at their churches or their children’s schools or otherwise tell employees how to spend their spare time. I also don’t think developers or other workers should stop doing the things that are important to them just to be more productive at work. These are the things that help us lead meaningful, fulfilled lives. But it’s important to keep our limits in mind – spending too much time on one hobby or side project can take away our ability to focus on another.

For employers, it’s worth still worth keeping the work hours short. For workers, it’s worth being mindful of how you spend your energy.

When Overtime Works

It might be OK to do some overtime under the right conditions. Evan Robinson writes about a study by the Business Roundtable called Scheduled Overtime Effect on Construction Projects:

Productivity drops when working 60-hour weeks compared with 40-hour weeks. Initially, the extra 20 hours a week makes up for the lost productivity and total output increases. But the Business Roundtable study states that construction productivity starts to drop very quickly upon the transition to 60-hour weeks. The fall-off can be seen within days, is obvious within a week…and just keeps sliding from there. In about two months, the cumulative productivity loss has declined to the point where the project would actually be farther ahead if you’d just stuck to 40-hour weeks all along.

In other words, you may be able to use overtime to catch up on lost work, or to make a final push towards the end of the project, but push too much and you’re better off having just stuck to a 40 hour a week schedule.

This paper was criticized in a meta-analysis paper by H. Randolph Thomas titled “Effects of Scheduled Overtime on Labor Productivity.” Thomas noted methodology deficiencies in the paper, including the fact that the data was taken from only one construction project. However, Thomas and Karl A. Raynar later conducted further research on four different construction projects and concluded in the paper “Scheduled Overtime and Labor Productivity” that: “The results compare favorably to other published data including the Business Roundtable (BRT) curves. Therefore, it was concluded that the BRT curve is a reasonable estimate of losses that may occur on average industrial projects.”

I would also note that there are circumstances under which overwork is impossible to avoid. I would not have been able to make the transition from being an IT worker to being a tech journalist if I hadn’t worked part time as a writer on the side. Many people have to work full time while going to school. Some people have to work more than one job to get by. Sometimes freelancers can’t afford to turn down projects with overlapping deadlines.

It may result in overall lower productivity, but sometimes it’s the only way to get something done. The key is to be aware of the effects of overwork and know when it’s worth putting in the extra effort.

What Happens Next?

Chris Nerney writes for IT World:

Even as you read this, thousands of tech workers at Facebook, Google, Zynga and elsewhere are playing the Sandberg card! And when I say “thousands,” I mean none. Because no one who’s putting in 50 or 60+ hours because they’re afraid not to is going to stick out their neck and demand their lives back from the tech jobs that consume them or the venture capitalists who get wealthy on the backs of overworked and stressed-out technology employees. Not in this shaky economy.

I hope Nerney is wrong. The 501 Manifesto is being widely criticized, but it’s also getting a lot of play. Sandberg stepping forward is a good sign, as is Netscape co-founder Jamie Zawinski taking Michael Arrington to task for demanding more from entrepreneurs:

He’s trying to make the point that the only path to success in the software industry is to work insane hours, sleep under your desk, and give up your one and only youth, and if you don’t do that, you’re a pussy. He’s using my words to try and back up that thesis.

I hate this, because it’s not true, and it’s disingenuous.

What is true is that for a VC’s business model to work, it’s necessary for you to give up your life in order for him to become richer.

Meanwhile, companies like 37 Signals and SAS are leading the way on offering sane working hours for employees. But the most important thing might be just taking a long hard look at the data and what it tells us about how much time we should spend working.

This article originally appeared at SiliconAngle.

What are blogs good for?

tobias has a great post up about blogging. I don’t take issue with the thesis of the post, but there’s something there that I’ve been thinking about:

Blogs tend to not express or reflect on political action, taken or organised by the blogger; rather, the act of writing the blog is considered to be political and active in itself. Blogs are not reports. This is not a new position–it is the turf of the political writer (Voltaire, Rousseau, etc.).

This does indeed seem to be the position taken by many, probably most, political bloggers. However, I doubt that blogging is a particularly effective political act.

What has changed since the times of pamphlets is not just the speed of publication, but also the amount of information. I don’t really see the web as a very effective tool for propaganda and persuasion, except for perhaps the very most popular of web sites.

I don’t think indy Media or American Samizdat are going to win a lot of people over to progressive causes. Nor do I think Little Green Footballs is going to lure a lot of people over to neo-conservative views. But, what American Samizdat can do is serve as a medium for communication between “the converted.” It’s a great place to share information. The blogosphere in general serves as a way to share ideas and discuss them, but is limited to a fairly small audience. The real work of activism must come from other activities, and blogging is not an effective political act, and shouldn’t kid ourselves about it. That doesn’t make it any less worthwhile.

Did the Dean blog or Meetup really serve as ways to recruit new people to the Dean campaign? Maybe a few, but I think the real recruitment happened in the streets and in the big media. What meetup and the blogs did was organize, solidify, and inform the group. That is what blogs and the web in general are good for.