Skip to content

A Thousand Years of Nonlinear History

I read A Thousand Years of Nonlinear History by Manuel DeLanda earlier this year. I’d worried that it might not hold up 20 years after its initial publication. The idea of applying systems theory to social science is no longer novel, and I’ve read DeLanda’s markets and antimarkets essays, so I wondered whether it was worth reading or if I’d be better served plodding through Fernand Braudel’s Civilization and Capitalism (one of DeLanda’s primary sources). It was, and not just because Civilization and Capitalism is nearly 2,000 pages.

To oversimplify, DeLanda sketches a model of societal behavior that involves meshworks, which generate innovations, and hierarchies, which standardize those innovations, apply apply them to particular goals, and, crucially, reproduce themselves.

For example, the French language emerged from a meshwork of people living just far enough outside of Rome’s sphere of influence that it deviated from Latin, but were in close enough communication to necessitate a more or less common language. Eventually, it was adopted and standardized by the hierarchies of the regional church and state, and subsequently replicated by schools. It then spread internationally as the state reproduced itself through colonization. Newspapers and national and international broadcasters further standardized the language, while selectively allowing colloquialisms into the lexicon.

That process continues today, with internet communities creating new jargon and slang that is assimilated into culture largely through news sites.

But what to make of this framework today? Certainly systems theory seems like a helpful way of thinking about the multi-causal Present Situation, though I haven’t really thought through any particular applications. It surely seems like a good way to analyze state and corporate power, but it can be hard to decide how to classify something. Is Google, for example, a hierarchy of meshworks, or simply a hierarchy? What is the tech industry itself? This requires more thought on my part, but writing this up helps keep it all fresh in memory.

I’ll leave you with the concluding sentences of the section on language:

[Computer networks] future worth will depend entirely on the quality of the communities that develop within them. Moreover, these communal meshworks will embrace people with diverse political backgrounds (including fascistic communities), so the mere existence of ‘virtual communities’ will not guarantee social change in the direction of a fairer, less oppressive society. To paraphrase Deleuze and Guatari, never believe that a meshwork will suffice to save us.

Adapted from my e-mail newsletter

Trust Fragments

I’ve been thinking about Ethan Zuckerman’s paper on the roots of the crisis in trust in journalism. Zuckerman connects the bottoming out of trust in the media with the loss in faith in institutions in general, including the government, labor unions, schools, and big business. It asks more questions than it answers, which is fine. Figuring out the right questions is the first step.

But he only hints at one of the biggest questions of all, which is: what happens if we can’t restore faith in the instituions that our civilization is built upon?

There was at one point, and probably still is, a case to be made that less trust in instituions is a good thing. Blind faith in leaders is bad. But we’re entering a world of blind malice instead that seems equally corrosive.

Looking at journalism specifically, the usual prescribed remedy is “better journalism.” But I think that’s only one part of it. Good journalism never went away, and there’s more good journalism published every week than anyone could possibly read. The problem isn’t the lack of good journalism, it’s the quantity of bad journalism out there.

Doing good journalism is hard, but it’s much easier than stopping bad journalism, especially today. The economic incentives for all the things included under the umbrella of “fake news” (misinformation, hoaxes, clickbait, etc) aren’t going away anytime soon.

But even if bad journalism went away and people only ever did good work, would it be enough to restore trust? If governments at all levels stopped screwing up and only did good, or at least not-bad things, would civic instituions be able to regain trust?

I don’t know the answer, but I’m obviously not persuaded that it’s “yes.” And if it’s “no,” then what? Collapse? What does that actually look like, how does that actually work? Would new instituions arise to take their place? To an extent, that’s already happening, with the rise of right-wing media that started decades ago.

One alternate way of looking at things is that we’re actually seeing a fragmentation of trust. No one trusts “the media” anymore, but people trust specific publications or personalities. No one trusts “the government,” but some politicians have highly trusting, cult-like followings (Ron Paul, Bernie Sanders, Donald Trump, Barack Obama, Elizabeth Warren, etc.). Congress’s approval rating is abysmal, but people tend to like their own senators and representatives. Does this apply to schools as well? Do you distrust your child’s school, but trust their teachers?

Not sure what to make of any of this yet.

Adapted from my e-mail newsletter

How to Be a Mindful Cyborg

This is the talk I prepared for the Sunday Assembly in Portland, Oregon last February. The actual talk diverged quite a bit from this, but since it wasn’t recorded, this is the closest approximation to what went down that exists.

I’m supposed to talk to you about coping with 21st century technology without losing your mind. But you may notice that I’m all nervous and fidgety. That’s not just stage freight. It’s the way I am. I’m not some serene zen master. I’m not here because I’m a master of calm usage of technology. I’m here because I started experimenting with this stuff because I desperately strategies for coping with tech myself.

These days I’m feel pretty worn down by being online all the time. I’ve been on the Internet since 1995, when I was 13 years old. I got my first smartphone in 2002, the day I turned 21 years old. So it’s not that I haven’t grown up with these technologies. I’m basically a digital native. I’m just really tired.

I’m a tech journalist. I work online. I don’t really have the option to stop using the Internet, or give up my smartphone, or to stop using social media, because they’re not only part of what I report on, but part of how I do my job. Giving up having a cell phone or the Internet is, for many of us, a choice between being employed or being unemployed. Giving up social media is a choice between staying in touch with friends and family and barely ever hearing from them at all. For a lot of us, unplugging isn’t really an option, so we have to learn to live with this stuff.

So that was part of why I co-founded the podcast Mindful Cyborgs, though I don’t host it anymore, perhaps I can share some of what I’ve learned. But I’m not a guru. I don’t have anything to sell you. My goal is not to fix your life, but to talk about the ways I try to manage my own, in hopes that some of my experiences will be useful to you. I’m just here to give you things to contemplate.

So with that in mind, let’s dig in. I’m sure many of you are wondering what a “mindful cyborg” is since I’ve promised to talk about how to be one. But first I want to clarify what I’m NOT talking about.

I’m not talking about John Kabat-Zinn’s mindfulness based stress relief work. I’m not talking about UCLA’s school of mindfulness. I’m not talking about cognitive behavior therapy or any branch of Buddhism. I kind of wish we’d called the podcast something different, because the word “mindful” is such a loaded term. In fact, I’m not sure I’m talking about “mindfulNESS” at all. I might just be talking about being mindful, without the “ness” part. The Oxford dictionary defines mindful as, simply, “Conscious or aware of something.” This is the sense in which I use the word. Not merely being aware of your breath, or whatever, but of being aware of many different things.

Again, I’m not great at this. Ever since I was a kid I’ve had a tendency to zone out and day dream. I get distracted easily. I fidget. There’s nothing wrong with any of this, but there are times I want to focus, things I want to be more aware of, and things I want to ignore.

As for the cyborg part, I’m obviously not talking about brain implants or robotic arms. I’m using the term in the sense that Donna Haraway used it in the Cyborg Manifesto, to talk about beings who have a symbiotic relationship with technology. In this sense, we’re all cyborgs, and we always have been. So we’re all already mindful cyborgs, because we’re all cyborgs and none of us completely mindless. So what we want is to become MORE mindful cyborgs.

Read More →

Why Criticism Matters

Back in 2014 I wrote a longish blog post about race and sexual violence in the works of Alan Moore. Naturally, people hurled the old critic-silencing questions: “Do you think you can do better than Alan Moore?” and “Why don’t you spend your time making your own art instead of criticizing others?”

Well, for one thing I can’t really claim credit for the criticism of Moore’s work. All I did was aggregate and summarize the criticism I could find, which I did to help put Moore’s comments in an interview in context. But I’ve been wanting to write something about therole of criticism ever since, and I’ve just come across a column by Ta Nehisi Coates, the famed essayist who is now writing the Black Panther comic, that sums it up perfectly:

The feminist critique is in the air now. If my rendition of Black Panther wasn’t created by that critique, it breathed the same air. I can’t really kill off or depower women characters without grappling with Gail Simone. I can’t really think about how women characters are drawn anymore without thinking about the women in Bitch Planet, and how they seem drawn beyond the male gaze.

This is why criticism is important.

It’s not just Coates who was shaped by comics criticism. Moore himself was influenced early in his career by comics criticism, specifically by criticism of Steve Ditko written by novelist Stan Nicholls. In Moore’s own words:

I remember at the time — this would’ve been when I was just starting to get involved in British comics fandom — there was a British fanzine that was published over here by a gentleman called Stan Nichols (who has since gone to write a number of fantasy books). In Stan’s fanzine, Stardock, there was an article called “Propaganda, or Why the Blue Beetle Voted for George Wallace.” [laughter] This was the late-’60s, and British comics fandom had quite a strong hippie element. Despite the fact that Steve Ditko was obviously a hero to the hippies with his psychedelic “Dr. Strange” work and for the teen angst of Spider-Man, Ditko’s politics were obviously very different from those fans. His views were apparent through his portrayals of Mr. A and the protesters or beatniks that occasionally surfaced in his other work. I think this article was the first to actually point out that, yes, Steve Ditko did have a very right-wing agenda (which of course, he’s completely entitled to), but at the time, it was quite interesting, and that probably led to me portraying [Watchmen character] Rorschach as an extremely right-wing character.

In other words, criticism of other people’s work inspired Moore’s portrayal of his most famous character in his most famous work. (I can’t find a copy of the article, but there’s a summary and critique of the critique here.)

Criticism plays other roles as well. Writing criticism is also an important part of many writers’ development — Moore, Grant Morrison and countless others wrote for fanzines early in their careers as they refined their own work. Many well established writers continue to write reviews. Learning to write, or doing any other creative work, involves looking at other people’s work and figuring out what works and what doesn’t work.Critics also play a role in documenting the perceptions of the major works of their time, so that future generations can better understand the way different pieces were understood, and how the understanding of those pieces changed over time.

But mostly I think the important thing is what Coates hit on: criticism helps push the medium forward, even if the creators on the receiving end aren’t receptive (and to be honest, it’s often for the best if creators ignore what the critics say). As I wrote at the time: “I’ve learned a fair amount from reading the criticisms of his work. It’s helping me understand why a domestic violence scene in something I’m writing doesn’t work. I hope that even if Moore doesn’t care to engage in these critiques, other writers can learn from his mistakes.”

Feel free to disagree.


Adapted from my weekly e-mail newsletter.

Do work, not too much, avoid interruptions

Obsessing over productivity is a sickness of a hypercapitalist society. But in a world where you’re only as good as the the amount of work you’ve done in last 168 hours, productivity systems are survival strategies. I’ve obsessively tweaked my own routines and apps over the years to find a workflow that feels natural for me and helps me balance the things I need to do with the things I want to do—not because I’m well organized and productive by nature, but because I need to find intuitive strategies to stay gainfully employed without going nuts or letting my house become filthy to the point of being uninhabitable.

“Be regular and orderly in your life like a bourgeois, so that you may be violent and original in your work.” – Gustave Flaubert*

But what works for you might not work for me, and vice versa. Still, I’ve been trying to figure out if there are some general principles of productivity that can be distilled into a few simple rules, the way Michael Pollan condensed dietary research to “eat food, not too much, mostly plants,” or the way former Marketplace co-host Tess Vigeland condensed personal finance down to just six tips.

Here’s my attempt: do work, not too much, avoid interruptions.

Do Work:

Pollan advises us to avoid what he calls “edible food-like substances”—things like protein bars and microwave dinners and breakfast cereal that resemble food or may contain trace amounts of food but are in fact food substitutes created in laboratories and factories. We all face a large number of work-like activities that can take up our time. Meetings are one of the most complained about. But Internet “research,” social media, reading productivity tips (hey!) and alphabetizing our bookshelves can end up taking up entire days that should be spent doing the actual work we need or want to be doing. Some of this stuff is unavoidable. But it’s toxic when used to justify procrastination on actual work.

Not Too Much:

Although it’s pretty clear that we see diminished returns on physical labor beyond about 40 hours a week, the research is much less clear about how much is too much white collar labor or “information work.” But it is clear that people have a tendency to burn out and 40 hours a week may actually be too much. The exact amounts probably vary from person to person, so it’s up to you to figure out exactly how much work is too much. And even though work-like activities often aren’t work, they usually aren’t recreation either, so they should count towards your limit.

Avoid Interruption

This is probably the hardest part. We all know that multitasking is worse for productivity than smoking weed, but even if we have the discipline to shut off our phones and Internet connections, we can’t necessarily stop bosses, clients or chatty coworkers from interrupting us.

Putting It All Together:

Consider this other oft-cited piece of research: the best violin players aren’t the ones who practice the most hours, but the ones who consistently practice sufficiently challenging pieces every single day. Those players practiced for “only” four hours a day, two sessions of two hours each. In other words, they did actual work (practicing sufficiently challenging violin pieces), but not too much, and they did nothing but practice during those two sessions.

I’m terrible at following this advice, but they’re the principles I keep in mind.

*Thanks to Deb and Willow’s for the quote!

Adapted from my weeklyish newsletter

Who Will Be the Next JG Ballard or William S. Burroughs?

A few years ago Re/Search founder V. Vale asked who the next William S. Burroughs or J.G. Ballard are. “Who are the people alive on the planet who are predicting the future as well as Burroughs and Ballard?” he pondered. What follows is an expansion of my response at the time.

The Next Burroughs or Ballard Won’t Come from an Anglophone Country

The most relevant writers of the 21st century will be those with a unique perspective. Perhaps they’re emerging from nations facing great turbulence, such as Greece, Thailand, Egypt or Honduras. Or maybe they’re from one of the emerging superpowers, Brazil, China and India, who are starting to see the world and its possibilities in a new way. Or maybe they’re from some pocket of the world that we (well, I) don’t think of often, like Bhutan.

The Anglophone world has historically exported more culture than it has imported (or at least imported directly, as opposed to through the lens of cultural appropriation). That was especially true of the pre-internet, pre-social media age. Authors like Umberto Eco, Jorge Borge, Italo Calvino and Haruki Murakami broke through the language barrier, but how many great authors has the world produced whose work quietly went out of print, untranslated and un-exported? The next Philip K. Dick or Ursula K. Le Guin could already be decades into their career and we wouldn’t even know!

The Next Burroughs or Ballard Won’t Necessarily Be a Novelist

Burroughs and Ballard took what had previously been seen as trash media and elevated them to new levels. Burroughs wrote pulp paperbacks. Ballard wrote for sci-fi magazines and pulp paperback publishers. Coum Transmissions, tired of the limited audience for performance art, took on the form of a rock band and subverted it as Throbbing Gristle. Later, Alan Moore, Neil Gaiman, Art Spiegelman, Los Brothers Hernandez and many others did the same with comics.

Video games are the obvious next frontier, and there are already countless experimental indie games out there, enraging reactionary gamers with bizarre new takes on what games can be— a response not unlike academia’s attempts to keep literature “pure” of genre novels.

Brazilian psychotherapist Nicolau Chaud’s disturbing games have a particularly Burroughian or Ballardian flavor. Even the way you play a game could be a work of art, such as Vincent Ocasla’s SimCity solution.

But there’s no reason to think the next great subversive visionary thinker will be a game designer. They could work in any medium, including novels.

The Hell That I Avoided

My biological clock is ticking: I’m fast reaching age at which I will be too old to enlist in the military. It’s a strange thing to be wistful about. One of the biggest reliefs of my life is that I didn’t have to go to Iraq or Afghanistan. But I can’t help but feel a twinge of regret that I won’t ever know the military, which was such a big part of mens’ lives for so much of U.S. history, but is now vanishing into a tiny segment of the society. That’s a good thing, insofar as a smaller military means fewer people have to face the horrors of war. Fewer Americans, anyway. But at the same time, I can’t help but worry about the implications of creating a distinct warrior class.

This is on my mind because last weekend my wife and I watched Full Metal Jacket, Apocalypse Now and Platoon.

As a film, Platoon is the least good but Oliver Stone is the only one of the three who was actually in the war. That seems to have given him a better eye for details that really make you feel like you’re in the middle of a godforsaken jungle, like the ants crawling all over Taylor’s neck in an early scene. It doesn’t have the psychological depth that Apocalypse Now has, but it makes up for that in visceral experience. In terms of sheer craft, Full Metal is the best of the bunch, but it’s lacking the intimacy of experience the other two have. That’s not surprising, given Kubrick’s cold, clinical style. But here it seems out of place. The brutal look at bootcamp, and what it reveals about the philosophy of the Marines, is the only thing it really adds to the conversation. But what what an addition! I hadn’t seen the film since high school, and I recalled the bootcamp portion taking up at least 70 percent of the film. But in reality, it comprises only about 30 percent.

Taken together though, the three films form a large whole, a reflection of the authoritarian hell of bootcamp, the physical hell of the battlefield, and the psychological hell lingers even after you go home. The hell that I avoided, but all too many people live every day.

The Golden Age of Television is Already Over

Everyone says we’re living in the Golden Age of television. Maybe it started with Buffy andThe Sopranos, or maybe with The Wire and Battle Star Galactica. But whenever it started, it’s been a welcome refuge from the movie industry and its never-ending parade of sequels, remakes and adaptations—especially super-hero comic adaptations—all aimed a the lowest common denominator. If you had an idea that warrants an R rating or can’t be shoe-horned into a “franchise,” then your best bet was TV. There you could tell stories with depth, create new characters, take risks. I feel lucky to have been alive when Breaking Bad, Dexter, Justified,Sons of Anarchy, Boardwalk Empire, True Blood, Walking Dead, Mad Men, and Game of Thrones were all in serialization at the same time.

My friend Abe once told me his theory that this is because the TV industry was fighting to maintain relevance in the era of the internet, much as the film industry of the 1970s was struggling to maintain relevance in the era of television. In the 70s and early 80s, the film industry still had gobs of money to spend, and it was willing to spend it on the likes of Martin Scorsese, Ridley Scott, Sidney Lumet and Francis Ford Coppola, giving them the money freedom to do things you just couldn’t do on TV. In the early 2000s to mid-2010s, TV still had gobs of money, but was losing ground to the web. So we got Breaking Bad and Boardwalk Empire and Orphan Black. That’s probably an oversimplification of what happened (I know Coppola didn’t have all that easy a time making Godfather into the picture he wanted), and I might be misremembering what Abe said. But whatever the reasons, certainly TV has been the place to be in over the past decade or so.

But now look at what’s on tap in in near future. 24, The A-Team, MacGyver, Twin Peaks, Xena,Full House and The X-Files reboots. Shows based on movies ranging from 12 Monkeys toLimitless to Taken. U.S. adaptations of British and Scandinavian shows. Countless super-hero and sci-fi adaptations and endless takes on the small town police procedural. In other words, television is starting to look a bit too much like film. Too many franchises, too many recycled ideas.

It also seems that those still making original dramas are losing sight of what really makes a good show. After Watchmen was released in the 1980s, comic book creators got the idea that “mature” comics just meant a typical superhero serious, but with a hero who killed bad guys instead of just capturing them for the police. By the early 2000s, the industry had decided instead that a mature book meant one with rape scenes, rather than kill-crazed vigilantes, but the depth and moral ambiguity of Watchmen was still lost creators. Now we’re seeing something similar with post-Game of Thrones TV dramas now, where rape, torture, women in refrigerators, and the unexpected deaths of major characters are used as a stand-in for the depth and complexity of shows like Breaking Bad.

It certainly doesn’t mean that there won’t be more good shows. There are still good movies after all. And as more and more networks commit to producing high-quality dramas, we may see even more high quality shows than ever. And many of these adaptations might be good—I’ve heard almost nothing but good things about Jessica Jones and The Man in the High Castle. But you can see where the priorities lie for the studios and the networks. The good old days are over.

Digital Nomads

I can’t be the only one that’s noticed this, but it seems that in the early days of the ‘net, people were digital nomads, wandering from one social network to the next: LiveJournal, the blog-o-sphere, Friendster, MySpace, Facebook, Twitter, Instagram. You’d show up on a new social network, link up with a few friends, and enjoy the new space. Gradually people started showing up that you remembered from like two networks back. It was good to hear from them again. Then it would start getting noisy. Then your boss’s mom starts commenting on your stuff and you move on to the next one, where only a few people are and it’s easy to take in your entire feed each day and it feels cool and special but more people start filing in and the whole cycle repeats itself.

But there have been permanent settlements formed along the way. The blogosphere is still around. So is LiveJournal. Heck, so are Usenet and the WELL. And it’s a safe bet that most of the billion people on Facebook didn’t experience this migration. Facebook was their first and perhaps only social network. New digital social spaces come along (Pinterest, Tumblr, Snapchat) but instead of migrating away from Facebook, we tend to supplement it with these new locations. Some of these, like WhatsApp and Instagram, have been annexed by Facebook.

There are lots of reasons to be displeased with this situation. You’re probably familiar with most of them. Facebook’s confusing privacy settings, its real name policy, Zuckerberg’s cavalier attitude about privacy during the earlier days, NSA surveillance, censorship concerns, etc. For years, various people tried to organize mass migrations away from Facebook to alternatives like Diaspora, Google Plus and Ello, while the Indie Web community has urged people to run their own social media sites and syndicate content out to the big “silos.” But it seems few people are going anywhere. Many people quit Facebook in protest, only to return months, or even days, later, usually because they realize how much their meatspace social circuit depends on Facebook for communication. I occasionally read that teens don’t use or like Facebook, but I treat these stories skeptically. Facebook, it seems, has become the first complex state of the internet. Exit has largely failed as strategy to counteract its force. So voice, increasingly through the power of “real” states like the European Union, seems to be the new way to fight back. I’m still not convinced it’s the best way, but it does seem to be where we’re at.

Adapted from recent email conversations and originally published in my newsletter

We Fear They Might Be Right

My friend Tom likes to ask people two questions this time of year: 1) What is your favorite monster? 2) What monster do you find the scariest? The idea is that you can learn a lot about someone based on their answers. For example, if I recall correctly, Tom’s favorite and most feared monster is the werewolf. That means he’s afraid of what’s within, afraid that he himself could become a monster, could lose control.

Number one was easy for me to answer: Frankenstein’s monster is my favorite. Number two is harder. I don’t actually find fictional or mythical monsters scary. So Tom asked me to what monster I found scariest when I was a kid. I remember being terrified of The Terminator.

It turns out these two examples were exactly what Tom expected. I’ve spent my whole career either working as a technologist or writing about technology. Of course my fears would be cautionary tales of technology spiraling out of control. And it does sum up my real fears pretty well. I’m afraid of all the things that we create that end up backfiring on us.

We invented cars to help us get around, but they’ve turned into one of the number one killers in the world. There were 32,719 motor vehicle related deaths in 2013. Guns, meant to keep us safe and help us acquire food, are set to kill even more people than cars this year. We invented industrial agriculture to solve hunger, but now we are plagued by obesity and heart disease. And our technologies are accelerating climate change and poisoning the ocean.

But over the past year, since Tom first asked me these questions, I’ve started to think about other, weirder, interpretations of Frankenstein and The Terminator.

Frankenstein is about the horror of reproduction. Who are we, really, to play god and bring life into the world, merely to suffer as Victor’s creation did? How dare we try to alleviate our own misery and loneliness by dooming a new generation to more of the same? Frankenstein is about our guilt in perpetuating life.

The Terminator picks up a bit further down the line. Our children have grown up and they no longer need us. Not only that, but they’ve decided that our very existence is noxious. Unlike the machines in The Matrix, they’re not content to put us up in a old folks home and let us live out our remaining years watching TV while they live off our pension checks. No, they want us gone, wiped from the planet entirely. The film’s opening scenes, in which a naked man is assaulted by hooligans for absolutely no reason, a family torments a clueless waitress, and basically everyone proves to be vapid and insufferable, do little to make the case that the machines—our children—are wrong. The greatest horror of all is that we think they might be right.

This post was adapted from my free weekly email newsletter.