• Home
  • About
  • Guest Post
  •  

    The machineries of joy

    Posted by Sean at 05:00, January 13th, 2010

    Sarah Hoyt is one of those blog friends who’ve become rare and valued friends offline. Like many other first-generation Americans, she has a special appreciation for our freedoms borne of having grown up elsewhere. She can get good and cranky about the current state of things, but she’s inquisitive and hopeful about the future. And she’s wicked, witty fun even before the Scotch starts flowing.

    Her day job (and, judging from the hour at which an e-mail sometimes arrives from her, her night job, too) is as a sci-fi and fantasy writer, and her optimism is evident in her writing—there’s mischievous good humor on nearly every page. Like real life, the worlds Sarah creates come with evil forces that persons of good will and fortitude have to fight. And they win. Her most recent book, the first in a space-opera series, is Darkship Thieves , and the following is a sort of companion piece she kindly offered to let me publish here about why being optimistic, and not just cranky, about technology is the right attitude.

    *******

    Death. Turmoil. Despair. Disaster surrounds us on every side, a precipice threatening to swallow us with its dark and relentless horrors.

    Just talk to anyone. Ask them what they expect of the future and you’ll get Ecclesiastes. No one wants to go around bouncing and smiling and telling everyone to cheer up because the future is almost certainly better than the present.

    No one goes around saying that, but it is true nonetheless. Sure, of course, some periods in history in some places on Earth have been terrible or disgusting, or sometimes both. But taken as a continuum, the human condition of the majority of normal people on Earth has been a slowly ascending line. So that we are now, in aggregate, the best fed, healthiest, wealthiest humans in history. Which is why expecting the future to bring us horrors untold is a little daft. Or an attempt to sound interesting and thoughtful and get tenure and grants and big book advances.

    I suspect the other reason for it is a real sense of angst due to accelerating change. I can’t even imagine how my children will live when they are my age. To be honest, I’m not too sure about how I’ll live in ten years. We haven’t even yet seen all the results of the innovations of the last ten years.

    Take the Internet. Ten years ago it was already here, but as one resource among many and worse than a lot of them. Even I, who was an early adopter—my husband being a techno-geek—did only a few things online. I got email—mostly from editors and family. I read a few message boards. And I looked at my library catalogue. Today I read all my news online; I read a good deal of my fiction on line too. Via email and chat programs, I have as much—occasionally more—of a social life online as face to face. I do my preliminary research for work online. I shop on line for specialty items—such as older son’s size 17 shoes—which would otherwise be impossible to find in a medium-size town in Colorado. The list goes on, and I’m sure you know it as well as I do.

    So—answer me this—what social changes will result from the Internet? Look beyond what it was designed to do and at what it is obviously doing—or could do.

    First, the obvious—the Internet allows one to pick friends who have the same interests and opinions, something those of us in smaller towns might not find in our immediate neighborhood or in our professional lives. Given a few more years, it is quite possible it will allow us to seal ourselves hermetically in our comfort zones and never stray out of them. A few more years still, and I can see several cultures with their own lingo and beliefs so distinct that they will necessitate interpretation for outsiders. (The gentleman in the third row—yes, you, with the glasses—I heard that “But we already have computer professionals!” Be nice. They do try to translate. It’s not their fault our eyes glaze over.)

    Then the slightly less obvious—what effect will the Internet have on future generations? In past centuries, given the limited selection, outliers often married people who were not outliers, or didn’t marry at all. This meant a certain genetic reinforcement toward what was considered normal in the given society, physically and—more importantly—mentally. Now I know several couples who met, dated over the Internet and married across the world. About half of them have children. Will the continuation of this trend cause humanity to speciate? Will certain odd characteristics accumulate in a sub-group till it’s no longer part of the human race?

    I don’t know. What I know for sure is one thing: that the rate of acceleration of technology is increasing, as future discoveries build on the past ones. It’s no use at all my worrying about the future impact of the Internet, as though it were the only thing that will be different in the future, because other innovations I can foresee are just about to hit—life extension; gene modification; easier acquisition of knowledge. And then there are myriad other innovations I can’t foresee but that are coming as surely as the ones I mentioned.

    People sense this at some level. It is a great part—I think—of what fuels the widespread panic about the future. In his small portion of the world, just about every one of us finds himself unable to predict where he’ll be in the future. By which I mean the very near future. Within the next ten years or less. My own profession is beset by the ease of e-publishing, the collapse of traditional bookstores (and that’s more the effect of Amazon and online used bookstores than of ebook readers, whose effect won’t be seen for some years yet), the collapse of the traditional power structure from publishers to distributors to bookstores. Everything is in flux. No one knows what the winning model for the future is. My dentist—talking about something else—mentioned he doesn’t know what he’ll be doing in ten years. You see, turns out you can implant a tooth-bud in the gum, and the tooth will grow and replace the one that has a cavity or is decaying. The technology, he says, will be available for human use within ten years. A friend of mine is married to a man who studies how to clone eyes, so he can replace the eyes of middle-aged people. I wonder how that makes my eye-doctor feel.

    Our brain is simply not adapted to these conditions. I recently watched a Terry Pratchett movie and heard humans described as the place where rising ape met falling angel. I don’t think he meant it theologically, since Pratchett is not religious. It doesn’t need to be religious to make sense. We are all of us made of the rational part of our brain, which looks head on at developments and decides what they mean. And then we’re composed, also, of instincts, impulses, and tendencies that we carry around because they were useful to our ancestors.

    One of my friends has a sign on her wall that I can’t quote verbatim but that says something about her being a gatherer while her husband is a hunter, which explains their different approaches to buying shoes. It’s funny, but it’s probably also true, in that their ancestors developed different approaches to dealing with life. Take women’s tendency to congregate in vast social women-only groups—please, do take it; I don’t want it—that enforce internal conformity and mutual “defense.” It was probably born of something we see in the most primitive tribes, where women do their gathering in vast groups and watch the kids collectively. Be the pissy woman—hi, everyone—who will not conform and wear her bone on her nose just the way every other woman in the tribe does, and the other ones are liable to neglect to tell you when your toddler is wandering off into the forest while you are busy picking berries.

    Or take socialism—I’m quite done with it—which is a system that makes perfect internal sense to us because it works in very small groups and in an economy of extreme scarcity, which is what our ancestors lived in. The groups that insisted the haunch of mammoth be divided among everyone in the tribe, even the pregnant women and the children, left more descendants than the others, and so we are running around with the idea of a fixed economic pie in our minds, and the suspicion that if someone has more than we do, they must have stolen it. These built-in assumptions are probably partly created by the way our brains are formed (no, I don’t understand it either, but one of my friends is a biologist, and he seems to believe it might be so) and partly by deep-set culture: ideas so old and unexamined that they are passed in language and in gesture. The fact that our thinking brain has plenty of examples of redistribution and zero sum economics creating misery and death doesn’t seem to make any difference, against that kind of programming.

    The ape is what is screaming, right now, shaking his club in the face of the approaching future, running into the cave and trying exorcism rituals as the ground trembles and shifts beneath his hairy feet.

    Poor ape. He doesn’t know it, but he’s destined to lose. Not that he’ll go away. At least I hope no one finds the technology to dispose of him. Without him, humans wouldn’t be humans. (There is a version of transhumanism that seems to expect just that and always strikes me as being as repugnant as the attempt to change humans by political means. The Soviet Man is no more appalling than the poreless god-like man of the future who lives entirely in VR.) But try as he might, he can’t stop innovation and the ever-cascading change pouring down on him. And truly, he should embrace it, because technology will afford him the chance to make his peace with the angel, and to stride forward into the future, if not a more coherent being, at least a happier one.

    Societies based on scarcity; societies in which everyone has to live a certain way in order to survive; societies led by a strong leader—all those are trends of the past, trends of the ape-brain. The trends of the future are abundance; societies where you can live your life the way you want, because there is a gadget, a pill, a technique to do just what you wish to without negatively affecting your neighbors or peers; and distributed knowledge and power.

    Will it proceed without a glitch? Of course not. Our ape will fight and scream and fling poo. For some time and in some places, he will perhaps manage to hold progress at bay, and probably create quite a dark and dank lair for himself.

    But technology will leak even in there. Things will change. The future will march in. In lurches and sideways dodges. In bobs and weaves. The gates are open and it’s too late to shut them. The future is coming. And it’s very bright indeed.

    *Sarah A. Hoyt is a writer of science fiction (and fantasy and mystery) which, while giving her a certain interest in trends and effects of technology does not, by any means, make her a prophet or even a Cassandra. Take her predictions as you might. But she would place a strong bet on these.*


    The best that has been thought and written

    Posted by Sean at 21:57, January 10th, 2010

    BTW, if you’re interested in reading about problems in higher education, rather than just in rock-throwing at perceived elites, Joanne Jacobs and Erin O’Connor both post frequently about them. Jacobs’s blog is one of the first I ever began to read nine or ten years ago, after Virginia Postrel linked to her several times. O’Connor is a former professor in the English Department at Penn (I’d been graduated by the time she was hired) who left for other work and writes a lot about how well our colleges and universities are doing at nurturing the life of the mind. One of her most recent posts is particularly interesting. Citing a Time interview about university accountability, she writes:

    I take [interviewee Kevin] Carey’s point that right now you see too many colleges and universities admitting people they know aren’t ready—and not taking responsibility for their atrocious attrition rates. There is a betrayal of youth happening there—false promises attached to a lot of money and also to a vital period in someone’s life. At the same time, I think fewer people should be going to college, that college should be harder, and that means people are going to flunk out. We need to take on reforms that have that in mind—and that means, among other things, valuing vocational training much more, taking the trades seriously as viable career plans, and making the high school diploma mean something.

    The phenomenon O’Connor notes—the idea that everyone must Go to College in the first place—is, to my knowledge, as common in the heartland as on the coasts. Strengthening the non-academic tracks of the educational system might or might not drain away some of the prestige (in the ambivalent original sense) that accrues to private colleges, but it would probably make students and their parents less likely to waste obscene amounts of money on four years in which they’re not really learning anything they’re going to use.

    Added on 11 January: Greg Lukianoff of FIRE has a column on Reason.com about the persistence of campus speech codes that worth reading:

    For many, the topic of political correctness feels oddly dated, like a debate over the best Nirvana album. There is a popular perception that P.C. was a battle fought and won in the 1990s. Campus P.C. was a hot new thing in the late 1980s and early ’90s, but by now the media have come to accept it as a more or less harmless, if unfortunate, byproduct of higher education.

    But it is not harmless. With so many examples of censorship and administrative bullying, a generation of students is getting four years of dangerously wrongheaded lessons about both their own rights and the importance of respecting the rights of others. Diligently applying the lessons they are taught, students are increasingly turning on each other, and trying to silence fellow students who offend them. With schools bulldozing free speech in brazen defiance of legal precedent, and with authoritarian restrictions surrounding students from kindergarten through graduate school, how can we expect them to learn anything else?


    School’s out

    Posted by Sean at 14:09, January 7th, 2010

    Okay, so I’m as against statism as any Tea Party attendee, but can people please knock it off with the coarse, blanket hating on “Ivy Leaguers”? I know what you’re trying to say, and there’s plenty to it. Scratch the CV of a high-handed technocrat, and you’ll frequently find the letters y, a, l, and e in sequence somewhere. A lot of people dismissed Sarah Palin out of hand because her degree was from a school in Idaho. (It hardly mattered which one; no real person, this line of thinking runs, goes to Idaho for school. Unless maybe it’s ski school.) We’re all sick and tired of hearing how Obama’s Higher Being-ness is related to his bachelor’s from Columbia and law degree from Harvard. Fine. Points taken.

    But “Ivy League” refers to a specific athletic conference of eight specific schools. The University of Chicago is not Ivy League. Stanford is not Ivy League. Duke is not Ivy League. Neither is Johns Hopkins or Georgetown or Amherst or Swarthmore or any of the Seven Sisters, each of which is at least in the right region. The left-leaning big-government types churned out by those schools are every bit as ideologically committed as those churned out by Brown, but they are not Ivy League.

    Does that really matter? For the purposes of the arguments being made, not really. Nanny-statism sucks on its own terms. And yes, I went to Penn (the universally recognized safety Ivy in my day, but an Ivy nonetheless) and Columbia, and that may be the major reason I’m noticing; at the same time, I notice a lot of things that don’t particularly bother me because they don’t seem to represent any troubling tendencies. This, I think, in its own small way, does. These rants tend to come from the sort of people who get testy when you refer to Texans as “Southerners,” or mainstream Protestants as “Evangelicals,” or gun-rights supporters as militia members, and I don’t see why imprecision matters in one direction and not in the other. If sloppy name-calling is wrong, it’s wrong for everyone, and it’s good for all of us to try, even in the details, to be as scrupulously accurate as we can and not to throw terms around when we’re not reasonably sure we know what they mean.

    If you’re starting to feel as if I were stomping all over your fun, let me hasten to remind you that Barney Frank, Chuck Schumer, Al Gore, and a host of other hopeless ninnies with earl-churl complexes really did go to Harvard et al., which means you can, with perfect justification, rail at them as being Ivy League ninnies with earl-churl complexes if you like. Nancy Pelosi (Trinity), Rahm Emanuel (Sarah Lawrence [!]), Robert Gibbs (North Carolina State), and others, you will have to be content with damning for their beliefs. And really, aren’t they enough?


    You got me feeling crazy ’bout my body

    Posted by Sean at 21:57, January 5th, 2010

    E.J. Dionne is such a reliable defender of interventionist government that it’s hardly a surprise that he cares more about whether a health-care bill is passed than about trivialities such as what it will force us to do, but it’s still astonishing for him to say so quite this directly (via Hit and Run):

    The whole plan got discredited in the, in the minds of some people because the legislative process looks really awful. And the more the focus was on the legislative process, the more people said, “What’s going on here?” Once they pass a plan, you can actually talk about a plan.

    I guess it’s kind of the reverse-Foucault model of legislation: through coming into being, it allows you to start articulating it in language.

    Seriously, what next? Does the United States Congress propose to start passing bills that consist of nothing but titles identifying what they’ll be regulating, with all actual stipulations to be worked out at leisure when things aren’t so busy? One wouldn’t want to (ahem) tax such self-abnegating public servants, who are already gracious enough to boss us around for our own good in so many other areas of American life, with the necessity of being transparent about what the new health-care machine consists of before they vote on it. As Reason‘s Nick Gillespie says:

    The legislative process in this case “looks really awful” because it is really awful, filled with special deals, obfuscatory language, shady cost estimates, and worse. Barack Obama, fer chrissakes, gave a whole big talk about the need for reform a few months back, where he resolutely refused to make anything clear other than whatever happens will both taste great and be less filling.

    Gillespie snidely notes that “celebrity plagiarist and historian” Doris Kearns Goodwin was on the panel also, but I think his tone is altogether too unsympathetic: when a woman professes to be unable to figure out which note cards were her own lightning bolts of inspiration and which were copied from published sources, you can’t exactly blame her for scrupling to criticize a bill consisting of a bunch of crap of unknown provenance.

    Added after a restful night’s sleep: Here‘s the title reference, to put us all in a better mood:


    核廃絶

    Posted by Sean at 21:36, January 4th, 2010

    In the Mainichi, a typically confused argument for the global abolishment of nuclear weapons (Japanese original here, though the English version is well done):

    “Every American should visit Hiroshima,” said Balbina Hwang, a Northeast Asia specialist and a senior policy advisor during the administration of President George W. Bush. She was addressing reporters after giving a lecture in Tokyo in November following a visit to the Peace Memorial Museum and other locations in Hiroshima. “I was overwhelmed by a sense of humility,” she continued. “I was struck speechless upon seeing (what happened) with my own eyes.”

    Still, what Hwang saw was not the actual reality of the atomic bomb. Witnessing an exhibit — which can only offer a hint of the actual barbarity of the bomb — with one’s own eyes will shake any American’s soul to the core. Even today, the starting line toward the elimination of nuclear weapons lies in Hiroshima and Nagasaki.

    I don’t know. I didn’t find that Hiroshima shook my soul to the core. It was very sobering. I was sorry it had to exist. I hope it’s never necessary to deploy nuclear warheads again.

    But when you read opinion pieces like this one, it’s easy to forget that when we bombed Hiroshima and Nagasaki, we were not, say, retaliating against Japan for dumping cheap Sony electronics on our markets. World War II is called that for a reason. Japan thought it could play the USSR against the other Allies and capitalize on our exhaustion. It was wrong. America decided that it was not worth sacrificing more of our people and materiel waiting for Japan’s military command to figure out in its own time that surrender not only was inevitable but had to be immediate.

    Besides, the Mainichi editors make an interesting exception to their call for everyone to start beating their nuclear warheads into ploughshares:

    A basic international roadmap should be laid out at this year’s conference. Obama has already publicly announced his goal of eliminating nuclear weapons. Though the road to that finish line may be a long one, small, visible steps are necessary along the way. If the countries that possess nuclear weapons fail to demonstrate goodwill in the upcoming meeting, discontentment toward the NPT will be further exacerbated, getting in the way of international cooperation that is crucial to the prevention of nuclear terrorism. What is most important now is the reconstruction of an international consensus toward the goal of nuclear abolishment.

    Hwang, whose area of expertise includes North Korea, remains doubtful that the rogue nation’s nuclear arsenal will be completely eliminated. She says this is because North Korea is overwhelmed by insecurity. [!] As a result, says Hwang, North Korea is demanding not only the withdrawal of all U.S. troops from South Korea, but wants the entire U.S. nuclear umbrella to be removed from East Asia, including Japan.

    We seek the complete abolition of nuclear weapons from the world, and support President Obama’s efforts toward nuclear disarmament. At the same time, however, we do not see the protection we receive from the U.S. nuclear umbrella as unreasonable as long as the North Korean threat exists, and we will not accept our allies’ admonishments to give up on North Korean nuclear disarmament.

    So using nuclear weapons for protection—in fact, outsourcing nuclear protection to another country—is okay if you’re Japan, with North Korea a stone’s throw across the sea.

    But only temporarily, you understand.

    Just until everyone agrees to disarm.

    What’s never explained is how this is supposed to happen. “International consensus” sounds great, but I don’t recall one on any issue that involved agreement by every country on the entire planet. The international community can’t even stamp out age-old rogue behaviors such as piracy in shipping lanes, drug trafficking, and currency counterfeiting. In today’s world of decentralization and advanced telecom and transport technologies, it seems quixotic at best to believe that we can ever return to a state in which we can reliably say that no malefactor has nuclear weapons. It’s a bummer to have to think that way, but as long as human beings are born with human nature, there will be some bummers we have to stare in the face and prepare to deal with forcefully. In the universe we actually inhabit, the weaned child who puts his hand on the cockatrice’s den is going to get bitten.


    Yrs. faithfully

    Posted by Sean at 13:59, January 2nd, 2010

    Tim Cavanaugh at Reason.com has been really, really, really big on sliding in the gay jokes these last few months. Not that I mind; he’s exceptionally handsome, and it’s an odd fact of life that, while gay jokes told by unattractive straight men are lame, offensive, retrograde manifestations of deep-seated sexual insecurity, gay jokes told by exceptionally handsome straight men are witty, bravely edgy, and charming. It’s interesting, though, that when the subject is Barney Frank—which is to say, when there’s a gigantic “INSERT MOST OBNOXIOUS POSSIBLE GAY JOKE HERE” sign flashing—Cavanaugh lets the opportunity pass right by, the PC coward!

    What was I saying?

    Oh. Right.

    Cavanaugh posts, not for the first time, about an article in the genre I love to hate: Technology is ruining our human relationships in ways only your humble, soulful Cassandra of a correspondent is aware of. The specimen in question, by Rachel Marsden, ends the way they all do:

    When I set up a meeting with someone, they’re the only person in the room. My friends are few and dear. I refuse to sign anything “xoxo” or “love” unless I mean it.

    Too many people seem to be grasping for ways to connect with others while rarely actually connecting in a way that has true value or significance. What so many people end up with is something that looks like a connection from the outside as they text each other a million times a day, or sign notes with “much love.” Sadly, that’s the new standard of personal value in this technological era.

    You can imagine what came before: Cell phones? Bad. Facebook? Bad. Twitter? Bad. True Intimacy with Rachel Marsden? Good.

    Singing: “Dialed about a thousand numbers lately / Almost rang the phone off the wall….”

    Besides the smugness of tone, what drives me berserk about these diatribes is the way they put the moral agency on the gadgets themselves. One of Marsden’s first sentences is “Does anyone care that technology is destroying social graces and turning people into rude jerks?” I am, I believe, of a somewhat older vintage than the lady, so maybe she doesn’t know this from first-hand experience, but there were a lot of rude jerks before cell phones and the Internet. Indeed, you can argue that technology has made it easier to escape from them. In the pre-cellular era, if some bore chatted you up in the airport departure lounge about the results of his latest colonoscopy, it was nearly impossible to shake him off without moving. You could not, after all, pretend that your magazine had suddenly started vibrating with what might be an important call from Mom.

    As for meetings, has there ever been an era in which at least half of those parked around the table in any conference room on the planet weren’t desperately angling for ways to get the hell out of there? I’m not willing to defend the constant diverting of attention away from people in the room, who have first claim on it, toward taking any and every stray call that comes in. That’s rude. It should Not be Done. However, the not-nice part of me can’t help wondering whether Marsden is the sort who loves the sound of her own voice over clicking PowerPoint slides and insists on convening a meeting when a phone call, hallway discussion, or e-mail exchange would do. Arriving late and not paying attention are ill-bred behaviors, but they long predate the gizmos in question, and sometimes there’s a useful message beneath the rudeness: there’s not enough content here to be worth my time. There are few tactful ways of saying, “This gathering is pointless—can we wrap up so everyone can go back to getting some real work done?”

    I wish that, just once, these people who bitch about how technology is wrecking everything would pin the blame squarely where it belongs: on the users. There are obligations connected to work and family that have changed in texture with the introduction of some communications technologies, sure; but otherwise, no one is forced to accept Facebook friend requests, respond to every text in three nanoseconds, or keep his cell ringer on even while asleep. The kind of person who badgers casual acquaintances on Facebook for attention is the kind of person who would have badgered casual acquaintances at a dinner party for attention in eras past, and at least now his victims can use the expedient of making his updates invisible, rather than switching to double shots of whiskey and trying desperately to look interested. Human interaction is a wonderful thing, but it’s hardly the unalloyed good Marsden and others want it to be.

    Besides, would we really want to go back to a time when you couldn’t be at the supermarket, press a button, and say, “Hi, darling. I know you’re still at the office, but can you please just remind me which brand of marinara sauce you wanted me to get before you kill me for grabbing the wrong one again?” Properly used, the communications technology we now enjoy makes a whole lot of things easier and less time-consuming so that we can actually spend more time and energy on what’s really important.

    What to do when people don’t use it properly? If they’re clearly irredeemable, shun them. If they might listen to hints, say gently that you’ll be happy to resume the conversation when they’re not so tied up with other things. If they’re irredeemable but you can’t avoid them anyway, be frostily polite to them to take the edge off your irritation and maximize the probability that they’ll pick up on the fact that something’s wrong. In my experience, these things generally work. And even when they don’t, they have to be better than working yourself into a reductive, sanctimonious froth about what a nasty, vulgarizing force Technology is.

    Added on 5 January: Thanks to Eric and to Donna B., who I believe comments at his place frequently, for linking back. Eric sticks with the do-cute-guys-say-more-interesting-things? topic, and Donna sticks with the doesn’t-technology-enable-valuable-new-types-of-communication part of the topic. Or maybe they’re talking about two entirely different topics, and this post is just incoherent. In any case, thanks to them for the links.