The Voice of Google published announcements in recent months that functionality in Google Groups will be pared back radically come January. This is not important enough to make any newspaper's front page, but I do think it's a signal of the company's impending moves in social media space -- of the next battle in the search giant's war with Facebook.
Warning: this post starts with a shaggy dog story, but I've marked it off with a sub-heading so you can skip down to the rumor and speculation bits if you'd rather...
Shaggy Dog Story: functionality stripped from Google Groups
My on-line writer's coven uses Google Groups as a platform for discussion, critique, and other sorts of feedback on the work-in-progress of its members. Google's offering meets our needs pretty well. It allows us to restrict content-access to our group's members, upload files (usually the pages offered up for critique), and link out to other useful Google apps, like Calendar (where we maintain our schedule) and Docs (where we evolve and publish the group's guidelines).
So I was kind of irritated when Google threw an announcement over the transom in late September notifying us that file upload would no longer be supported. My irritation didn't last long, because there was a workable, if somewhat awkward fix: we created a Google Site, and -- in advance of the scheduled January shutoff of Google Groups' support for files -- began to upload to the new Site instead. (We're not uploading to Google Docs because these are not, for the most part, files that we edit collaboratively: they're pages-in-progress that writers share with other group members.)
The "Welcome Message" feature that permits linking to our posting schedule on Google Calendar easily accommodates another link from Google Groups to the new file upload URL. Groups remains the core of our on-line activity, from which we link to the ancillary stuff. Problem solved.
Only it isn't.
A few weeks after the first pare-back bulletin, Google announced that the Google Groups "Welcome Message" would go away too: "Google Groups will no longer be supporting the Welcome Message feature. Starting January 13, you won't be able to edit your welcome messages, but you will still be able to view and download the existing content."
The "Welcome Message" is a feature that allows a group to customize its Google Groups home page. We use it in a very thin, but essential way, as I mentioned above: to link to all the other Googlicious products and features we use, like Calendar, Sites, and Docs.
So Google Groups is losing the ability to upload files, create web pages, and even to have a "home" page? It's being stripped down to a bare-bones discussion forum, supporting only member management & mailing list / forum functionality? Okay, I thought. Such is the price of using services offered for free. If you get more than you pay for you can't say much about the nature of the gift. So, I figured, we can adjust to this change too. We'll just move the center of gravity of our group from Google Groups to Google Sites.
But wait! Turns out there's no supported gadget for integrating discussion forums from Google Groups into a Google Site. Not good.
Why hasn't Google provided integration, I wondered? Looking at the Google Sites forums it was clear that a number of people had asked for it, and had even fashioned some clumsy work-arounds. I had a sneaking suspicion what was up, but I thought I'd ask ... and complain a bit ... by starting a new thread in the Google Sites user forum. In Groups discussion widget for Sites integration I asked for real, supported, unclumsy integration -- which, despite the workarounds and unsupported gadgets other Sites users have suggested or provided, remains an ugly hole in Google appscape.
The thread drew some posts from other complainants, but, to-date, no response from the Google-plex itself.
Birth pangs of Google's latest foray into social-media?
I suspect that stripping functionality from Google Groups is part of a repositioning of Google's offerings. That repositioning, I'd say, is part of the search giant's next steps in the ongoing battle against Facebook for dominance of eyeballs-on-the-intertubes.
In late June of this year, rumors began to appear of a new offering from the biggest search engine on Earth. "Google Me" was the rumored offering's name (which many found pretty silly-sounding, as I do). It sounded like it was shaping up to be a Facebook rival to many technology-watchers. In Fall the rumor mill perked up again as hints were dropped that that "Google Me" might make its debut this Fall.
The nature of this new offering began to emerge from the fog a couple months ago, as PC World reported, when CEO Eric Schmidt announced in mid-September that: "Google will be adding 'a social layer' into its suite of search, video and mapping products."
A couple days later, the same PC World journalist, Brennon Slattery, cited unnamed sources who described that layered approach to Google's social media strategy -- something quite different from a one-stop social media destination like Facebook: "Google Me will produce an activity stream generated by all Google products. Google Buzz has been rewritten to be the host of it all. And the reason Google Buzz isn't currently working in Google Apps is because they'll use the latest Buzz to support the activity stream in Apps... All Google products have been refactored to be part of the activity stream, including Google Docs, etc. They'll build their social graph around the stream."
These hints became more solid earlier this month when, as the U.K.'s Telegraph reported, Hugo Barra -- Google's director of mobile product management -- said: "We are not working on building a traditional social network platform. We do think 'social' is a key ingredient ... but we think of it more broadly. We think of social as an ingredient rather than a vertical platform."
And hence: an explanation for what's happening to Google Groups this fall. Stripping away functionality is, I'd wager, part of what Slattery's unnamed sources described as a refactoring of "all Google products" to fit, as an "ingredient" into the ill-named "Google Me."
If Google Groups delivers only discussion forum functionality, it is likely to fit more seamlessly into Google's new modes for digital interaction -- as part of a larger constellation of engagement between Google users.
I'm guessing that Groups' stripped down functionality isn't well-supported in Google Sites for a reason. Google's plan, I think, is to steer users of current Google offerings (like my writing group) into "activity streams" that are part of its new social-media strategy ... not into that flat old static web page thing that Sites enables.
Will it work out for the better as far as my writing group is concerned?
Time will tell. It's clear that Google wants to do better than Orkut in the social-media space that Facebook currently dominates. How that will fall out for mere mortals is anybody's guess.
Monday, November 29, 2010
Google signals its next social media move
Labels:
Google,
technology,
writing
Thursday, November 25, 2010
Eating insects
(or: Does entomophagy bug you?)
A week ago today I had a solo evening free in Cambridge, Massachusetts; I'd come east for a series of work-related meetings.
I took the T from my hotel to Harvard Station, walked around the campus a bit, and then around Harvard Square. I picked up a free newspaper and found a decent-looking Thai restaurant. The chicken curry with mango looked pretty good, and when the server brought it out it tasted pretty good too. One of a pair of young Japanese women sitting at the next table turned my way to ask what my dish was called. She spoke English with a thick accent, and I'd heard her order plates of drunken noodles for both herself and her friend. The rest of their conversation had been in Japanese. I wondered whether drunken noodles was the only dish she recognized on the English-language menu, and whether she was planning to order whatever I was having next time around.
The newspaper I'd picked up to keep me company over dinner was a weekly called The Phoenix, one of many freebies in stands along the streets surrounding the campus. An article featured on the cover had caught my eye: Eat me: Delicious insects will save us all. The lead paragraph: "Insects are a more sustainable protein source than cows or pigs, they're more nutritious, and they're being taken seriously. The United Nations has thrown its weight behind insect consumption, and more and more people are recognizing that bugs could be a solution to a host of emerging problems, including world hunger and environmental woes."
It's Thanksgiving today, so why not share some of what I learned from The Phoenix on this topic, and, after returning to my high-speed internet equipped hotel room, on the intertubes too.
Fascinating facts:
Arnold van Huis, the entomologist quoted above, really gets around. The UN's FAO site links to his research group in Holland. The U.K.'s Guardian published an article in August titled Insects could be the key to meeting food needs of growing global population and van Huis seemed to be the expert behind the curtain in that article too. But this isn't just some Dutch scientist's fetish. Follow the Guardian link, and check out the photo in that article of skewered scorpions waiting for hungry customers at a food stall in Beijing. I saw virtually the same scene when I visited Wangfujing market in that city about five years ago. This insect-eating business is for real. No, I didn't sample any myself ... in Beijing I stuck to the bin tang hu lu, skewers of candied hawthorn fruits dipped in a sugar syrup that hardens to a sweet, crackly carapace. Delicious.
The article in The Phoenix gives recipes for Roasted snack crickets à la carte, Mealworm Chocolate Chip Cookies, and a Mealworm Stir-Fry. The cricket recipe is very simple. There are only two ingredients: live crickets and salt. The Mealworm Stir-Fry was pictured, in color, in the print copy of the newspaper. Honestly? My mango curry was easier on the eyes.
Happy Thanksgiving!!
Thanks to Xosé Castro for the image of deep-fried crickets from a market near Chiang Mai, Thailand.
A week ago today I had a solo evening free in Cambridge, Massachusetts; I'd come east for a series of work-related meetings.
I took the T from my hotel to Harvard Station, walked around the campus a bit, and then around Harvard Square. I picked up a free newspaper and found a decent-looking Thai restaurant. The chicken curry with mango looked pretty good, and when the server brought it out it tasted pretty good too. One of a pair of young Japanese women sitting at the next table turned my way to ask what my dish was called. She spoke English with a thick accent, and I'd heard her order plates of drunken noodles for both herself and her friend. The rest of their conversation had been in Japanese. I wondered whether drunken noodles was the only dish she recognized on the English-language menu, and whether she was planning to order whatever I was having next time around.
The newspaper I'd picked up to keep me company over dinner was a weekly called The Phoenix, one of many freebies in stands along the streets surrounding the campus. An article featured on the cover had caught my eye: Eat me: Delicious insects will save us all. The lead paragraph: "Insects are a more sustainable protein source than cows or pigs, they're more nutritious, and they're being taken seriously. The United Nations has thrown its weight behind insect consumption, and more and more people are recognizing that bugs could be a solution to a host of emerging problems, including world hunger and environmental woes."
It's Thanksgiving today, so why not share some of what I learned from The Phoenix on this topic, and, after returning to my high-speed internet equipped hotel room, on the intertubes too.
Fascinating facts:
- The world's total meat supply quadrupled between 1961 and 2007, during which time per capita consumption more than doubled, according to the New York Times in 2008.
- "An estimated 30 percent of the earth’s ice-free land is directly or indirectly involved in livestock production, according to the United Nation’s Food and Agriculture Organization, which also estimates that livestock production generates nearly a fifth of the world’s greenhouse gases — more than transportation," according to that same NYT article.
- "To produce one kilogram of meat, a cricket needs 1.7 kilogram of feed -- significantly less than a chicken (2.2), pig (3.6), sheep (6.3), and cow (7.7)." This according to Arnold van Huis, an entomologist based at Wageningen University in the Netherlands, in a recent opinion-piece in The Scientist, to which the article in The Phoenix called my attention.
- "Additionally, the edible proportion after processing is much higher for insects -- it's 80 percent in crickets -- than for pork (70 percent), chicken (65 percent), beef (55 percent), and lamb (35 percent)." Ibid.
- The UN is really into this insects-as-food thing. Check out the Edible forest insects page, complete with video, on the site of the Food and Agricultural Organization (FAO) of the United Nations.
Arnold van Huis, the entomologist quoted above, really gets around. The UN's FAO site links to his research group in Holland. The U.K.'s Guardian published an article in August titled Insects could be the key to meeting food needs of growing global population and van Huis seemed to be the expert behind the curtain in that article too. But this isn't just some Dutch scientist's fetish. Follow the Guardian link, and check out the photo in that article of skewered scorpions waiting for hungry customers at a food stall in Beijing. I saw virtually the same scene when I visited Wangfujing market in that city about five years ago. This insect-eating business is for real. No, I didn't sample any myself ... in Beijing I stuck to the bin tang hu lu, skewers of candied hawthorn fruits dipped in a sugar syrup that hardens to a sweet, crackly carapace. Delicious.
The article in The Phoenix gives recipes for Roasted snack crickets à la carte, Mealworm Chocolate Chip Cookies, and a Mealworm Stir-Fry. The cricket recipe is very simple. There are only two ingredients: live crickets and salt. The Mealworm Stir-Fry was pictured, in color, in the print copy of the newspaper. Honestly? My mango curry was easier on the eyes.
Happy Thanksgiving!!
Thanks to Xosé Castro for the image of deep-fried crickets from a market near Chiang Mai, Thailand.
Monday, November 22, 2010
Happiness is a warm focus
Sometimes I apologize to friends or colleagues for being unable to do more than one thing at a time. As if it's a deficiency, and proves I'm stupid. In my heart-of-hearts, though, I don't feel sorry. I like focusing. And last week, on a flight to Boston, as I took off my noise-canceling headphones and came up for air from a focused edit of a report my project is preparing for a funding agency, our project director (who was sitting across the aisle) passed over a section of that morning's New York Times. He was showing me an article titled When the mind wanders, happiness also strays.
The article riffs off a study done by Harvard psychologists Matthew A. Killingsworth and Daniel T. Gilbert, published in Science magazine. The study showed that when people's minds wander, they tend to wander into unhappy territory. Therefore, people are happier if they stay focused on what's before them. They proved this with an iPhone app, naturally, because there's an iPhone app for everything.
But, hey -- sarcasm aside, really -- that means I'm not stupid, right? I'm happy!
Here's my favorite paragraph from the NYT article. It's about what people who aren't psychologists have to say on the topic of focus: "What psychologists call 'flow' -- immersing your mind fully in activity -- has long been advocated by nonpsychologists. 'Life is not long,' Samuel Johnson said, 'and too much of it must not pass in idle deliberation how it shall be spent.' Henry Ford was more blunt: 'Idleness warps the mind.' The iPhone results jibe nicely with one of the favorite sayings of William F. Buckley Jr.: 'Industry is the enemy of melancholy.'"
Perhaps you're familiar with the concept of multitasking. I don't mean the kind computers do; I mean the kind humans do, or are supposed to do according to certain management consultants. This is one of those brain-burps with staying power that purport to be about getting things done. The kind of thing that that seems to ripple regularly out of business schools and into the management 'culture' of unsuspecting organizations, never mind that pretty much everybody else seems to know better.
Here's the three-sentence Wikipedia definition: "Human multitasking is the performance by an individual of appearing to handle more than one task at the same time. The term is derived from computer multitasking. An example of multitasking is listening to a radio interview while typing an email."
Later in the Wikipedia article, author and psychiatrist Richard Hallowell is quoted describing multitasking as a "mythical activity in which people believe they can perform two or more tasks simultaneously."
I'm with Dr. Hallowell on that one. I know it's mythical to me. The only 'multitasking' I'm good at is listening to music while I cook, clean, or drive. I can do those tasks while distracted because I've been doing them for so many years that they no longer involve much cognition: they're mostly reflex, muscle memory, sense-and-response. Still, I'd bet that if there were a reliable way to measure, somebody could prove that I don't cook, clean, or drive as well while listening to music as I do when the radio's off.
Managers seem to be hardwired to respond to the myth of multitasking. There was a time in the mid-1990s when every job listing I saw called out multitasking as a desirable trait in prospective employees (and I saw a lot of job listings then, because I was working in UC Berkeley's Human Resources department). Answer e-mail, juggle phone calls, edit a report, manage a few student interns, balance a budget, design a web page, analyze a contract, draft a meeting agenda, normalize a database. A day in the life...
It's a great fantasy if you're a manager: hire one worker, get multiple workers' productivity. The fantasy gets even sillier when it involves a worker whose value is directly tied to her ability to focus and think deeply about a problem. Imagining that such a worker can have her day sliced into modular chunks that can be plugged into any number of projects and problems is ... well, it's just not real.
There's all kinds of mythbusting out on the intertubes about multitasking, so it's hardly necessary for me to do a literature review on the topic. But I will give a shout-out to one of the most practical and thoughtful computer programmers who Writes About Stuff on the intertubes, Joel Spolsky, of "Joel on Software" fame. Spolsky wrote -- nearly ten years ago, in an article titled Human Task Switches Considered Harmful -- about the friction inherent in giving computer programmers more than one project to work on at a time. Spolsky knows a lot about this, because giving programmers tasks to work on is his job. I believe what he writes in this article because it is exactly aligned with my own experience (as a computer programmer, as a colleague of computer programmers, and otherwise). To wit:
"The trick here is that when you manage programmers, specifically, task switches take a really, really, really long time. That's because programming is the kind of task where you have to keep a lot of things in your head at once. The more things you remember at once, the more productive you are at programming. A programmer coding at full throttle is keeping zillions of things in their head at once: everything from names of variables, data structures, important APIs, the names of utility functions that they wrote and call a lot, even the name of the subdirectory where they store their source code. If you send that programmer to Crete for a three week vacation, they will forget it all. The human brain seems to move it out of short-term RAM and swaps it out onto a backup tape where it takes forever to retrieve."
And now, according to a bunch of Harvard psychologists and their iPhone app, that same human brain, deprived of the focus it really really wants, is in a bad mood.
Who needs it? Don't multitask ... be happy.
Thanks to Andrew McMillain & Wikipedia Commons for the image of The Thinker at the San Francisco Legion of Honor.
The article riffs off a study done by Harvard psychologists Matthew A. Killingsworth and Daniel T. Gilbert, published in Science magazine. The study showed that when people's minds wander, they tend to wander into unhappy territory. Therefore, people are happier if they stay focused on what's before them. They proved this with an iPhone app, naturally, because there's an iPhone app for everything.
But, hey -- sarcasm aside, really -- that means I'm not stupid, right? I'm happy!
Here's my favorite paragraph from the NYT article. It's about what people who aren't psychologists have to say on the topic of focus: "What psychologists call 'flow' -- immersing your mind fully in activity -- has long been advocated by nonpsychologists. 'Life is not long,' Samuel Johnson said, 'and too much of it must not pass in idle deliberation how it shall be spent.' Henry Ford was more blunt: 'Idleness warps the mind.' The iPhone results jibe nicely with one of the favorite sayings of William F. Buckley Jr.: 'Industry is the enemy of melancholy.'"
Perhaps you're familiar with the concept of multitasking. I don't mean the kind computers do; I mean the kind humans do, or are supposed to do according to certain management consultants. This is one of those brain-burps with staying power that purport to be about getting things done. The kind of thing that that seems to ripple regularly out of business schools and into the management 'culture' of unsuspecting organizations, never mind that pretty much everybody else seems to know better.
Here's the three-sentence Wikipedia definition: "Human multitasking is the performance by an individual of appearing to handle more than one task at the same time. The term is derived from computer multitasking. An example of multitasking is listening to a radio interview while typing an email."
Later in the Wikipedia article, author and psychiatrist Richard Hallowell is quoted describing multitasking as a "mythical activity in which people believe they can perform two or more tasks simultaneously."
I'm with Dr. Hallowell on that one. I know it's mythical to me. The only 'multitasking' I'm good at is listening to music while I cook, clean, or drive. I can do those tasks while distracted because I've been doing them for so many years that they no longer involve much cognition: they're mostly reflex, muscle memory, sense-and-response. Still, I'd bet that if there were a reliable way to measure, somebody could prove that I don't cook, clean, or drive as well while listening to music as I do when the radio's off.
Managers seem to be hardwired to respond to the myth of multitasking. There was a time in the mid-1990s when every job listing I saw called out multitasking as a desirable trait in prospective employees (and I saw a lot of job listings then, because I was working in UC Berkeley's Human Resources department). Answer e-mail, juggle phone calls, edit a report, manage a few student interns, balance a budget, design a web page, analyze a contract, draft a meeting agenda, normalize a database. A day in the life...
It's a great fantasy if you're a manager: hire one worker, get multiple workers' productivity. The fantasy gets even sillier when it involves a worker whose value is directly tied to her ability to focus and think deeply about a problem. Imagining that such a worker can have her day sliced into modular chunks that can be plugged into any number of projects and problems is ... well, it's just not real.
There's all kinds of mythbusting out on the intertubes about multitasking, so it's hardly necessary for me to do a literature review on the topic. But I will give a shout-out to one of the most practical and thoughtful computer programmers who Writes About Stuff on the intertubes, Joel Spolsky, of "Joel on Software" fame. Spolsky wrote -- nearly ten years ago, in an article titled Human Task Switches Considered Harmful -- about the friction inherent in giving computer programmers more than one project to work on at a time. Spolsky knows a lot about this, because giving programmers tasks to work on is his job. I believe what he writes in this article because it is exactly aligned with my own experience (as a computer programmer, as a colleague of computer programmers, and otherwise). To wit:
"The trick here is that when you manage programmers, specifically, task switches take a really, really, really long time. That's because programming is the kind of task where you have to keep a lot of things in your head at once. The more things you remember at once, the more productive you are at programming. A programmer coding at full throttle is keeping zillions of things in their head at once: everything from names of variables, data structures, important APIs, the names of utility functions that they wrote and call a lot, even the name of the subdirectory where they store their source code. If you send that programmer to Crete for a three week vacation, they will forget it all. The human brain seems to move it out of short-term RAM and swaps it out onto a backup tape where it takes forever to retrieve."
And now, according to a bunch of Harvard psychologists and their iPhone app, that same human brain, deprived of the focus it really really wants, is in a bad mood.
Who needs it? Don't multitask ... be happy.
Thanks to Andrew McMillain & Wikipedia Commons for the image of The Thinker at the San Francisco Legion of Honor.
Labels:
culture,
technology
Thursday, November 18, 2010
Drafting vs. editing
I'm in a transition mode. Until a couple of weeks ago I'd been revising (and revising and revising and revising) my novel project, Consequence, for ... well, let's just say it's been a few years since I "finished" a first complete draft of the mss. Because it's how I work, I was also editing as I drafted (sometimes going back over a chapter or two, sometimes taking a deep breath and running through from the beginning to wherever I was). Because it's how editing goes, there was always a bit of drafting -- new chapters, new scenes -- during the editing phase. Red pen, black pen.
This month I launched into an agent-querying phase. Now, there's no guarantee I'm done with Consequence. In fact, the best I can hope for is that an agent will take a shine to the manuscript and ask me to make changes. Then an editor will take a shine to the project and ... ask me to make more changes. The worst I can expect? That nobody in the precarious world of 21st century publishing will bite, and I'll have to decide whether and how to have at it again.
In the meantime, I'm taking index cards and post-its and journal entries I've written over the past ten or so months about the next novel project I'm contemplating, and putting flesh on what are currently some fairly skimpy bones. Do I have a name for this next project? Well, yes, sort of, but it's so provisional that I'm not going to say. Not yet. Wouldn't want to jinx it.
Drafting vs. editing: what I can tell you is that what I'm doing with index cards and post-its and journal entries now feels really different from what I was doing with manuscript pages in October.
I do like the sudden wealth of possibility -- I mean, it's fiction, anything could happen -- but drafting does feel oddly free-form after having a built-structure to work in for quite a long time. I've certainly found another thing to appreciate about blogging, in that facing the blank blog-post page a couple of times a week for most of this year means that this free-form business isn't altogether foreign as I circle back to it.
But it feels soooooooooooooooooooooo slow! Thinking my way slowly slowly slowly into a new world, one detail at a time, one character's features, one scene's outline or tone, one turn of the plot, a bit of backstory ... the sudden change of tempo foregrounds the easily forgotten truth that Consequence was built up out of tiny accretions over years. Did I slip into a fantasy that it sprang up overnight, fully marshaled to face the red pen?
I canvassed my writing group last week about whether they like drafting or editing better, and in the course of some lively exchange got a mix of responses. One of our group said unequivocally that he always prefers drafting to editing, that editing is "a slow slog" for him. Others riffed on the way the process of building a work of long-form fiction involves a lot of back and forth: drafting, editing what you've drafted, letting the work settle for a bit, thinking of a brilliant new twist on the bus or in the shower, finding new scenes that need to be written, finding that pages on end were actually about building background for one's self, the writer, and don't need to be in the novel at all.
And then there's the part where the inmates take over the asylum, the characters assert themselves and determine what's next ... whether the author planned it that way or not. That, actually, can be the most fun, and the most fluid part of a story. Of course, once the characters settle back onto the page the author still has to revise. And revise again. And -- well, you get it.
At the Galleria dell'Accademia in Firenze (a.k.a. Florence, Italy), Michelangelo's four unfinished sculptures, titled "Prisoners," are displayed on either side of what amounts to the path to a soaring, domed space where the great artist's "David" is exhibited. Those rooms hold my touchstone images about what it feels like to write, no matter that Michelangelo worked in stone and I work in words, and never mind that what he achieved is infinitely greater than I can even dream of inventing. The visceral sense of emergence one has looking at these massive blocks from which the sculptor began to chip away the spalls that obscured his vision, the power rearing out of the marble as the figures are 'revealed' in stone that has itself imprisoned them, stone from which they are being freed by the sculptor's chisel ... and then the breathtaking perfection of the finished David, towering over the next room. On a very good day, editing feels like it's a part of that same game.
(If only one's prose read a fraction as heroically and beautifully even as Michelangelo's unfinished work ...)
Related posts on One Finger Typing:
Mental floss
Craft and art: erasure and accent
Aleksandar Hemon on Narrative, Biography, Language
Does a writer need a writers' group?
This month I launched into an agent-querying phase. Now, there's no guarantee I'm done with Consequence. In fact, the best I can hope for is that an agent will take a shine to the manuscript and ask me to make changes. Then an editor will take a shine to the project and ... ask me to make more changes. The worst I can expect? That nobody in the precarious world of 21st century publishing will bite, and I'll have to decide whether and how to have at it again.
In the meantime, I'm taking index cards and post-its and journal entries I've written over the past ten or so months about the next novel project I'm contemplating, and putting flesh on what are currently some fairly skimpy bones. Do I have a name for this next project? Well, yes, sort of, but it's so provisional that I'm not going to say. Not yet. Wouldn't want to jinx it.
Drafting vs. editing: what I can tell you is that what I'm doing with index cards and post-its and journal entries now feels really different from what I was doing with manuscript pages in October.
I do like the sudden wealth of possibility -- I mean, it's fiction, anything could happen -- but drafting does feel oddly free-form after having a built-structure to work in for quite a long time. I've certainly found another thing to appreciate about blogging, in that facing the blank blog-post page a couple of times a week for most of this year means that this free-form business isn't altogether foreign as I circle back to it.
But it feels soooooooooooooooooooooo slow! Thinking my way slowly slowly slowly into a new world, one detail at a time, one character's features, one scene's outline or tone, one turn of the plot, a bit of backstory ... the sudden change of tempo foregrounds the easily forgotten truth that Consequence was built up out of tiny accretions over years. Did I slip into a fantasy that it sprang up overnight, fully marshaled to face the red pen?
I canvassed my writing group last week about whether they like drafting or editing better, and in the course of some lively exchange got a mix of responses. One of our group said unequivocally that he always prefers drafting to editing, that editing is "a slow slog" for him. Others riffed on the way the process of building a work of long-form fiction involves a lot of back and forth: drafting, editing what you've drafted, letting the work settle for a bit, thinking of a brilliant new twist on the bus or in the shower, finding new scenes that need to be written, finding that pages on end were actually about building background for one's self, the writer, and don't need to be in the novel at all.
And then there's the part where the inmates take over the asylum, the characters assert themselves and determine what's next ... whether the author planned it that way or not. That, actually, can be the most fun, and the most fluid part of a story. Of course, once the characters settle back onto the page the author still has to revise. And revise again. And -- well, you get it.
At the Galleria dell'Accademia in Firenze (a.k.a. Florence, Italy), Michelangelo's four unfinished sculptures, titled "Prisoners," are displayed on either side of what amounts to the path to a soaring, domed space where the great artist's "David" is exhibited. Those rooms hold my touchstone images about what it feels like to write, no matter that Michelangelo worked in stone and I work in words, and never mind that what he achieved is infinitely greater than I can even dream of inventing. The visceral sense of emergence one has looking at these massive blocks from which the sculptor began to chip away the spalls that obscured his vision, the power rearing out of the marble as the figures are 'revealed' in stone that has itself imprisoned them, stone from which they are being freed by the sculptor's chisel ... and then the breathtaking perfection of the finished David, towering over the next room. On a very good day, editing feels like it's a part of that same game.
(If only one's prose read a fraction as heroically and beautifully even as Michelangelo's unfinished work ...)
Related posts on One Finger Typing:
Mental floss
Craft and art: erasure and accent
Aleksandar Hemon on Narrative, Biography, Language
Does a writer need a writers' group?
Monday, November 15, 2010
Matrixed Higher Education
Henry Kissenger famously noted (even though the fame doesn't justifiably belong to him) that: "academic politics are so vicious precisely because the stakes are so small." I don't think that's what's going on in the on-line education kerfuffle.
In June I blogged about the effort UC Berkeley law school dean Christopher Edley has been leading to "pilot" on-line learning dispensed by a leading research university. That effort is moving ahead on schedule, as reported by the Chronicle of Higher Education in May, with a call for proposals for up to 25 pilot courses, according to a UC-published puff piece earlier this month. But, bowing to the kinds of faculty balkiness that Dean Edley warned of when he worried that "the coalition of the willing among frontline faculty who would like to pursue this idea will be stopped dead in their tracks by the bureaucracy," this month's announcement from UC's central "Office of the President" devoted over 25% of its word count to a section titled "Faculty concerns to be studied."
That was enough backpeddling for Nanette Asimov to write an article for the San Francisco Chronicle two days later, titled UC leaders downplay plans for online courses. As she spins it, "University of California students will be able to enroll in the schools' first top-tier, UC-quality online courses by January 2012, but UC officials have strongly scaled back their expectations about what such courses can achieve. [...] the new effort was intended to see whether UC might pull off a fully online degree program as rigorous as what the selective university offers in its classrooms. For now, that plan is on hold."
In a related item, did you catch Bill Gates on the topic of technology replacing higher ed, at the Techonomy Conference 2010, in August? "The self motivated learner," Gates predicted over the summer, "will be on the web, and there will be far less place-based [education]. [...] College -- except for the parties -- needs to be less place-based. Place-based activity in that college thing [sic] will be five times less important than it is today."
Sharp-eyed monitors of zeitgeist-past will recall that in the mid-1970s Bill Gates dropped out of a small, well-endowed institution of higher education called Harvard, based in a place known as Cambridge, Massachusetts.
A statistical aside. Census figures for October 2008 show 11,378,000 students enrolled as undergraduates in institutions of higher education. Harvard College currently enrolls 6,655 undergraduates. Dividing the oranges by the apples, we can guesstimate that Harvard enrolls one out of 1,736 college students in the United States. That fact that Gates was admitted to Harvard in the first place means -- wait for it -- he was never your typical student. That he, or Facebook founder Mark Zuckerberg, dropped out of Harvard and still became billionaires doesn't predict much about education as it applies to the rest of us. I will say straight out that I am grateful for my place-based education, and grateful that nobody tried to take it away from me, stick me in front of a One Laptop Per Child screen, and call it an 'equivalently' rigorous experience.
See, here's the thing.
The CHE reports that Suzanne Guerlac, a UC Berkeley faculty member in French, believes that "[o]ffering full online degrees would undermine the quality of undergraduate instruction [...] by reducing the opportunity for students to learn directly from research faculty members. 'It's access to what?' asked Ms. Guerlac. 'It's not access to UC, and that's got to be made clear.'" I'd like to second that opinion.
Look at and listen to the Bill Gates video embedded above. All three minutes of it. Listen to the part where he talks about the value of a "full immersion environment" to education. He's careful to speak of this value only in the context of elementary and secondary schooling, which he likens to baby-sitting in one breath, before turning around in the next and saying that the "full immersion environment" is what enables students to learn successfully. Wha??? It's babysitting and it's the most effective form of education?
Notice that he doesn't actually explain why he thinks this full immersion stuff works for younger students but suddenly loses its applicability when kids reach college level. Pay close attention to the special hand-waving transitions at 1:14-1:16 ("That's very different than saying, okay, for college courses...") and 1:34-1:36 ("The self-motivated learner will be on the web..."). Why is it different? What about the learner (I hate that word) who is not self-motivated? Later in the video he's making a case for on-line education because it's cheaper, not because it is an effective mode of education.
It would be an ideologue's mistake to say there's nowhere along the Spectrum of What's Good that on-line education fits. The thing to remember, though, is that on-line learning is not the same thing as face-to-face, interactive experience with peers and teachers. Sure, it may be good for teaching some things. Maybe even some useful things, probably better in some areas than others. But what of the way students learn from the other students in a seminar or discussion section, and from the interactions those students have with a professor or a graduate student instructor?
On-line education isn't visceral, and it lacks the enormous incentives to learning inherent in synergies, competitiveness, sympathy and other forms of interpersonal zing that crop up in engaged rooms full of students and teachers. Ever wonder why people pay $40 or $80 or $120 for two or three hours of live theatre (or sports, or music), then complain about a cable TV bill that costs that much per month? It's the value, people.
And therein lies the issue behind the issue.
Across the United States, public funding for higher education is being pared back. In many states, this is true for elementary and secondary school education as well. In September of last year, UC Berkeley Chancellor Robert J. Birgenau and Vice Chancellor Frank D. Yeary wrote an op-ed piece for the Washington Post. They argued that "Public universities by definition teach large numbers of students and substantially help shape our nation," describing how the top ten public universities enroll 350,000 undergraduates compared to a sixth as many for the eight Ivy League schools. They argued that strong public support for public higher ed means that public universities "have an admirable cross-section of ethnically and economically diverse students. In essence, their student bodies look like America. They are the conduits into mainstream society for a huge number of highly talented people from financially disadvantaged backgrounds, as well as the key to the American dream of an increasingly better life for the middle class."
Then they describe how, "over several decades there has been a material and progressive disinvestment by states in higher education. The economic crisis has made this a countrywide phenomenon, with devastating cuts in some states, including California. Historically acclaimed public institutions are struggling to remain true to their mission as tuitions rise and in-state students from middle- and low-income families are displaced by out-of-state students from higher socioeconomic brackets who pay steeper fees."
Public funding for higher ed is being pared back internationally too. The government of the U.K., as part of a drastic package of austerity measures outlined last month, has proposed "to cut education spending and steeply increase tuition for students," drawing tens of thousands of protesters to London, as reported by the NY Times on 10 Nov.
Why is this happening?
I don't know the answers to these questions, but I'm inclined to believe that their answers are driving the stampede to 'on-line education' in the U.S. and abroad ... not because it's better, but because it's cheaper and because some groups of policymakers imagine that diminishing quality of your average citizen's education is, well, not a big deal. Not to them.
It's not the availability of technology -- it's the ends that social, economic, and political forces are driving toward, using technology as a means. You can't build a case on a three minute video of Bill Gates talking at a conference up at Lake Tahoe, but the brand of "vision" without substance that he was espousing earlier this year does raise my antennae.
There's something about on-line education -- and the agendas that may be driving it -- that makes me nervous. I don't know about you, but I'm not so keen on living in a world engineered and operated by people whose educational background and trained compliance could have been lifted straight out of The Matrix. Let alone governed by some shadowy Architect!
In June I blogged about the effort UC Berkeley law school dean Christopher Edley has been leading to "pilot" on-line learning dispensed by a leading research university. That effort is moving ahead on schedule, as reported by the Chronicle of Higher Education in May, with a call for proposals for up to 25 pilot courses, according to a UC-published puff piece earlier this month. But, bowing to the kinds of faculty balkiness that Dean Edley warned of when he worried that "the coalition of the willing among frontline faculty who would like to pursue this idea will be stopped dead in their tracks by the bureaucracy," this month's announcement from UC's central "Office of the President" devoted over 25% of its word count to a section titled "Faculty concerns to be studied."
That was enough backpeddling for Nanette Asimov to write an article for the San Francisco Chronicle two days later, titled UC leaders downplay plans for online courses. As she spins it, "University of California students will be able to enroll in the schools' first top-tier, UC-quality online courses by January 2012, but UC officials have strongly scaled back their expectations about what such courses can achieve. [...] the new effort was intended to see whether UC might pull off a fully online degree program as rigorous as what the selective university offers in its classrooms. For now, that plan is on hold."
In a related item, did you catch Bill Gates on the topic of technology replacing higher ed, at the Techonomy Conference 2010, in August? "The self motivated learner," Gates predicted over the summer, "will be on the web, and there will be far less place-based [education]. [...] College -- except for the parties -- needs to be less place-based. Place-based activity in that college thing [sic] will be five times less important than it is today."
Sharp-eyed monitors of zeitgeist-past will recall that in the mid-1970s Bill Gates dropped out of a small, well-endowed institution of higher education called Harvard, based in a place known as Cambridge, Massachusetts.
A statistical aside. Census figures for October 2008 show 11,378,000 students enrolled as undergraduates in institutions of higher education. Harvard College currently enrolls 6,655 undergraduates. Dividing the oranges by the apples, we can guesstimate that Harvard enrolls one out of 1,736 college students in the United States. That fact that Gates was admitted to Harvard in the first place means -- wait for it -- he was never your typical student. That he, or Facebook founder Mark Zuckerberg, dropped out of Harvard and still became billionaires doesn't predict much about education as it applies to the rest of us. I will say straight out that I am grateful for my place-based education, and grateful that nobody tried to take it away from me, stick me in front of a One Laptop Per Child screen, and call it an 'equivalently' rigorous experience.
See, here's the thing.
The CHE reports that Suzanne Guerlac, a UC Berkeley faculty member in French, believes that "[o]ffering full online degrees would undermine the quality of undergraduate instruction [...] by reducing the opportunity for students to learn directly from research faculty members. 'It's access to what?' asked Ms. Guerlac. 'It's not access to UC, and that's got to be made clear.'" I'd like to second that opinion.
Look at and listen to the Bill Gates video embedded above. All three minutes of it. Listen to the part where he talks about the value of a "full immersion environment" to education. He's careful to speak of this value only in the context of elementary and secondary schooling, which he likens to baby-sitting in one breath, before turning around in the next and saying that the "full immersion environment" is what enables students to learn successfully. Wha??? It's babysitting and it's the most effective form of education?
Notice that he doesn't actually explain why he thinks this full immersion stuff works for younger students but suddenly loses its applicability when kids reach college level. Pay close attention to the special hand-waving transitions at 1:14-1:16 ("That's very different than saying, okay, for college courses...") and 1:34-1:36 ("The self-motivated learner will be on the web..."). Why is it different? What about the learner (I hate that word) who is not self-motivated? Later in the video he's making a case for on-line education because it's cheaper, not because it is an effective mode of education.
It would be an ideologue's mistake to say there's nowhere along the Spectrum of What's Good that on-line education fits. The thing to remember, though, is that on-line learning is not the same thing as face-to-face, interactive experience with peers and teachers. Sure, it may be good for teaching some things. Maybe even some useful things, probably better in some areas than others. But what of the way students learn from the other students in a seminar or discussion section, and from the interactions those students have with a professor or a graduate student instructor?
On-line education isn't visceral, and it lacks the enormous incentives to learning inherent in synergies, competitiveness, sympathy and other forms of interpersonal zing that crop up in engaged rooms full of students and teachers. Ever wonder why people pay $40 or $80 or $120 for two or three hours of live theatre (or sports, or music), then complain about a cable TV bill that costs that much per month? It's the value, people.
And therein lies the issue behind the issue.
Across the United States, public funding for higher education is being pared back. In many states, this is true for elementary and secondary school education as well. In September of last year, UC Berkeley Chancellor Robert J. Birgenau and Vice Chancellor Frank D. Yeary wrote an op-ed piece for the Washington Post. They argued that "Public universities by definition teach large numbers of students and substantially help shape our nation," describing how the top ten public universities enroll 350,000 undergraduates compared to a sixth as many for the eight Ivy League schools. They argued that strong public support for public higher ed means that public universities "have an admirable cross-section of ethnically and economically diverse students. In essence, their student bodies look like America. They are the conduits into mainstream society for a huge number of highly talented people from financially disadvantaged backgrounds, as well as the key to the American dream of an increasingly better life for the middle class."
Then they describe how, "over several decades there has been a material and progressive disinvestment by states in higher education. The economic crisis has made this a countrywide phenomenon, with devastating cuts in some states, including California. Historically acclaimed public institutions are struggling to remain true to their mission as tuitions rise and in-state students from middle- and low-income families are displaced by out-of-state students from higher socioeconomic brackets who pay steeper fees."
Public funding for higher ed is being pared back internationally too. The government of the U.K., as part of a drastic package of austerity measures outlined last month, has proposed "to cut education spending and steeply increase tuition for students," drawing tens of thousands of protesters to London, as reported by the NY Times on 10 Nov.
Why is this happening?
- Is it to hurry along the well-documented shift in wealth from working- and middle-classes to an economic elite?
- Does it follow from that shift in wealth that a smaller cohort of educated workers is required to drive the economy?
- Is public divestment from higher ed an effect of economic globalization, in which a lion's share of 'the smarts' doesn't need to come from the United States and Europe any more because nations like China and India are supposed to be doing such a bang-up job educating their vast populations, and business owners can pay lower wages for their labor?
- Is it an effect of 'higher productivity,' in which fewer workers plus better technology produce greater wealth -- and -- I can never get this straight what with the competing narratives -- does this mean that there are fewer jobs to go around or that this magically-increasing wealth is somehow supposed to lift all boats?
- Is diminishing public support for higher ed an effect of fetishizing "the free market," an ideology that has been applied with an overly broad and abolutist brush?
I don't know the answers to these questions, but I'm inclined to believe that their answers are driving the stampede to 'on-line education' in the U.S. and abroad ... not because it's better, but because it's cheaper and because some groups of policymakers imagine that diminishing quality of your average citizen's education is, well, not a big deal. Not to them.
It's not the availability of technology -- it's the ends that social, economic, and political forces are driving toward, using technology as a means. You can't build a case on a three minute video of Bill Gates talking at a conference up at Lake Tahoe, but the brand of "vision" without substance that he was espousing earlier this year does raise my antennae.
There's something about on-line education -- and the agendas that may be driving it -- that makes me nervous. I don't know about you, but I'm not so keen on living in a world engineered and operated by people whose educational background and trained compliance could have been lifted straight out of The Matrix. Let alone governed by some shadowy Architect!
Labels:
higher ed,
politics,
technology
Thursday, November 11, 2010
Dystopias in fiction
A couple of weekends ago I saw Mark Romanek's film adaptation of Never Let Me Go, based on the novel by Kazuo Ishiguro (2005). I hadn't read the book before, but I have since.
Both the novel and film are a haunting take on a recurring trope in fiction: portrayal of an imagined, dystopian present or future. Examples abound. I'd say the most prominent among 20th century novels in English are Aldous Huxley's Brave New World (1932), and George Orwell's Animal Farm (1946) & 1984 (1949). On The Beach (1957) by Nevil Shute is another that most of my contemporaries read in high school. Margaret Atwood's The Handmaid's Tale (1985) and Oryx and Crake (2003), in a category the author calls "social science fiction," fit the dystopian bill; I still have Atwood's dystopian follow-up to Oryx and Crake, The Year of The Flood (2009) on my own to-read list. Harlan Ellison's A Boy and His Dog (1969), Jose Saramago's Blindness (1999), Cormac McCarthy's The Road (2006) ... the list goes on and on.
The premise of Ishiguro's novel, and of the film that hews closely to its dramatic arc and dry-eyed affect, is that human cloning became possible soon after World War II, and that human clones -- including the narrator, Kathy H., and her closest friends, Tommy and Ruth -- are raised for the exclusive purpose of providing "donations" of vital organs upon reaching maturity. After some number of "donations," two or three or four for most, the clones "complete." Of course the "donations" are not donations at all, as there is no free will whatsoever involved in the "donating" ... the cloned characters receive notices from some vaguely off-stage "they" and proceed to "centers" where their bodies are harvested. "Complete" is whitewash for "slaughter."
I found Never Let Me Go haunting for a couple of reasons. First, there's the present-timeness of the story. The science is fictional of course -- it will be the 21st century that sees human cloning, it was not the 20th -- but by constraining point of view to the world as it appears from the narrator's isolated circle the details are conveniently and convincingly allowed to remain fuzzy. The setting is England in the late 1990s, and readers and viewers are inexorably drawn into the conceit that this is the world we currently inhabit. Second, and for me most chilling, there's the utter lack of resistance to the fate that has been ordained for these people. (They are people. The novel is significantly concerned with the non-cloned world's exploration of whether these "creatures" are human, whether they have souls; but the reader is not left to wonder at all, as the view through Kathy H.'s eyes is as particular and humane as the view through our own.)
Tommy falls into inchoate rages. That's as close to railing against their inevitable slaughter as these poor "creatures" get ... Ishiguro gives us not even a whiff of organized resistance. It doesn't occur to the clones that resistance is possible. The characters in this novel are raised as farm animals, destined for an antiseptic abattoir. They accept their fate as if they were sheep or cows. I can't be the only person who "completed" this story filled with a queasy uncertainty about my future as a carnivore.
Ishiguro's fellow Booker Prize winner Margaret Atwood reviewed Never Let Me Go for Slate soon after the novel was published; her review was titled Brave New World: Kazuo Ishiguro's novel really is chilling. She wrote: "It's a thoughtful, crafty, and finally very disquieting look at the effects of dehumanization on any group that's subject to it. In Ishiguro's subtle hands, these effects are far from obvious. There's no Them-Bad, Us-Good preaching; rather there's the feeling that as the expectations of such a group are diminished, so is its ability to think outside the box it has been shut up in. The reader reaches the end of the book wondering exactly where the walls of his or her own invisible box begin and end."
Indeed.
At the time I saw and read Never Let Me Go I was in the middle of a polishing edit of my own current fiction project, Consequence, prior to sending out queries to literary agents. It was therefore natural to think about Ishiguro's work in relation to mine, as well as to the dystopian novels that I listed at the start of this post.
Consequence is also concerned about collateral damage that has been and will be caused by genetic engineering. Two differences between Consequence and the world Ishiguro portrays have a lot to do with why I found his work so haunting and unsettling. First, Consequence is set in the early 21st century and the deep moral and ecological compromise with which cloning (humans, other animals, bacteria, and plants) will taint our world has yet to be realized, or realized fully. There's still some hope that we won't catastrophically fumble this one. Second, the characters in Consequence are all about preventing that dystopian taint. My novel portrays activists resisting a fate that vast, impersonal forces are trying to impose on all of us. Ambling peacefully into the slaughterhouse -- as cows, sheep, and Ishiguro's characters do -- is a dystopia to which my characters are certain they will never submit.
At the end of the film Never Let Me Go -- but not of Ishiguro's novel -- the narrator Kathy H. is thinking in voice-over: "What I'm not sure about is whether our lives have been so different from the lives of the people we save. We all end up dying. And none of us really understand what we've lived through. Or feel we've had enough time."
It's a Hollywood ending, telling the viewer so directly what to think, how to package the preceding 103 minutes, how to shrug off any implicit call to active engagement in political culture (which wouldn't leave nearly enough time to consume film and television industry product, now, would it?). Ishiguro was an Executive Producer of the film, so it would be pretty risky to guess that he had nothing to do with the wrap-up.
The novel ends more ambiguously, and thoughtfully. It is evocative of what the "creatures" who inhabit this story have been denied. It left this reader straining against the limits and fate that Kathy H., and Tommy, and Ruth placidly accept. Ishiguro's novel absolutely does not make it all okay. That fits my sensibility much better than the film's ending -- to which, I admit, I responded with tears -- but only because I'm constitutionally a sentimental fool -- not because I prefer or permit myself to create sentimental art.
I recommend Never Let Me Go very highly. I also recommend that you do not make the mistake I did: you should read the book first.
I hope it will stir you to make the world you want to live in, rather than accept a predetermined fate. I hope too that it will predispose you to pick up Consequence in your local bookstore ... someday.
Thanks to hugefluffy for the photo of Kazuo Ishiguro signing his novel, Never Let Me Go in March 2005. Yeah, I know, it's out of focus & a bit shaky besides.
Both the novel and film are a haunting take on a recurring trope in fiction: portrayal of an imagined, dystopian present or future. Examples abound. I'd say the most prominent among 20th century novels in English are Aldous Huxley's Brave New World (1932), and George Orwell's Animal Farm (1946) & 1984 (1949). On The Beach (1957) by Nevil Shute is another that most of my contemporaries read in high school. Margaret Atwood's The Handmaid's Tale (1985) and Oryx and Crake (2003), in a category the author calls "social science fiction," fit the dystopian bill; I still have Atwood's dystopian follow-up to Oryx and Crake, The Year of The Flood (2009) on my own to-read list. Harlan Ellison's A Boy and His Dog (1969), Jose Saramago's Blindness (1999), Cormac McCarthy's The Road (2006) ... the list goes on and on.
The premise of Ishiguro's novel, and of the film that hews closely to its dramatic arc and dry-eyed affect, is that human cloning became possible soon after World War II, and that human clones -- including the narrator, Kathy H., and her closest friends, Tommy and Ruth -- are raised for the exclusive purpose of providing "donations" of vital organs upon reaching maturity. After some number of "donations," two or three or four for most, the clones "complete." Of course the "donations" are not donations at all, as there is no free will whatsoever involved in the "donating" ... the cloned characters receive notices from some vaguely off-stage "they" and proceed to "centers" where their bodies are harvested. "Complete" is whitewash for "slaughter."
I found Never Let Me Go haunting for a couple of reasons. First, there's the present-timeness of the story. The science is fictional of course -- it will be the 21st century that sees human cloning, it was not the 20th -- but by constraining point of view to the world as it appears from the narrator's isolated circle the details are conveniently and convincingly allowed to remain fuzzy. The setting is England in the late 1990s, and readers and viewers are inexorably drawn into the conceit that this is the world we currently inhabit. Second, and for me most chilling, there's the utter lack of resistance to the fate that has been ordained for these people. (They are people. The novel is significantly concerned with the non-cloned world's exploration of whether these "creatures" are human, whether they have souls; but the reader is not left to wonder at all, as the view through Kathy H.'s eyes is as particular and humane as the view through our own.)
Tommy falls into inchoate rages. That's as close to railing against their inevitable slaughter as these poor "creatures" get ... Ishiguro gives us not even a whiff of organized resistance. It doesn't occur to the clones that resistance is possible. The characters in this novel are raised as farm animals, destined for an antiseptic abattoir. They accept their fate as if they were sheep or cows. I can't be the only person who "completed" this story filled with a queasy uncertainty about my future as a carnivore.
Ishiguro's fellow Booker Prize winner Margaret Atwood reviewed Never Let Me Go for Slate soon after the novel was published; her review was titled Brave New World: Kazuo Ishiguro's novel really is chilling. She wrote: "It's a thoughtful, crafty, and finally very disquieting look at the effects of dehumanization on any group that's subject to it. In Ishiguro's subtle hands, these effects are far from obvious. There's no Them-Bad, Us-Good preaching; rather there's the feeling that as the expectations of such a group are diminished, so is its ability to think outside the box it has been shut up in. The reader reaches the end of the book wondering exactly where the walls of his or her own invisible box begin and end."
Indeed.
At the time I saw and read Never Let Me Go I was in the middle of a polishing edit of my own current fiction project, Consequence, prior to sending out queries to literary agents. It was therefore natural to think about Ishiguro's work in relation to mine, as well as to the dystopian novels that I listed at the start of this post.
Consequence is also concerned about collateral damage that has been and will be caused by genetic engineering. Two differences between Consequence and the world Ishiguro portrays have a lot to do with why I found his work so haunting and unsettling. First, Consequence is set in the early 21st century and the deep moral and ecological compromise with which cloning (humans, other animals, bacteria, and plants) will taint our world has yet to be realized, or realized fully. There's still some hope that we won't catastrophically fumble this one. Second, the characters in Consequence are all about preventing that dystopian taint. My novel portrays activists resisting a fate that vast, impersonal forces are trying to impose on all of us. Ambling peacefully into the slaughterhouse -- as cows, sheep, and Ishiguro's characters do -- is a dystopia to which my characters are certain they will never submit.
At the end of the film Never Let Me Go -- but not of Ishiguro's novel -- the narrator Kathy H. is thinking in voice-over: "What I'm not sure about is whether our lives have been so different from the lives of the people we save. We all end up dying. And none of us really understand what we've lived through. Or feel we've had enough time."
It's a Hollywood ending, telling the viewer so directly what to think, how to package the preceding 103 minutes, how to shrug off any implicit call to active engagement in political culture (which wouldn't leave nearly enough time to consume film and television industry product, now, would it?). Ishiguro was an Executive Producer of the film, so it would be pretty risky to guess that he had nothing to do with the wrap-up.
The novel ends more ambiguously, and thoughtfully. It is evocative of what the "creatures" who inhabit this story have been denied. It left this reader straining against the limits and fate that Kathy H., and Tommy, and Ruth placidly accept. Ishiguro's novel absolutely does not make it all okay. That fits my sensibility much better than the film's ending -- to which, I admit, I responded with tears -- but only because I'm constitutionally a sentimental fool -- not because I prefer or permit myself to create sentimental art.
I recommend Never Let Me Go very highly. I also recommend that you do not make the mistake I did: you should read the book first.
I hope it will stir you to make the world you want to live in, rather than accept a predetermined fate. I hope too that it will predispose you to pick up Consequence in your local bookstore ... someday.
Thanks to hugefluffy for the photo of Kazuo Ishiguro signing his novel, Never Let Me Go in March 2005. Yeah, I know, it's out of focus & a bit shaky besides.
Monday, November 8, 2010
Making a world where queer kids thrive
A lot of words have been written since Justin Aaberg, Billy Lucas, Asher Brown, Seth Walsh, and Tyler Clementi took their own lives after being subjected to callous humiliation and, in most cases, outright hostility by their peers. These young men ranged in age from 13 to 18, and lived in Minnesota, Indiana, Texas, California, and New Jersey, respectively. Justin Aaberg took his own life in July; the other young men committed suicide in September. Each of them identified himself as or was perceived by others to be gay.
I haven't attempted a real analysis, but I'd say most of what I saw and read about this harrowing series of suicides focused on the bad behavior of those who humiliated and tormented these five young men.
Let's not beat around the bush: that behavior was bad. It was awful. Those who humiliated and tortured Justin, Billy, Asher, Seth, and Tyler will have blood on their hands for the rest of their lives. Should they be legally punished for this bad behavior? I don't know, and I haven't got much riding on cookie-cutter answers to such a crude (as in not-nuanced) question. Whether or not or how severely they are punished doesn't change the fact that they must now and for decades to come bear responsibility for the fatal consequences of their actions, which cannot be undone.
But bad behavior is not what I want to focus on. I want to focus on three good things.
First good thing: cultural change can strengthen kids against the inevitable predation of bullies. Second: parents can help to better proof their kids against a culture in which those changes have not yet been realized. And third: kids can and do actively claim their right to be who they are.
None of the ideas in this blog post are original. In fact, I'm going to render what I think needs to be said in other people's words. This post is not about being the first to come up with an idea; it's about recognizing what's essential amid all the hullabaloo -- about separating the wheat from the chaff.
Cultural change
Richard Kim blogged on The Nation's site on October 6th. He eloquently separated wheat from chaff and needed fewer than 1500 words to do the job.
The chaff, with respect to Rutgers University students Dharun Ravi and Molly Wei (who used a webcam to spy on Tyler Clementi having sex with another man days before he jumped from the George Washington Bridge): "What Ravi and Wei did was immature, prurient and thoughtless; it undoubtedly played some role in what became an awful, awful tragedy. That they acted with homophobic malice, that they understood what the consequences of their actions might be, or that their prank alone, or even chiefly, triggered Clementi's suicide is far less clear."
And the wheat: "So when faced with something so painful and complicated as gay teen suicide, it's easier to go down the familiar path, to invoke the wrath of law and order, to create scapegoats out of child bullies who ape the denials and anxieties of adults, to blame it on technology or to pare down homophobia into a social menace called "anti-gay bullying" and then confine it to the borders of the schoolyard. It's tougher, more uncertain work creating a world that loves queer kids, that wants them to live and thrive. But try -- try as if someone's life depended on it. Imagine saying I really wish my son turns out to be gay. Imagine hoping that your 2-year-old daughter grows up to be transgendered. Imagine not assuming the gender of your child's future prom date or spouse; imagine keeping that space blank or occupied by boys and girls of all types. Imagine petitioning your local board of education to hire more gay elementary school teachers."
There will always be people -- kids and adults -- who commit cruel acts against children and youth. Some of these acts will be deliberate and conscious. Others will be thoughtless, callow, inadvertent, stupid, or fueled by antediluvian cultural values. What can be done? To the extent possible, we can each help to build a world in which kids understand that cruel acts committed by cruel and callow people are exceptional and wrong. That these acts are not the whole of the known universe. That the world is better than that, and they can be part of it. That's what Dan Savage's It Gets Better Project has been about. If you haven't heard of it, follow the link (and ask yourself: where have I been?). By speaking person-to-person across the internet through the medium of videos, the project does an end run around isolationists, reaching kids who need to know that queer people grow up to live satisfied, productive, emotionally rich lives in communities of their own choosing.
If there's a heaven, Dan Savage has secured a place in it, make no mistake.
Speaking of heaven, let's not pretend for a moment that cultural change will go down easy. This weekend, the NY Times published an article In Efforts to End Bullying, Some See Agenda. As they have since time immemorial, social conservatives -- a.k.a. people who try to isolate kids from truths they need to know -- are whipping out scripture and ferreting out the hateful passages. In Helena, Montana, school officials proposed "new guidelines for teaching about sexuality and tolerance. They proposed teaching first graders that 'human beings can love people of the same gender,' and fifth graders that sexual intercourse can involve 'vaginal, oral or anal penetration.'" In the actual real world that human beings inhabit, these are facts. The guidelines propose teaching about things that happen. Indeed, these are things that happen with significant frequency, often between people who make considered moral choices about them.
So what does one mother quoted in the NY Times story think about teaching facts to children? "'Anyone who reads this document can see that it promotes acceptance of the homosexual lifestyle,' one mother said at a six-hour school board meeting in late September." And Pastor Rick DeMato explains: "Of course we’re all against bullying. But the Bible says very clearly that homosexuality is wrong, and Christians don’t want the schools to teach subjects that are repulsive to their values."
Well -- leaving aside the further evidence that socially conservative Christians have a deceitful habit of pretending they speak for all Christians, which is absurd on its face but not the topic of this post -- there's the rub.
Anybody can say they're against bullying. But in communities that actively refuse to teach or accept that it takes all kinds to make a world -- communities that refuse to pull difference out from under what Richard Kim called "the denials and anxieties of adults" -- humiliation and torment breed like sewer rats. History has taught that lesson over and over again.
Being a good parent
A week ago today, a Midwestern mother of three who blogs as "Nerdy Apple Bottom," posted a blog titled My son is gay. If I were put in charge of assigning required reading for parents of young children, I would assign this blog post.
The post begins, starting with the title: "My son is gay. Or he’s not. I don’t care. He is still my son. And he is 5. And I am his mother. And if you have a problem with anything mentioned above, I don’t want to know you."
In short, an adorable boy, age five, chose to dress up as a cartoon character for Halloween this year. The cartoon character is female. When the blogger escorted her adorable five year old to preschool on Halloween, dressed as he chose, a number of other mothers whose children attend the preschool tried to make the boy and his mother feel very very icky because they are so positively certain it's wrong for a boy to choose to dress as a girl. On Halloween, mind you.
(Is it worth noting, as the blogger does, that her child's preschool is a church-hosted program? I'm not sure. As implied above, I am not one of those who thinks all religion is tainted with hatred and conservatism, or that all religious people are, at bottom, haters. I think ideas like that are ridiculous and patently false. And, indeed, Nerdy Apple Bottom proves this in her post: both she, who loves her child as he is, and the mothers who spew denial and anxiety as though possessed all send their kids to the same church preschool. It ain't about religion. It's about bigotry.)
Here's how Nerdy Apple Bottom ended her blog post: "If he wants to carry a purse, or marry a man, or paint fingernails with his best girlfriend, then ok. My job as his mother is not to stifle that man that he will be, but to help him along his way. Mine is not to dictate what is 'normal' and what is not, but to help him become a good person. I hope I am doing that. And my little man worked that costume like no other. He rocked that wig, and I wouldn't want it any other way."
'Nuff said.
Claiming a place in the world
Sayre Quevedo is a 17 year old who lives in the same city I do: Berkeley, California. (I do not know him.) He's a reporter with Youth Radio, and published an op-ed piece in the San Francisco Chronicle on October 26th. Sayre was nine years old when he came out to his mother. She asked him if he was sure, Sayre wrote, then said, "Well, I'm happy for you, honey. Now get ready for bed."
(Word to the wise: that's not the way it goes down for most kids who come out to their parents. Heck, I waited until I was 23 years old and it wasn't that way for me.)
Sayre wrote about supportive experience in his life, and also about tough times, when other kids screamed at him that he was "sick" and was "going to hell." He wrote that he likes Dan Savage's "It Gets Better" campaign, because it's something that can make a difference during those tough times. Most importantly, he wrote about what he has found and made to claim his place in the world: "But what made the difference was the support system that had my back. My friends, my family, my Gay-Straight Alliance and the staff at my school who I knew would enforce rules regarding homophobia."
Sayre Quevedo doesn't live in a perfect world. Kids in his school scream hate and scorn in his face. But he's got the support at home (see previous section) and in his community (see the section before that) to understand that hate and scorn do not define the borders of his world. In fact, while Sayre has to wade through hate and scorn and other forms of human idiocy -- because he's human like the rest of us -- he does not accept that hate and scorn have any rightful place at all in his world.
This is a lesson to all of us.
Related posts on One Finger Typing:
Toxic fundamentalism here at home
Thanks to jglsongs for the photo of the Gay-Straight Alliance school bus from Seattle Pride, 2008.
I haven't attempted a real analysis, but I'd say most of what I saw and read about this harrowing series of suicides focused on the bad behavior of those who humiliated and tormented these five young men.
Let's not beat around the bush: that behavior was bad. It was awful. Those who humiliated and tortured Justin, Billy, Asher, Seth, and Tyler will have blood on their hands for the rest of their lives. Should they be legally punished for this bad behavior? I don't know, and I haven't got much riding on cookie-cutter answers to such a crude (as in not-nuanced) question. Whether or not or how severely they are punished doesn't change the fact that they must now and for decades to come bear responsibility for the fatal consequences of their actions, which cannot be undone.
But bad behavior is not what I want to focus on. I want to focus on three good things.
First good thing: cultural change can strengthen kids against the inevitable predation of bullies. Second: parents can help to better proof their kids against a culture in which those changes have not yet been realized. And third: kids can and do actively claim their right to be who they are.
None of the ideas in this blog post are original. In fact, I'm going to render what I think needs to be said in other people's words. This post is not about being the first to come up with an idea; it's about recognizing what's essential amid all the hullabaloo -- about separating the wheat from the chaff.
Cultural change
Richard Kim blogged on The Nation's site on October 6th. He eloquently separated wheat from chaff and needed fewer than 1500 words to do the job.
The chaff, with respect to Rutgers University students Dharun Ravi and Molly Wei (who used a webcam to spy on Tyler Clementi having sex with another man days before he jumped from the George Washington Bridge): "What Ravi and Wei did was immature, prurient and thoughtless; it undoubtedly played some role in what became an awful, awful tragedy. That they acted with homophobic malice, that they understood what the consequences of their actions might be, or that their prank alone, or even chiefly, triggered Clementi's suicide is far less clear."
And the wheat: "So when faced with something so painful and complicated as gay teen suicide, it's easier to go down the familiar path, to invoke the wrath of law and order, to create scapegoats out of child bullies who ape the denials and anxieties of adults, to blame it on technology or to pare down homophobia into a social menace called "anti-gay bullying" and then confine it to the borders of the schoolyard. It's tougher, more uncertain work creating a world that loves queer kids, that wants them to live and thrive. But try -- try as if someone's life depended on it. Imagine saying I really wish my son turns out to be gay. Imagine hoping that your 2-year-old daughter grows up to be transgendered. Imagine not assuming the gender of your child's future prom date or spouse; imagine keeping that space blank or occupied by boys and girls of all types. Imagine petitioning your local board of education to hire more gay elementary school teachers."
There will always be people -- kids and adults -- who commit cruel acts against children and youth. Some of these acts will be deliberate and conscious. Others will be thoughtless, callow, inadvertent, stupid, or fueled by antediluvian cultural values. What can be done? To the extent possible, we can each help to build a world in which kids understand that cruel acts committed by cruel and callow people are exceptional and wrong. That these acts are not the whole of the known universe. That the world is better than that, and they can be part of it. That's what Dan Savage's It Gets Better Project has been about. If you haven't heard of it, follow the link (and ask yourself: where have I been?). By speaking person-to-person across the internet through the medium of videos, the project does an end run around isolationists, reaching kids who need to know that queer people grow up to live satisfied, productive, emotionally rich lives in communities of their own choosing.
If there's a heaven, Dan Savage has secured a place in it, make no mistake.
Speaking of heaven, let's not pretend for a moment that cultural change will go down easy. This weekend, the NY Times published an article In Efforts to End Bullying, Some See Agenda. As they have since time immemorial, social conservatives -- a.k.a. people who try to isolate kids from truths they need to know -- are whipping out scripture and ferreting out the hateful passages. In Helena, Montana, school officials proposed "new guidelines for teaching about sexuality and tolerance. They proposed teaching first graders that 'human beings can love people of the same gender,' and fifth graders that sexual intercourse can involve 'vaginal, oral or anal penetration.'" In the actual real world that human beings inhabit, these are facts. The guidelines propose teaching about things that happen. Indeed, these are things that happen with significant frequency, often between people who make considered moral choices about them.
So what does one mother quoted in the NY Times story think about teaching facts to children? "'Anyone who reads this document can see that it promotes acceptance of the homosexual lifestyle,' one mother said at a six-hour school board meeting in late September." And Pastor Rick DeMato explains: "Of course we’re all against bullying. But the Bible says very clearly that homosexuality is wrong, and Christians don’t want the schools to teach subjects that are repulsive to their values."
Well -- leaving aside the further evidence that socially conservative Christians have a deceitful habit of pretending they speak for all Christians, which is absurd on its face but not the topic of this post -- there's the rub.
Anybody can say they're against bullying. But in communities that actively refuse to teach or accept that it takes all kinds to make a world -- communities that refuse to pull difference out from under what Richard Kim called "the denials and anxieties of adults" -- humiliation and torment breed like sewer rats. History has taught that lesson over and over again.
Being a good parent
A week ago today, a Midwestern mother of three who blogs as "Nerdy Apple Bottom," posted a blog titled My son is gay. If I were put in charge of assigning required reading for parents of young children, I would assign this blog post.
The post begins, starting with the title: "My son is gay. Or he’s not. I don’t care. He is still my son. And he is 5. And I am his mother. And if you have a problem with anything mentioned above, I don’t want to know you."
In short, an adorable boy, age five, chose to dress up as a cartoon character for Halloween this year. The cartoon character is female. When the blogger escorted her adorable five year old to preschool on Halloween, dressed as he chose, a number of other mothers whose children attend the preschool tried to make the boy and his mother feel very very icky because they are so positively certain it's wrong for a boy to choose to dress as a girl. On Halloween, mind you.
(Is it worth noting, as the blogger does, that her child's preschool is a church-hosted program? I'm not sure. As implied above, I am not one of those who thinks all religion is tainted with hatred and conservatism, or that all religious people are, at bottom, haters. I think ideas like that are ridiculous and patently false. And, indeed, Nerdy Apple Bottom proves this in her post: both she, who loves her child as he is, and the mothers who spew denial and anxiety as though possessed all send their kids to the same church preschool. It ain't about religion. It's about bigotry.)
Here's how Nerdy Apple Bottom ended her blog post: "If he wants to carry a purse, or marry a man, or paint fingernails with his best girlfriend, then ok. My job as his mother is not to stifle that man that he will be, but to help him along his way. Mine is not to dictate what is 'normal' and what is not, but to help him become a good person. I hope I am doing that. And my little man worked that costume like no other. He rocked that wig, and I wouldn't want it any other way."
'Nuff said.
Claiming a place in the world
Sayre Quevedo is a 17 year old who lives in the same city I do: Berkeley, California. (I do not know him.) He's a reporter with Youth Radio, and published an op-ed piece in the San Francisco Chronicle on October 26th. Sayre was nine years old when he came out to his mother. She asked him if he was sure, Sayre wrote, then said, "Well, I'm happy for you, honey. Now get ready for bed."
(Word to the wise: that's not the way it goes down for most kids who come out to their parents. Heck, I waited until I was 23 years old and it wasn't that way for me.)
Sayre wrote about supportive experience in his life, and also about tough times, when other kids screamed at him that he was "sick" and was "going to hell." He wrote that he likes Dan Savage's "It Gets Better" campaign, because it's something that can make a difference during those tough times. Most importantly, he wrote about what he has found and made to claim his place in the world: "But what made the difference was the support system that had my back. My friends, my family, my Gay-Straight Alliance and the staff at my school who I knew would enforce rules regarding homophobia."
Sayre Quevedo doesn't live in a perfect world. Kids in his school scream hate and scorn in his face. But he's got the support at home (see previous section) and in his community (see the section before that) to understand that hate and scorn do not define the borders of his world. In fact, while Sayre has to wade through hate and scorn and other forms of human idiocy -- because he's human like the rest of us -- he does not accept that hate and scorn have any rightful place at all in his world.
This is a lesson to all of us.
Related posts on One Finger Typing:
Toxic fundamentalism here at home
Thanks to jglsongs for the photo of the Gay-Straight Alliance school bus from Seattle Pride, 2008.
Thursday, November 4, 2010
Evolution of the college textbook, redux
There was a lot of buzz in late February about a publishing platform called Dynamic Books that Macmillan was rolling out, timed to launch with about 100 titles students would be able purchase for Fall term of this year, according to a Publisher's Weekly article.
If you go to the Dynamic Books catalog page today, you'll find two Economics textbooks listed, one each in Life Sciences and Math / Statistics, and a dozen in the Physical sciences. That's a total of sixteen. And guess what? Well into Fall term 2010, only one of these books is for sale -- the rest are "coming soon."
No surprise, right? Things take longer than Marketing VPs might hope before they actually gain traction.
Since last week the blogosphere has been lighting up over a new new development in the world of wishing universities and the students who are their raison d'être would help e-books make inroads into a market worth something on the order of $7-12 billion (depending whose numbers you trust). The brave new idea? How 'bout if "colleges require students to pay a course-materials fee, which would be used to buy e-books for all of them" -- to quote a 24 October article in The Chronicle of Higher Education. And why would such a measure be necessary? Because "students tend to be more conservative when choosing required materials for their studies. For a real disruption in the textbook market, students may have to be forced to change."
Wow.
I work as a staff member at UC Berkelely, and have a cousin who is currently an undergraduate at Cal. That means I have an up-close view of the pain students and their families are feeling due to an unrelenting series of tuition increases over the past several years (32% last year alone), as state support for higher ed has dwindled. The California State University system was in the news (again) just last week: its trustees are considering increases of 15.5% this year and next, just as they warned was likely when the last increase was announced in this past June.
Part of my up-close view of students' pain comes via Facebook's Questions feature. Because I'm FB-friends with my Cal-enrolled cousin, I see questions stream by from her friends who are desperately seeking copies of a textbook someone else might have from a prior semester so that they can share or buy or rent or something as they study for exams in a class whose books they can't afford to purchase. Ouch.
I think it's pretty certain that students who are "forced to change" to a system that charges mandatory course-materials fees on top of rapidly escalating tuition costs are going to be ... unhappy, let's say. I think this is probably so even though the administrators who are experimenting with such models are aiming at lowering textbook costs for students, which is a very good thing.
It'll be interesting watching this one play out.
E-textbooks are certainly coming. Whether it's textbook publishing platforms that make it easier for professors to contribute to and pick-and-choose from available textbook material; or professors who choose books students can opt to purchase in electronic form; or administrators who ram e-texts down impoverished students' throats, ready or not ... one way or another, the future is nigh.
But university administrators are going to have to make a very clear case -- to students -- for near-term savings for students who are subjected to course-materials fees or similar mandatory schemes. Otherwise you can bet the building take-overs won't be reported from the Berkeley campus alone...
Thanks to Janne Huttunen for making her photo of the November 2009 occupation of Berkeley's Wheeler Hall available on Flickr under a Creative Commons Attribution-NonCommercial license.
If you go to the Dynamic Books catalog page today, you'll find two Economics textbooks listed, one each in Life Sciences and Math / Statistics, and a dozen in the Physical sciences. That's a total of sixteen. And guess what? Well into Fall term 2010, only one of these books is for sale -- the rest are "coming soon."
No surprise, right? Things take longer than Marketing VPs might hope before they actually gain traction.
Since last week the blogosphere has been lighting up over a new new development in the world of wishing universities and the students who are their raison d'être would help e-books make inroads into a market worth something on the order of $7-12 billion (depending whose numbers you trust). The brave new idea? How 'bout if "colleges require students to pay a course-materials fee, which would be used to buy e-books for all of them" -- to quote a 24 October article in The Chronicle of Higher Education. And why would such a measure be necessary? Because "students tend to be more conservative when choosing required materials for their studies. For a real disruption in the textbook market, students may have to be forced to change."
Wow.
I work as a staff member at UC Berkelely, and have a cousin who is currently an undergraduate at Cal. That means I have an up-close view of the pain students and their families are feeling due to an unrelenting series of tuition increases over the past several years (32% last year alone), as state support for higher ed has dwindled. The California State University system was in the news (again) just last week: its trustees are considering increases of 15.5% this year and next, just as they warned was likely when the last increase was announced in this past June.
Part of my up-close view of students' pain comes via Facebook's Questions feature. Because I'm FB-friends with my Cal-enrolled cousin, I see questions stream by from her friends who are desperately seeking copies of a textbook someone else might have from a prior semester so that they can share or buy or rent or something as they study for exams in a class whose books they can't afford to purchase. Ouch.
I think it's pretty certain that students who are "forced to change" to a system that charges mandatory course-materials fees on top of rapidly escalating tuition costs are going to be ... unhappy, let's say. I think this is probably so even though the administrators who are experimenting with such models are aiming at lowering textbook costs for students, which is a very good thing.
It'll be interesting watching this one play out.
E-textbooks are certainly coming. Whether it's textbook publishing platforms that make it easier for professors to contribute to and pick-and-choose from available textbook material; or professors who choose books students can opt to purchase in electronic form; or administrators who ram e-texts down impoverished students' throats, ready or not ... one way or another, the future is nigh.
But university administrators are going to have to make a very clear case -- to students -- for near-term savings for students who are subjected to course-materials fees or similar mandatory schemes. Otherwise you can bet the building take-overs won't be reported from the Berkeley campus alone...
Thanks to Janne Huttunen for making her photo of the November 2009 occupation of Berkeley's Wheeler Hall available on Flickr under a Creative Commons Attribution-NonCommercial license.
Labels:
books,
e-books,
education,
higher ed,
publishing,
technology
Monday, November 1, 2010
Californians aren't whores
I'm going to risk it. I know, it got the Chicago Tribune in trouble back in 1948, when they called the presidential election for Dewey over Truman (Dewey who?). But sometimes you have to step up, throw caution to the winds, and hope like hell the pollsters got this one right. If the Goddess of Democracy smiles on California, Meg Whitman will be a has-been by the end of the week. She'll be the candidate who dug herself the most expensive political grave in history.
Fox News was so proud last month to trumpet a garbled recording that proved somebody in the office of once-and-future governor Jerry Brown called his super-rich, opportunist opponent "a whore." In all the brouhaha, though, both the crass Brown campaign and the reactionary media got it backward. Meg Whitman's not a whore. If the concept applies at all, she's a john. The thing is, Meg Whitman's campaign -- into which she has poured $141 million of her personal wealth -- has been organized around the principle that California voters, as a group, are all whores. And that we're cheap whores at that: Meg's spent eight bucks and change on each of California's registered voters (figuring from the Secretary of State's stats as of 18 October). Even if she spent judiciously, targeting only the half plus one of Californians whose vote she's, um, soliciting -- we're still talking less than a Jackson.
Meg ... I'm sorry. Even in loose and licentious California you can't bid that low and expect to win the auction.
I'm generally pretty cynical about elections -- though eligible voters' absence at the polls prove more than 60% are more cynical than I am in midterm election years this far into the century, in that they don't even show up. Still, it would be nice to think that Meg Whitman's polling makes it safe to infer that there are limits to the insult that citizens will swallow.
No, Jerry Brown isn't many people's hero. Not by a long shot. But when some rich tyrant stirs from her lethargy as a citizen and attempts to buy her way into the CEO position of the 8th largest economy in the world, it's comforting to see that most people get that a lifelong politician, wrung out and flip-floppy as he may be, is going to be a lot more competent at actual governance than someone who couldn't be bothered to cast a vote for most of her adult life.
I voted by mail last week. But I'm looking forward to tomorrow night, when we all get to see what a $141 million concession speech looks like.
Thanks to Wikimedia Commons for the photo.
Fox News was so proud last month to trumpet a garbled recording that proved somebody in the office of once-and-future governor Jerry Brown called his super-rich, opportunist opponent "a whore." In all the brouhaha, though, both the crass Brown campaign and the reactionary media got it backward. Meg Whitman's not a whore. If the concept applies at all, she's a john. The thing is, Meg Whitman's campaign -- into which she has poured $141 million of her personal wealth -- has been organized around the principle that California voters, as a group, are all whores. And that we're cheap whores at that: Meg's spent eight bucks and change on each of California's registered voters (figuring from the Secretary of State's stats as of 18 October). Even if she spent judiciously, targeting only the half plus one of Californians whose vote she's, um, soliciting -- we're still talking less than a Jackson.
Meg ... I'm sorry. Even in loose and licentious California you can't bid that low and expect to win the auction.
I'm generally pretty cynical about elections -- though eligible voters' absence at the polls prove more than 60% are more cynical than I am in midterm election years this far into the century, in that they don't even show up. Still, it would be nice to think that Meg Whitman's polling makes it safe to infer that there are limits to the insult that citizens will swallow.
No, Jerry Brown isn't many people's hero. Not by a long shot. But when some rich tyrant stirs from her lethargy as a citizen and attempts to buy her way into the CEO position of the 8th largest economy in the world, it's comforting to see that most people get that a lifelong politician, wrung out and flip-floppy as he may be, is going to be a lot more competent at actual governance than someone who couldn't be bothered to cast a vote for most of her adult life.
I voted by mail last week. But I'm looking forward to tomorrow night, when we all get to see what a $141 million concession speech looks like.
Thanks to Wikimedia Commons for the photo.
Labels:
politics
Subscribe to:
Posts (Atom)