Here are a couple of unrelated Amazon stories, neither one of which really seems to merit a full post by itself. First, Betanews reports that it is indeed possible to root the Kindle Fire HD so it can run unrestricted Android, using an Android 4.0 exploit that Amazon forgot to fix. Enjoy it while you can; now that it’s widely-known Amazon will probably close it in the next patch. It’s worth remembering that a locked bootloader doesn’t necessarily mean rooting is impossible.
And it turns out that Amazon’s restrictions on installing home screens on the Kindle Fire HD aren’t quite as strict as Nate had previously reported. There is at least one home screen app which can bypass Amazon’s restrictions; unfortunately it costs around $4 and cannot be bought in the Amazon Appstore.
In other news, as part of the discovery process in the defense it is putting together for its agency pricing trial next year, Apple is trying to subpoena records from Amazon relating to interviews it had with the Department of Justice concerning its e-book pricing practices as part of the Department of Justice agency pricing investigation. Amazon has filed in Seattle federal court attempting to quash the subpoena, and Apple is trying to get the motion transferred to New York so Judge Cote can rule on it.
People sure are curious to see what Amazon had to say to the Department of Justice, aren’t they? Bob Kohn was going on about this, too, in his court filings and his amicus curae comic book. Perhaps it has to do with Amazon being so closed-mouthed about its practices in general that the chance to find out what they said is so irresistible. It will be interesting to see if Apple succeeds where Kohn failed.
Entertainment moguls Scott Rudin (a film and theater producer) and Barry Diller (chairman of IAC/InterActiveCorp, former CEO of Paramount Pictures, Fox, etc., pictured at right) are getting together with publishing executive Francis Cody to form a new electronic and eventually paper book publisher called Brightline. The publisher will work in partnership with existing Brooklyn e-publisher Atavist.
The idea for the publisher came out of talks Rudin had with various high-profile authors during the process of turning their books into films. Rudin said he heard many complaints about the publishing industry’s rigidity and slowness to adapt to change. Diller and Rudin hope that a new business, launched without the confining legacy structure of existing publishers, will have the flexibility needed to compete in the current electronic publishing world.
I’ve been saying for some time that we need new publishers who aren’t tied to the legacy anchor and can innovate faster, and the fact that the major backers of this publisher come from outside the world of publishing could be a good thing—when you get people who don’t know that something is impossible, sometimes they manage to do it.
Another interesting thing to note is that this seems to lend additional support to the Department of Justice’s position in the agency pricing settlement talks—that Amazon’s “monopoly” power isn’t keeping potential new players out of the market. After all, here’s one coming in with the stated intent to compete with Amazon right off the bat.
As a writer who works in serial format myself (my co-writers and I just posted part 14 of a story in a science-fiction universe we’re creating to a free on-line fiction site), I was interested to see this piece from PaidContent that looks further into Amazon’s new serial format in-depth. The business turns out to be unexpectedly lucrative; the startup, Plympton, that licensed the first few Kindle Serials went from being able to pay its writers $500 per episode plus a bonus to paying them five-figure fees.
But writing serials isn’t for everyone. It can be very tricky—as I can attest from my own experience. The problem with it is that you’re working on a tightrope—it’s very hard to write and keep posting serial fiction because it’s all too easy to hit a snag in mid-stream.
Plympton’s Goldstein Love echoed [long-running online serial writer Claudia Hall] Christian’s comments that writing serials is hard. “We have really come across a lot of what we’re calling the third episode problem,” she told me. “It’s a lot easier to write a brilliant first episode of something. In your second episode, you’re continuing that. In the third episode, you realize you have no idea where this is going. It’s a real danger with writing serially. We won’t sign anyone on fully until we see how the first three [episodes] go.”
That is absolutely right. I’ve seen many attempted serials flame out in places like the old Superguy listserv as writers had trouble figuring out where to take them once they’d started them. Or simply trouble finding the time to get there once they had. (I’ve left more than one series unfinished for that reason myself.) And this problem has struck even professionally published writers, as with Diane Duane’s long-delayed The Big Meow Storyteller’s Bowl project, which went for years without seeing an update.
The article goes on to note that serials could be a challenge for Amazon because they’re basically charging the same amount for a serial that they charge for a “Kindle Single”—and only requiring the reader to pay that amount once to gain access to the whole thing—but requiring authors to do many times the work.
As someone who’s worked in serial (albeit admittedly unpaid) online fiction myself, I find Amazon’s attention to serial e-books to be very intriguing, and I will be interested to see how this new format goes. Is it going to fizzle out? Will it set the world on fire? One thing’s for sure, it hasn’t done that badly for a lot of the free online and fanfic sites that post writers’ serialized works.
Heck, even chartbuster Fifty Shades of Grey was originally published serialized as a free Twilight fanfic, building up its huge original audience over time as more readers spread the word and became invested in the story. (Was that fact more of an inspiration to Amazon than Dickens, I wonder?)
For any website that gets as much traffic as Wikipedia, the temptation is great to use it for scurrilous purposes, such as search-engine optimization or even simply plugging things for money. The community is generally pretty good about noticing people trying it and shutting them down—witness the Philip Roth thing I wrote about in which someone claiming to be his biographer tried to change something by Roth’s request and was politely but firmly shown the door until Roth himself penned an article to rebut it.
But what if the people doing it are high-level Wiki administrators? CNet reports on a case where two such Wiki staffers, a trustee and a Wikipedian in Residence, appear to have used Wikipedia to advance professional interests. Roger Bamkin, who is a trustee of the Wikimedia Foundation UK, has reportedly used Wikipedia to plug his current PR client, the country of Gibraltar—featuring it in Wikipedia’s “Did You Know” front page feature 17 times over the month of August, and putting it forward as a target site for a project involving using QR codes to link places and things in a town to articles on Wikipedia.
Meanwhile, Wikipedian in Residence Max Klein, one of a number of highly-esteemed Wikipedia editors who liase with galleries, libraries, archives, and museums, has been found to operate a consulting agency that bills itself as a way to “navigate the complex maze surrounding ‘conflict of interest’ editing on Wikipedia” to make use of the “invaluable SEO” from a positive Wikipedia write-up.
Jimmy Wales isn’t exactly pleased by all this, and has said in the past that paid editing is against what Wikipedia stands for. But on the other hand, there isn’t any specific Wikimedia policy, pro or con, about the practice either—which also leaves PR firms that want to correct actual errors about their clients, rather than simply promote them, up a creek as well.
Wikipedia is basically the free e-encyclopedia, and has been amazingly successful since its inception, managing to outlast attempted replacements on the strength of its sheer community-forming collaboration. The firm commitment of its commitment to try to be as free from bias as possible has helped, too. But if its own administrators are going to try to take advantage of it to forward their own personal interests, that’s a problem. Probably not one that will sink the encyclopedia, but it certainly won’t help things.
At first glance, this seems like an interesting and wise decision. After all, Baen has been doing this for over a decade. O’Reilly has been DRM-free forever, too, and Tor just started recently. But an interesting quirk of the model is that HBR seems to be taking a sort of “hybrid” approach. The cheaper editions of its e-books it sells via Amazon and iTunes will still have DRM, and you’ll pay almost twice as much ($18 vs. about $10 for one example) for a DRM-free version direct from HBR’s website. (Though professors who assign the books as course material can get their students a 50% discount, and organizations buying e-books in “bulk” can get a discount starting at just 10 copies of the e-book.)
Will this strategy work for HBR? It just might. The sorts of textbooks and nonfiction that HBR publishes have a different market with different needs from mass-market fiction. And purchasers of such books might just be willing to pay extra for an e-book they can use anywhere they need without having to crack the DRM first.
The really interesting thing, I think, is not so much how many different publishers are starting to try DRM-free approaches, but just how many different ways they’re starting to do it in. The more things like this publishers try, the more we all learn about what will work and what won’t. And so e-book publishing goes DRM-free, a little bit at a time.
PaidContent has updates on a couple of e-book-related lawsuit stories today. First, Judge Denise Cote has approved the $69 million everyone-but-Minnesota lawsuit settlement with Hachette, HarperCollins, and Simon & Schuster. The payments won’t happen until next year because of a “fairness hearing” to be held on February 8th to allow those opposed to have their say against it.
Meanwhile, an appeals court has issued a stay in the Authors Guild vs. Google Books copyright case until it can review Judge Denny Chin’s decision to grant the Guild class-action status. This places the already long and drawn-out proceedings on hold for another few months. But the separate but related Authors Guild suit against the Hathi Trust proceeds apace, and it could end up deciding the eventual fate of Google Books if it gets decided first.
The annoying thing about these courtroom battles is that they take so long to play out. But I’m sure there will be some eventual satisfaction when they do.
In the US, booksellers like Barnes & Noble consider Amazon to be The Great Satan. Look at how vehemently they have fought against the Department of Justice’s settlement that would enable Amazon to start discounting e-books again. But it seems as though the major bookstore chain on the other side of the Atlantic has just made a deal with the devil.
At a recent Independent Publishers Guild meeting, Waterstones CEO James Daunt admitted that a recent deal the chain signed to start selling Kindles in its stores is fraught with difficulty, but that it managed to avoid “some of the most significant bear traps” and stands to gain considerable benefit from it.
The store will get a portion of Amazon sales but only those that take place through its in-store networks. It will also get the benefit of some Special Offers on the ad-supported Kindles, and have some kind of Read For Free plan in which, presumably, Kindle owners can read certain Waterstones e-books free in the store.
Amazon already sells its Kindles through a number of other UK bookstores, so perhaps Waterstones figures that if you can’t beat them, join them. Daunt made a number of other remarks at the meeting according to FutureBook. He didn’t think most Waterstones customers would actually care that the store sold Kindles, for one thing, and that the partnership would continue for as long as it was beneficial to Waterstones.
At any rate, businesses will happily make deals with other businesses if they think there are more benefits than drawbacks. If Waterstones thinks Kindles can draw more people to its stores, more power to it. It remains to be seen whether or not that will actually be the case.
I learned about an interesting project on The Guardian the other day. Fantasy author Silvia Hartmann is writing her latest novel in Google Docs, the on-line collaborative word processor—and inviting the world to watch. The document file is open to the whole web, and would-be readers can simply click the link and read what’s been written so far, or indeed watch her write more in real time during those hours when she’s there and writing.
This is a pretty clever idea. Though as Alison Flood points out on the Guardian, this sort of thing has been done before, it’s been done rarely enough that it’s still a pretty new and clever idea when someone else does it, and really a great way to build interest in the finished book while still writing it if, like Hartmann, you don’t mind exposing your process to all and sundry while you’re still in the process. Not only does it show people what the story is becoming, but it builds interest in the story the same way serializing it does—you watch it take shape before your very eyes.
Some writers, such as Sharon Lee and Steve Miller, or Diane Duane, have done something similar in that they would post sections of a completed draft in return for specific levels of donations under the Storyteller’s Bowl model, but even they didn’t get so granular as to let other people watch what they were doing. Basically it’s inviting people to watch your every little stumble or mistake, to see every typo you make and every word you misspell, every blind alley you go down before deleting and rewriting. Small wonder Hartmann refers to this as “The Naked Writer Project”!
I have been using Google Docs myself a lot lately, for collaborating with a number of other writers in a shared-world science-fiction setting we’re creating together. Before that, I used to use MoonEdit, then EtherPad. They were all on-line editors, which allowed multiple people to write at one time. Google Docs used to be pretty clunky for that kind of thing, only updating every thirty seconds or so. Now it updates in real time. It also has a chat sidebar and commenting system to make working with other people easier. And, of course, it saves what you write in the cloud so you’re not hostage to your own hard drive. And while I’ve lost documents I’ve written in the EtherPad cloud before, Google is a bit more reliable than that.
I’ve been having a lot of fun in my collaborations, and I find writing this way with other people has kind of the same appeal readers might find in watching Hartmann work. It’s fun to watch my writing partners bang out their own paragraphs in our stories, or in their own stories set in the same universe as ours, in real time, because I often don’t have any idea what’s going to happen next.
And a lot of readers apparently do find the sort of appeal watching Hartmann. I popped into the doc to read what was there so far and check it out, and found about fifty people hanging out in the chatroom (all anonymously named, since it’s a public document) even though Hartmann wasn’t actively writing at the moment. A few of them responded to conversation, but most seem to be like one person who said they log in when they get to work then leave it up all day. And each one of those dedicated watchers is a person who at least might buy the book when it’s finished.
And not only does Hartmann get readers who watch her write, she could get instantaneous feedback as well—at least, if she watches the chat window at the side and doesn’t do her best to ignore it as a distraction. (One of the other chatters said that she does say things there at least occasionally.) This means there’s at least the chance that she might make decisions or change the way the story is crafted based on what her readers say as she’s writing it. Of course, not all writers would seek or welcome that kind of feedback, but for those who would, this could be a great way to get it.
Google Docs does have one limitation that might curb the effectiveness of such a promotion—it seems to be limited to 50 people in a room at one time. This means that only 49 other people can watch Hartmann do her realtime writing at once. Still, not everyone has to be able to follow the book from start to finish to enjoy it enough to want to buy it.
So is the book any good? I’m not going to offer an opinion. It’s light fantasy style writing, in draft quality. It may look very different when it reaches final publication. In any event, the quality or lack thereof may not be the point. People who like the way the story starts out will stick with it; people who don’t will shrug and go away, and they’re less likely to complain about it in online reviews than they would if they’d paid real money for it and then found they didn’t like it. So it’s win-win.
About 90% of the press attention today is going to Apple’s new iPhones. And those are nice and all, but the more relevant gadget to e-book readers is actually the iPod Touch. As I’ve said a number of times before, mostly over on TeleRead, the iPod Touch has long had the potential to be the preferred pocketable for people who didn’t want to have to pay through the nose for phone service. Pair it with a prepaid MiFi for net access and you’re good to go whenever and wherever, with no need to fret about contracts or paying for minutes you never use. (And that’s how I use my old 1st-gen 8 gig even now.) With today’s model, it looks like it’s finally got the chance to live up to that potential.
The biggest new feature from an e-book lover’s point of view is the new 4” screen, same as the iPhone 5’s, which means it can fit a couple more lines of e-book text on the screen at one time (at least for reader apps that are adapted to take advantage of the new taller screen shape), and at retina resolution so it’s one of the clearest LCD screens ever, inch for inch. It’s also finally getting a real 5 megapixel camera, which can also take 1080P full motion video, and not that insult of a low-resolution camera the last iteration got. (That should be more than good enough for portable OCR.) It’ll have Siri, too, and the ability to stream 1080P video over AirPlay.
It’ll have a few sillier features, too, like the multiple colors and the wrist straps you can use not to drop it (while looking like a total dork). But perhaps the most impressive is that it’s only 6.1 millimeters thick, making it the thinnest iPod Touch ever. It really does put me in mind of one of those plastic-pane PADDs they used to use on Star Trek.
Possibly one of the even more useful things about this for e-book readers, though, is what it will mean in terms of the pricing on older refurbished models of iPod Touch. If you don’t care so much about the 4” screen or the 5 megapixel camera, you could get the 3.5” 2010 model with retina display pretty cheaply now.
The Apple web store lists a refurbished 64GB 2010 version for $279, or a refurb of the 64 GB 2009 version (no retina display) for $249. The 32GB is $209 for the 2010 or $179 for the 2009. You could even get an 8GB 2010 model for $149, which is not a bad price at all for a pocketable retina display that can hold hundreds of e-books for on-the-go reading, even if it does have a pretty lousy camera.
I’m not sure whether those prices are lower today than they were yesterday, since I didn’t check them then, or whether the prices will only drop further when the new generation actually goes on sale (it refers to the 2010 model as the “current” generation, which adds to the confusion), but either way they’ll undoubtedly fall even further in another few months.
In a way, I feel a bit bad about plugging Apple devices like this, given the company’s many examples of reprehensible behavior over the last few years. But for all that the company has been total jerks, it is still a lot closer to owning the handheld mini-tablet niche than it is to owning the larger tablet niche. A year and a half ago I went looking for a good Android equivalent to the iPod Touch and couldn’t find one.
The closest equivalent at the time seemed to be the Galaxy Player, and it looks like that device has been getting some decent reviews lately on Amazon, but it hasn’t had the time to build the reputation or get the polish of the iPod Touch. The Touch has had years of development, giving it plenty of time to work out the worst kinks and add great features. I use mine every day and love it, and am seriously drooling over the prospect of getting a newer one—either a last-generation refurb or the new super-thin thing. Because evil company or not, Apple sure does know how to make a handheld tablet worth drooling over.
It’s not exactly a secret that I’ve been somewhat skeptical of Unglue.it’s selections and model for soliciting crowdfunding to unlock various e-book titles into free Creative Commons release. Most of its titles didn’t seem well-enough known to attract many people, and were generally already available inexpensively enough as e-books that most people would be more likely to buy the e-book to read for themselves than to kick in so everyone could read it free. And the recent cancellation of its payment processing by Amazon seems to have necessarily put its plans on hold for a while.
That being said, Unglue.it did hit one big success early on, when it was able to crowdfund the release of a 1970 non-fiction book, Oral Literature in Africa by Ruth H. Finnegan. This book had a funding goal of $7,500, which it met with the help of 291 donors, and work proceeded apace to “unglue” the book into free public release under a Creative Commons Attribution (CC BY) license. This means that the work may be used for any purpose, including commercially, as long as the new use is careful to make clear where it got the original material. Today, Unglue.it made that e-book available for free download in Mobipocket (Kindle), EPUB, and PDF formats.
Needless to say, this is quite a great accomplishment. The book had been out of print for decades and fetching high used prices on Amazon. The sorts of people who would have wanted to have it available would have had quite a lot of incentive to donate—for example, college professors who wanted to use it in their courses, or academics and students of Africa, oral traditions, and related matters. And people are accustomed to paying more for scholarly nonfiction works like this anyway—just look at how much college textbooks cost these days.
All in all, it’s not too surprising that the sort of people who would have wanted it free would have wanted it enough to make it happen. I would go so far as to say that even if Unglue.it never “unglued” another book, it will still be fondly remembered for this—at least in certain academic circles.
The only puzzling thing to me is why Unglue.it didn’t try to do more projects like this one. I’m sure there must be other academic books out there that are just as highly sought-after, and whose authors might be just as amenable to taking part in such a project. There might even have been some donors who crossed over from one project to another, deeming both academic nonfiction books to be equally worth supporting.
But apart from this one, all or most of Unglue.it’s books (in the original batch, anyway) tended to be self-published or indie-press published fiction—with completely different demand profiles. Not only were these not in as much demand as an out-of-print academic work, but there wouldn’t have been much crossover between people wanting to support a scholarly treatise on African folklore and people who would be inclined to support freeing some random self-published fantasy novel.
Anyway, I’ve downloaded the book and glanced through it. It’s not exactly the sort of thing I’d choose for light reading, but it seems to be very professionally formatted, with footnotes, black and white and color photos (lots of photos—the EPUB file clocks in at 5 megabytes), and block quotes. If I were to want to read a book about the oral literature of Africa, this is exactly the sort of book I would want to read. If this is a sample of the quality Unglue.it will bring to the table, then I certainly hope that Unglue.it can get past its payment processor snafus and is able to unglue many more books.
If Unglue.it is able to get back on track, I’d suggest it might want to learn from its success as well as its failure. If it wants to build a name for itself based on successful projects, it ought to choose more of the kinds of projects that are likely to be successful. Once it gets a reputation for succeeding, then is the time to branch out and take chances.
Here’s a cute little Kickstarter. Ross Pruden has launched a Kickstarter called “Dimeword” to fund one hundred 100-word short stories to be released into the public domain. Everyone who donates $10 gets a 100-word story written just for them. (Found via Techdirt.) Everyone who donates at least $1 gets every story emailed to them a week before the book’s official release. Pruden offers a number of perks at levels ranging from $1 to $5,000, though the highest perk he’s managed to sell (one of) is $500.
In the description of the Kickstarter, Pruden casts it as his attempt to show how authors can still make money without copyright. He explains content producers should “use the abundant to sell the scarce” by building a relationship with fans and using network effects of those fans to sell to other fans—then set the work free at the same time as he sells it.
The Kickstarter has cleared over $2,000 of its $1,000 goal, with two days to go, which makes it look highly successful at first glance. But when you look a little closer, you notice that only it only has 70 backers (at the time of this writing), and only 28 out of the 100 stories have actually been paid for. (There are actually 300 $10 donation slots available; there’s no explanation anywhere I can find of what the other two hundred people would get after the 100-story goal was reached—not that that is going to happen at this point.)
Furthermore, almost half of the total money the project has raised can be laid at the feet of only six backers—one $500 and five $100 donations. And eight more backers account for almost $400 more among them. I suppose from the point of view of earning Pruden the money he wants to earn without invoking copyright, the project could be called successful, but if he was aiming at democratizing content production by appealing to fans and getting them to spread the word to others, I’m not so sure he’s really succeeded—he didn’t even get enough people to fill a bus. Instead of taking in a little money each from a lot of people, he’s taking in the bulk of his money from just a few.
Is this the real future of copyright-free content—authors finding a few well-heeled angels willing to pay them for giving stuff away? The problem is, there aren’t all that many angels, and there are a lot more content creators who want to be paid than they’ll have the budget or the patience to fund.
On the other hand, Pruden did at least keep his goal modest—barely even four digits. Which means he had a better chance of success from the start than other would-be public-domain Kickstarter projects I’ve noticed. Perhaps that’s the real lesson for would-be public-domain start kickers: aim low and you’re more likely to succeed.
Well, this isn’t a surprise. The coalition of publishers who sued Georgia State University over copyright violations committed in online course packs and had most of the suit resoundingly overturned are filing an appeal.
On the conference call, Sage CEO Blaise Simqu suggested that the decision to appeal was made with the consultation of a group of Sage’s “textbook” authors, asserting that the authors had overwhelmingly supported Sage’s pursuit of the appeal. OUP president Niko Pfund, meanwhile, acknowledged that the litigation puts the press in a delicate spot, suing their customers, but said the decision was flawed, and if left to stand would threaten publishers who operate on “razor thin” margins.
Hardly all textbook authors are in such firm support, however. Publishers Weekly quotes University of Virginia professor Siva Vaidhyanathan as being afraid he would lose his job if the publishers prevailed, since a publisher victory would throw into doubt the safety of professors providing course material packets to students.
Various organizations are lining up on either side of the case. The Copyright Clearance Center, which has helped find the publishers’ side, is going on about “clarifying the balance embodied in fair use,” disregarding the fact that the judge’s ruling was pretty clear as it was—just clearly not what publishers wanted it to be. The Association of American Publishers, also supporting the plaintiffs, complains the ruling promotes digital copyright infringement. On the other side, the Association of Research Libraries criticizes the publishers and those organizations for lacking “understanding or respect for the fair use rights” of academics.
This comes at a time when a Bookboon.com survey of 10,000 college students reports that over 75% of students in the United States don’t buy their college textbooks new because they’re too expensive and only needed for a few chapters. (In the UK, the number is 83.3%.) 60% of students buy second-hand, according to the survey, and 58% of students in the US prefer digital textbooks. (There wasn’t any mention of what percentage admitted to downloading illegally.)
So it’s not surprising that academic publishers are trying to hold onto every last bit of copyright power that they possibly can. Indeed, as thin as textbook margins are getting, it would almost be surprising if they didn’t. It’s a bit worrisome, though, to reflect what might happen if the margins get too thin to support the publishers. What will happen to our education industry if the textbook industry collapses?
Well, presumably some new industry would arise from the ashes. We’re always going to need some way to educate the next generation, and if one fails we’ll end up finding another.
Say what you will about Bob Kohn, he’s certainly persistent. PaidContent reports that he’s filed a last-minute request for a stay on Judge Cote’s ruling that the Department of Justice’s Agency Pricing settlement can go into effect. He wants the imposition of the settlement suspended until the Second Circuit appeals court can rule on it. If the appeal is rejected—as it probably will be—then that will go to the Second Circuit. And if it strikes out there, too, then the fat lady will have officially sung, because by the time the appeals court gets around to ruling on the settlement itself the market will already have been changed by months of new e-book prices.
In his most recent blog post, Mike Shatzkin gives the devil his due, noting that Amazon has done a remarkably good job of synergizing a lot of its different properties and capabilities together for new features in its new family of Kindle products. Of particular note are the abilities to synchronize or pair the e-book and audiobook versions of a book—which will not only make both more useful, but will also induce consumers to make two purchases instead of one.
He also thinks Amazon will discount aggressively, using “bots and algorithms”, those e-books covered by the settlement—and that Apple lacks the expertise to match them at that game. (This is a bit interesting given that Nate thinks that, with its newly-cheap ereader hardware, Amazon won’t be doing that sort of aggressive discounting. I wonder who’s right?)
At any rate, our times just got more interesting. I can’t wait to see what Amazon does next.
Wikipedia isn’t perfect. Sometimes the mechanisms in place meant to protect the credibility of the information can actually get in the way of the people who are in a position to know the most about the subject correcting things they know to be mistakes. There are good reasons for this, of course—people who are directly connected with a subject might have ulterior motives for making changes. For example, a well-known PR firm caught flak for editing its clients’ articles last December, leading to the Chartered Institute for Public Relations in June issuing guidelines for its members requiring them to seek changes through Wikipedia’s editors.
Of course, not all such changes are so dodgy as paid flunkies nipping in to make their clients look better. Case in point: author Philip Roth. In the Wikipedia article for one of his books, The Human Stain, he found a notation that the story was based on the life of writer Anatole Broyard, when in actuality it was inspired by an incident in the life of his friend Princeton professor Melvin Tumin. When he tried to correct this error, via what the novel’s entry’s talk page calls his “self-proclaimed biographer,” he was told by a Wikipedia editor that he, the actual author of the book, “was not a credible source.” I’m not sure if it was intentional or not, but Roth found a rather clever way around this restriction in the end.
As silly as the idea that an author might not be a “credible” source about his own book might first appear, there are good reasons behind it when you think about it. There is, of course, the conflict of interest issue, which is pretty obvious—if someone’s being paid to alter the writeup, how can you be sure he’s not doing so according to the interests of his employer rather than of the truth? If someone’s altering the writeup about something he himself did, how can you know he’s not doing it out of his own interests?
But consider, also, that authors can say anything they want to about their books after the fact. More importantly, they can change their minds on what they say after the fact, and they can be rather hard to get ahold of if you want to verify what they actually said (especially if they’re since deceased). Consider the late Ray Bradbury’s 2007 insistence that Fahrenheit 451, understood for decades to be a salvo against censorship, was actually meant as a salvo against television and other new media. If that was the case, why didn’t he speak up about it before? It’s not like critics made any great secret of their interpretations.
So Wikipedia relies on secondary published sources: things that don’t necessary reflect what someone is saying now, but represent a snapshot of what he did say at a particular moment in time. So when Roth published a 2635-word open letter to Wikipedia in The New Yorker, he was not only complaining about Wikipedia’s policies, he was creating a secondary published primary source that Wikipedia editors could cite in getting the error corrected.
Some blogs, such as Gizmodo’s Mario Aguilar, are happy to wax irascible at Wikipedia’s “idiotic rules” about sources, and cackle with glee at how Roth “outfoxed” them. But really, he didn’t. Instead, he did exactly what they wanted, and produced a fixed secondary published source that could be cited and researched to any Wikipedia editor or reader’s content. It would have worked exactly the same way if he’d tacked it on as an afterword to his next book—it would just have taken longer and been less visible.
Neatly summarizing the Wikipedia policy side of things, Wikipedia editor Emergentchaos wrote on the article’s talk page:
I agree with everyone who’s saying that this policy looks silly, but on reflection, think it is the right thing. Let’s agree that Roth is the best source on his own motivations. Given that there are other sources, it is helpful as an encyclopedia to get him to commit his perspective to a citable source. Imagine that this was a more contentious issue; would we accept the author’s opinion that this is an excellent book when it’s panned by the critics? We’d use the preponderance of evidence rule to push back on such a thing. So while this looks as silly as a skilled author like Roth can make it look, I’m not convinced that that’s an argument to revise wikipedia policies.
The talk page entries present a slightly different side to the story than a lot of the media paint. The original changes were made by someone who just claimed to be Roth’s biographer, who was making the changes at his request, but didn’t exactly provide any proof to back up his contention. After the changes were reverted, the article was bolstered with 6,000 words including many references to sources supporting the Broyard contention—which is undoubtedly why Roth was moved to declaim in such detail that he did not know Broyard.
Regardless, it seems to me that in this case Wikipedia’s policies worked perfectly well, in causing Roth to set down his arguments in a more concrete form that will survive into the ages than just a few edits to Wikipedia. Of course, it’s unlikely anyone who is not a famous novelist could get a major media outlet like The New Yorker to publish it, but you never know. There might be other, equally effective ways.
Are you going to be in the vicinity of Urbana, IL this weekend? There’s a little paper book estate sale you might want to stop by. Now, normally we probably wouldn’t make such a big deal about paper book sales, but this happens to be the the estate sale of Michael Hart—the founder of Project Gutenberg, and one of the principal reasons we can read e-books on handheld devices today.
The sale site is posted in the form of an amusingly cheeky WordPress blog, with pictures of some of the books that will be offered for sale. The ground rules for the sale include the fact that books with cover prices will be sold at those prices—including old pre-inflation paperbacks. Also, the sale will start promptly at 9, with no early-birds permitted, and they express a hope that most books will go to individual readers, not dealers.
Ordinarily, I’d think it was kind of sad to see a man’s collection of books parceled out after he passed away—but Michael Hart dedicated his life to promoting reading. I’d like to think he would approve of seeing his books go to other people who would like to read them in turn.