Thursday, December 23, 2010

Bruce Sterling in 1992

In thinking about Bruce Sterling after reading his brilliant tour de force on Julian Assange and Wikileaks, I dug up this old clip I'd made of a speech he made in 1992. He was right then and sounds even more visionary now:

From a speech by Bruce Sterling 
The Library Information Technology Association 
June 1992 • San Francisco CA

“What’s information really about?
 
“It seems to me there’s something direly wrong with the ‘Information Economy.’ It’s not about data, it’s about attention.
 
“In a few years you may be able to carry the Library of Congress around in your hip pocket. So? You’re never gonna read the Library of Congress. You’ll die long before you access one tenth of one percent of it.
  
“What’s important—increasingly important—is the process by which you figure out what to look at.  This is the beginning of the real and true economics of information.  Not who owns the books, who prints the books, who has the holdings.
 
“The crux here is access, not holdings. And not even access itself, but the signposts that tell you what to access—what to pay attention to. 

In the Information Economy everything is plentiful—except attention.”

Posted via email from edge & flow

Tuesday, November 23, 2010

Rare, great and philosophical qualities ...

“… to acknowledge the error he shall discover in his own argument, though only found out by himself, is an effect of judgment and sincerity, which are the principal things he is to seek after; that obstinacy and contention are common qualities, most appearing in mean souls; that to revise and correct himself, to forsake an unjust argument in the height and heat of dispute, are rare, great, and philosophical qualities.”

—Michel de Montaigne
Essays: Of the Education of Children

Posted via email from edge & flow

Wednesday, September 22, 2010

‘Objectivity' vs POV: Sorry, it’s more complicated than you think

Scott Rosenberg’s recent post (Journalists follow their voices, vote with their feet) introduces an important, oft-neglected perspective to the continuing debate about journalistic credibility and accountability. And as he suggests, it isn't really a binary question of “dull and mushy" versus "opinionated and exciting." 

Journalists have always longed for greater freedom, and the great ones have gotten it. You can read journalism from the Vietnam War, the Civil Rights struggle or Nixon era that proves it. Jay Rosen's aptly named "view from nowhere" represents laziness more than imposed restriction. It's simply easier to adopt the gambit Peter Goodman admits to: call and quote an expert you know will say what you want. (There are many variations on this technique; this is only one).

I am absolutely all for transparency and clear voice. I advocated it as the “manifesto” for a weekly I helped start in 1976 and I have known ever since that it’s essential. But the cure for the "view from nowhere" syndrome isn't as simple as declaring “here’s where I'm coming from.” The test of journalism’s worth and value must be, "OK, that's where you're coming from — but what did you bring me?" Simply having and declaring bias is itself mildly useful but it’s hardly determinative. If proffered as a license for dishonest reporting (“If I hadn’t believed this with my own mind, I never would have seen it”) then it’s worse that doing nothing.

Passionate advocacy journalism is altogether consistent with the accuracy, honesty and fairness the discredited “objectivity” standard claimed to advance. In simplest terms, the question is whether the writer starts by searching fearlessly for all the evidence and then comes to a conclusion, or sets out with a conclusion and gathers selected facts to advance it.

Many good journalists can honestly say that where they're coming from is a disciplined, ethical posture that tries to build truth out of evidence, regardless of the outcome. That's a POV, but not an opinion.

Likewise, nobody is coming from just one place, either. I have shades and degrees of passion and opinion and expertise on different subjects; my reporting on them will have to filtered through several lenses. We do no good by pretending that this is an easy journey from “nowhere” to nirvana.

Albert Einstein famously advised that everything should be made as simple as possible — but not simpler. 

Just so.

Posted via email from edge & flow

Monday, September 13, 2010

Two simple tools to make online reading a pleasure

Paul Simon thought we lived in “days of miracle and wonder” back in 1986. We can only imagine what he thinks now.

Yet even amidst contemporary marvels — my iPad, my featherweight mifi hub, an HD video camera that’s incidental to my phone — I’m finding a rather low-tech marriage of two distinctly Gutenbergish web services are among the info-systems I use most.

To wit: the indispensable Instapaper and longform.org.

Probably most folks who reads this blog are already established Instapaper devotees; if you by chance are not, stop now and go sign up. You’ll start using it almost immediately, and soon find you can’t live without it. What started as a simple web bookmarking service (the main command —“Read Later” — is also its raison d’etre) has evolved into an information tool so sophisticated that it still seems simple and transparent.

In its initial iteration, Instapaper served to organize links to web articles you wanted to read later (typically, for me, because they were long and I needed to keep moving at the time). After free registration, you’ll have a spot online to save the reminders and links you used to email to yourself or write down on scratch paper to lose later. Now it’s all available on a personalized, elegant page of headlines and links.

Instapaper reaches its apex these days as an iPad app. Many apps for Twitter and other services now include “Send to Instapaper” as an option alongside “copy,” “email” or what-have-you and the original bookmarklet is also useable. Once saved to Instapaper, the reading interface on iPad is the best there is right now for reading long articles found online.

Which brings us to the aptly named longform.org, a curation service that both selects suggested articles and makes it trivially easy to add them to your Instapaper que. It’s only as good as the quality of the recommendations, of course, but for me the signal to noise ratio is high — I’m interested enough to try perhaps a quarter of what they recommend. I assume other recommendation engines will soon provide similar service from different points of view.

As a result, I’m reading more from the web now than ever: all the sites and stories I would have flagged for myself before n addition to an eclectic, serendipitous selection drawn from longform.

There’s nothing very flashy about either service (thank goodness) but the combination is potent. You owe it to yourself to give this a try.

Posted via email from edge & flow

Tuesday, September 07, 2010

'Thank God for Title IX' — my American River Adventure


Since nearly all human experience is disintermediated nowadays, there is of course a YouTube video of just the site of my adventure on Labor Day. 
As nearly as I can reconstruct, we crashed into the large rock at the end of the Parallel Parking chute; the rock is very briefly visible in the near right foreground at 1:27 of this video. Barb and I were thrown out the left side of the raft, which then bumped the Parking Garage wall and continued through Catapult Rapid, without us.
Barb surfaced and swam to the left bank, away from Catapult. I am not nearly as strong a swimmer and so was washed through Catapult, roughly down the chute seen at 1:42. I was underwater for some time (surely less than it seemed) and distinctly remember the strange quality of the green, filtered light and the cloud of bubbles that surrounded us. I tried to push Barb upward (she remembers only being touched, or restrained) but we quickly separated. I broke the surface for one breath but was then caught in the waves and couldn't predict when my head would surface. It did now and then and I gulped air (and some water). I was trying, with limited success, to float as we'd been instructed, feet downstream and head up. Judging from the large contusion on my upper left thigh and various other scrapes, I did encounter some rocks, though I don't remember that. 
Somewhere at the end of Catapult Rapids (after the spot where the crew "high fives" in this video) I swept past our number two raft, unable to reach them. Shortly thereafter I caught up with my own raft, which was waiting. I missed the first paddle extended toward me but was able to grab Maureen's longer guide's paddle. She pulled me up to the boat, grabbed my life jacket and yanked me into the raft like I was a little kid.
She's not a big girl. I'd peg her in her late 20s, a classic California blonde with braided pigtails and a frequent smile. 
You go, Mo. She flexed her biceps for me later on the bus back; thank God for Title IX. 

916.531.4561
More about Howard

Tuesday, August 31, 2010

News vs coffee: It’s still about value, not price

We often confuse cost with value when thinking about what people will pay for. There’s a crucial, oft-neglected distinction, and recognizing that reality is central to determining the future of paid media.

Years ago we made a useful analogy about paying for a cup of coffee vs paying for a newspaper. I used it first, as nearly as I can tell, in a 1996 strategic plan for digital publishing at McClatchy:

... at the end of WWII a cup of coffee and a newspaper each cost about the same -- let’s say 10 or 15 cents; today a cup of coffee can fetch $3.00, while a newspaper at most costs 50 cents. The essential difference is that the coffee sellers learned to give their customers choices --  you can go to any espresso stand in any airport in America and order a double tall decaf skinny latté. They added value to their basic product.

The distinction between pricey coffee and cheap news is more nuanced than that, of course. In addition to offering choices, coffee houses created a cultural dimension to enjoying their product; think about the way Starbucks adapted the concept of the “third place” as a profitable niche between your home and office. 

Coffee itself is a commodity, and nearly any cup will serve the utilitarian purpose of spreading caffeine through your system. The same is often true of news: pretty much any source can give you basic facts and headlines about most things. Coffee houses overcome the commodity problem by offering not only many choices but also sociability, relaxation and a sense of community. They’ve created coffee snobs who can discern the difference between the tang of Kona roast and the smoothness of a nice Sumatran. (I know this; I am one). They’ve made coffee a broader and richer experience than simply an injection of caffeine. 

News companies haven’t done nearly as well, although there are some important exceptions. Almost any news consumer can tell the difference between the products of the New York Times and those of Fox News, and many choose which to consume precisely because of that. For a Fox News customer, the NYT is *not* a commodity replacement, and vice versa.

Others are trying hard. I’ve worked some with the Civil Beat organization in Honolulu, a membership (subscription) digital news service. They’ve picked their target carefully: in-depth, local civic news. They foster community through f2f meetings for members only, by focusing on coverage others aren’t providing (especially investigations and analysis), by maintaining easily accessible archives to help bring context to coverage, and much more.

Huffington Post is recognized as a leader in applying “virtual goods,” theory to its community, applying lessons learned from online gaming. You can earn digital merit badges based on levels of participation that in turn confer some status benefits on the HuffPo site. I’m sure others are doing similar (and I hope more meaningful) experiments. I wrote about the concept possibilities for virtual goods in news communities a while back; see Why is a Facebook beer worth more than your news story?

I was reminded to revisit all these ideas this morning when I saw “The Oatmeal” illustrated post about people who pay for all kinds of things but flinch at 99¢ apps for their phone. The same was true about shareware, a system where you downloaded and used software for free but were asked to voluntarily pay something to defray costs and support the programmers. (The general failure of the shareware system helped ruin my faith in peoples’ inherent willingness to do the right thing).

Have a look at The Oatmeal’s commentary, one panel of which is used above. Then think about the same argument relating to news, and whether there are ways to make it something other than a free commodity, something with enough value to attract paying customers.

Posted via email from edge & flow

Wednesday, August 25, 2010

Like books of old, today's web is in its swaddling clothes

Excerpt from an essay I wrote in 1999 about — you guessed it — "the future of newspapers":


"While we know the impacts brought by new digital media will be profound, none among us is able to divine their precise shape. The pace of change in the networked era is such that we are denied the luxury of extended study and careful reflection. The product development strategy of the digital age, it is said, is: Ready, Fire, Aim.

"Our uncertainty should come as no surprise. We know from history that it took more than 50 years for Gutenberg's invention of moveable type to result in the creation of anything that would be recognized today as a book. After Gutenberg, somebody else had to discover the form that best took advantage of his technology -- things like legible typefaces, numbered pages arranged in chapters, hard covers to bind the work together coherently in a convenient, portable size. Indeed, books printed between the invention of Gutenberg's press in 1455 and about 1501 are known to collectors today as incunabula -- taken from the Latin for 'swaddling clothes, indicative of a technology in its infancy.

"Similarly, the invention of moving pictures  in the 1890s did not immediately result in what we know today as movies. Here, too, was a technology in search of a format. Motion pictures initially were simply films of stage plays. It took time to discover the elements of cinema we all take for granted at the movies today: close-ups, flashbacks, shifting focus and so forth.

"The parallel between incunabula  books and those early moving pictures and what is happening today on the World Wide Web is inescapable. The technology has been discovered; we are searching for the format."

Posted via email from edge & flow

Monday, August 23, 2010

Turtles all the way down

I gave up on ‘truth’ a long time ago, 

but I’m having a hard time letting go of facts


I know folks have been fighting about “What is truth?” since way before Aristotle, and I’m comfortable leaving that debate to poets and philosophers. Obviously, it’s a topic way above my pay grade.


Until recently, however, I still placed a lot of faith in facts.


I cautioned journalists over several decades to remember that “truth is a plural noun ... there’s a lot of truth out there.” Instead of concentrating on somebody’s protean definition of truth, we tried to focus on things we could measure: accuracy, fairness, accountability, documentation.


Lately I’ve come to a deeper realization of how the nature of fact and authority themselves have fundamentally changed. I haven’t yet come to grips with what this means.


In the old days (five years ago) one poster child for this debate was Wikipedia vs Britannica. If a crowd-sourced, infinitely editable, volunteer encyclopedia could trump the venerable text with all its credentialed authority, we’d know some kind of milestone had been reached.


In many ways, the battle was already over by then. Wikipedia was already quickly craigslisting Britannica’s business model and seemed sure to outlast it. We were losing Britannica. so the question was mainly about what that was going to mean. Was Wikipedia really as good?


Web triumphalists were certain it was. To confirm that, they delighted in citing a study published in the journal Nature, usually asserting that it had found Wikipedia was as good. Inconveniently, the study didn’t actually say that.


Nature’s press release was more qualified than that, saying that “Wikipedia comes close” to matching Britannica’s accuracy. Even that, it turned out, was considerable overstatement.


What the study actually found was that Wikipedia had 100 errors for every 75 in Britannica. The actual numbers were that the Britannica articles studied averaged 2.9 while Wikipedia averaged 3.9. Now, I was schooled on newsroom math, but my calculation says that four errors instead of three errors per article amounts to 25% more errors. That’s hardly trivial.


I made that point in the comments section every time I saw somebody cite the study as “proof” of Wikipedia’s success. Almost nobody even acknowledged the challenge, much less bothered to consider it.


I was frustrated by that at the time. This was arithmetic, damn it. How could people just ignore it?


I understand better now. The real debate was over the nature of facts, and I was trying to assert one particular kind of fact (math) to make my point — but I was already too late. The post-modern fact train had already left the station. If enough people wanted to think Wikipedia was equivalent, nothing as small as mere arithmetic was going to change it. 


Now I realize that’s okay. I'd still argue some things require precision and hard facts — chemical reactions come to mind, or surgical procedures — but for many debates these days, “fact” is itself debatable. 


At one level this only recognizes a reality we long ignored. Few of the “facts” that were the foundation of my education were absolute. History books are written by fallible, often prejudiced people. Scientific theories, once memorized, will summarily be eclipsed by new data or another discovery. The scholars upon whose authority Britannica was founded knew this, and in the academy they’d argue endlessly about the stuff that later became typographic fact in their encyclopedia.


At some level, the experts in any field are telling us we weren’t smart enough to follow all the details, so we’d have to take their word for it.


And of course that’s largely true. You and I are never going to know enough about the history of Islam or the probability of Iran making a nuke to render reliable independent judgments. We have to rely on expert knowledge, and that means trusting somebody.


That trust is shifting. The catchphrase today is “algorithmic authority,” and the most illuminating and articulate spokesman I’ve encountered is Clay Shirky. It’s a gross over-simplification to put it this way, but you can think of this as a shift toward “trusting everybody.”


This turns Thoreau and Andrew Jackson and a lot of Enlightenment philosophy upside down, of course. (Thoreau: “Any man more right than his neighbors constitutes a majority of one.” Jackson: “One man with courage is a majority.”) Philosophers and polemicists will continue to insist they’re right — that there is a singular truth out there somewhere, independent of what our collective wisdom says about the subject. But in practical terms, this game is over.


News media and politics provide clear examples. It no longer matters if I think there is documentary evidence of Obama’s birth. Huge percentages of American voters think otherwise — for whatever reason — and they will act on what they believe. “Authority is as authority does,” Shirky has noted, and we must now learn to deal with the consequences.


The “wisdom of crowds” is both broad and deep, but it is not ubiquitous. James Surowiecki, who popularized the concept, was careful in his book to point out that crowds are wise only in certain, constrained circumstances: for example, when they are diverse, when they allow every voice to be heard, and so forth. Most folks who cite the wisdom of crowds don’t know that, and may end up putting trust in conclusions drawn by crowds that are anything but wise.


Algorithms, likewise, are subject to distortions of their own. Google’s “page rank” algorithm is often cited as a prime recommendation for crowd-based authority. It is useful and hugely successful, to be sure, but it is also the product of decisions made by a select few individuals in total secrecy. Few of the people who simply “Google it” to answer questions understand that.


Making these kinds of cautionary points to algorithmic advocates isn’t easy. Too often, their faith is built on unexamined assumptions that aren’t as firm as they imagine.


Shirky, characteristically, faces these issues head on. In acknowledging the lack of “root authority” for any definition of fact, he cites the old tale of “turtles all the way down”:


A well-known scientist once gave a public lecture on astronomy. He described how the earth orbits around the sun and how the sun, in turn, orbits around the center of a vast collection of stars called our galaxy. At the end of the lecture, a little old lady at the back of the room got up and said: "What you have told us is rubbish. The world is really a flat plate supported on the back of a giant tortoise." The scientist gave a superior smile before replying, "What is the tortoise standing on?" "You're very clever, young man, very clever", said the old lady. "But it's turtles all the way down! 


Later, Shirky amended his conclusion: “I lied before. It’s not turtles all the way down; it's a network of inter-referential turtles.”

Posted via email from edge & flow

Monday, July 26, 2010

Our first fruit harvest


We picked a few plums (on the right) and nectarines this morning, the first significant fruit of the season for us. We lost the cherries to the birds, and most of the other fruit isn't ready.
The trees are heavily fruited this season and we'll have a lot more than in recent years if it all comes to harvest: many more plums, peaches, pears, apples. Figs and jujube look good too (just one tree each).

Sunday, July 25, 2010

Summertime at Redwing

After an unusually cool spring and early summer, we have some heat at Redwing — not often the triple digits we see this time of year, but lots of bright, hot, mid-90s days.

Most of the garden crops and fruit trees look good, especially melons, tomatoes, plums and nectarines.

Posted via email from Wine country crops

Saturday, July 24, 2010

Meaning must be made, not discovered

He knew that there could be no meaning to someone who was dead. Meaning came out of living. Meaning could come only from his choices and actions. Meaning was made, not discovered ... The things he’d wanted before — power, prestige — now seemed empty, and their pursuit endless. What he did and thought in the present would give him the answer, so he did not look for answers in the past or future. Painful events would always be painful. The dead are dead, forever.


Karl Marlantes

Matterhorn (Chapter 23)

Posted via email from edge & flow

Thursday, July 08, 2010

Worth remembering

The rules of conduct, the maxims of actions, and the tactical instincts that serve to gain small victories may always be expanded into the winning of great ones with suitable opportunity; because in human affairs the sources of success are ever to be found in the fountains of quick resolve and swift stroke; and it seems to be a law inflexible and inexorable that he who will not risk cannot win.


—John Paul Jones

Posted via email from edge & flow

Sunday, May 30, 2010

Thoughts about 'writing for the web'

Believe it or not, I’ve never really thought specifically about writing for the web.
I think that’s because I don’t believe there’s really any difference between writing online and writing elsewhere.
This is not to say that all online writing follows the same pattern or that analog writing is perfectly adaptable to the online world. Tweets (of which I have now posted 1,739) and blogs (I have several, and have been doing it since 2002) are obvious examples.
Writers have always tuned their voice to the medium in question. News articles aren’t magazine articles aren’t novels. Television news isn’t television comedy. Movies aren’t theater. This, I think, is the key insight in understanding how to communicate best in whatever forum.

Thus, Rule One: Know thy forum.
You can’t write successfully online if you aren’t immersed online. You learn blog conventions by reading blogs. You learn Twitter by tweeting (and following good tweeters).
If you haven’t spent hours on end grazing across the web, you aren’t going to understand what it is that causes a reader to give up and click elsewhere. You won’t realize that if your blog looks the same every time a reader clicks (even during the same day), she’ll soon look elsewhere. You won’t know that referring readers to choice items from others — retweets, we call that — is an excellent way to keep them following you.

Rule Two: Escape is only a click away.
Online readers have no patience. I read somewhere that an episode of The West Wing had about twice as many words as a television drama 10 years earlier. Newspaper research proves conclusively that a high percentage of readers falls away every time you ask them to turn the page. Cable news crawls and split-screen pictures are evidence of diminished attention span.

Rule Three: The biggest turn-off is an unanswered question.
Patience is tested (more accurately, demolished) by writing that leaves readers confused rather than enlightened. You must know why they’ve come to read your piece, and make sure that impulse is satisfied. Afterward, you may be able to entice them to spend time with other things, but they won’t hang around long enough to find them if their initial itch isn’t scratched.

Rule Four: Reading online rots your brain.
It’s apparently true that spending time online does indeed diminish thoughtfulness, concentration and patience. Nicholas Carr writes in the current issue of Wired, “Even as the Internet grants us easy access to vast amounts of information, it is turning us into shallower thinkers, literally changing the structure of our brain.”
I’m not sure what a writer is supposed to do with this knowledge, but it seems imperative to know it.

Rule Five: These rules don’t actually mean much.
As I suggested in the second paragraph, there really isn’t any singular kind of writing for “the web.” There isn’t even any singular “web” at all. It exists only in the intentions of its users and is as varied and differentiated as they are.
For instance, I suppose the most important writing on a porn site might be the single word MORE, with a hyperlink. On the other hand, substantial journalism can and does flourish online; in the right environment, readers will navigate through a great deal of complex and challenging prose to get to the nuggets inside.
Take a look at longform.org, a site designed to highlight long narrative journalism. It’s particularly tuned to serve the site Instapaper, which makes it easy to archive a long online article for later reading. (It is a particular delight on the iPad, which itself is a delight for reading when compared to a VDT screen).
Nobody knows just yet how profound the iPad effect may be, but it seems likely to me that App-based reading will be vastly different than web-based reading. Apps offer a curated, bounded exposure, and can be tailored to combine typography and  illustration to augment and amplify the words in ways websites (with the ubiquitous exit just a click away) never can. If I click on my “NPR addict” app or my BBC app, I have made a decision to seek a specific information experience, and I will approach what I find there differently.
John McPhee once told me his years of apprenticeship at Time Magazine — where he wrote countless short entertainment articles — was valuable mainly because it taught him that every story has to “stand up, move, and find a place to sit down.” Beginning, middle, end. 
Online writing is not so different, I suggest.

Sunday, April 04, 2010

Salvation? Forget devices and work on 'modeless innovation'


Ethan Kaplan (blackrimglasses) has more precisely said what I’ve been trying to say about the tiresome question, “Will the iPad save [insert medium here]?” He said, “Newspapers ... should think more about modeless innovation and less about atomic device specific innovation. One device does not beget a reinvention of a dying model.”

Modeless innovation. Yes, that’s it.

To me, that means “create something valuable to share.” Screw the delivery media, the “atomic specific devices." All of them. Instead, learn how to enrich users’ lives. Help promote and serve community. Become indispensable by making some part of their lives better.

When I started using the internet, Gopher was the best information retrieval system around. FTP was a killer ap. Email was magic (except none of your friends used it yet). 

One day, perhaps soon, the web will seem as antiquated and clumsy as those. The iPad will collect dust. Who knows what the hell the technology will be? I have near-boundless confidence in teenagers and garages. Quantum computing, anyone? Mindcasting?

No matter. Have something valuable to share. Have something worthwhile to say. Contribute something to the common good.

Modeless innovation, baby.

I’ve been thinking hard lately about how virtual goods and social media relate to what we call news. I wrote a little, and hope to write a lot more. But pretty much everything I’ve been dreaming comes under the heading “modeless innovation.”

Friday, April 02, 2010

Have iPad critics fallen into an echo-chambered trap like the news industry?

The self-referential nature of most “future of journalism” discussions is evidence of the very problems they lament. Reading much of the pre-iPad criticism, I’ve realized that’s often true of the open-as-religion crowd, as well.

The iPad is bad, they say, because it’s closed. There are things you can’t do on it that they sometimes want to do. It’s not a “real computer” because you can’t open it up (actually or metaphorically, I presume). Cory Doctorow’s jeremiad is a particularly pointed example, much praised and linked to by CPU curmudgeons.

If you can’t open it, you don’t own it, he declares, certainly implying that he doesn’t own a modern car, which no shade-tree mechanic or home hobbyist can open up and tinker with any more. In fact, many of the arguments about the iPad are well mirrored in that analogy.

"Without the open Apple II or Commodore or whatever, I’d have never learned to program," some complain. Well, doubtful. The world saw its last generation of tinkerable autos some time ago, yet today’s cars are safer, more reliable and less environmentally destructive than their predecessors. They’re better for everybody, really, except the guys who want to tinker.

Yes, there are plenty of folks who wish they could still pop the hood and fix things. As the owner of a 1967 Jeepster, I understand. But "closed" automobiles didn't signal the end of mechanics or car designers.

BTW, the iPad isn’t going to take away your Dell or MacBook, is it? Go ahead, program there, all you like. Let a thousand flowers bloom.

The other consistent criticism is that the iPad is designed mainly for consuming content, not creating it. Even if I thought that was fair (I don’t), I’d have to ask, “So what?”

People who protest that it’s simply a beautiful device for reading/viewing/playing prepackaged media must think there’s something wrong with enjoying a beautiful packaged book or song or video.

Cory recalls that he gave away or traded comic books he bought as a kid, and that such fan interaction helped create an appetite and market for comics. I was a comics fan and trader as well, but mentioning that by way of criticizing Marvel because you can’t freely give away its iPad comic is preposterous. As a kid, Cory could give away only the comic books he purchased, and he could give each away once. Online we can (and do) give away hundreds of thousands of individual content units: books, songs, comics, whatever. (This is usually called “sharing” rather than "giving away,” but whatever). I fully understand the digital duplication jinni is long since out of the bottle and I’m not asking to have that debate again. But why insult our intelligence by comparing comix-as-atoms with comix-as-bits?

Cory also sets up a preposterous straw man by defining those who like prepackaged media as one of Wm Gibson’s consumers: “something the size of a baby hippo, the color of a week-old boiled potato, that lives by itself, in the dark, in a double-wide on the outskirts of Topeka. It's covered with eyes and it sweats constantly.”

This is the place where this brand of critics fall most deeply into the same kind of echo-chambered trap as the news industry — by thinking that most people should think and behave like themselves. But most people are not and will not ever become creators of sophisticated media. Instead they’re working in bakeries and insurance offices and having babies and teaching people to play the fiddle.

They don’t want to make a lot of “content” and they don’t need to. They do want to “consume media.”

The People Formerly Known As the Audience — as Jay Rosen brilliantly characterized them — have a lot more clout and choices and opportunity nowadays. Yes, hierarchical relationships like the one between editors and readers have changed forever. That's a good thing.

But you know what? We’re still audiences a lot of the time — by choice. Cory & Co. think that makes us sweaty, colorless and and covered with eyes. I don’t.

—30—

Caveat: I have reservations and questions myself about the “iPad ecology” about which Cory and others complain. Reflecting my particular bias, I’m worried about the relationship of news publishers and the iPad application gatekeepers. I’m guessing it will end working something like a bookstore — owners decide which books to carry, but don’t edit the ones they sell. They can decline pornography if they want to, or other subjects but will pay a heavy price for every subject they chose to exclude. But I don’t know that, and it’s a central question news companies need to answer.

Friday, March 05, 2010

Yes, most of us are still media consumers, and that's iPad's appeal


John Battelle and my many friends who can’t wait to disparage the iPad as “passive,” “old-school” or proprietary need to take deep breath and calm down.
None of them had such harsh criticisms about the Kindle, a far more restrictive and decidedly more passive device. Most of them carry netbooks that are underpowered, cramped little boxes with almost no aesthetic appeal. Why are they so bothered about the fact that the iPad will (reportedly) be a gorgeous canvas on which all kinds of media — yes, including interactive media — can be enjoyed?
Yeah, you won’t use the iPad to create the rich media. So what? 
Many of them criticize me every time I mention “consumers.” Yet the fact remains that most people — overwhelmingly so — consume most of their media diet; they are not co-creating much of it at all. The stuff they do create — social media, mainly — will work fine on iPad. 
These critics might consider that a third of the country doesn’t have broadband access. They should look at the statistics that show content creation vs consumption is more lopsided than the standard 80/20 rule of thumb.
Until the next-best device comes along, most of us will enjoy having a well-designed, easy to use, highly portable device for reading the new enhanced Penguin books or Condé Nast magazines or whetever other wonderful new media will populate the new ecology.

Thursday, February 11, 2010

Why is a Facebook beer worth more than your news story?

As Chris O’Brien asks at the PBS IdeaLab, “Why will people spend $1 to send you a virtual beer on Facebook, but not to read a news story online?”

Good question — and there’s a lot more than a dollar at stake. Americans are spending something like $1.6 billion a year for “virtual goods” — that is, things that exist only in cyberspace, like that Facebook beer, or status upgrades in a game — but we’re told they won’t spend squat on news. What’s up with that?

If you’re a journalist, your first impulse might be to ask “What’s wrong with them?” But a far more useful question is to ask “What’s wrong with us?”

There’s only one reasonable explanation: people spend discretionary money on the things that matter most to them. If it turns out that buying intangibles to enhance experience in a virtual (non-physical) world is worth more than consuming another isolated, incremental news fact, that’s where their dollars will flow.

These distinctions aren’t all black-and-white, of course. (Honestly, these days, what is?) Some news consumption is related to the immersive, satisfying virtual experiences people pay for; by and large, organizations that provide some of that experience — not just a collection of individual factoids written in a peculiar news dialect — tend to be doing better than those that don’t.

But news-as-social-community happens by accident nowadays. What would happen if a news organization set out to make its product immersive and satisfying on purpose?

Writing in TechCrunch, Susan Wu said, “Virtual objects aren’t really objects – they are graphical metaphors for packaging up behaviors that people are already engaging in.” That sounds like something that could very well apply to an online community defined by common interest in civic affairs, doesn’t it?

Sharing and caring about news is an inherently social activity.  “Everybody who is interested in Ahmadinejad” or “People worried about a property tax increase” certainly comprise communities. The problem is, news organizations don’t treat them like communities — don’t feed and nurture and satisfy them — and so they fragment and drift apart. Much of their value drifts away with them.

Why would somebody spend real money on a virtual rose or make-believe beer on Facebook? What Susan Wu said: because it’s a graphical metaphor: it stands for something, it’s part of an integrated system that rewards participation.

An individual news story is itself a virtual good. What’s missing is the community environment in which it is recognized as valuable, an ecology where caring about the news becomes satisfying and rewarding social behavior. Instead of becoming an integral part of a social community experience, consuming news stories remains an isolated individual act.

When somebody creates a social ecology around news, I’m willing to bet they’ll also create a place where the virtual goods we know as “news stories” become valuable for their creators.

Saturday, January 30, 2010

iPad will help us most when it disappears

Both in anticipation of and reaction to the Apple iPad, people playing the Future of News Game tended toward superlatives. It would save traditional models, some said, by making Plain Old Newsprint pretty and shiny and worth charging for. Others looked at the missing Flash plugin and multitasking capability and dismissed the device as irrelevant.

They’re both wrong—and the truth isn’t somewhere halfway in-between them, either.

Here’s the most important thing about the iPad: it can be one of the biggest steps yet toward taking the technology out of our way and letting human beings get on with communicating, creating and consuming news. In much the same way the desktop metaphor and mouse made computer power more accessible than the command line, iPad’s touchscreen, instant-on availability, intuitive interface and extreme portability promise still greater opportunity.

If the Macintosh was “the computer for the rest of us” (and it was), maybe the iPad will be “networks for the rest of us.” If it’s easy, intuitive and relatively cheap to experience constantly updated Facebook and Twitter and the New York Times on a bright, colorful screen, doesn’t it make sense that more people will do so?

The technoids who instantly set upon the iPad for what’s missing — Flash, total multitasking, no camera, no SD slot, yada, yada — don’t get it. Apple didn’t build the iPad for them (although I’ll bet most will end up owning one). They built it for the people who love it when technology “just works.” (It’s also illuminating to see what these critics had to say about the iPhone in version 1.0; they look silly now. By the time iPad cycles through a few software and firmware updates, today’s arguments will be even more hollow.)

It’s also obvious that expecting a miracle cure for what ails newspapers and magazines is deeply stupid. The fact that the iPad’s roughly the shape of a published page, or that it will be used primary by holding it in your hands doesn’t offer any new hope for content created by hierarchical, top-down newsrooms that haven’t figured out consumers are in control. People will get news about subjects they want, when they want it—and many will be creating it, as well. What the iPad’s likely to mean for them is that they’ll get what they want easier and consume it more pleasurably—but it will be what they value, not what a gatekeeper decides to give them.

Here’s what I think—and devoutly hope—will happen: the iPad (and even better devices sure to follow) will enrich human beings by removing technological barriers.

For all their failings, newspapers were equally accessible to everybody who could read: cheap, portable, intuitive, ubiquitous. Poor boys had about the same chance as bankers to keep up with the news. Good newspapers worked to shape content to meet a wide range of interests—football scores and shipping schedules and how-they-voted charts—because they knew a lot of different people would be looking through the window those pages opened.

Alan Kay, the computer visionary who famously declared Macintosh “the first computer worth criticizing, hasn’t weighed in directly on the iPad as far as I know. It’s hard to imagine that he won’t see it a significant realization of his Dynabook dream, a tool that makes information and communication ubiquitous and makes devices disappear.

In the middle 1980s, Kay visited Alaska for a lecture and was interviewed in the Anchorage Daily News, articulating intoxicating ideas that helped awaken me to the brewing information revolution. He was careful even then to caution against focusing too much on devices. “The music’s not in the piano,” he said. “If it was, we’d have to let it vote.”

When iPads start arriving two months from now, we’ll be a lot closer to realizing his long-time vision.

The device is becoming as simple as a newspaper—and infinitely more capable. It’s now up to producers to be sure what they offer thrives in a world where accessing their work (or a competitor’s) is as easy as picking up a book, or the newspaper.
 
/*