From a speech by Bruce SterlingThe Library Information Technology AssociationJune 1992 • San Francisco CA“What’s information really about?“It seems to me there’s something direly wrong with the ‘Information Economy.’ It’s not about data, it’s about attention.“In a few years you may be able to carry the Library of Congress around in your hip pocket. So? You’re never gonna read the Library of Congress. You’ll die long before you access one tenth of one percent of it.“What’s important—increasingly important—is the process by which you figure out what to look at. This is the beginning of the real and true economics of information. Not who owns the books, who prints the books, who has the holdings.“The crux here is access, not holdings. And not even access itself, but the signposts that tell you what to access—what to pay attention to.In the Information Economy everything is plentiful—except attention.”
Thursday, December 23, 2010
Bruce Sterling in 1992
Tuesday, November 23, 2010
Rare, great and philosophical qualities ...
Wednesday, September 22, 2010
‘Objectivity' vs POV: Sorry, it’s more complicated than you think
Monday, September 13, 2010
Two simple tools to make online reading a pleasure
Tuesday, September 07, 2010
'Thank God for Title IX' — my American River Adventure
916.531.4561 More about Howard |
Tuesday, August 31, 2010
News vs coffee: It’s still about value, not price
... at the end of WWII a cup of coffee and a newspaper each cost about the same -- let’s say 10 or 15 cents; today a cup of coffee can fetch $3.00, while a newspaper at most costs 50 cents. The essential difference is that the coffee sellers learned to give their customers choices -- you can go to any espresso stand in any airport in America and order a double tall decaf skinny latté. They added value to their basic product.
Wednesday, August 25, 2010
Like books of old, today's web is in its swaddling clothes
Excerpt from an essay I wrote in 1999 about — you guessed it — "the future of newspapers":
"While we know the impacts brought by new digital media will be profound, none among us is able to divine their precise shape. The pace of change in the networked era is such that we are denied the luxury of extended study and careful reflection. The product development strategy of the digital age, it is said, is: Ready, Fire, Aim.
"Our uncertainty should come as no surprise. We know from history that it took more than 50 years for Gutenberg's invention of moveable type to result in the creation of anything that would be recognized today as a book. After Gutenberg, somebody else had to discover the form that best took advantage of his technology -- things like legible typefaces, numbered pages arranged in chapters, hard covers to bind the work together coherently in a convenient, portable size. Indeed, books printed between the invention of Gutenberg's press in 1455 and about 1501 are known to collectors today as incunabula -- taken from the Latin for 'swaddling clothes, indicative of a technology in its infancy.
"Similarly, the invention of moving pictures in the 1890s did not immediately result in what we know today as movies. Here, too, was a technology in search of a format. Motion pictures initially were simply films of stage plays. It took time to discover the elements of cinema we all take for granted at the movies today: close-ups, flashbacks, shifting focus and so forth.
"The parallel between incunabula books and those early moving pictures and what is happening today on the World Wide Web is inescapable. The technology has been discovered; we are searching for the format."
Monday, August 23, 2010
Turtles all the way down
I gave up on ‘truth’ a long time ago,
but I’m having a hard time letting go of facts
I know folks have been fighting about “What is truth?” since way before Aristotle, and I’m comfortable leaving that debate to poets and philosophers. Obviously, it’s a topic way above my pay grade.
Until recently, however, I still placed a lot of faith in facts.
I cautioned journalists over several decades to remember that “truth is a plural noun ... there’s a lot of truth out there.” Instead of concentrating on somebody’s protean definition of truth, we tried to focus on things we could measure: accuracy, fairness, accountability, documentation.
Lately I’ve come to a deeper realization of how the nature of fact and authority themselves have fundamentally changed. I haven’t yet come to grips with what this means.
In the old days (five years ago) one poster child for this debate was Wikipedia vs Britannica. If a crowd-sourced, infinitely editable, volunteer encyclopedia could trump the venerable text with all its credentialed authority, we’d know some kind of milestone had been reached.
In many ways, the battle was already over by then. Wikipedia was already quickly craigslisting Britannica’s business model and seemed sure to outlast it. We were losing Britannica. so the question was mainly about what that was going to mean. Was Wikipedia really as good?
Web triumphalists were certain it was. To confirm that, they delighted in citing a study published in the journal Nature, usually asserting that it had found Wikipedia was as good. Inconveniently, the study didn’t actually say that.
Nature’s press release was more qualified than that, saying that “Wikipedia comes close” to matching Britannica’s accuracy. Even that, it turned out, was considerable overstatement.
What the study actually found was that Wikipedia had 100 errors for every 75 in Britannica. The actual numbers were that the Britannica articles studied averaged 2.9 while Wikipedia averaged 3.9. Now, I was schooled on newsroom math, but my calculation says that four errors instead of three errors per article amounts to 25% more errors. That’s hardly trivial.
I made that point in the comments section every time I saw somebody cite the study as “proof” of Wikipedia’s success. Almost nobody even acknowledged the challenge, much less bothered to consider it.
I was frustrated by that at the time. This was arithmetic, damn it. How could people just ignore it?
I understand better now. The real debate was over the nature of facts, and I was trying to assert one particular kind of fact (math) to make my point — but I was already too late. The post-modern fact train had already left the station. If enough people wanted to think Wikipedia was equivalent, nothing as small as mere arithmetic was going to change it.
Now I realize that’s okay. I'd still argue some things require precision and hard facts — chemical reactions come to mind, or surgical procedures — but for many debates these days, “fact” is itself debatable.
At one level this only recognizes a reality we long ignored. Few of the “facts” that were the foundation of my education were absolute. History books are written by fallible, often prejudiced people. Scientific theories, once memorized, will summarily be eclipsed by new data or another discovery. The scholars upon whose authority Britannica was founded knew this, and in the academy they’d argue endlessly about the stuff that later became typographic fact in their encyclopedia.
At some level, the experts in any field are telling us we weren’t smart enough to follow all the details, so we’d have to take their word for it.
And of course that’s largely true. You and I are never going to know enough about the history of Islam or the probability of Iran making a nuke to render reliable independent judgments. We have to rely on expert knowledge, and that means trusting somebody.
That trust is shifting. The catchphrase today is “algorithmic authority,” and the most illuminating and articulate spokesman I’ve encountered is Clay Shirky. It’s a gross over-simplification to put it this way, but you can think of this as a shift toward “trusting everybody.”
This turns Thoreau and Andrew Jackson and a lot of Enlightenment philosophy upside down, of course. (Thoreau: “Any man more right than his neighbors constitutes a majority of one.” Jackson: “One man with courage is a majority.”) Philosophers and polemicists will continue to insist they’re right — that there is a singular truth out there somewhere, independent of what our collective wisdom says about the subject. But in practical terms, this game is over.
News media and politics provide clear examples. It no longer matters if I think there is documentary evidence of Obama’s birth. Huge percentages of American voters think otherwise — for whatever reason — and they will act on what they believe. “Authority is as authority does,” Shirky has noted, and we must now learn to deal with the consequences.
The “wisdom of crowds” is both broad and deep, but it is not ubiquitous. James Surowiecki, who popularized the concept, was careful in his book to point out that crowds are wise only in certain, constrained circumstances: for example, when they are diverse, when they allow every voice to be heard, and so forth. Most folks who cite the wisdom of crowds don’t know that, and may end up putting trust in conclusions drawn by crowds that are anything but wise.
Algorithms, likewise, are subject to distortions of their own. Google’s “page rank” algorithm is often cited as a prime recommendation for crowd-based authority. It is useful and hugely successful, to be sure, but it is also the product of decisions made by a select few individuals in total secrecy. Few of the people who simply “Google it” to answer questions understand that.
Making these kinds of cautionary points to algorithmic advocates isn’t easy. Too often, their faith is built on unexamined assumptions that aren’t as firm as they imagine.
Shirky, characteristically, faces these issues head on. In acknowledging the lack of “root authority” for any definition of fact, he cites the old tale of “turtles all the way down”:
A well-known scientist once gave a public lecture on astronomy. He described how the earth orbits around the sun and how the sun, in turn, orbits around the center of a vast collection of stars called our galaxy. At the end of the lecture, a little old lady at the back of the room got up and said: "What you have told us is rubbish. The world is really a flat plate supported on the back of a giant tortoise." The scientist gave a superior smile before replying, "What is the tortoise standing on?" "You're very clever, young man, very clever", said the old lady. "But it's turtles all the way down!
Later, Shirky amended his conclusion: “I lied before. It’s not turtles all the way down; it's a network of inter-referential turtles.”
Monday, July 26, 2010
Our first fruit harvest
We picked a few plums (on the right) and nectarines this morning, the first significant fruit of the season for us. We lost the cherries to the birds, and most of the other fruit isn't ready.
Sunday, July 25, 2010
Summertime at Redwing
Saturday, July 24, 2010
Meaning must be made, not discovered
He knew that there could be no meaning to someone who was dead. Meaning came out of living. Meaning could come only from his choices and actions. Meaning was made, not discovered ... The things he’d wanted before — power, prestige — now seemed empty, and their pursuit endless. What he did and thought in the present would give him the answer, so he did not look for answers in the past or future. Painful events would always be painful. The dead are dead, forever.
Karl Marlantes
Matterhorn (Chapter 23)
Thursday, July 08, 2010
Worth remembering
The rules of conduct, the maxims of actions, and the tactical instincts that serve to gain small victories may always be expanded into the winning of great ones with suitable opportunity; because in human affairs the sources of success are ever to be found in the fountains of quick resolve and swift stroke; and it seems to be a law inflexible and inexorable that he who will not risk cannot win.
—John Paul Jones
Sunday, May 30, 2010
Thoughts about 'writing for the web'
Sunday, April 04, 2010
Salvation? Forget devices and work on 'modeless innovation'
Friday, April 02, 2010
Have iPad critics fallen into an echo-chambered trap like the news industry?
The iPad is bad, they say, because it’s closed. There are things you can’t do on it that they sometimes want to do. It’s not a “real computer” because you can’t open it up (actually or metaphorically, I presume). Cory Doctorow’s jeremiad is a particularly pointed example, much praised and linked to by CPU curmudgeons.
If you can’t open it, you don’t own it, he declares, certainly implying that he doesn’t own a modern car, which no shade-tree mechanic or home hobbyist can open up and tinker with any more. In fact, many of the arguments about the iPad are well mirrored in that analogy.
"Without the open Apple II or Commodore or whatever, I’d have never learned to program," some complain. Well, doubtful. The world saw its last generation of tinkerable autos some time ago, yet today’s cars are safer, more reliable and less environmentally destructive than their predecessors. They’re better for everybody, really, except the guys who want to tinker.
Yes, there are plenty of folks who wish they could still pop the hood and fix things. As the owner of a 1967 Jeepster, I understand. But "closed" automobiles didn't signal the end of mechanics or car designers.
BTW, the iPad isn’t going to take away your Dell or MacBook, is it? Go ahead, program there, all you like. Let a thousand flowers bloom.
The other consistent criticism is that the iPad is designed mainly for consuming content, not creating it. Even if I thought that was fair (I don’t), I’d have to ask, “So what?”
People who protest that it’s simply a beautiful device for reading/viewing/playing prepackaged media must think there’s something wrong with enjoying a beautiful packaged book or song or video.
Cory recalls that he gave away or traded comic books he bought as a kid, and that such fan interaction helped create an appetite and market for comics. I was a comics fan and trader as well, but mentioning that by way of criticizing Marvel because you can’t freely give away its iPad comic is preposterous. As a kid, Cory could give away only the comic books he purchased, and he could give each away once. Online we can (and do) give away hundreds of thousands of individual content units: books, songs, comics, whatever. (This is usually called “sharing” rather than "giving away,” but whatever). I fully understand the digital duplication jinni is long since out of the bottle and I’m not asking to have that debate again. But why insult our intelligence by comparing comix-as-atoms with comix-as-bits?
Cory also sets up a preposterous straw man by defining those who like prepackaged media as one of Wm Gibson’s consumers: “something the size of a baby hippo, the color of a week-old boiled potato, that lives by itself, in the dark, in a double-wide on the outskirts of Topeka. It's covered with eyes and it sweats constantly.”
This is the place where this brand of critics fall most deeply into the same kind of echo-chambered trap as the news industry — by thinking that most people should think and behave like themselves. But most people are not and will not ever become creators of sophisticated media. Instead they’re working in bakeries and insurance offices and having babies and teaching people to play the fiddle.
They don’t want to make a lot of “content” and they don’t need to. They do want to “consume media.”
The People Formerly Known As the Audience — as Jay Rosen brilliantly characterized them — have a lot more clout and choices and opportunity nowadays. Yes, hierarchical relationships like the one between editors and readers have changed forever. That's a good thing.
But you know what? We’re still audiences a lot of the time — by choice. Cory & Co. think that makes us sweaty, colorless and and covered with eyes. I don’t.
—30—
Caveat: I have reservations and questions myself about the “iPad ecology” about which Cory and others complain. Reflecting my particular bias, I’m worried about the relationship of news publishers and the iPad application gatekeepers. I’m guessing it will end working something like a bookstore — owners decide which books to carry, but don’t edit the ones they sell. They can decline pornography if they want to, or other subjects but will pay a heavy price for every subject they chose to exclude. But I don’t know that, and it’s a central question news companies need to answer.
Friday, March 05, 2010
Yes, most of us are still media consumers, and that's iPad's appeal
Thursday, February 11, 2010
Why is a Facebook beer worth more than your news story?
Good question — and there’s a lot more than a dollar at stake. Americans are spending something like $1.6 billion a year for “virtual goods” — that is, things that exist only in cyberspace, like that Facebook beer, or status upgrades in a game — but we’re told they won’t spend squat on news. What’s up with that?
If you’re a journalist, your first impulse might be to ask “What’s wrong with them?” But a far more useful question is to ask “What’s wrong with us?”
There’s only one reasonable explanation: people spend discretionary money on the things that matter most to them. If it turns out that buying intangibles to enhance experience in a virtual (non-physical) world is worth more than consuming another isolated, incremental news fact, that’s where their dollars will flow.
These distinctions aren’t all black-and-white, of course. (Honestly, these days, what is?) Some news consumption is related to the immersive, satisfying virtual experiences people pay for; by and large, organizations that provide some of that experience — not just a collection of individual factoids written in a peculiar news dialect — tend to be doing better than those that don’t.
But news-as-social-community happens by accident nowadays. What would happen if a news organization set out to make its product immersive and satisfying on purpose?
Writing in TechCrunch, Susan Wu said, “Virtual objects aren’t really objects – they are graphical metaphors for packaging up behaviors that people are already engaging in.” That sounds like something that could very well apply to an online community defined by common interest in civic affairs, doesn’t it?
Sharing and caring about news is an inherently social activity. “Everybody who is interested in Ahmadinejad” or “People worried about a property tax increase” certainly comprise communities. The problem is, news organizations don’t treat them like communities — don’t feed and nurture and satisfy them — and so they fragment and drift apart. Much of their value drifts away with them.
Why would somebody spend real money on a virtual rose or make-believe beer on Facebook? What Susan Wu said: because it’s a graphical metaphor: it stands for something, it’s part of an integrated system that rewards participation.
An individual news story is itself a virtual good. What’s missing is the community environment in which it is recognized as valuable, an ecology where caring about the news becomes satisfying and rewarding social behavior. Instead of becoming an integral part of a social community experience, consuming news stories remains an isolated individual act.
When somebody creates a social ecology around news, I’m willing to bet they’ll also create a place where the virtual goods we know as “news stories” become valuable for their creators.
Saturday, January 30, 2010
iPad will help us most when it disappears
They’re both wrong—and the truth isn’t somewhere halfway in-between them, either.
Here’s the most important thing about the iPad: it can be one of the biggest steps yet toward taking the technology out of our way and letting human beings get on with communicating, creating and consuming news. In much the same way the desktop metaphor and mouse made computer power more accessible than the command line, iPad’s touchscreen, instant-on availability, intuitive interface and extreme portability promise still greater opportunity.
If the Macintosh was “the computer for the rest of us” (and it was), maybe the iPad will be “networks for the rest of us.” If it’s easy, intuitive and relatively cheap to experience constantly updated Facebook and Twitter and the New York Times on a bright, colorful screen, doesn’t it make sense that more people will do so?
The technoids who instantly set upon the iPad for what’s missing — Flash, total multitasking, no camera, no SD slot, yada, yada — don’t get it. Apple didn’t build the iPad for them (although I’ll bet most will end up owning one). They built it for the people who love it when technology “just works.” (It’s also illuminating to see what these critics had to say about the iPhone in version 1.0; they look silly now. By the time iPad cycles through a few software and firmware updates, today’s arguments will be even more hollow.)
It’s also obvious that expecting a miracle cure for what ails newspapers and magazines is deeply stupid. The fact that the iPad’s roughly the shape of a published page, or that it will be used primary by holding it in your hands doesn’t offer any new hope for content created by hierarchical, top-down newsrooms that haven’t figured out consumers are in control. People will get news about subjects they want, when they want it—and many will be creating it, as well. What the iPad’s likely to mean for them is that they’ll get what they want easier and consume it more pleasurably—but it will be what they value, not what a gatekeeper decides to give them.
Here’s what I think—and devoutly hope—will happen: the iPad (and even better devices sure to follow) will enrich human beings by removing technological barriers.
For all their failings, newspapers were equally accessible to everybody who could read: cheap, portable, intuitive, ubiquitous. Poor boys had about the same chance as bankers to keep up with the news. Good newspapers worked to shape content to meet a wide range of interests—football scores and shipping schedules and how-they-voted charts—because they knew a lot of different people would be looking through the window those pages opened.
Alan Kay, the computer visionary who famously declared Macintosh “the first computer worth criticizing, hasn’t weighed in directly on the iPad as far as I know. It’s hard to imagine that he won’t see it a significant realization of his Dynabook dream, a tool that makes information and communication ubiquitous and makes devices disappear.
In the middle 1980s, Kay visited Alaska for a lecture and was interviewed in the Anchorage Daily News, articulating intoxicating ideas that helped awaken me to the brewing information revolution. He was careful even then to caution against focusing too much on devices. “The music’s not in the piano,” he said. “If it was, we’d have to let it vote.”
When iPads start arriving two months from now, we’ll be a lot closer to realizing his long-time vision.
The device is becoming as simple as a newspaper—and infinitely more capable. It’s now up to producers to be sure what they offer thrives in a world where accessing their work (or a competitor’s) is as easy as picking up a book, or the newspaper.