This post was originally published in The Photon Fantastic on May 5th, 2010. It's the companion piece to The Relentless Commoditization of Content. It appears here in slightly modified form.
In May, 2010, I traveled to Tofino, a small fishing and whale watching town on the west coast of Vancouver Island. Along the way, I took photographs of sea and mountains and forests, and had a lot of time to chat with my good friend and guide, Kirk, about the business of photography and writing. (All of the photographs shown in this post were taken on the trip.) When I returned to Vancouver and an internet connection, I was particularly struck by a poignant blog post by late film critic Roger Ebert. He wrote:
This is a golden age for film criticism. Never before have more critics written more or better words for more readers about more films … Twenty years ago a good-sized city might have contained a dozen people making a living from writing about films, and for half of them the salary might have been adequate to raise a family. Today that city might contain hundreds, although … not more than one or two are making a living. Film criticism is still a profession, but it’s no longer an occupation. You can’t make any money at it.
Now go back and read that again, but this time, replace every instance of the word “film” with “photography.” The statistics Ebert guesses at probably aren’t the same for the two pursuits — I surmise that fewer people have made a living writing about still photography than about moving pictures — but the general argument holds for both: the internet has ushered in a great democratization of information generation — not just about film and photography, but about every conceivable topic. Anyone with an interest in a subject and access to the internet can use free tools to publish free information for a potentially enormous audience.
Wikipedia defines a commodity as a good for which there is demand, but which is supplied without qualitative differentiation across a market. Typical commodities are products like copper and wheat. Because a commodity is the same no matter who produces it, it is “fungible,” or interchangeable with another product of the same type. To a first approximation, copper is copper is copper, no matter who refined it; and wheat is wheat is wheat, no matter who grew it. Therefore, the price of a commodity is insensitive to brand — it’s based solely on supply and demand: the higher the demand, or lower the supply, the higher the price.
If you squint a little, there isn’t much difference between copper and text on a computer screen, or between wheat and images on a web page. The internet has made information — words and pictures — as much a commodity as copper, or wheat. There are huge numbers of people writing on an enormous variety of topics — so much so that even the best, expert writers are largely undifferentiated and therefore fungible. In the context of film criticism, it won’t matter much whose reviews you read, even if you are picky about quality: the web ensures that you have a large number of excellent, reliable, and entertaining sources, all of which are equally satisfactory.
In economic terms, information on the web has been “commoditized,” which is a word used to describe the transformation of a market from one in which products are differentiated to one in which they are undifferentiated through increased competition. This typically results in lowered prices. It is for this reason that companies are always adding new features to their products or services. Only by differentiating its goods from its competitors’ can a company lift them out of the morass of commoditization and therefore command a higher price. Once a feature has been adopted across the board by all manufacturers, it ceases to differentiate one brand from another and the financial benefit brought by that feature drops towards zero. (This doesn’t mean that manufacturers can cease adding the feature; it means that the feature has become essential and taken for granted by the consumer. The manufacturer would to well to find ways to decrease the cost of the feature commensurate with its declining value.)
In practical terms, the price of most information on the internet has already been driven to zero by commoditization. This has caused a lot of pain for a lot of people who formerly relied for their livelihoods on access to privileged information. Consider, for example, what the internet has done to middle men like travel agents or stock brokers. In Canada, real estate agents are now also in the crosshairs (but are fighting back vigorously). Writers, the people who provide the very stuff that makes the web valuable, have themselves become valueless at the individual level. (And so have photographers.)
To explain this from a different perspective, we turn to Joel Spolsky, a software developer based in NYC. He writes:
Every product in the marketplace has substitutes and complements. A substitute is another product you might buy if the first product is too expensive. Chicken is a substitute for beef. If you’re a chicken farmer and the price of beef goes up, the people will want more chicken, and you will sell more.
A complement is a product that you usually buy together with another product. Gas and cars are complements. Computer hardware is a classic complement of computer operating systems. And babysitters are a complement of dinner at fine restaurants. In a small town, when the local five star restaurant has a two-for-one Valentine’s day special, the local babysitters double their rates.
All else being equal, demand for a product increases when the prices of its complements decrease.
Free content is the fuel (complement) that makes the web valuable. In order to provide free content, websites will do everything possible to commoditize the writers, photographers, videographers, musicians, and animators who provide this content. Why do you think that the idea of user-generated content caused such a stir among mainstream media in 2005? Was it because the BBC and Fox News embraced the rise of the “citizen journalist” as a good idea in principle? Or was it because it made cold, hard economic sense to shift from the expensive process of creating new and original content to providing facilities for amateurs to publish their own? (Think of the difference between scripted drama and reality TV — which is less expensive and more prevalent?) Social networking sites take user-generated content to the extreme. In March of this year, Facebook surpassed Google for the first time as the most visited site in the US. Approximately 200 million of its 400 million account holders log in to Facebook each day to spend an average of one hour consuming content that was created not at great expense by the proprietors of the site, but for free by its end users.
And let’s not forget Facebook’s terms of service, which includes this rather questionable bit about the content you made and received no payment for: “For content that is covered by intellectual property rights, like photos and videos (‘IP content’), you specifically give us the following permission, subject to your privacy and application settings: you grant us a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that you post on or in connection with Facebook (‘IP License’). This IP License ends when you delete your IP content or your account unless your content has been shared with others, and they have not deleted it.” (1)
These generalities hold true because the typical entrepreneur is spurred into action by a low barrier to entry. If you like, barriers to entry and going into business for yourself are complements. The lower the cost of set-up, the greater the demand for entrepreneurship. Why else do you think that this world is replete with massage therapists, hairdressers and wedding photographers?
And bloggers like myself. Let me repeat: Anyone with an interest in a subject and access to the internet can use free tools to publish free information for a potentially enormous audience. Which is why there are so many bloggers “in business.” Please note, however, that although blogs may be free to read, they are not free to write. There is an enormous opportunity cost to writing a great blog on a daily basis. Roger Ebert tells the story of James Berardinelli, who is “among the half dozen most-read critics in the world.” Berardinelli holds a day job as an engineer because his site, reelviews.net, popular though it is, does not support him. “The studios and other industry advertisers don’t give a damn about film criticism, preferring to direct most of their online ad budgets to celebrity and gossip sites.” Berardinelli told Ebert that “his Amazon resale commissions helped to offset his out-of-pocket costs,” nothing more. The prognosis is not good for those of us wanting to turn blogging into a living. If Facebook, with 200 million visits daily, has trouble coming up with a reasonable business model, what chance do we have?
For writers of any persuasion, myself included, now is a precarious moment. It may indeed be, as Roger Ebert contended, the golden age of film criticism, but it is far from the golden age for the critic herself.
- These terms were current in May, 2010. I have not looked at them since.
- As of May, 2010. It appears now that Facebook does indeed have a viable and very profitable business model – which is predicated on your personal information!
I'm fascinated by the multitude of ways that photographers think and speak about their art. To my mind, photography is a very slippery thing and, contrary to what Garry says, it takes a lot of thinking to penetrate into what it is. Here's a (growing) collection of thoughts on photography by great practitioners and critics.
This piece was originally published in The Photon Fantastic on June 10, 2010. It appears here in slightly modified form.
Remember the glorious days when record company executives and their rock bands du jour sailed languidly around the Mediterranean in multimillion dollar yachts surrounded by supple, sunbathing, Scandinavian supermodels? Don’t you long to return the time when CDs cost upwards of $20 — $2 or more per track — that rose-tinted time when Lars Ulrich’s eyes bugged out more than usual as he flew into hysterics over Napster on the Charlie Rose show? Didn’t you love paying for an entire album when all you wanted were two tracks, which made it effectively $10 per song? Ah! The good old days! How fondly we remember them. And we have a personal computer company to thank for taking them away. Who would have predicted that?
Apple singlehandedly reinvented the music business by embedding in the popular consciousness the idea that songs should cost only 99 c, that using simple software you should be able to select single tracks from an album to download, on-demand, from anywhere in the world, and that you should be able to carry “1,000 songs in your pocket.” It was an idea that had a powerful and instant appeal, and almost overnight the former iTunes Music Store (later renamed the iTunes Store) became the most influential force in music retail, and, within a few years, the largest. What was Apple’s motivation for doing this? Was it to make money from music sales? Was it to corner the world market in supple and suntanned Scandinavian supermodels?
Before we address those questions, let’s take a seat in the front row of Apple's 2010 World Wide Developers’ Conference. During his keynote presentation, Steve Jobs boasted that the App Store had served 5 billion downloads. That’s a big number, and certainly worth bragging about. That number should make a software developer’s eyes turn to white saucers with big fat dollar signs on them and play ka-ching sounds in her ears. With the the next slide, Jobs proudly proclaimed that Apple had paid developers $1 billion in revenue from their apps (i.e., 70% of the take — Apple keeps the other 30%). Again, sounds like a big number — until we stop to consider that it amounts to only 20 c per download. Time to put away those saucery eyes with big fat dollar signs on them, and to consider the implications of writing software for the iPhone.
Later in the presentation, Jobs unveiled iAds, Apple’s venture into the advertising business. iAds are to iOS apps what Google ads are to the websites, only with the promise of being more engaging and less annoying. This raises another question: What the heck is Apple doing in the advertising business?
I’ve asked a lot of questions. Now it’s time to have a go at answering them. When Steve introduced iAds, his rationale was, and I quote: “We’re doing it for one simple reason. To help our developers earn money so they continue to create free and low cost apps for users.” Before I address this explanation at face value, let me say that it’s rather slippery of Steve to market iAds this way. It’s slippery because although it sounds very noble and altruistic, the statement is only half complete.
Way back, near the beginning of time as far as the personal computer is concerned, Apple was rightly convinced that the reason everyday people like you and me bought computers, no matter how pretty they were, was actually to do things with them. And to do things with computers, you needed software. Software made it possible for you to fill in your taxes, or write a letter to your grandma, or maintain a webcam affair with your dominatrix in Tokyo. Without software to run it and make it useful, there would be no reason to buy a computer. To make this idea concrete, then head of software, Jim Hoyt, commissioned the Star Wars-inspired poster “Software Sells Systems” for Apple in 1979.
Every product has a complement, which is a thing that you typically buy along with it. Products and their complements maintain an economical see-saw relationship. Cameras and film are complements: when the price of film drops, demand for cameras increases, which is part of the reason digital cameras are so popular these days; the cost of “film” has been driven to zero. Likewise, software and hardware are complements; when the price of software falls, demand for hardware to run that software increases.
Despite countless red herring articles and much pointless heavy breathing about the rivalry between it and software giant Microsoft, the thing to remember about Apple is that it’s not a software company at all, and doesn’t compete directly with Microsoft. Bite into its core and you’ll find that the Apple is not soft, but hard. Apple makes its real money, and a lot of it, by selling hardware.
Apple is also a smart company, and like all smart companies, it is pushing hard to raise demand for its products by commoditizing their complements. This is to say that it is consciously and aggressively lowering the public’s perception of what software that runs on the iPhone should cost, as well as lowering the cost of its own apps and sending strong signals to developers to follow suit. From the inception of the iDevice, Apple has strategically promoted the idea that software for the iPhone (and iPod Touch and iPad) should be as close to free as possible, just as a song should cost no more than 99 c. (Podcasts get rougher treatment: they should always be absolutely free.) 99 c songs sell iPods like hotcakes, just as 99 c apps sell iPhones.
But Apple is in a tricky position here. On the one hand, the company needs software to be cheap or free to entice consumers to buy iDevices, but on the other it needs prices to be sufficiently high to entice developers to devote their energies to the platform. If all software for the iPhone were free, there would be no motivation for developers to produce it. And so we have iAds (and pernicious in-app purchases) as Apple’s solution to this thorny predicament.
The purpose of iAds may be inferred by a literal reading of Jobs’s statement, i.e., to subsidize developers by giving them a revenue stream in addition to app sales and thereby to allow them to lower their prices. The knock-on effect of this is to boost iDevice sales, which is how Apple makes its money. So, Steve’s statement is better and more completely written as: “We’re doing it for one simple reason. To help our developers earn money so they continue to create free and low cost apps for users. Free and low cost apps will increase demand for iPhones, iPods, and iPads, and ultimately profit and marketshare for Apple.” Less noble, perhaps, but truer.
The iPad is supposed to be the miracle device that saves the “printed” word. But all I see is a regurgitation of the same advertising model that has been used to subsidize the printed word for aeons. It has never made sense to me that funds raised by selling advertising space to Bloomingdales should be used to support reporters risking their lives to deliver news from Afghanistan. One of the most vital aspects of a liberal democracy, an unfettered press reporting and analyzing news, both local and international, thereby rests on the whims of spring season shoe-shoppers. And I foresee that content creators will continue to be marginalized, iPad or no.
Should we photographers, writers, programmers, and designers, we the people who create stuff, be worried about this endless commoditization of content — our content — the original, inspired, beautiful things we produce? Yes. I believe we should. I believe that our work has inherent value, and that we should be able to make a living from the stuff we make. We should not need to be subsidized by advertising. Who in this world wants more advertising?
Perhaps there is hope. Perhaps there is room on the jungle floor for producers of niche content to carve out a reasonable living by making and marketing the things they love to small, passionate audiences. At least that's the view of Ben Thompson, the author of Stratechery (essential reading for anyone interested in the intersection of tech, culture, and business.) Personally, I take courage from this graph of Ben's, which he includes in the article "Differentiation and Value Capture in the Internet Age." The key, or course, is differentiation, which is a topic for another day.
(The original, erroneous title of this post was The Relentless Commodification of Content. The problem was my mistaking "commodification" to mean "commoditization." The Marxist idea of commodification is to take something that was previously not saleable and to make it saleable. Commoditization, on the other hand, is to take something that was previously differentiated and to make it generic.)