Five Years Ago
This week in 2010, the world was abuzz over the prototype iPhone that was left in a bar, and we discussed the legal fallout and its implications for bloggers-as-journalists. At the same time, another case found that a blog commenters was not a journalist. In the old-school journalism world, Rupert Murdoch was moving towards paywalls while his competitors were proudly promising to remain free, and the New York Times was struggling to find reasons to like its paywall.
Lots happened on the copyright front this week. A historical association claimed copyright over scans of 100-year-old photos while we asked why UNESCO was launching pro-copyright efforts; Twitter was removing a lot of tweets over bogus DMCA takedowns while a horrible new bill in congress sought to extend DMCA takedowns to cover “personal information”; the U.S. Chamber of Commerce released a bogus study pushing for more IP enforcement while the IFPI’s latest report actually showed music sales growth in some markets; the USTR released its “revised” Special 301 report (and continued to lie about ACTA) while the Justice Department boosted the number of FBI agents and attorneys focused on copyright infringement; and the RIAA utterly missed the point about Record Store Day but also got AFL-CIO to sign on in support of a proposed performance tax. Internationally, Shanghai was cracking down on (read: pushing underground) bootleg discs, Chile got a mixed bag of new copyright laws, and an Irish collection society was trying to get music bloggers to pay up.
Ten Years Ago
Back in 2005, many newspapers weren’t nearly as well-established online as they would be in 2010 amidst all the debates about paywalls. It was only just becoming clear (well, it was only just becoming undeniable anyway) that the future of news was online (and that future wasn’t in walled gardens). An interesting upshot was that local newspapers were faring better online, and teaching the rest a thing or two about targeted advertising. Meanwhile, Techdirt got some newspaper love of its own.
The entertainment industry was still desperately trying to make the case for mobile TV, this time suggesting babies might be a great audience, which was insane but at least a little more detailed than “just because”. Some wanted to leapfrog TV and go straight to selling extremely expensive feature-length movies on phones. Meanwhile, Disney was rolling back its failed video-on-demand test programs just as Steven Soderbergh was trying to get more films available in theatre and on DVD at the same time. RealNetworks was offering a music streaming service with an incredibly sad 25 free streams per month, though it’s hard to say whether that’s sadder than Wal-Mart’s custom-burned CDs.
Oh, and Nathan Myhrvold was going around hyping up his plans to create an “invention factory”. I wonder whatever happened with that…
Fifteen Years Ago
This week in 2000, Silicon Valley was in the midst of the dot-com bubble collapse, with the peak now a month in the past. Still, some CEOs were rightly saying that focusing on running a business was more important than the market, and Valley innovation wasn’t going to disappear. People were noting that there was still a lot of investment money out there in the pockets of VCs, while a new book (slightly hypocritically) discussed this question of short- and long-term thinking. Certainly none of this was scaring entrepreneurs away from the web, or stopping dot-coms from throwing big parties and media outlets from listing internet business superstars. It didn’t stop ridiculously expensive dot-com Superbowl ads or, on the other end of the scale, the coming wave of dot-com infomercials. After all, there was plenty of evidence to suggest that online retail wasn’t dead and that ecommerce would keep on growing. Hell, even music sales were up.
Twenty-Two Years Ago
April 30th, 1993 is an important day in the history of the information era: it’s the day CERN announced that the newfangled “World Wide Web” and the protocols and technologies that it consists of would be free to anyone, with no fees due. This decision, arguably moreso than the technology innovations or the proliferation of home computers or any other factor, shaped the future of the global communication network we all rely on today, and shaped it for the better. Sadly, now much of what is now happening online stems from the exact opposite attitude, with even many well-meaning innovators proving too nervous to relinquish control of their creations and let them truly flourish. Hopefully more people will take a moment to think about what the internet might look like if all its higher functions relied on a fragmented mess of proprietary protocols and walled gardens, instead of a unifying web on which everyone can build.