Friday, April 30, 2010

Justifying Limitations on Copyright...

In this ongoing digital debate, there are those who believe that the idea of copyright is sacrosanct, and that copyrighted material can never be used without the author's permission. Of course, those individuals are completely clueless about actual copyright law, which expressly allows the fair use of such material for a range of purposes including commentary, criticism, satire, news reporting, research, teaching or scholarship.

One problem that Fair Use advocates always run into is how to quantify the value of such unauthorized uses of copyrighted works with a positive social value. After all, when the movie and music industries cite statistics on exactly how much money is lost to digital piracy, it would be nice to have some quantitative data to counter them.

Well, according to a new study by the Computer & Communications Industry Association, and reported on the Google Public Policy Blog, $4.7 trillion is the amount of revenue generated in the U.S. by the “fair use economy” -- industries that rely on fair use and other limitations on copyright. They account for 1/6th of U.S. GDP, one out of eight jobs, and $281 billion in exports.

Copyright law not only provides artists with certain protections, but also includes important limitations that promote innovation and legitimate re-use of information.

For example, without limits on copyright, search engines would not exist. Indexing the Web would be illegal, because that requires creating a copy of websites first.

The importance of well-designed copyright goes much further, though. iPods, Tivos, and any other digital media device that is capable of making copies depends on balanced copyright.


This is what is meant by calling for "copyright reform". It is not, in any way, a call for the abolition of creators' rights, but rather is a call for legislation that ensures that fair use remains possible. Digital technologies have been used in recent years both to enable individuals to pirate copyrighted works and to empower corporations to restrict the fair use "copying" of those works as well. Reform involves re-balancing the two, and having the law recognize a new equilibrium appropriate for the 21st-century.
  

Thursday, April 29, 2010

Jon Stewart Rips Into Apple...

Fast-forward to 5:25. When even the Daily Show is blasting Apple for acting like a tyrannical Big Brother, you know that this is no longer an issue where only lunatics are taking up the cause.

Best lines...

“You guys are busting down doors in Palo Alto while Commandant Gates is ridding the world of mosquitoes. What the f**k is going on?”

“Apple you guys were the rebels man, the underdogs. People believed in you. But now, are you becoming The Man? Remember back in 1984, you had those awesome ads about overthrowing Big Brother? Look in the mirror, man!”


The Daily Show With Jon StewartMon - Thurs 11p / 10c
Appholes
www.thedailyshow.com
Daily Show Full EpisodesPolitical HumorTea Party
  

Tuesday, April 20, 2010

Facebook Fan Pages as Business Model...

A question recently came up in a conversation about marketing one's presence in cyberspace. It was regarding an individual whose public persona was also their business; similar to, say, a paid speaker.

The question... Is it better to use your Facebook Fan Page to drive traffic to your website; or use your website to drive traffic to your Facebook Fan Page?

To be honest, this question blew my mind a little bit. Call me old-fashioned, but I would have assumed that a person's or a business's website was their central hub, and that other marketing efforts would merely be considered supplementary - their entire purpose being to help brand one's product or service, ultimately driving traffic to their primary website. However, when my friend contradicted that idea (and actually put it into practice), it raised my curiosity.

Time for a little research. Using Facebook Fan Pages as the test case (since that is what my friend uses as his primary online marketing vehicle in place of his website), what we find are the not-surprising statistics that social media pages get far less traffic than websites, but they get more valuable traffic.

Facebook Fans are valued at an average of $3.60 a piece in earned media for big brands. Compare that to the approximate 20 cent value of each website visitor using a Pay-per-click (PPC) formula - although this number varies widely depending on the website.

Granted, one should be somewhat sceptical of this data, but it's at least a reasonable starting point for dialogue.

If we take a look at the overall most popular Facebook Fan Pages, what we find is that the number of fans is usually quite paltry in comparison to the amount of traffic that their website counterparts accumulate. Starbucks and Coca-Cola are among the leading brands and even they only have between 5 and 6 million fans. As far as putting this back in the context of our original question, the numbers can also be broken down by consumer product, by celebrities/public figures, by TV shows, and more. The data indicates that even the absolute most popular Facebook Fan Pages struggle to exceed a few million fans. And I guarantee that Megan Fox (currently the #2 celebrity on the list) gets more traffic to her website and through Google searches than is reflected by her 6.4 million Facebook Fans.

The great counter to all of this, however, revolves around monetization. Who cares, opponents say, how many people visit your site? What you really want is a loyal group of followers willing to promote you through the tight bonds of networking because that's what will ultimately lead to bigger and better things, including more money.

Frankly, I'm still not convinced. Is anyone really earning more cash from Facebook Ads than they are through Google AdSense? I'd like to hear if anybody can share some success stories about individuals who've actually converted their number of Facebook Fans into a reliable revenue-stream. I certainly haven't heard of any, and it's hard for me to imagine that scenario. Unless I'm missing something, there still doesn't seem to be any advantage in using a Facebook Fan Page as one's central online marketing hub.

Then again, I've been wrong before.
  

Thursday, April 15, 2010

Netflix and Wii: A Match Made in Heaven...

The only reason online streaming video hasn't completely taken over the world is that, when you want to watch a movie or TV show off the internet, most of us have the unfortunate choice of either watching it on our small computer screens or else forgoing online video altogether and reverting back to physical DVDs.

Well, in case you missed it, Netflix has solved that problem. Following the path already taken by Playstation 3 and XBox 360, Netflix subscribers who own a Nintendo Wii can now stream thousands of Netflix movies and TV shows from the internet directly onto their large-screen family-room television. This doesn't replace getting physical DVDs in the mail; it is in addition to it, and at no extra cost.

If you're looking for a review, I'll give it to you. I set this up yesterday in my apartment, it took all of three minutes, and now it's the greatest thing in the world! Really, I haven't been so excited about a home technology gadget in years. Now, while I still churn slowly through my Netflix queue of getting 40 DVDs in the mail, I can instantly watch thousands of movies and TV shows without missing a beat. Last night, the Nerfherder Gal and I had a "Weeds: Season 1" marathon, because we've never seen the show and do not subscribe to Showtime, and it was all at no extra cost, on the big-screen instead of a computer monitor, and completely streamed by Netflix through my Wii.

This is one of those rare moments where you get a clarified glimpse of the future. Is Nintendo still even primarily a gaming console? Hulu first illustrated the real potential of online streaming video combined with high-production content. This Netflix-Wii partnership steps it up to another level. Honestly, if Netflix could make a deal with ESPN and the Food Network to stream its content as well, and if Hulu could be similarly delivered to my television, I'd feel comfortable cancelling my monthly cable subscription entirely.

It's coming.
  

Wednesday, April 14, 2010

Twitter Tries to Make Money (Finally)...

For a company said to be valued at around a billion dollars, Twitter still earns almost no money. Well, that's about to change. Maybe. The company announced yesterday that it would start launching "Promoted Tweets", or short advertisements, as a new method of bringing in revenue.

Some Twitter users are calling this a "sell-out" move. But really, Twitter is a business, and for one of the most heavily trafficked sites on the Web, how could they not reasonably try to capitalize on that success?

Moving towards a Google-style advertisement-supported business model, their plan is to, first, include Promoted Tweets at the top of search results, and, second, to then display them in users' personal streams. As Wired comments, this plan is also intended to have a few cascading effects...

While Twitter.com is a popular destination, millions of people use desktop and smartphone apps for the service and rarely use the site at all. Some of those apps themselves have ads which Twitter gets no cut from — including Tweetie for the Mac and iPhone, made by a atebits, a company Twitter bought last week. Because Promoted Tweets will appear in search results on Twitter apps too, the company will get a piece of the action on those platforms as well.


However, TechCrunch is a little sceptical as to how well the advertising model will ultimately work...

Quite frankly, counting impressions, clicks or retweets isn’t all that interesting. And there is nothing to indicate so far that clickthrough rates or other conventional advertising metrics will perform any better on a promoted Tweet than in a search ad. My guess is it will probably perform worse because there are just so many more commercially-oriented searches on Google than there will ever be on Twitter.


There is, of course, the inevitable user backlash to consider as well. 71% of users are apparently perceiving this development in a negative light. But that's a predictable initial reaction that I would expect to fade away over time - it's similar to how Facebook users held a boycott to protest the new "live feeds" feature in 2006, yet didn't slow the growth of their userbase at all. (Boy, does that seem like a relic now).

I say let Twitter try to make money from advertising. Why not? As long as it remains relatively unobtrusive users will grow to accept it, just as they did on Google, on other sites, and in other mediums.

My bigger complaint is with the changing culture on Twitter recently. Whereas it used to be a fantastic place to have conversations and interact with like-minded people, it's really transformed into a one-way broadcasting platform, mainly for already-established peronalities and news outlets. It used to be that if you followed someone they'd follow you back, almost every time. Now there are way too many accounts where people have thousands of followers but they're only following about a dozen people themselves. Or have staff writers controlling their twitstream for them.

It may be something of a rant, but forget advertising. If Twitter doesn't put the "social" back into the social-networking aspect of the site, that's the real threat to the continued growth of its userbase.
  

Tuesday, April 13, 2010

The Adobe-Apple Flame War...

Who says that computer programming is boring?

An epic fight is occurring right now between two corporate giants of the IT world - Adobe and Apple. It started last week when Apple decided that only apps written in the native programming languages of C, Objective-C, or C++ could be run on the iPhone OS. This meant that Adobe programs that use Flash were effectively banned on iPhones, iPod Touches, and iPads.

Adobe responded first by created a clever workaround - a Flash-to-iPhone compiler, which for you non-programmers is a simple tool that sucks in Flash code and spits out iPhone OS code. However, Apple immediately changed its license agreement to prohibit such cross-compilers as well.

Now for the juicy part. Lee Brimelow, the platform evangelist for Adobe, blogged a scathing criticism of Apple that has since caught fire in cyberspace...

This has nothing to do whatsoever with bringing the Flash player to Apple’s devices. That is a separate discussion entirely. What they are saying is that they won’t allow applications onto their marketplace solely because of what language was originally used to create them. This is a frightening move that has no rational defense other than wanting tyrannical control over developers and more importantly, wanting to use developers as pawns in their crusade against Adobe.


Brimelow's post concludes with the not-too-subtle statement, "Go screw yourself Apple".

Other big names have since weighed in as well. The Tao Effect's Greg Slepak replied to a missive by non-other than Steve Jobs himself by saying...

You didn't need this clause to get to where you are now with the iPhone's market share, adding it just makes people lose respect for you and run for the hills.... From a developer's point of view, you're limiting creativity itself... There are plenty of [applications] written using cross-platform frameworks that are amazing, that he himself has praised. Mozilla's Firefox just being one of them.


Now, of course, a Facebook group has emerged named, "I'm for Adobe". In its manifesto, it states, "There is no longer any debate as to who the “bad guy” is in this story — Apple has proven themselves to be anti-competition, anti-developer, and anti-consumer. I stand with Adobe."

Adobe's John Dowdell has even tweeted that Apple's policies are analogous to China's intolerance of religion, and even compared them to the Soup Nazi from Seinfeld.

All this stink over which computer programming language to use.

My readers can guess where I stand on this issue. I've been arguing for years already that Apple's policies are anti-developer and a killer of innovation. Their decision to prohibit Flash and all languages other than those of the C-family isn't surprising. It's just the next step in the slippery slope they've been pursuing for quite some time now. When will anyone listen?

In the meantime, Megan Lavey is right when she says that most of the 85 million iPhone OS users "don't care how those apps are created as long as the app experience is compelling -- they wouldn't know an IDE from an SDK, or be able to tell Xcode from Flash on a bet".

For a topic filled with such geek-speak, this flame war has passions running mighty high. And it's only the beginning...
  

Monday, April 12, 2010

Kicking Off a Complicated and Long-Term Programming Project in the Real World...

The following post was submitted by guest blogger, Ben Oksman.

I am currently in the process of starting an extraordinarily complicated development project. Unlike an academic, the slightest problem or delay with my projects could have dramatic consequences. So one of the questions that is always on my mind is how to start a new project without becoming overwhelmed.

A little about me:

I am currently the director of technology for a medium-size hedge fund in the NYC area. I have been designing financial systems for over 15 years now. I come from an electrical engineering and mathematics background but have been a computer guru all my life. Many of the systems I have built have tens of millions of dollars passing through them on a regular basis, have multiple heavy users, and communicate with a large number of counter parties. Moreover, they have to process large amounts of data and make complicated calculations in real-time. These same systems also regularly use dozens of thorny algorithms and formulas that if they went wrong could have dramatic real world consequences.

Here are some of the primary design issues I face:

  • Generally speaking my users have only a vague idea of what they want and are mostly inaccessible.

  • Time to market is critical.

  • Information on critical parts of the project are limited or overwhelming - including third party APIs, protocols, and data sets.

  • Access to third-party commercial products and outside help is extremely limited.

  • I have very few people who can double-check my work directly or that I can bounce ideas off of.

  • Although somewhat useful, design processes and documents tend to be disconnected and waste a tremendous amount of time.


Here is some of what I have learned and what I typically do to overcome these problems:


  • Close your eyes and dive right in. Starting a new project is like jumping into a cold swimming pool. You know it's going to be a shock but its going to be worth it at the end.

  • Compartmentalize. Break your project into small components in your mind or on paper and work on them as if they were unrelated.

  • Name your project. Give it a unique and interesting name. This will help you commit to the project as a good name is often hard to find. My current project involves research and exploring new ideas so I called it Voyager.

  • Know your design and architectural patterns beforehand. You may not know all the requirements for the project yet, however you should have a solid feeling for the design patterns and architecture available to you. Before the project begins, you should choose which patterns you will generally be following to solve all the unknowns going forward. For example, on the current project I choose a simple three-tier architecture and will follow general object-oriented design patterns.

  • Leverage existing infrastructure, open source software and frameworks. This will save you a bunch of time. Choosing a framework, in particular, will solve a large number of remedial problems right out of the starting gate and will facilitate your architectural and design patterns selection. I use the Spring Framework for most of my projects.

  • Start small but think big. I usually start by forming the project file structure and coming up with some basic interfaces for the most critical parts of the project.

  • Keep it simple and don't focus on the details. The foundation and walls of a house only take a couple of days; the devil is in the details. Often-times, users are more interested in seeing a working system than having all of the requirements met. A simple login screen with some basic functional requirements can go a long way to satiate a tough crowd.

  • Don't be a perfectionist. You will make mistakes and will have to redo large parts of your code. Coming to terms with this will help you alleviate your anxieties.

  • Focus on added value. Remember you are not making a Hollywood blockbuster. At the end of the day what matters is how this will add value to your career and/or the firm you are working for. Don't worry about bells and whistles. You can add those later.

  • Sleep. It's critical that you sleep and well. I find that most of my problems are solved by a good night's sleep. Often-times, when I am stuck on a particularly tough problem, a good night's sleep will bring me results.

  • Use caffeine effectively. Although caffeine can improve your development productivity and reduce attention lapses, it can also build up your anxieties and has limited effect on more complex cognitive function. Use it appropriately and watch for its effects.

  • Diet and exercise. My work colleagues joke that I am the most fit director of technology that they know. I see myself as a cognitive athlete and try to be very body aware. Things like flexibility, cardio heath, muscle mass, type of food, and the quantity of food you eat can have a dramatic affect on your productivity. Starting a project can be particularly overwhelming and without some balance you will soon find yourself overwhelmed by your body's inefficiencies. I, for example, have a mixed regiment of cardio workouts, strength training, and flexibility exercises. I eat small launches to avoid the midday energy lapse and have several small healthy snacks.

  • Stand. I, for one, am a big proponent of the standup-sit down desk. Sitting is one of the most inactive things you can do and you create a psychology of inactivity. Get the sit/stand desk and when your blood starts to pool and you are tired of standing, lower your workstation to a sitting position. Sitting all day will also tighten up your hip flexors and put pressure on your hamstrings and lower back. Eventually, this may create muscle imbalances that sap your strength or distract you with pain.
  

Wednesday, April 07, 2010

Net Neutrality Loses, But Is Not Dead (Yet)...

Monster news yesterday. A federal appeals court ruled that the FCC did not have the legal authority to regulate the network management practices of internet service providers (ISPs). In plain English, the court effectively killed Net Neutrality.

Or did they?

You see, Net Neutrality is the long-standing principle - also called the First Amendment of the Internet - that has always guaranteed that all web content was treated equally. No matter whether a site delivered streaming video or just plain text, data was data, and when you paid your ISP for web access every month, you were assured that you would have equal access to all websites regardless of their type of content.

The FCC was responsible for ensuring that Net Neutrality remained in place and that large telecom and cable operators like Comcast, Verizon, AT&T, etc., did not divide the internet into two parts - one "fast lane" where giant corporations paid extra to deliver their websites at faster speeds, and one "slow lane" for small businesses, schools, non-profits, and individuals who could not pay the extra fee and thus their sites were delivered at slower speeds. Net Neutrality was designed to prevent this type of splintering into the digital haves and have-nots.

With this appellate court ruling, the sky seems to have fallen! Bloggers are proclaiming the end of cyberspace as we know it. It is nothing less than a transformational shift, they say, and a corporate power grab. This is akin to right-wing pundits like Glen Beck decrying Net Neutrality as a "Marxist plot to take over the internet".

Everyone ought to relax. I've always framed the Net Neutrality issue, not as the Average Joe vs. the evil global corporation (the David-vs.-Goliath analogy), but rather as being between one group of giant corporations vs. another group of giant corporations. In a practical sense, the Average Joe website owner doesn't compete for speed with the giant websites anyway. They just don't have the hardware or other resources. So really, what we're talking about here is a battle between the ISPs of the world like Comcast and Verizon vs. the major internet players like Google, Microsoft, and Yahoo.

Now, you can certainly have an opinion about which side you favor, but it's misleading to frame the issue as being something other than one gargantuan industry versus another. Personally, in full disclosure, I favor the websites in this battle only because their industry remains less monopolistic and because I believe that there is far more potential for technological innovation and economic growth in the hands of website operators than there is in those of the dozen or so major telecom players who've been gathering cobwebs for decades.

Stepping back from the brink, a few things become clear as a result of the court's ruling. As Tech Trader Daily points out, there are three possibilities for what will happen next:
  1. The FCC will ask Congress to pass legislation granting them the needed authority (that they previously thought they had).

  2. The FCC will ask Congress to outright pass Net Neutrality legislation.

  3. The FCC will reclassify broadband services as a "utility" in order to bring such services under their jurisdiction.


The latter is being described as the "nuclear option". Turning broadband into a Title II service with the "common carrier" label would mean far more regulatory obligations extending way beyond Net Neutrality. It would grant the government the ability to regulate broadband the way it did telecommunications when it was still an AT&T monopoly, including price controls. If this was the course pursued - and, let's be clear, most observers do NOT believe it will - then the result would be a virtual halt to all telecom infrastructure investment for years. And that's something that nobody wants.

Thus, if you want to boil all of this down into a few bullet points to impress your co-workers, let us say simply that the court has ruled against Net Neutrality, and despite some people claiming apocalypse, Congress is likely to take up the issue next. If they succeed, Net Neutrality will return, perhaps stronger than it was (with an actual legislative mandate); if they fail, then it will mean in practical terms that large website companies like Google and Microsoft will have to pay an extra fee, although it will affect the rest of us only slightly. In either case, the Net Neutrality debate shall live on.
  

Tuesday, April 06, 2010

Cory Doctorow Frames the iPad in Terms of the Free Culture Debate...

Now that the iPad has finally been released, let the praise and criticisms both come pouring in. But forget product reviews - that's the purview of a myriad of other sites. The absolute most interesting article related the the iPad is Cory Doctorow's piece bringing the iPad into the Free Culture debate.

This essay has already garnered so much attention in the blogosphere that it is fast becoming a seminal work that deserves to be enshrined in the Internet Canon. Read it here.

And the best line... "The real issue isn't the capabilities of the piece of plastic you unwrap today, but the technical and social infrastructure that accompanies it."


Why I won't buy an iPad (and think you shouldn't, either)

I've spent ten years now on Boing Boing, finding cool things that people have done and made and writing about them. Most of the really exciting stuff hasn't come from big corporations with enormous budgets, it's come from experimentalist amateurs. These people were able to make stuff and put it in the public's eye and even sell it without having to submit to the whims of a single company that had declared itself gatekeeper for your phone and other personal technology.

Danny O'Brien does a very good job of explaining why I'm completely uninterested in buying an iPad -- it really feels like the second coming of the CD-ROM "revolution" in which "content" people proclaimed that they were going to remake media by producing expensive (to make and to buy) products. I was a CD-ROM programmer at the start of my tech career, and I felt that excitement, too, and lived through it to see how wrong I was, how open platforms and experimental amateurs would eventually beat out the spendy, slick pros.

I remember the early days of the web -- and the last days of CD ROM -- when there was this mainstream consensus that the web and PCs were too durned geeky and difficult and unpredictable for "my mom" (it's amazing how many tech people have an incredibly low opinion of their mothers). If I had a share of AOL for every time someone told me that the web would die because AOL was so easy and the web was full of garbage, I'd have a lot of AOL shares.

And they wouldn't be worth much.


Incumbents made bad revolutionaries

Relying on incumbents to produce your revolutions is not a good strategy. They're apt to take all the stuff that makes their products great and try to use technology to charge you extra for it, or prohibit it altogether.

I mean, look at that Marvel app (just look at it). I was a comic-book kid, and I'm a comic-book grownup, and the thing that made comics for me was sharing them. If there was ever a medium that relied on kids swapping their purchases around to build an audience, it was comics. And the used market for comics! It was -- and is -- huge, and vital. I can't even count how many times I've gone spelunking in the used comic-bins at a great and musty store to find back issues that I'd missed, or sample new titles on the cheap. (It's part of a multigenerational tradition in my family -- my mom's father used to take her and her sibs down to Dragon Lady Comics on Queen Street in Toronto every weekend to swap their old comics for credit and get new ones).

So what does Marvel do to "enhance" its comics? They take away the right to give, sell or loan your comics. What an improvement. Way to take the joyous, marvellous sharing and bonding experience of comic reading and turn it into a passive, lonely undertaking that isolates, rather than unites. Nice one, Misney.


Infantalizing hardware

Then there's the device itself: clearly there's a lot of thoughtfulness and smarts that went into the design. But there's also a palpable contempt for the owner. I believe -- really believe -- in the stirring words of the Maker Manifesto: if you can't open it, you don't own it. Screws not glue. The original Apple ][+ came with schematics for the circuit boards, and birthed a generation of hardware and software hackers who upended the world for the better. If you wanted your kid to grow up to be a confident, entrepreneurial, and firmly in the camp that believes that you should forever be rearranging the world to make it better, you bought her an Apple ][+.

But with the iPad, it seems like Apple's model customer is that same stupid stereotype of a technophobic, timid, scatterbrained mother as appears in a billion renditions of "that's too complicated for my mom" (listen to the pundits extol the virtues of the iPad and time how long it takes for them to explain that here, finally, is something that isn't too complicated for their poor old mothers).

The model of interaction with the iPad is to be a "consumer," what William Gibson memorably described as "something the size of a baby hippo, the color of a week-old boiled potato, that lives by itself, in the dark, in a double-wide on the outskirts of Topeka. It's covered with eyes and it sweats constantly. The sweat runs into those eyes and makes them sting. It has no mouth... no genitals, and can only express its mute extremes of murderous rage and infantile desire by changing the channels on a universal remote."

The way you improve your iPad isn't to figure out how it works and making it better. The way you improve the iPad is to buy iApps. Buying an iPad for your kids isn't a means of jump-starting the realization that the world is yours to take apart and reassemble; it's a way of telling your offspring that even changing the batteries is something you have to leave to the professionals.

Dale Dougherty's piece on Hypercard and its influence on a generation of young hackers is a must-read on this. I got my start as a Hypercard programmer, and it was Hypercard's gentle and intuitive introduction to the idea of remaking the world that made me consider a career in computers.


Wal-Martization of the software channel

And let's look at the iStore. For a company whose CEO professes a hatred of DRM, Apple sure has made DRM its alpha and omega. Having gotten into business with the two industries that most believe that you shouldn't be able to modify your hardware, load your own software on it, write software for it, override instructions given to it by the mothership (the entertainment industry and the phone companies), Apple has defined its business around these principles. It uses DRM to control what can run on your devices, which means that Apple's customers can't take their "iContent" with them to competing devices, and Apple developers can't sell on their own terms.

The iStore lock-in doesn't make life better for Apple's customers or Apple's developers. As an adult, I want to be able to choose whose stuff I buy and whom I trust to evaluate that stuff. I don't want my universe of apps constrained to the stuff that the Cupertino Politburo decides to allow for its platform. And as a copyright holder and creator, I don't want a single, Wal-Mart-like channel that controls access to my audience and dictates what is and is not acceptable material for me to create. The last time I posted about this, we got a string of apologies for Apple's abusive contractual terms for developers, but the best one was, "Did you think that access to a platform where you can make a fortune would come without strings attached?" I read it in Don Corleone's voice and it sounded just right. Of course I believe in a market where competition can take place without bending my knee to a company that has erected a drawbridge between me and my customers!


Journalism is looking for a daddy figure

I think that the press has been all over the iPad because Apple puts on a good show, and because everyone in journalism-land is looking for a daddy figure who'll promise them that their audience will go back to paying for their stuff. The reason people have stopped paying for a lot of "content" isn't just that they can get it for free, though: it's that they can get lots of competing stuff for free, too. The open platform has allowed for an explosion of new material, some of it rough-hewn, some of it slick as the pros, most of it targetted more narrowly than the old media ever managed. Rupert Murdoch can rattle his saber all he likes about taking his content out of Google, but I say do it, Rupert. We'll miss your fraction of a fraction of a fraction of a percent of the Web so little that we'll hardly notice it, and we'll have no trouble finding material to fill the void.

Just like the gadget press is full of devices that gadget bloggers need (and that no one else cares about), the mainstream press is full of stories that affirm the internal media consensus. Yesterday's empires do something sacred and vital and most of all grown up, and that other adults will eventually come along to move us all away from the kids' playground that is the wild web, with its amateur content and lack of proprietary channels where exclusive deals can be made. We'll move back into the walled gardens that best return shareholder value to the investors who haven't updated their portfolios since before eTrade came online.

But the real economics of iPad publishing tell a different story: even a stellar iPad sales performance isn't going to do much to stanch the bleeding from traditional publishing. Wishful thinking and a nostalgia for the good old days of lockdown won't bring customers back through the door.


Gadgets come and gadgets go

Gadgets come and gadgets go. The iPad you buy today will be e-waste in a year or two (less, if you decide not to pay to have the battery changed for you). The real issue isn't the capabilities of the piece of plastic you unwrap today, but the technical and social infrastructure that accompanies it.

If you want to live in the creative universe where anyone with a cool idea can make it and give it to you to run on your hardware, the iPad isn't for you.

If you want to live in the fair world where you get to keep (or give away) the stuff you buy, the iPad isn't for you.

If you want to write code for a platform where the only thing that determines whether you're going to succeed with it is whether your audience loves it, the iPad isn't for you.

  

Monday, April 05, 2010

Graduate Students Hack Your Cell Phone...

Being on multiple cybersecurity email lists, I'm sometimes fascinated by what's happening out there. This morning, I was notified about how a group of graduate students from Rutgers University, working under a grant sponsored by the National Science Foundation, were asked to take a smart phone platform commonly used by software developers and develop malicious applications that a user may not even notice.

Witness your tax dollars being put to work...

Suppose you're a criminal who wants to surreptitiously monitor someone's every move and even eavesdrop wherever they take their phone? Yes, as it turns out, there's an app for that, too.

Few smart phone users realize that the same characteristics that make these devices so useful can be can be hijacked and used against them...

The team decided to inject software components known as rootkits into the phone's operating system. Rootkits are a particularly devious threat to a computer, because they attack the operating system itself. Traditional antivirus software, therefore, may not be able to detect them because they don't appear to be stand alone applications or viruses. Most desktop computers are protected from rootkits by something known as virtual machine monitor, but because of their limited size and limited energy resources, smart phones don't deploy these monitors, making it very difficult to know a rootkit attack has taken place.

Once the rootkits were in place, the researchers were able to hijack a smart phone by simply sending it a text message. This allowed them to do things like quietly turn on the device's microphone, enabling them to hear what was going on in the room where the phone had been placed. Another attack trained the phone to use its GPS capabilities to report the phone's exact location without the user's knowledge. By turning on various high-energy functions, the team was even able to rapidly drain the phone's batteries, rendering it useless.


It's important to stress that the Rutgers team presented their results at a conference, and even posted a webcast. This demonstrates how there was no malicious intent on their part and justifies the notion of their hacking efforts being for research purposes only.

The dirty little secret in cybersecurity circles is that, in order to defend against hacking threats, wannabe experts must learn how to hack themselves. Courses are routinely taught on the subject, systematizing certain practices that could potentially be used in, shall we say, an unethical manner.

But don't let stories like this one - about respectable researchers - frighten you. Like guns, hacking doesn't destroy things; people who hack do.
  

Friday, April 02, 2010

The Digital Divide and Changing Roles of Public Libraries...

Traditionally, we all knew that libraries were places to get books. For many of us, it was a right of passage and as commonplace as apple pie for our parents to bring us to libraries from a very early age exactly for this purpose.

But libraries, these days, have been completely transformed. It wouldn't be hard to argue that books aren't even their primary purpose anymore. Instead, they now get most of their visitors as a result of having become community centers - offering classes and lectures on various topics, "renting" out CDs and DVDs, and, of course, providing free internet access.

Judy Breck of SmartMobs highlights a MacArthur Foundation SPOTLIGHT review which provides the following statistics on the role of public libraries being the primary source of internet access to 44% of people living below the poverty line...

  • While social networking was the most popular use of the internet at public libraries, education was a strong second—especially for young adults from low-income households: Among young adults age 14 to 24 in households below the federal poverty line, 61 percent used public library computers and the internet for educational purposes.

  • Young adults in general are the heaviest users of internet access at public libraries: Nearly half of all 14- to 18-year-olds (an estimated 11.8 million users) reported using a library computer during the last year, and 25 percent of that group did so at least once a week. Teenagers reported that one of the most common uses of library computers was to do homework.

  • More than 32 million visitors use library computers for a variety of educational activities, including searching for and applying to GED and graduate programs, completing online courses and tests, and applying for financial aid. More than half of library patrons who used library computers to seek financial aid received funding.


Certainly, these are enlightening and extremely promising figures. The Digital Divide among members of our society is a growing problem, but libraries are clearly doing their part to counteract it.

A question: If you were going to make a sizable donation to your local library, do you believe your money would have a more sizable impact by purchasing books or by installing a few more high-speed internet terminals?