Google’s Gigabit Bonanza

Last week Google announced it will finance ultra high-speed residential Internet access over fiber in select US communities. North American broadband households get average downlink speeds of around 5 Mbit/s. No wonder this initiative has caused a lot of jaw-dropping, soul-searching and wild-assed speculation.

The announcement has gleaned very positive support from Washington’s advocacy community. Others have fretted about Google’s real motives, such as whether or not the company is planning to become a retail ISP. The incumbents and their fellow travellers have dissed the venture. For example:

“The Google plan is short on details, with no information on capital spending, and, in our view, should primarily be seen through the lens of regulatory posturing,” Sanford Bernstein senior analyst Craig Moffett wrote in a research report. “We do not view Google’s announcement as a serious threat to the broadband businesses of either the cable or telecom operators.”

Short on details? Jeez, the project just got announced. The details will come as part of the “experimental” process. Like, that’s the whole point. And the suggestion Google is engaging in “regulatory posturing” is a hoot coming from the carrier sub-culture, which pretty much invented regulatory posturing.

It’s the ownership, stupid

Let’s not forget that the US carriers have a habit of suing the shit out of municipalities that have tried to build fiber networks intended to offer what the local telco or cableco can’t or won’t offer. Google has now handed them a serious migraine, related to both speed and ownership. The incumbents don’t want their customers to know that, if only the world were a little different, they could be enjoying much faster speeds, at cheaper rates.

But the real nightmare isn’t the prospect of having to offer faster speeds to keep up with Google. It’s the ownership model that’s scary, and the concurrent loss of control over the gates to cyberspace. Google is, after all, making this a muni fiber project. If I were an old-style gatekeeper, I’d be plenty worried about this statement:

“We’ll operate an ‘open access’ network, giving users the choice of multiple service providers. And consistent with our past advocacy, we’ll manage our network in an open, non-discriminatory and transparent way.”

Choice? Open and non-discriminatory? Sounds a lot like Net Neutrality. And in some ways, maybe like Bill St Arnaud’s vision of condominium fiber.

Speed and fear of the unknown

Making access 200 times faster isn’t trivial. It’s difficult for most people to imagine what the end-user experience will be like in the 100-to-1000 megabit zone. Nevertheless, Washington’s advocacy groups are emphatic that no speed is too fast – and a prosperous future depends on setting very high bandwidth targets (yes, somebody will still have to pay for all that bandwidth). This week, in a preview of the FCC’s Broadband Plan, Chairman Genachowski set the following target:

100 Squared Initiative: 100 million households at a minimum of 100 megabits per second (Mbs) — the world’s largest market of high-speed broadband users — to ensure that new businesses are created in America and stay in America.

Even this scheme falls short, according to prominent consumer advocate Mark Cooper, of the Consumer Federation of America.

“Noting there will be 130 million U.S. households by 2020, Cooper urged the FCC to strive to reach 120 million households with ultrafast broadband. He said it is a realistic figure consistent with the agency’s push to raise the broadband adoption rate from 65 percent to 90 percent by 2020.”

Where’s Canada?

In his report on Google’s plan, CBC’s Peter Nowak said it was “unlikely” such an experiment would find its way up to Canada. In one sense, that doesn’t matter. Canadians will notice – eventually – if the access experience becomes significantly better to the South.

On the other hand, nothing will change as long as modesty reigns in Ottawa. The CRTC has just issued an unusually bold report in which it makes recommendations for significant policy changes. It’s entitled Navigating Convergence: Charting Canadian Communications Change and Regulatory Implications. I’ll be discussing it in more detail in a subsequent post. For now I want to single out one assertion it makes about speeds:

“Significantly higher speeds are not currently being driven by direct consumer demand; few residential users actually need 100 Mbps to enjoy their Internet experience at present. Rather, the current impetus is largely competitive marketing and the recognition that if history is any guide, applications will quickly develop to take full advantage of any increases in capacity once the infrastructure is in place. Moreover, on a competitive level, the telecommunications companies’ desire to compete with cable companies in the broadcasting and VOD market will likely fuel their move to FTTN/VDSL, as this will allow them to offer an IPTV service with greater capacity and potential for interactivity than their current satellite television offerings (p.75, emphasis added).”

This statement makes some mighty big assumptions about competition and what’s in consumers’ heads. Of course subs aren’t clammering for ultra high speed access. They have no idea what that is, what it would cost or who might be able to supply such access – never mind imagine the kind of immersive environments a few hundred megs can make possible.

Not only is the end-user set aside in this equation, so is government, at least by implication. Instead of that old vision thing, as in what the FCC will deliver next month, this position leaves progress to market forces. Higher speeds will come not as part of a grand design in which we strive for ambitious social goals; they will come as a spinoff from market-share battles among Canada’s half-dozen broadband incumbents, which control 80% of the market. Innovation doesn’t come from Rogers cranking up its retail bandwidth just enough to say “Canada’s fastest high-speed Internet” – with or without a straight face.

And I’m not going to trust history to show the way. In fact, if history is any guide, it tells us that when the carriers are in charge, they’re not inclined to innovate. Worse, they’re inclined to block innovative applications, especially when they might compete with their own apps – video library, VoIP, cloud storage, IPTV, music downloads, security apps and so on.

How could we possibly use a gigabit? Look no further than the Google blog post that started all this commotion:

“Imagine sitting in a rural health clinic, streaming three-dimensional medical imaging over the web and discussing a unique condition with a specialist in New York. Or downloading a high-definition, full-length feature film in less than five minutes. Or collaborating with classmates around the world while watching live 3-D video of a university lecture. Universal, ultra high-speed Internet access will make all this and more possible. We’ve urged the FCC to look at new and creative ways to get there in its National Broadband Plan – and today we’re announcing an experiment of our own.”