Monday, May 31, 2010

High Fives

Once again this weekend I participated as a runner in Ottawa Race Weekend. Unlike so many other years the weather, as many others openly remarked before the start, was pretty much perfect. Combine that with the fantastic crowds lining the route, it made it a lot of fun to push through the physical discomfort of a long, fast road race. For myself I will only say that while I didn't do as well as I wished, I did as well as I could reasonably expect. That counts as a success.

That, however, is not what I wanted to draw notice to in this retrospective look at the event. What I noticed this time is the number of little kids lining the route holding their hands up and out hoping for a high five from passing runners. I've seen this before but not to the extent that I witnessed this year. But there they were, standing on their own, usually next to siblings or hand-in-hand with their parents, pretty well everywhere.

After watching some of the runners making the brief detours to the road edge to meet the demand, I joined in. This is a bit tricky when you're physically extended and moving fast, but the reward is lots of shy smiles lighting up the kids' faces. It was well worth the effort to carefully connect with their outreaching palms. Despite some initial worry, it was actually quite easy to gauge how to manage this high-speed manoeuver without slowing me done or slapping their hands too hard.

That's a wonderful memory that keeps me smiling even as I sit here in some discomfort as my body recuperates from the punishment of the run.

Friday, May 28, 2010

Disaster = Profits

As the attempt to stem the oil leak continues in the Gulf of Mexico and the environmental disaster spreads, it is no surprise that the shares of BP, Transocean and other closely involved companies are taking a hit. This disaster will cost them dearly in compensation claims and government penalties, and the amount is unpredictable; markets hate unpredictability, and invariably punish companies faces large uncertainties in their revenues and expenses.

Aside from the specific fallout from the disaster, the implications for the energy sector are bullish. There is of course an immediate emotional response, even among large and small investors, to exit companies involved in petroleum exploration and production beyond those directly involved in the Gulf spill. Some of those sentiments are well founded since many companies will be negatively impacted by Obama's order to halt offshore drilling.

But from a global perspective this is all quite bullish due to basic economics. All of these reactions are serving to reduce current and future supply from what were until now reliable and predictable sources. Yet our economies and personal habits do not so easily shift gears, therefore demand continues unabated. This is economics 101: reduce supply in the face of steady demand causes an increase in price. This means we can look forward to higher global crude oil prices (and their refined products) and higher natural gas prices in North America (a more regional market due to transport constraints).

Couple this with the forecast of a more severe hurricane season and you get quite rosy possibilities for oil and gas investments that are far away from the Gulf of Mexico. It's alright to feel a little bit uncomfortable with the idea of profiting from the disaster, but do consider profiting in the short to medium term due to supply constraints. Canadian companies are well-positioned to do well in this scenario.

I don't feel any contradiction in being environmentally minded and investing in the oil and gas sector. It is possible to reduce one's own consumption while profiting from the consumption of others.

Wednesday, May 26, 2010

Unpredictability of Communications Act Amendments

I wrote recently about the FCC's latest strategy to deal with the network neutrality issue by reclassifying broadband as a telecommunications service. Not unexpectedly this has upset some politicians who are perfectly happy with the status quo of letting the market, not the government and especially not the FCC, sort things out. What is surprising is that there is now a move afoot by some in Congress to reopen the Communications Act in order to wrest control of the agenda from the Administration and the FCC.

This is an unusual move since, as this New York Times article correctly notes, it is not something that Congress often takes on, or takes on expeditiously.
Any overhaul is likely to take some time. Congress has little time left on this session’s calendar. And it took more than five years to produce the 1996 Telecommunications Act, which itself was the first major overhaul of telecommunications law since the Communications Act of 1934.
Despite the apparently poor odds of Congress following through with legislation, they must not be discounted. From the outside the complex and fractious grandstanding and hearings always seem to go absolutely nowhere. But as in so many politically-charged issues, the thick smoke can hide an actual fire; substantive and even conclusive discussions may develop out of public sight. This is where the US government is so different from that in Canada, that there is actual negotiation among the parties since the Administration can only influence or pressure, but not order Congress to act in any specific way.

I remember when, over 14 years ago, I was certain that Communications Act reform was all but impossible, an opinion I formed from casual observation of media coverage of the antics. I had no particular stake in the outcome, only knowing that whatever did finally emerge would affect my employer at the time (Nortel) to a greater or lesser degree. Imagine my surprise when one morning I walked into work and one of my employees informed me that the Senate and the House has come to agreement and the Telecommunications Act of 1996 would shortly be signed into law by the President.

That was not the last time those politicians surprised me, but it did teach me to prepare for surprises. Anyone on the outside, even within the US, simply cannot know what is going down behind the veil of the publicity smoke screen, or at least not with any clarity. There is danger in pontificating with imperfect knowledge.

Regardless of Congressional process, the FCC will continue on their path. In a sense they are now in a competition to see which body will take the next step to reform communications law and regulation. I expect the FCC to prevail, but I am now wise enough to know just how wrong I could be.

Thursday, May 20, 2010

Big Tent Regulation

In my earlier post comparing and contrasting telecommunication and information services I promised a third one on where I see regulation evolving in the future. That is the subject of this post. To begin, I will take a brief foray into the past so that we can understand how we've come to the present regulatory dilemma, of which network neutrality is but one aspect. While my focus is on the US, the discussion is applicable to Canada and other countries.

Going back a century or so, it was at first necessary to regulate radio spectrum since interference among services could only be accomplished with the heavy hand of the government. Without regulation, commercial, private and government (including military) services would simply select the best frequency (wavelength) to meet their own needs, without regard to other users of the radio spectrum. The problems should be obvious. This brought the FCC into existence -- back then it was know as the FRC, emphasizing its initial focus on radio. Over time the radio spectrum was segregated by service, licenses were issued and technical standards established, and performance was both monitored and enforced. Everyone benefited from this state of affairs, including the public. International treaties were negotiated to deal with the habit of electromagnetic radiation to ignore borders.

Telephone service was by then old hat, having been offered to the public for several decades. Since these signals ran over wires, there was no need for the type of regulation in place for radio. There were, however, other challenges. One was interference, since radio transmissions could be induced on telephone wires. This was not too much a matter of regulation at first since the telephone companies were able, and felt obliged, to take measures to protect their equipment. There was no third-party equipment market selling to consumers since in those early days the entire network, including the telephone sets, belonged to the telco. Therefore, the telco could handle the issue themselves -- there was no such concept as FCC type approvals (regulations parts 15 and 68).

A bigger problem, one with serious political ramifications, was the insatiable appetite of AT&T; for many years they had been buying up smaller telcos (a century ago it was much like a dot-com boom) and were close to creating a monopoly for themselves. In danger of being nationalized by the government, they halted their acquisitiveness, stopping at about 80% of the total market. The existence of several thousand small telcos across the country made it politically unpalatable for the government to nationalize the industry. Interestingly, this 80% figure remained stable for half a century, until implementation of the AT&T Consent Decree in 1984.

After WW-II, television took off, and came under radio regulation for spectrum allocation, technical standards and licensing of transmitters. Networks which had come into existence in the radio era extended into TV and came to dominate programming. Apart from decency, ties to advertisers and other basic rules, content was unregulated.

Then cable came along, improving reception quality and quantity of stations using direct feeds, amplifiers and scramblers (this last was for premium content). Being a new type of distribution model, and even a source of new local content, it also came under regulatory oversight. This became particularly important as cable companies began to offer premium and unique content, required access to rights-of-way, competed with others and rebroadcast and substituted content, including advertising.

At every stage, regulation became more expansive and complex, often falling under several regulatory regimes: federal, state and municipal. As is typical with governments, they tend to add more laws rather than simplifying or removing them. Even so, as technology evolved the regulations failed, as they often do, to keep up with what companies and their customers were able to do. The Communications Act covered in whole or in part: wired telecommunications and information services, wireless broadcasting, cable distribution and broadcasting, wireless personal communications, including paging, cellular and point-to-point, private radio and dispatching, satellite transponders and direct broadcast, and a whole raft of military and commercial services.

One unwritten assumption upon which much of this regulation rested was that there was a strong correspondence between technology and service. Copper loops were good for telephony and low-speed data; coax was for TV and radio distribution; microwave was for point-to-point communication; wireless was divided into bands, within each was one service utilizing one style of modulation (coding) to maximize coordination, sharing and standardization of products. When it first appeared, fibre complemented point-to-point communications technologies and so did not require special attention.

Then everything changed. Copper carried high-speed data and even TV signals; cable carried high-speed two-way data and telephony; wireless carried voice, data, TV and information services; satellite circuits paired with copper to provide asymmetric high-speed data; and every data service supported IP and internet access, and which in turn supported a bit of all the above. Quite suddenly the Communications Act and all forms of regulations became a ball and chain impeding technological progress while also unable to grapple with the accelerating speed of the marketplace. A decade of tweaking the rules has only made the problem worse. Something has to give.

This brings us back to where we began, with the FCC proposing to introduce new rules to treat broadband as telecommunications rather than an information service. This, I believe, is a sensible step -- at least in concept -- to set things aright once more. In this battle, network neutrality will become a minor skirmish in the larger war, which will be to completely remake communications regulation. The reason hearkens back to an old and greatly reviled buzzword in the industry, one that has been grossly abused in technology-sector marketing for two decades. This buzzword is convergence.

However, unlike in the past, this time convergence actually has some weight since there really is convergence occurring in the marketplace and not just in marketing slogans. Once the move to bidirectional broadband data became standard for copper, coax, fibre and wireless, especially personal mobile wireless, IP became the common services protocol. Since IP, whether private or in the public internet, quite naturally accommodates voice, video, real-time and store and forward services, the old assumption of technology determining service became obsolete. It also separated transmission from service so that service providers lost control of services and content as entrepreneurs and users themselves developed software and services that exploited IP networks. Not only did regulators fall well behind the state of the art, they also found themselves having to regulate not just a modest number of commercial entities but every citizen and enterprise, including those outside the country's borders.

Regulators and legislators were (and are) overwhelmed. They have had some success in using their power over commercial entities to leverage control over what the universe of individuals can do, but this is imperfect and, often, politically sensitive. This goes well beyond my topic today, but suffice it to say that the regulations will have to be simplified to deal with this convergence. We can call this Big Tent regulation since we want to take the now mostly superfluous technology distinctions out of the regulations to focus on the critical issues of services and content. With technology and the marketplace now experiencing convergence, so too must the laws and regulations. There will still need to be technology regulations -- in particular, wireless spectrum usage and equipment type approval -- but those must be treated separately from services and content. Individuals and enterprises providing services and content can now be effectively regulated without regard to the base technology.

This is what I'll be looking for as the FCC process unfolds in the coming months, and possibly years. With all the political sensitivities involved, we can be sure that legislators will get involved to influence the FCC, even going as far as introducing new legislation. It'll be messy, to be sure, but necessary. If they do it well there is a chance it will become a model for other countries. There is precedent for that, as has been seen for spectrum auctions, service provider safe harbours, and competition rules to reduce monopolistic practices.

When I say "do it well" with regard to communications regulation, I mean that the regulator should take the minimum steps to ensure that technology and business model innovation should not be constrained, and that there is a level competitive playing field, creating opportunities for service providers and real choices for consumers. True convergence enhances these possibilities more than has previously been possible. Realizing these benefits will require some wisdom on the part of the regulators and their political master. No matter the outcome, it will be very interesting to watch the process unfold.

Monday, May 17, 2010

Android - Why Google Needs It To Succeed

If you've followed this blog for a while you'll know that I am involved with Android application development. This means I have a vested interest in seeing Android succeed. It doesn't mean that I will be ruined if Android fails or falls short of iPhone's appeal, just that as matters currently stand I benefit from Android's continued market acceptance.

Regardless of the my situation, I have always been puzzled by Google's intentions with regard to Android. Their attention (and resources allocated) to the Android Market and to their OHA partners is exceptionally poor. I would have thought that they would at least give the Android business more personal (human) attention than they do to the bulk of their products and services -- Google typically excludes any human-operated aspect to their support systems, relying instead (it seems) on random sampling of feedback and automated, statistically-driven response systems.

It is natural to question just how committed Google is to Android. They have not even adequately explained how HTML5 and their Chrome OS will be positioned versus Android and native Android applications in their overall market strategy. They play Android close to the chest, just as they do with everything else they do. That's their choice. Ultimately, even for a company as technology driven as Google is, there has to be money in it.

Lots of delivered and rumoured phones, tablets and other devices certainly demonstrates that Android has gained broad acceptance by carriers and device manufacturers. Since the platform is available at no charge, this is not how Google makes money from Android. It's been said that the money would flow from the Google experience applications and services that are typically bundled with Android, but this is optional and therefore uncertain, as has already been seen in several devices. Again, Android on its own is not a guaranteed revenue stream for Google.

I suspect the actual business is one of strategic positioning versus the alternative platforms, especially Windows and iPhone. Without a presence in the mobile device market -- Google does not build hardware -- these competitors can lock Google out of the mobile ad market. This comes about in two ways: mobile search and in-app advertising.
...Google prefers to have mobile users access the mobile web instead of have them locked up within smartphone applications. Users tend to use the Google search engine more if they’re surfing the web, which brings Google more cash.
Since Android had to be open with regard to applications and services to gain acceptance with carriers and device vendors, there is risk that Android adopters would sign deals with other search companies such as Microsoft's Bing and other ad networks such as AdMob. However, and I believe this is the key point, Google is at even more risk of losing search and app presence on iPhone and Windows. This is why, for example, the niche mobile advertising marketers are being acquired by Google, Apple and the other large players.

Google is also threatened by Apple's move into in-app advertising with iAd. Since Apple makes its money on selling hardware but not on software or content (these drive hardware sales), they are happy to create strong incentives to app developers and advertisers to use iAd.

If my analysis is right, Android is Google's opportunity -- although an uncertain one -- to protect its ad business for search and apps when its two competitors seek to lock them out on the other mobile device platforms. With the mobile web and apps taking an increasing share of total time spent by users on the internet, this is no small concern, even if there remains a real possibility that native apps will eventually be supplanted by HTML5-driven web apps.

I'll be watching announcements coming out from Google this week during their I/O conference to see if there are any that indicate their future direction on apps, Android, Chrome OS and mobile search.

Thursday, May 13, 2010

Telecommunications and Information Services - Telling Them Apart

The discussion of network neutrality in the US refocuses attention on the distinction between telecommunications and information services. As I mentioned in the previous article, it is largely an artificial one, but one that resulted from the FCC's long efforts -- this goes back 25 years or so -- to draw a bright line between industry sectors and the need for distinct modes of regulation for each. This came about long before the web was invented and before the internet became a subject of public knowledge, when digital switching and modem usage became widespread.

I will begin by reviewing the technicalities of how the law distinguishes between telecommunications and information services before moving onto the present choice that the FCC is taking and why it matters. Here are the two definitions from the US Code that are most relevant to us:
(20) Information service
The term “information service” means the offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications, and includes electronic publishing, but does not include any use of any such capability for the management, control, or operation of a telecommunications system or the management of a telecommunications service.

(43) Telecommunications
The term “telecommunications” means the transmission, between or among points specified by the user, of information of the user’s choosing, without change in the form or content of the information as sent and received.
These are pretty clear, so I only need to note that there are additional sub-definitions for telecommunications, and the one for telecommunication service is merely one of offering telecommunications to the public for a fee. Let's compare telecommunications and information services by means of the two attributes that most distinguish them:
  • Transformation
  • Store and forward
Transformation would seem to be a simple enough attribute, where a service is telecommunications if the provider does not "change...the form or content of the information as sent and received" by the user. With digital switching and transport it turns out to be more complex than one might at first imagine.

The telephone network is optimized to human speech, covering frequencies up to about 4 kHz. This is far from high fidelity: compact discs (CD) employ 44 kHz sampling to accurately reproduce frequencies up to 20 kHz, whereas PCM digital codecs for telephony sample at 8 kHz. However there are many codec standards to accomplishing this task. Your voice may be transformed many times between your microphone and the other party's speaker, including from and to analogue at each end. There are a variety of PCM standards in the wired network, plus more for wireless (e.g. AMR) and for VoIP. The bit stream is transformed at each boundary between these network domains.

What saves this type of transformation is that ultimately it's your voice in, your voice out (there are even more nuances in voice-band transmission, but let's not get too detailed). Therefore despite all this transcoding, adding and subtracting and other things, this is still telecommunications. If, however, you utilize a service that modifies your voice to disguise your identity, that is not telecommunications since the transformation is substantive and not incidental; this is an example of a simple information service.

Store and forward is more interesting. It should be self-evident that services like email and voice mail are information services since there is explicit storage and forwarding involved, which we can consider as one extreme. At the other is the many instances of bit-stream storage and forwarding (switching and transport) that is an unavoidable artifact of digital switching. Unlike analogue transmission and switching, there is buffering at many intermediate locations for even an ordinary circuit-switched voice call. It's brief and usually unnoticable, but there is a measurable latency compared to analogue. For some of the low-bit rate codecs occasionally used on VoIP services, the latency can be very noticable, though not as bad as on satellite circuits.

This type of store and forward is acceptable for a telecommunications service since it is purely incidental and due to the underlying technology, and because the service remains real-time in human terms. Messaging services like SMS and IM are pseudo-real time since messages may indeed be stored, or even lost or left undelivered. These would likely be classed as information services. I say "likely" since they have not been as thoroughly tested as some other services since they came about more recently, after the advent of mandated competition which tended to make these questions less important.

There is a side-effect of declaring internet services, including broadband, as information services -- which they were -- because it implies that any use of those services is also an information service. Since IM is in this category, but not SMS, its store and forward attributes become largely irrelevant. More fundamentally, while IP packet switching in the internet is substantially equivalent to real-time voice calling -- no substantive storage and forwarding or transformation -- it has been classified as an information service purely as a policy matter so that it could escape undue regulatory control.

With VoIP this policy approach became problematical. Like legacy voice calling, VoIP does meet the test as a telecommunications service, whether as a stand-alone service or when bundled in an application such as IM or Skype. This is a subject I've covered before, where I talked about the duck test: if it quacks like a duck, walks like a duck and looks like a duck, it's a duck. Except, VoIP is an application on top of an information service. This is the type of conflict we can expect when regulators get too specific in their classifications. Technological innovation and the free market always move faster than regulators and governments.

In my next article I'll talk about where I think this is all heading, and how the future of internet and broadband regulation is likely to go. The FCC's present inclination to reclassify broadband service as a telecommunications service is likely to play an important role in the future that I foresee.

Tuesday, May 11, 2010

Network Neutrality in the US

Once upon a time we all accessed the internet in just one way, by dialing up into an ISP modem bank (ok, this isn't entirely true but it's close enough). Eventually, after many years and much consumer demand, there came DSL, cable modems and wireless, and ever-increasing speeds and accessibility. Competition existed, but the choices were often limited and allowed the service providers to exercise their ability to raise prices, lower service quality and interfere in ways they believed to be to their commercial benefit. As dial-up passed into history and the carriers became dominant in the ISP sector, the situation deteriorated. Finally there came the cries for network neutrality.

It is quite natural and expected that carriers would seek to control or otherwise manage internet access services since that is a good way for them to move up the telecom value chain, and avoid becoming a commodity carrier of bits. This tendency is also quite naturally countered with a suitable mix of regulation and competition. However, competition is constrained by the high expense of building facilities to reach the market. Absent prospects for a healthy competitive marketplace, the calls for regulation become more numerous and louder.

In an age when governments of all stripes are trying to reduce regulations as one means to limit government spending, the political reluctance to get involved in the broadband conflict is understandable. Every country responds in its own way. I most recently discussed this in my closing article on CRTC decisions on GAS. Now we have the FCC in the US wading into the morass of network neutrality, which is shaping up to be an epic battle. Although details are sparse, it is worth a look to see what they are attempting to accomplish and why. I plan to skirt some of the blatant politics of the issue so that I can focus on what I believe is more important.

First, I have to say that FCC Chairman Julius Genachowski's public statement is a fine example of clarity in communications. There are many that don't like what he says, but what he says is unambiguously clear. For a politically-motivated agency to do so is to be applauded. I recommend that you should read the full statement if this topic interests you. I will only focus on the following extracts:
...the Internet should remain unregulated and that broadband networks should have only those rules necessary to promote essential goals, such as protecting consumers and fair competition.
...
  • Recognize the transmission component of broadband access service—and only this component—as a telecommunications service;
  • Apply only a handful of provisions of Title II (Sections 201, 202, 208, 222, 254, and 255) that, prior to the Comcast decision, were widely believed to be within the Commission’s purview for broadband;...
I had to run to my bookshelf for a bound copy of the Communications Act of 1934 (with amendments) to see what the referenced Title II sections were about. I don't need to review these in detail, and consider it sufficient to mention that the choice backs up the policy direction of "protecting consumers and fair competition." There is a clear intent to prevent ISPs from restricting what services and destinations their customers can reach.

This will be unacceptable to many companies with a vested financial interest in controlling or interfering with access so that they can either derive additional revenue from transmission (e.g. ad injection), from usage (e.g. media downloads) or to favour services in which they have an interest. Those companies often have the ear of political leaders, who will exercise their influence on the FCC. It will be quite a fight when the FCC formally issues an NPRM. It will also be interesting to see what lessons the CRTC will learn as they monitor the proceeding down south.

Later this week I will follow up this article with one regarding the distinction between telecommunications and information services. It is largely an artificial one, but one that resulted from the FCC's long efforts to draw a bright line between industry sectors so that they would fall under separate regulatory regimes and policy objectives. As we'll see, the problem is not an easy one.

Monday, May 10, 2010

Hazards of Rescuing Greece

I suspect that many of us have had at one time a friend or relative who has through negligence or design dug themselves into a financial hole, by living beyond their means. In other words, spending more than they earn, or taking on debt they cannot service from their earnings. I don't mean otherwise responsible people that hit a bump on the road of life, such as losing a job or having to deal with some other misfortune, but those who seem destined to make trouble for themselves.

When this person approaches you to guarantee a loan, you will quite sensibly want to think this over carefully. Clearly the lender knows he or she is high risk and wants to reduce that risk by having you backstop a potential default. This could put you in the position of footing the entire cost of the loan, both principal and interest. There is no direct financial benefit of taking on this risk, only the prospect of helping out a person that matters to you, or perhaps to keep peace within the family.

This is in essence what Europe is now doing. Monday's astounding market rebound reflects the belief of many that their bailout can work, that it will at least temporarily address the risks of lending funds to the EU's misbehaving family members. Greece (and the other so-called PIIGS countries) must become fiscally responsible or the problems will return, and next time they may be allowed to deal with their own demons. Except that next time you can be sure that the rest of Europe will have put in place safety valves whereby Greece can be cut adrift without undue impact on the Euro and global markets. Those safety valves do no currently exist, so this bailout is as much to the benefit of Germany, France and other countries as it is to the PIIGS. They can do this now, while the problem is mostly contained within Greece, since they are small in comparison to the other PIIGS and the EU.

None of this solves the cause of the problem, that Greece has been running an enormous deficit for a long time, due to internal societal pressures and government weakness. This will still have to change. Keep in mind that this bailout only requires Greece to reduce their government deficit from 13.6% of GDP to 8.6%. The reduced number remains far above the 3% that Eurozone members countries are required to maintain. The EU and ECB are in a bind in part because they have failed to enforce the 3% rule. That makes partly responsible for the current crisis. The only reason to moderate the reduction in Greece's deficit is to maintain public order. I find it arguable that citizens of Greece are accountable for this crisis and therefore should bear the burden of a steep reduction in public spending, even into the black. Except that taking the moral high ground is unworkable since the political instability that will result, and will spread beyond Greece's borders, makes the go-slow approach necessary if unseemly. The real threat isn't to the Euro but to the cohesiveness of the EU itself.

For today at least, I am enjoying the market rebound. It should last awhile, but unless more is done to solve Europe's problems there is no reason for complacency.

Friday, May 7, 2010

CRTC Decision 2010-255 on Usage Based Billing

This week the CRTC finally brought down their final decision on usage-based billing for GAS (gateway access service), which is required by non-facilities based ISPs to offer DSL retail services to customers with telco copper loops. No one was surprised that CRTC supported usage-based billing, though you might never realize this from the many and loud comments on this CBC story and elsewhere. This is not a popularity contest where rulings are made on the basis of votes. The CRTC is not unaware of the noise being made by interested parties:
5. The Commission also received a large number of comments, mostly from individuals, that almost unanimously opposed the Bell companies’ applications.
Since I've covered this topic several times in the past, most recently in August of last year (see this October article for my coverage of the traffic management decision), I thought it worthwhile to wade into this one more time to close off any remaining loose ends. Since everything I've said before still applies with this ruling, I can keep this post fairly brief.

The ruling itself is somewhat lengthy. If you want the summary, you can simply refer to the CBC's summary in the previously mentioned article. If the subject interests you and you have an hour to kill, by all means read the full ruling. You will learn some important lessons about what drives these types of decisions. Let's review these (you should also refer back to my August 2009 article).

The CRTC remains firmly focused on promoting facilities-based competition, not resale of incumbents' facilities. This is especially clear in this policy extract from the decision:
V. Are the Commission’s determinations consistent with the Policy Direction?

79. The Commission considers that its determinations in this decision advance the telecommunications policy objectives set out in paragraphs 7(b), (c), and (f) of the Act.[8] The Commission further considers that its determinations are consistent with the Policy Direction requirements that (a) the measures in question be efficient and proportionate to their purpose and interfere with competitive market forces to the minimum extent necessary to meet the above policy objectives, and (b) the measures neither deter economically efficient competitive entry into the market nor promote economically inefficient entry.
In other words, CRTC is uninteresting in promoting retail competition through resale. They want more facilities-based competitors. This will likely only come about through new wireless infrastructure, not copper, cable or fibre since these are all highly capital intensive and slow to deploy.

CRTC is also interested in non-discriminatory treatment of both retail and wholesale service providers, but insists on leaving competition for the market to resolve. This underlies their continued emphasis on the GAS providers treating their retail ISP customers the same as GAS customers. Not only that, they would like to regulate both copper and cable providers the same, so far as is possible.
80. Finally, the Commission notes the Policy Direction requirement that regulatory measures related to network access regimes, such as wholesale GAS, should ensure the competitive neutrality of those regimes to the greatest extent possible. As noted above, in Telecom Decision 2006‑77, the Commission approved UBB for the cable carriers. The Commission considers that approving the Bell companies’ proposed economic ITMP, with the changes set out in this decision, will allow the Bell companies to apply UBB on a comparable basis to the cable companies, and that such approval is consistent with the competitive neutrality aspect of the Policy Direction.
A final lesson to be learned from this decision is that it is never a good idea to have a regulator intervene in a market to the extent that they must determine service prices. If you have any understanding of network engineering and economics you will be horrified at the superficial analysis and rationale behind the CRTC's setting the price for billing excess gigabyte usage. This begins with paragraph 52 of the decision and rambles on from there.

With this post I am done with this topic. In reality very little will change: traffic management (throttling) will continue. the majority of broadband users will not be hit with overage charges, and better prices and services will have to wait for effective wireless competition. Cries of woe from customers and ISPs will have no impact on the outcome.

Thursday, May 6, 2010

Palm and HP

In the week since HP announced that it would acquire Palm for $5.70 per share there has been rampant speculation in the trade media as to what it all means. Long-suffering Palm shareholders have stars in their eyes, and are even holding out hope for a better deal, which is reflected in the persistence of trades above the offer price. Typically the share price for a deal expected to close as-is would hover ~2% below the offer price, or about $5.55, establishing equilibrium between sellers looking for an early exit and arbitrageurs willing to wait for the deal to formally close. A higher price is unlikely since there is no obvious value of Palm's assets to another buyer at a higher price. Which brings us to wonder how it is that HP thinks that Palm is worth $1.2B.

It's no surprise that HP wants into the smart phone market since it is an easy reach from their PC business, which focuses on computing hardware, and was widely expected ever since their largest competitor, Dell, made the move. After all, if consumers are beginning to prefer smart phones and tablets over traditional PCs and netbooks, the manufacturers of those devices must follow the money or lose revenue. Except that where Dell chose Android as the platform for their own devices, HP is opting for webOS and devices which, currently, only run webOS.

It is true, as some commentators have said, that this approach differentiates them from Dell, Apple, HTC, Nokia, Blackberry and other smart phone vendors. Unfortunately they have tied up with the platform most likely to lose. While this is something that many have predicted (including me), it is also apparent in Palm's steep revenue decline, which reflects low interest from consumers, carriers and app developers. HP said that they are serious about webOS and plan to invest further in the platform. They have the resources to stick it out for years if they are truly committed, but will webOS see a turnaround now that HP is behind it?

Success can be measured. Key indicators to watch for in the next quarter are a halt to the decline in unit sales and positive indications of interest from carriers beyond Sprint, and especially the largest carriers in the US and elsewhere. They won't close deals that soon, but if they can't demonstrate real interest then there is reason to doubt. Unfortunately for HP, I very much doubt that consumers and app developers will be impressed by HP's purchase of Palm. Consumers may respond to a new and extensive marketing campaign, which is unlikely to occur until the deal closes later this year. Attracting more app developers will be delayed until there are hard numbers that show webOS device sales are trending upward.

Yet there is still the matter of all those smart phone platforms in the market, among which webOS is the weakest of the bunch. Assuming that HP wants to keep webOS proprietary -- unlike Android, Symbian and some others -- they need to start pushing out new and compelling devices soon. But, again, they can't do this effectively until after the deal closes. It is conceivable that in the interim Palm could license the software to HP so that HP can get to work immediately, although this would be irregular since the deal itself will have to undergo some (routine) regulatory scrutiny and could imperil Palm's ability to attract and respond to better offers. This is a matter of shareholder interest that cannot be easily dismissed.

Apart from webOS, the other major assets in the deal are Palm's patent portfolio and the hardware devices. The devices are probably not worth a great deal in comparison to webOS and the patents since HP will have its own device plans and, besides, devices age rapidly in the market and all smart phones utilize components from pretty much the same suppliers: the vendor pretty much just assembles them, modifies the software to run on the device, and gets the regulator to certify the final product for sale. Look at the bills of materials for iPhone and any Android phone and they are noteworthy for their striking similarity.

The value of Palm's patents is debatable. Since I doubt that HP intends to license any of it to others or to sue other vendors, the patents' value is most likely to be defensive: to negotiate cross-licensing deals and to shield against suits from others. That is not inconsequential since these lawsuits can result is heavy costs and awards, easily running into the hundreds of millions of dollars, each. This implies that even if HP were to dump webOS and go with Android, purchasing Palm might still be justified in their move into the smart phone business.

Getting back to differentiation, I would say that it is not so much webOS itself that provides this to HP but the Palm team that produced the device and its user interface. In comparison, Dell's and Acer's Android devices appear (at least so far) to be pretty generic Android. This is very unlike HTC, Motorola and even Samsung Android phones. The differences is that PC companies do not have the experience with developing custom user experiences, instead relying heavily on Windows. However one chooses to value the Palm team, this does give them a leg up on their traditional competitors. How this will stand them in the broader smart phone market remains to be seen.

After all this rambling, I still find it difficult to come to any conclusion that I find satisfying. HP has a difficult path ahead of it, where success is strongly tied to their unproven ability to nurture webOS into a contender rather than a might-have-been. They will require a healthy dose of luck in addition to flawless execution.

Tuesday, May 4, 2010

Unsustainable Lifestyles

In the midst of so many economic catastrophes that continue to unfold, the situation in Greece is an interesting one. Their dilemma is lamentable even though it is of their own making. It is also having an impact on other countries, including Canada, since a contraction in Europe reduces demand for what we primarily depend on for our own income: commodities.

The reaction of many Greek citizens is, so far, one of denial. They blame their government, which was only doing what the electorate wanted, the foreign lenders, the banks and, well, just about anyone other than themselves. Whether they like it or not, they will eventually have to accept that they problem is their own since no one else can fix it for them. Too often the media reports seem to suggest that the EU and IMF are offering them a handout, but it is really just a hand up. They do not need to accept the offered hand, but they likely will since the alternatives are worse, both for them and for everyone else.

My European experience is limited, so take what follows with a grain of salt. There is a pattern here that is echoed in Spain, Portugal and possibly Italy. They are all countries that are now fervently democratic, yet it was not too long ago that they were all under authoritarian regimes. Of these countries, Italy has been democratic the longest and Greece the shortest. This matters since that past colours the present.

Each of their so-called far-right governments shifted quite firmly to the left once democracy took firm root. One unfortunate consequence was to place priority on outcomes rather than opportunity. This occurred in a culture where governments were responsible for many aspects of society and public institutions. They wanted the social benefits of other European countries, and did not see any reason to delay their implementation. But without having spent to time to build a strong private sector -- or at least not yet strong enough for what it was being asked to support -- those benefits required funding beyond what the tax base could support. This was made worse by a disrespect for authority that resulted in widespread tax evasion and a fairly large underground economy that was beyond the reach of the tax collector.

The funds therefore came from elsewhere in the form of debt. Lots of debt. More than their country's GDP-worth of debt. Like an individual with a credit-card addiction, they were living well beyond their means, and that is unsustainable. It only last for as long as their are willing lenders and the debts can be serviced. Greece can no longer service its debts.
"We want an end to the freefall of our living standards," said Spyros Papaspyros, the head of ADEDY, which represents about half a million workers in the Aegean nation of 11 million.
Unlike you and I, Greece is a sovereign state and does have the power to refuse to service or even repay its debts. There are voices in Greece suggesting that this be done since, they reason, the lenders took the risk and lost. This won't work since their spending trajectory remains unsustainable and can only be funded with additional debt. If they default there will be scarcity of institutions willing to lend them those funds. With or without the demands of the EU and IMF, onerous spending cuts are unavoidable. This is why the Greek government consented to the terms; there really was no alternative.
"Whether Greece can actually adjust, whether their social cohesion will remain -- that's the key thing to watch," said sovereign ratings analyst Tom Byrne of Moody's.
How long the political and social strife will continue in Greece is difficult to say. We can hope that it ends soon, and that they accept the inevitable. Regardless of who gets blamed, no others but themselves can solve the spending problem. They will have to pay their taxes and reduce public expenditures, including programs and the civil service.

Some countries, especially those in some parts of the third world, have another, darker option that Greece does not. That is to default and then take on debt or "grants" with political strings attached. Usually this involves trading sovereignty or unity for money. The US, Russia and China have each one this many times to acquire military bases, resource rights and beneficial political alliances. It's unsavoury, but it happens. Greece can't do that since it is part of the EU and they have no political coin to trade for money. They will have do it the hard way, by living within their means.
"These government measures are destroying my life," said Panagiota Katsagani, a 25-year-old part-time school teacher who was marching in Athens on Tuesday. "I was planning my future, now I have to go back and live with my parents."
That is a part of the price they will have to pay. There are no good alternatives.