Friday, December 31, 2010

On Reading Orders by Telecommunications Regulators

Going by this site's web statistics, the most popular of my posts are those regarding CRTC rulings and orders. I suspect this is because of the relatively scarcity of in-depth discussion (from what I've found) on the web of actions of the Canadian telecommunications regulator. This is in contrast to the FCC in the US where every slightest movement on their part is analyzed -- whether rigourously or superficially -- by hordes of bloggers and more-mainstream media outlets. I wish there were more attention paid to the CRTC.

For the public, or at least those that pay more than the most casual and passive attention to regulator actions, there is a dilemma: to get beyond superficial, and sometimes misleading, reporting it is necessary to go the source and read the orders and rules that they publish. It seems that not many do this, other than the companies directly subject to those orders and rules. These companies -- typically telecommunications carriers and service providers, but sometimes their major suppliers and customers -- not only read what the regulators publish but are deeply involved in the process and have many specialists and contracters that understand the process and the network of people to influence to achieve their business objectives.

One big reason that few among the public get involved or even just read the material is that it is time-consuming. It is somewhat surprising that there is more of this avoidance in Canada since, in comparison to FCC orders, those by the CRTC are not difficult to digest. The reason is that the CRTC has more latitude than the FCC to both set policy and establish regulations. The US telecommunications regulation process is more politicized and by dint of the governing statutes there are many avenues to take the FCC to court to dispute the legality of their regulations; Congress writes telecom laws that go into an unusual level of technical detail regarding what the FCC is required and permitted to do. This is why the FCC is replete with staff lawyers and far more lawyers are employed by the industry.

A good case in point is the recent FCC Report and Order on network neutrality: FCC 10-201. This is a big document weighing in at 194 pages. It is far easier for the public to wait and read the summaries and analyses in the media and on the web. Unfortunately a number of these analyses are less than ideal since the writer may have an agenda, whether to influence the public or to stir up controversy in an effort to gain readers (aka link bait) and without some knowledge it can be difficult to know whom to trust. (Whether I am trustworthy on that score is moot since, because I see little in the Report and Order I want to comment on, I will not be providing any analysis of it.)

Beyond the mere page count, the core content of the FCC document where staff analyzes the issues and assessing the balance between their objectives, the law and the positions of intervening parties, is full of detailed legal references and written in a manner that is meant to withstand the gauntlet of the justice system. On the positive side, the core content of the document is far less than those 194 pages. That is fairly typical of FCC Reports and Orders. Here's my page count of the various parts of the network neutrality document:
  • Table of contents: 1 page
  • Discussion and determinations (the core of the document): 83 pages
  • Procedural matters: 2 pages
  • Rules: 9 pages
  • List of commenting parties: 16 pages
  • More discussion: 22 pages
  • Statements of Commissioners: 60 pages
So there you have it: 9 pages out of 194, or less than 5%, contain rules. This is dwarfed by the 60 pages of Commissioner statements, which are mostly from the Republican dissenters. Unless you are enthralled by the political angle of telecom regulations you will most likely not miss much by skipping these 60 pages. Even less enthralling are the 16 pages of lists of people and companies that participated in the process (such as it was in this instance).

In fact the really interesting stuff isn't actually the rules themselves but the main body of discussion regarding how the FCC reached its determinations. While this is still 83 pages, a very rough guess on my part is that close to 40% of that is footnotes. For the casual (!) reader it is fairly safe to skip those, which leaves about 50 pages of text. This is a less-daunting chore since we have managed to exclude almost 3/4 of the document. A deeper reading of the document, unfortunately, will require reading some of those footnotes, but only the ones that provide background and not those that are legalese that are intended to buttress the FCC's legal position. That isn't so bad.

Although it is possible clear the clutter to reduce the quantity of reading one must do, it does help to understand the unwritten rules of the regulatory and political games underlying the proceeding, and a bit of history regarding how these things tend to unfold. That is one advantage I have since, although years ago, I once participated in these types of US regulatory proceedings. Even so, my guess is that a careful and intelligent reader who is not prone to irrationally clinging to preconceived notions will get a lot more out of the network neutrality debate, or any other contentious regulatory issue, by going to the source and reading the material published by the regulator rather than relying on others; these others may have biases or filters of convenience that distort their reporting or analysis.

The downside is the modest time and effort required to read these documents. My experience tells me that the learning curve isn't steep, so after the first one or two it gets much easier -- there is even a sort of primer in the case of the network neutrality order in the form of a 5 page press release. If you care deeply about network neutrality, usage-based billing or one of the many other current issues being considered by US and Canadian telecom regulators, reading the core content of these publications can be time well spent.

Wednesday, December 22, 2010

Plausible Deniability in Telemarketing

By now I'm sure that everyone has heard about the fines levied by the CRTC for do-not-call registry violations by Bell Canada, Telus and various firms that they contracted. The only real surprise to me in this matter is that the CRTC took enforcement action of any significant degree.

Although the CRTC claims that these fines -- $1.3M in the case of Bell Canada -- will hurt these firms, it really does not. Not only are these amounts very small relative to their overall business, considering the degree and duration of these violations it is also quite possible they made a net revenue gain after paying the fines. The only company that may have been hurt badly is Xentel, which was fined $500,000 since it is a far smaller company.

The style of enforcement is interesting since it resembles the actions taken in the US by the FCC when they began enforcement of their DNC regulations. First, they targetted large companies, including carriers like AT&T, and also DBS (direct broadcast satellite) providers, and they timed and bundled enforcement actions to maximum media impact. This is an effective tactic to combat public unhappiness with ongoing DNC violations. Going after the biggest companies, especially those that are dominant in their sectors, also works well since there is always an undercurrent of distrust and dislike of these companies that we are often unable to avoiding giving our business. The CRTC might therefore have adopted their own tactics in hopes of achieving the same positive public relations impact.

There is also the simple fact that large companies, if only because of the size of their businesses, are going to show up in the list of top offenders.
...the wireless sector had the distinction of taking the top three spots with Rogers and Telus ranking second and third respectively. There were also hundreds of complaints against Canada's top financial institutions and retailers including RBC, CIBC, Scotiabank, TD Canada Trust, and Sears.
Regarding the companies themselves, it is not unusual for companies like Bell Canada and Telus to outsource what they would consider non-core functions. It allows them to maintain business flexibility by contracting services as needed without building up an in-house telemarketing operation -- including the bad optics of then laying them off.

It also has a further advantage of giving them plausible deniability when the CRTC comes knocking. This allows them to claim that the contracted companies were renegades that (plausibly) violated the DNC regulations without explicit direction from themselves.
On Monday, Bell said it had terminated contracts with two telemarketing companies and suspended “several others” as a result of the investigation. Like Telus, Bell pledged to stiffen guidelines for telemarketing practices.
The deniability is plausible, but I simply do not believe that they did not know of or contribute to the violations by the contracted companies. I am not implying that they gave their contractors explicit directions to ignore the DNC registry, only that I can easily imagine there was a bit of "nudge, nudge, wink, wink" going on in parallel with the more formal instructions. Consider these points:
  • The duration of the violations.
  • The reputation of some of these firms is not always the best. They are often known to be aggressive in their methods for their other customers, such as charities. They would also have known which numbers were most likely to get a positive response from their historical database, although I of course don't know if they used that data in their contracts with the telcos. I can only suspect the possibility.
  • Many of the numbers from which the telemarketers won business for the telcos must have been on the DNC list, and telcos should have known it. I'll bet they were careful not to cross-check the two lists.
  • Worse, many of those numbers must have also been on the companies' own opt-out lists of people that explicitly requested that they not be contacted. The CRTC mentioned this point, though not quite in the same context.
Although plausible deniability does not excuse Bell Canada and Telus from paying fines and making other restitution -- and I doubt that they would have ever thought it would -- it does help them smooth over the inevitable public outrage by (as shown above) promising to do better in their contracting practices in future. In other words, they take responsibility while also shifting the ultimate blame to others. I think it is more likely that they knew exactly what they were doing and set things up so they could offer the plausible deniability excuse when trouble struck. The script they're following looks too perfect to support their innocence. However this is only my speculation, not an accusation.

Tuesday, December 21, 2010

Lunar Eclipse and Overwrought Coincidences

Lunar eclipses are delightful to watch. I've witnessed the more awesome spectacle of a solar eclipse, but lunar eclipses win in the long run because they're more frequent, are visible to half the planet, last a long time and are pretty and safe for viewing.

This one is being aggressively promoted in the media as special because it coincides almost exactly with the solstice.  Since the advance of the lunar nodes (the azimuth of where the Moon crosses the ecliptic) is not in resonance with Earth's orbit around the sun, this particular coincidence is no more or less probable, or interesting, than an eclipse falling on any chosen calendar date. All this coincidence really tells us is that we can finally look forward to increasing hours of daylight and solar insolation (although the weather doesn't start warming up until February).

One coincidence that isn't a coincidence is that this eclipse occurs at a full moon. This "insight" comes from an interview with some new-agey sort of person that I heard on the radio. Well, duh! I suppose we can also add the, um, coincidence that we see the eclipse occuring at night. These facts are about as coincidental as a flipped coin that lands head's-up showing an impression of Her Royal Highness. It is also why this eclipse will not be visible from anywhere in Antarctica -- the explanation of which I'll leave as an easily-solved puzzle for the geometrically inclined.

Apart from the obvious things about lunar eclipses, there are a couple of items that are less widely considered. The first is that when we see a lunar eclipse, if someone is on the moon (anywhere on the hemisphere facing Earth) they will be simultaneously viewing a solar eclipse.
...the view from the moon during the eclipse, with the Earth in front of the sun, would be a spectacular red ring in the black sky.
It should be no surprise that the colour of the scattered sunlight from the Earth's atmosphere (from its visible circumference) is brownish-red (or copper) since that is the tint the moon takes on during the lunar eclipse. In similar fashion, the frequency of solar eclipses for Moon dwellers is the same as lunar eclipses for Earth dwellers. From the Moon, the apparent diameter of the Earth is quite a bit larger than the Moon appears from Earth and so is more likely to cover the sun.

However, there are no "terran" eclipses when the Earth is full (Moon between the Earth and sun) since the Moon's shadow never covers more than a small area on Earth's surface. That occurs when we on Earth see a solar eclipse.

The second interesting thing about lunar eclipses I want to mention refers back to an earlier post of mine that talked about why the full moon always passes high overhead during the winter months at higher latitudes, like here in Ottawa (also at high southern altitudes such as southern Chile); however that, too, is no coincidence. What this means is that the best lunar eclipses are those that occur around the winter solstice, just like the one this week, because the Moon will pass high overhead for optimal viewing.

Unfortunately this is also what I have always hated about lunar eclipses, because the best ones occur when you have to endure cold winter nights if you want to watch them properly. They're never quite so nice when seen through a window, which in any case will be difficult since in many houses the Moon will be so high as to be blocked by the eaves of the roof. In other words, to stay warm while watching this eclipse you'll probably have to deal with window-glass distortion and an uncomfortable viewing angle.

Despite having said all this, my plan (I am actually writing this the night before) is to stay warm and in bed and pass on the joy and the cold. Anticipation for this event by people here in Ottawa may be for naught in any case since it now looks as if clouds are going to spoil the event. Well, there's always the internet so I'm sure there will be lots of photos making the rounds on astronomy blogs Tuesday morning.

Friday, December 17, 2010

Central Banks vs. Public Debt

Imagine that there is a car in front of you that is suffering from a range of mechanical ills: wheels out of alignment; needs an oil change; transmission won't reliably shift into reverse; and so on. Someone then places a tool in your hand and tells you to get to work. You look down and what you see is that you're holding an impact wrench. It's a very powerful tool but wholly inappropriate for most of the work ahead of you. However, it's all you have so, good luck, and give it your best shot.

Central banks are often in a similar dilemma: the economy can suffer from a variety of ills due to many and complex causes and inter-relationships, yet they must attempt to get the economy back on track with pretty much one tool, that of monetary policy. They are often smart enough to do the best they can with the tools they have but without access to a wrench and other useful tools there is a limit to what they can realistically accomplish. Often they must resort to leveraging the grand stature of their institution by giving speeches and influencing those holding the proper tools -- industry, consumers and government -- to effect desired outcomes.

Consider this quote from Peter Foster's opinion piece in the Financial Post:
“Cheap money is not a long-term growth strategy,” warned Mr. Carney during a speech in Toronto on Monday. But where did this cheap money originate? Also, from what I can remember of economics 101, cheapness is a signal to purchasers to buy, and that includes buying money. People are acting entirely rationally. The only problem is that they are likely not aware that they may have been lured into a cul-de-sac by delusions of macro management.
And this one by Maxime Bernier, also in the Financial Post:
Mr. Carney offers us three “lines of defence” that are clearly an admission of impotence.
Here we have Bernier, a government MP and former cabinet minister, complaining about the BOC's impotence when it is the government which sets the BOC's powers. It is amusing that he then goes on to complain as follows, in effect the pot calling the kettle black. For his part, Foster blames Mark Carney for only having, and then using, the limited toolkit he's been provided with by the government (elsewhere in the piece, he also seems to be confused about the respective roles and powers of the BOC and the federal government).

It is the government, not the central bank, that has the better toolkit for repairing the economy. This starts with building confidence among the true economic players: citizens, both as consumers and business owners. They could also use their powers over taxation which can be used to more accurately target problem areas than is possible with the central bank's interest rate policies. For example, the government could lower corporate tax rates, which would have the affect of encouraging private sector investment and hiring similar to lower interest rates, but without simultaneously encouraging borrowing. That is, leave more capital in the hands of those entities that can give the economy the push it needs.

There have been attempts by the BOC and the government to blame the banks since they are the ones we go to for our borrowing needs. This is unfair: the banks lend money as a business proposition and make loan decisions based on risks associated with both the broader economy and the individual borrower. Ed Clark, CEO of TD Bank has quite rightly deflected the criticism right back at the government. The points he raises in this article are spot on in my opinion. If the government, for example, wants to rein in low-quality mortgage risk -- most commonly associated with the longest-terms with their lower monthly payments but high interest costs -- they should prohibit them.

Speaking of blame, we should also beware playing the blame game when it comes to the US Federal Reserve or the Bank of Canada. It is easy to point fingers and they are tempting targets. Yet they would have an impossible task if they are the only institution expected to right what everyone else has set wrong. They can ease interest rates lower to make it less expensive for consumers and businesses to spend and invest, but that policy can spark investment bubbles and inflation. Go the other way and, as happened so famously following the 1929 crash, and we can be pushed into a deep depression. Finding an optimal middle ground, if it even exists, is more than a little challenging for a central bank.

Even so, Mark Carney is not being entirely forthright regarding debt and, as I will come to, neither is the government. The Governor's warning goes something like this:
When rates do begin to rise again, Carney said, the repercussions may be fierce and have the potential to catch many with debt loads they can no longer afford.
This is true and, although there are words of agreement from Flaherty, there is no mention of the government's own debt problem. They tell us, as individuals and as business owners, to be careful not to take on debt that we cannot easily repay when interest rates rise once more, while at the same time the federal government is taking on over $50B of debt in the current fiscal year. That is not really government debt; that is public debt. On our behalf the government is borrowing money against the wealth and wealth-production capacity of the Canadian public. That debt, too, could easily become difficult to repay when interest rates rise.

There is a measure of hypocrisy when they fail to discuss government-incurred debt, debt which is also our debt and subject to the same risks. I do not mean to criticize the government having used this debt to smooth over the worst impacts of the economic air pocket we've just been through, just that they should not avoid lecturing themselves at the same time they lecture us. The lecture is a good one for both the private and the public sectors.

The thing is that Flaherty does intend to rein in spending, eventually, and so he has missed an excellent opportunity to lead by example and explain how both types of debt are due to public borrowing. Perhaps he is being cynical in an attempt to retain some flexibility to keep spending, and taking on more public debt, for a while longer.

The question is even more pertinent in the United States where Federal Reserve Chairman Bernanke is thinking of the extreme government debt policies he has been pushing, including financial sector bail-outs, which very much depend on keeping interest rates low at least until some of that debt can be extinguished. We had all better hope that he does a good job of juggling interest rates and debt policies since if he or the US government stumbles the Canadian economy will also suffer.

Tuesday, December 14, 2010

Cord Cutters: Small Numbers Matter

One of the more-recent terms being tossed about in the telecom trade press is that of cord cutters. It is being applied in particular to cable customers that terminate their cable service, including TV and broadband, in preference for some alternative. There are not many alternatives. For TV it is OTA (over the air) broadcast, satellite and, in a minority of cases, telco fibre such as Verizon FiOS.

The questions are whether the phenomenon is real and, if it is real, is it significant? To date the number of these cord cutters is deemed to be small since the quarter-to-quarter downward move is vanishingly small. Yet it is not this small drop that is the question, since it could be a statistical blip or a temporary impact of the recession, but rather that the growth has vanished. Growth matters since that is what investors want, generally preferring that (if the choice must be made) over flat but reliable dividends.

Cable TV has certainly reached market saturation years ago, so that it can only move higher as the population grows; it can also rise if service were to be extended to more rural areas, but this is unlikely to ever occur. In other words, it is the cable companies' business to lose, just as telephony has played the same role for the telephone companies. With growth in raw subscriber numbers stalled, cable growth must come from increased ARPU (average revenue per user). The required growth has at different times in the past been satisfied with incremental channel tiers, PPV (pay per view), broadband and telephony. PPV is under threat from the likes of Netflix streaming entertainment, telephony continues to grow a slow place, while fibre, DSL and especially wireless are increasingly meeting the needs of bandwidth-hungry consumers. It appears that this is one of Comcast's motivations in their current dispute with Level 3, even as it pushing to consummate the deal to purchase NBC Universal to gain control over the content their competitors need.

Unfortunately for them, simply raising rates, either directly or by usage-based billing, only makes competitive alternatives look more attractive. Although the alternatives are not many and not particularly cheaper, every upward tick in the price does drive a small percentage of subscribers to defect. This is an important signal for a couple of reasons. First, every dollar of revenue lost falls almost immediately to the bottom line -- profit -- since many of their costs are not elastic, or at least cannot be reduced quickly.

Second, and perhaps more importantly, demand can fall far more precipitously than it rises. That is, like an avalanche, one modest snowfall or a quick thaw can trigger a sudden dislocation of the snow cover. The cable companies are treating their customers like the proverbial frog in a pot of heating water, except that people are (usually) smarter than frogs and will jump out when the heat becomes uncomfortable. This is more likely to occur when the market, like theirs, is saturated. For a comparable situation you should read this nicely done analysis of RIM's woes.

To conclude, it is not the small number of cord cutters that matter but the trend and the increasing motivation of their customers to defect en masse. We can only know that this is truly occurring after the fact in a retrospective analysis. Nevertheless, whether we are cable company investors or customers it is a situation that could reward close attention over the coming year.

Tuesday, December 7, 2010

Influencing Network Neutrality Outcomes

One thing you will likely notice when you pay close attention to any public policy discussion is that every party to the discussion will attempt to steer the outcome in a direction that serves their own interests. This is especially evident when there are different and divergent views. A common technique is to frame (or spin) the very definitions of the foundational ideas and catch-phrases to align with their preferred mode of thinking about the issues. This is equally true when it comes to network neutrality.

I my previous article I drew attention to this definitional issue. The issue exists because there is no broadly-accepted or legal definition of the term network neutrality; the legal definition is the more important of the two since it will persist and be enforcable even when discussion becomes confused. In the article before that one I listed a few major alternative, but not necessarily mutually-exclusive meanings of network neutrality. Every party to the discussion tends to list of wants in their particular definition of the term, and this is true whether it is consumers, carriers, content providers, the FCC and politicians. Even academics and industry analysts need to watched carefully since many are not neutral on neutrality; many have identifiable interests or ideological perspectives that can bias what they say, and the media tends to highlight those with the more extreme views.

Interested parties are therefore angling for influence and see the public relations battle as one where they want the prevailing understanding of network neutrality to align with their interests. It is also important to note that in addition to defining network neutrality, they also wish to define what it is not. For example, a network owner with media interests (such as Comcast or Bell Canada) might like to exclude equal traffic priority for other content providers from the network neutrality debate.

When these companies wish to create public support for their ideas they will target their messages accordingly. For the peoples' representatives in political office they will talk of (and exaggerate) the number of jobs and economic activity, including taxes, for which their industry is directly and indirectly responsible. However they will often choose to not mention the future potential for economic growth if other industries and business models are enabled by forms of network neutrality that are less friendly to their business interests. Being well-established with deep pockets they also have the capacity to contribute to politician and party campaign funds, and that gets them a degree of access and influence that may not be available to others.

They will also stoop lower to get the public support they need. For example, they might announce that if they can't throttle, otherwise manage or charge extra for heavy-duty downloaders -- which network neutrality, they say, will make impossible -- the pipes will get blocked up and you, dear user, will have trouble downloading dancing baby videos from YouTube. "Oh noes!" You might say to yourself, that's unthinkable, so of course network neutrality shouldn't allow that from happening:
Fourth: Network management. ISPs need incentives to run their networks, and we want those networks to be the “freest and fastest in the world.” Therefore, “reasonable network management" will be allowed in order to deal with harmful traffic, congestion, and other network problems. Again, we'll need to wait for the rules to see what might count as reasonable and what might not, and who decides.
The regulator itself is a party which has its own self-interest to protect. It is said that the first priority of any bureaucracy is to continue its existence. Therefore in the coming meeting we should expect the FCC to promote a view of network neutrality that requires FCC oversight, ensuring their continued relevance for years to come. Their task isn't easy since they must navigate the obstacle course of unfriendly politicians and industry power to develop policies and regulations that also maximize consumer interest and the national interest. All you have to do is read through last week's statement by FCC Chairman Genachowski. Consider, for example, the following sentence:
Informed by the staff’s additional legal analysis and the extensive comments on this issue over the past year, the proposal is grounded in a variety of provisions of the communications laws, but would not reclassify broadband as a Title II telecommunications service.
Notice how they've backtracked on reclassifying broadband as a telecommunications service, and therefore the non-discrimination aspects of common carrier law, to appease voices in Congress that want to assert their own power to determine policy. Statements like these are a good way to keep score regarding how successfully the various interests are wielding influence.

It isn't just the FCC that has to navigate the minefield: the same applies to the industry players themselves. Think back a couple of months to when Verizon and Google published a joint position on network neutrality. The predominating reaction that I noticed was one of outrage from the public, and even some of the companies that compete with one or both of these behemoths. The FCC itself was more circumspect. At the time I mentioned that this was one was to promote progress on a contentious public policy debate since the regulator and the government are potentially freed from having to (unavoidably) upsetting the status quo, creating both winners and losers; if the contenders agree up-front, that can create the conditions for an acceptable compromise.

The reality of joint-company proposals such as that by Verizon and Google are unfortunately less than they may seem. It is most enlightening to actually read the statement. I recommend doing so although it's a painful document to peruse. The reason that it is painful is not because the joint proposal is so terrible, but because it is so vague. Ultimately it is almost useless since it gives little of concrete value for the FCC to deal with. Let me pick one pseudo-random passage to highlight this point:
Additional Online Services: A provider that offers a broadband Internet access service complying with the above principles could offer any other additional or differentiated services. Such other services would have to be distinguishable in scope and purpose from broadband Internet access service, but could make use of or access Internet content, applications or services and could include traffic prioritization. The FCC would publish an annual report on the effect of 2 these additional services, and immediately report if it finds at any time that these services threaten the meaningful availability of broadband Internet access services or have been devised or promoted in a manner designed to evade these consumer protections.
Take a moment and try to parse that text. This is vague beyond reason and is wholly unsuitable as guidance on effective regulatory enforcement. It wouldn't, for example, stop Comcast from favouring its (coming soon) NBC Universal content by applying "traffic management" if it were to make the sort of charges it recently made against Level 3, and therefore Netflix, a competing content provider. Even their recommendations on FCC enforcement provisions are weak by making them conditional and fines which are (for the companies involved) inexpensive. A lesson here is that companies that compete to some degree are often not much better than a third party, including regulators, to find a middle ground that most would at least grudgingly accept; disparate and competing interests are inherently unresolvable if the objective is winner-take-all. I don't believe anyone should worry overmuch about how much impact the joint Verizon-Google will have.

When there is a joint proposal, or at least some grudging agreement among competitors, sometimes it does matter. Although it is fair game for companies to attempt to directly influence the regulator through the formal process -- you can't oppose this speech just because you don't like their message -- there is (at least) one way in which it can be judged as less than fair. To show this I have to backtrack a bit on something I said last month:
...the cable companies had to support the telcos so that the CRTC would be encouraged to choose the option that most benefited them...
Joint company proposals to the regulator (the CRTC in the above case) when they come from competitors can, in one sense, be seen as promoting progress on a controversial topic such as network neutrality. However, sometimes when competitors agree they do so to give the impression that the issue is resolvable since the regulator should not have to deal with the tricky issue of balancing the interests of competitors, large and small, new and incumbent. This is normally the minefield where the regulator often makes the mistake of choosing favourites among technologies and business models in an attempt to promote competition without unduly giving anyone an advantage.

The potential trouble is when the joint company position, if adopted by the regulator, has the effect of further entrenching those companies in a way that disadvantages consumers, other competitors or the greater public good. For example, if Bell Mobility and Rogers Wireless were to propose a (hypothetical) CRTC-mandated "network improvement fee" of $100/month on every subscriber's bill, they would achieve network parity, massively increase their revenue and make fruitless the lower prices of new wireless carriers. While this is of course a pretty extreme example that would never happen, it helps to illustrate the possible threat represented by the more-nuanced one to the FCC by Google and Verizon on network neutrality.

The machinations and vague threats that are now coming more and more frequently will not abate during the coming FCC meeting and subsequent rule-making process. It's terribly irritating but also important. Unfortunately the voices of the large companies and their interests are likely to drown out the quieter, more dispersed voices of the public, at least those in the public who can afford to pay attention and see past the spin.

The effect will be to manufacture what will be called the public good rather than focusing on what is good for the public. Of course the debate itself would be superfluous if there were more competitive choices since the companies would have to serve the public interest to avoid losing customers. Failing that, regulation is the second choice, poor though as it so often is.

Wednesday, December 1, 2010

Comcast vs. Level 3 vs. Network Neutrality

My preceding article on the bafflegab that follows the network neutrality debate like a malevolent black cloud only briefly touched on the brewing dispute between Comcast and Level 3. Within hours it has blown up into a widely-covered issue that, in the spirit of the bafflegab I talked about, is being used by many to trumpet their own entrenched position in the internet food chain. I had intended today to build upon the theme I introduced in yesterday's post, but this dispute simply provides too good opportunity to highlight a few key items in these companies' war of words that fit well with the theme I introduced.

First off, I do not intend to do what others are already doing; there are a few (maybe more) excellent articles that dig deeper into what the dispute is really about, but that may get drowned out in the flood of media coverage. There is one in particular I want to recommend, which was published by Ars Technica, that is short, lucid and does a good job of covering off the deeper nuances of the dispute and also provides references to some more comprehensive background material. If you want to learn more rather than merely pick sides in the fight, go read it.

There is only one sentence that I want to mention here since it goes to the heart of just what network neutrality is all about:
The DC group Public Knowledge blasted Comcast's stance as a net neutrality violation.
I like this since it is such pure nonsense. The thing is, there is no commonly-accepted definition of network neutrality when it comes to the internet -- which includes access, transport and services -- and there is certainly no law on the books that anyone is violating. That statement is mere misdirection with the apparent intent to sway others towards Public Knowledge's position.

To understand this more deeply we need to look at just how the internet is assembled -- the Ars Technica article provides better references so look there if you want more than the following brief and somewhat superficial description. The internet was originally constructed of autonomous systems that interconnected by mutual agreement for the mutual benefit of their users, irrespective of each system's size, public or private or government, using data connections whose cost was either shared in some fashion or covered by one of the two parties. The telco monopolies, which were common carriers, that provided those long-haul and short-haul transport facilities had no interest whatsoever in what was being carried; they billed for the transport and never even saw what was inside those pipes. This is peering in its simplest form.

As the internet opened to the public, the structure had to evolve. First, ISPs came on the scene with their racks full of modem banks and oodles of telco business lines that users dialed to access the internet and local ISP services like email. Those phone lines were subscribed by public tariff and were subject to the telco's common carrier responsibilities; that is, the telco had no choice but to offer those business lines as long as their network equipment wasn't harmed. At first this was good business since it brought a lot of new revenue, both from the ISP lines and all the second lines that residential customers installed for their computers.

The ISPs negotiated peering agreements with other ISPs to mutually terminate traffic to their customers, which included both users and services, and to route traffic to other ISPs further removed, even across the globe. The industry moved toward more stratification between access ISPs that served users, backbone providers that only provided transport and routing between access ISPs and other backbone providers, and service providers that provided the content that users wanted to access.

Backbone providers found it easy to peer among themselves since their traffic flows tended to be similar in both directions, and not charging for that traffic made sense since the burden of accounting could be dispensed with and the monthly net tended towards zero. The traffic differential between access ISP and backbone providers increasingly became unbalanced since as users accessed services that utilized the increasing access bandwidth they did not transmit much data upstream. Besides, since the backbone providers could not cover their business costs by peering, they charged the access ISPs for their services; but not for the raw transport, which the ISP alone was responsible for by contracting with a telecommunication provider, which was usually a telco or a raw transport provider such as MCI.

Notice that in none of this did I mention regulation once. That's because there was none. The data transport facilities that everyone used came from common carriers of one sort or another, but while the common carriers were tightly regulated the users of their transport services were not. It's the same whenever you pick up the phone and call someone; the telco provides a tariffed service as a common carrier but you are not regulated. The common carrier does not and, importantly, must not concern itself with how you are using that tariffed service. If they do get involved then they are almost certainly breaking one or more conditions of their licenses or even the law.

Because unlike network neutrality, common carrier has a definition in law and is enforced through government licensing and oversight, complete with penalties for non-compliance. However there is a benefit to the common carriers because even if you use a common carrier service to commit a crime (such as making a drug deal or arranging a murder) the carrier is protected by law from any liability; you can't sue Bell Canada because someone used their services to commit a criminal or civil offense that causes you harm. Similar legal protections have been extended to  ISPs and related services in the US for nearly 15 years by means of Section 230 of the Communications Act, although they are not common carriers.

Getting back to the evolution of the internet, as the business potential grew large and broadband replaced dial-up, through a multi-year frenzy of mergers, acquisitions and emergence of new business models, we now have the carriers in an enviable dominant position where they have the power to dictate terms to others. They have a pretty solid lock on access and transport, wireline and wireless, where they are (as Broadband Reports puts it) the "troll-under-the-bridge. They almost always stand between end users and web-based services, and they are unencumbered by common carrier regulations: the regulations are not applied to corporate entities, but selectively to each line of business that provides those particular services. For example, Verizon is a common carrier but also an ISP. The same applies to Level 3 with their transport and telephony services distinct from their routing and CDN services.

This last point is where an important nuance comes up in the dispute between Comcast and Level 3. Level 3 has a financial advantage over pure CDNs like Akamai since they can "sell" transport and routing between business units at a discount to what they offer to other companies. This may have helped them win Netflix business from Akamai. From Comcast's perspective, they do have a legitimate issue with regard to peering with Level 3 since, due to their CDN business, we would expect the traffic imbalance to be greater than with other backbone providers and ISPs.
Now, Level 3 proposes to send traffic to Comcast at a 5:1 ratio over what Comcast sends to Level 3, so Comcast is proposing the same type of commercial solution endorsed by Level 3. Comcast is meeting with Level 3 later this week for that purpose. We are happy to maintain a balanced, no-cost traffic exchange with Level 3. However, when one provider exploits this type of relationship by pushing the burden of massive traffic growth onto the other provider and its customers, we believe this is not fair.
Of course there is already an imbalance that any pure access ISP like Comcast will see since the bulk of their customers are end users that primarily download from content served by other ISPs. There is nothing to stop Comcast from getting into the content hosting business, it just isn't what they've chosen to do. The traffic imbalance is not due to any nefarious action on Level 3's part, just a natural consequence of Comcast's and Level 3's respective business priorities. Recall that none of this is regulated; while each company does have some FCC regulated business (Comcast's cable TV business and Level 3's transport business) their internet and broadband businesses are pretty much unanswerable to government regulators. Each company chooses its business activities as it sees fit with regard to internet.

To be more blunt, this dispute, which has nothing to do with network neutrality and where peering is optional not mandated by law, is almost entirely a private commercial contract negotiation between two companies. Neither is above slinging around terms like network neutrality, monopoly and fairness if doing so wins them political and public allies to buttress their side of the negotiations. Here's part of what Level 3 has to say:
John Ryan, Assistant Chief Legal Officer of Level 3 Communications, Inc.: "...the fundamental issue is whether Comcast, as the largest cable company in the country with absolute control over access to its cable TV and broadband access subscribers, has the right to unilaterally set a 'price' for that access that effectively discriminates against competitors of Comcast’s cable and Xfinity content..."
Notice how Level 3 tries to drag in Comcast's regulated business activities by insinuating that there is "leakage" between their regulated and unregulated business units which, if true, would justifiably draw government investigation.

The stakes for Comcast are quite high in this game of brinkmanship. While they do have a dominant position as gatekeeper to a huge body of internet access customers, just as they do for TV content distribution, and they are making inroads against the telcos by winning away telephony business, they are in fact surprisingly vulnerable. First, there is the talk about "cord cutters", which are customers, especially 20- and 30-somethings, that are showing signs of abandoning cable services -- TV, broadband and telephony -- since prices keep rising, customer service is awful, and their services are increasingly redundant with wireless voice and data. In other words, why pay for both if you favour internet media over TV and one device, the smart phone, gives you everything you need everywhere you go? Comcast doesn't have a wireless business. Further, even at the high prices charged for tethering, it can be the superior and cheaper choice to connect a PC or netbook to the internet (when a larger screen makes sense) in comparison to paying two broadband bills.

Even worse for Comcast is if the FCC and Congress feels that companies like Comcast are becoming overly aggressive their broadband billing and content practices for a service that is now seen as a pretty essential utility. Political influence goes only so far, and that is when the voters start shouting for Comcast's blood and want the government to take action.

Once they complete the acquisition of NBC Universal, if they continue to jack up cable and broadband rates, offer poor service and use their growing control over content, both TV and internet-based, to their own advantage, they might find they've painted themselves into a corner they can't their way out of. Not only is it more likely that the FCC will pursue turning broadband into a common carrier service, they may get the Congressional support that they need and currently lack. Worse, it is even conceivable, if still very unlikely, that the FCC could invoke Title VI of the Communications Act  and mandate "must carry" for internet media content (Hulu, YouTube, Netflix, etc.) in a manner similar to cable channels if Comcast is too aggressive in prioritizing content of those willing to pay them a premium. Companies with a dominant or monopoly position providing an essential service do have to take care to not cross that invisible and shifting line in the sand that will signal an end to government indifference.

Ultimately it is the consumer that will pay for all of this since all business costs are passed along in one way or another to end user, and governments know this:
This is a problem the Congress and regulators cannot ignore. Just as in the recent retransmission fights in the pay TV world, these rumblings between giant companies leaves consumers in the lurch, even though they’ve actually paid for access to the Internet — that is, the whole Internet, not one approved by Comcast or some other company. The problem, of course, is lack of competition in the broadband markets.
Network neutrality is just a word (or two), and no matter how much it is used to misdirect and obfuscate and attack a company's competitors, the real metric is when the howls of the public become audible in Washington. This is about politics and voter dissatisfaction; network neutrality is a definitional sideshow that should not distract our attention from the real issues at stake in disputes such as that between Comcast and Level 3.

Tuesday, November 30, 2010

Tactical Misdirection in the Network Neutrality Debate

As the FCC proceeding on network neutrality gets closer to a critical point it is interesting to witness how the contending parties are attempting to influence the regulator, Congress and the public. These generally fit within the following categories:
  1. Obfuscate the issue so that it becomes difficult to impossible to discern what network neutrality actually is and, when confusion is successfully established, offer a clear and simple solution which coincidentally (!) benefits the proposing party or at least is disadvantageous to its competitors.
  2. Obfuscate another's obfuscation of the core issues (see previous point) and then offer a clear and simple solution which... well, you can fill in the rest.
  3. Disentangle and clarify the core issues so that everyone can have a clear understanding of the numerous technical and business aspects of network neutrality. Sometimes this is done in an effort to facilitate the discussions but can sometimes be used to promote an agenda. The agenda may be to maximize the public good, but that is still an agenda.
Perhaps a simpler way of saying this that everyone has an agenda and will do what it takes to promote that agenda, even if occasionally one must go so far to even resort to the truth! Unfortunately the volume and quantity of proposals, attacks, misdirections and often poorly-informed reporting makes it difficult for the targets of all this (mis)information to understand anything other than what the best-placed voices want them to conclude. In other words, the public discourse on network neutrality is not only a mess, it's a political mess.

The FCC is quite used to putting up with attacks from the public, industry and Congress. While some of it is justified, it is also true that there are many competent people on staff that understand the issues perfectly well. With perception being reality (as the saying goes) they end up spending much of their time fending off the attacks rather than crafting good public policy that is in accord with the letter and spirit of the law. That, of course, is the objective of those attacks; while the FCC gets knocked around the key players, those that stand to gain or lose from what the FCC ultimately decides are busy influencing the politicians who are in a position to make their interests take precedence in the FCC's deliberations. Politics and regulations are inseparable, whether the industry is telecommunications, resource extraction, banking or pharmaceuticals. This isn't going to change.

Despite the sorry state of affairs, allow me to briefly describe what I believe are a few of the key industry issues that get bundled under the network neutrality banner, since they are so often obfuscated, conflated, misrepresented or otherwise mangled by so many of the antagonists.
  • Carrier control over vertical services: It is always in the carrier's interest to be the provider of all services offered to the public over their networks. Vertical services -- value-added services over and above raw carriage -- have higher margins than transporting bits. This should be apparent if you think about it, although I went so far as to calculate the added benefit years ago when the array of services was much smaller than today. Even better is if the carrier not only offer these services but also can exclude others from offering these services over their networks. This is why the FCC's intent to reclassify broadband under common carrier regulations is bringing out the big guns in political lobbying. This is also why Bell Canada is so opposed to GAS-enabled 3rd-party ISPs and why Bell Canada toyed with building a content empire, including CTV, and Comcast is acquiring NBC. However it is possible to push too hard and suffer what Australian government has just done to Telstra by splitting wholesale (network utility) from retails (vertical services). Yes, sometimes governments really do try to favour consumers over major corporations.
  • Traffic management: Every carrier requires mechanisms to manage network performance so that no one user sees more their share of service degradation; network capacity is always finite -- even though it is expandable, within the bounds of reasonable capital expenditures -- and there is always the risk of one user's activity impacting the service quality of other users. This is especially true for broadband, more so than telephony, because there is a wide range in volume and timing of traffic demand across the user base. The problem comes when traffic management is surreptitiously employed to favour the carrier's own business interests over that of other services that their broadband users choose to access, even if they are not directly competitive with equivalent services offered by the carrier. It has been surreptitious or even misrepresented because blatant discrimination can result in serious political fallout and harm their long-term interests if the government feels compelled to quell voter outrage with some heavy-handed market intervention. Traffic management will remain an issue even if broadband falls under common carrier regulation because it is so much of a cat-and-mouse game where the carrier has all the control and outsiders are hard-pressed to prove a solid pattern of discrimination.
  • Volume and speed pricing: When control over vertical services and traffic management are prohibited or otherwise fail to glean the revenue and market control the carriers would like, they can in the end resort to pricing strategies. These fall into the categories of usage-based billing (UBB) and speed tiers where the more data you consume and the higher the access speed you desire, the more you pay. If (and I mean if) the carriers are truly relegated to the role of broadband utility with no competitive vertical services and no ability to throttle selected applications, it can be reasonably argued that pricing should be in proportion to network load. Where this crosses the line into the realm of network neutrality is when they price by volume and speed in a manner that is implicitly discriminatory, especially if they pursue this strategy in parallel with offering competitive vertical services. This has come up time and time again in the CRTC proceedings on Bell Canada's GAS technology coverage and traffic management practices where the accusation is that if they do not similarly throttle or price their own retail services then they are ensuring 3rd-party ISPs offer an inferior service. Also telling is how carriers position and promote volume and speed tiers that can inherently place a monetary burden (or at least allegedly) on users that subscribe to services such as Netflix, where there is in effect an extra charge to the user for downloading a movie or other bandwidth-intensive media. That is, even if the carrier isn't or is prohibited from these high-value vertical services, by pricing broadband with surgical precision they directly derive revenue from those services. Precision is required since if the price-volume boundaries are too aggressive, the majority of users will be hit even though these types of services currently have low penetration (although this will change) or, in the other direction, they will fail to garner the desired revenue from carriage of those services. That is, they want to get paid but without attracting unwanted attention from the regulator. This is the reason they make a public show of all the pain they feel when a (current) minority of users use these services. If successfully executed, the regulator doesn't act and their revenue will climb substantially in the coming years. Of course if they offer competitive media services they can even double dip.
Apart from the tactics I listed at the beginning of this article I was planning to say a bit more about how the US carriers are manipulating the regulatory proceedings and political process on network neutrality to get what they want, but I've run out of time. I'll come back to this in a day or two.

To end this article I will repeat a point that I (and many others) have stated: in a truly competitive market there is no need for regulations over company business practices and services since market forces alone will compel all companies to lower prices and offer more and better services if they intend to survive and maximize shareholder value. We are still a long way from achieving that in the telecommunications industry despite what many of dominant carriers would like us to believe. There are few enough choices for consumers to reach the content and services they desire that all these political games do have a major impact. The fight for and against network neutrality will be with us for years to come.

Thursday, November 25, 2010

White Gold

With the ubiquitous attention on gold's rise it is easy to forget that there are other commodities doing very well. Gold also benefits from an allure that dates from antiquity. Yet when you look at the mundane fact of price and return on investment, it is less compelling. Priced in US dollars, gold's price has increased about 25% from the beginning of the year. This is nice except that when measured in Canadian dollars the rise is less: around 20%.

This was brought home to me when I spoke recently with someone in an industry that depends on cotton. This commodity makes gold look like a value laggard in comparison. While it is particularly volatile right now, it has appreciated about 60% since the beginning of the year, and peaked recently at 100% (all in USD). Not being someone who follows cotton at all, this nonetheless surprised me since I would have at least expected it to get more media attention. Except that the media, like me, can't seem to tear its eyes away from gold.

Although I have no intention of investing in cotton or cotton producers, and it may be in a bit of a bubble right now, it is interesting to look at some of the factors driving it higher. However please keep in mind that my understanding is not deep.

It seems that a large part of the problem is in Asia, especially China, where higher labour costs are causing disruption in the industries where cotton is a major input, such as clothing. As their ability to compete with other countries declines due to higher wages leading to lower margins, manufacturers resort to hording. This may start simply enough, with inventory building up due to slower production, but as supply gets tied up the price rises and leads to more hording. The theory here seem to be that there is a greater return to the manufacturers from holding onto the physical commodity until the price rising even further, and then selling it to other manufacturers, that uses it now in production.

This is not sustainable since unlike gold, cotton is not used as a global hedge against currency depreciation; it is pretty much only valuable as a raw material to make clothing, bedding and other marketable products. When used to speculate, the result is hording:
China’s leaders promise to hunt down and punish hoarders and speculators. According to Andy Rothman of CLSA, a broker, some traders are taking possession of agricultural commodities in the hopes that prices will rise.
It seems that hording is a common feature of Chinese business and is not limited to cotton and other globally-traded commodities. It is happening with perishables such as tomatoes and may also be partly the case with rare-earth metals.

Considering that it takes a warehouse and freight transport to horde and trade a monetarily-significant quantity of cotton, I imagine that ordinary speculators must rely on ETFs and futures contracts. Don't go looking to empty your closets of old clothes to profit from the cotton bubble.

Monday, November 22, 2010

Similarity Of Political Polls and Markets

As I skimmed through this National Post article about the resurgent popularity of French President Sarkozy, I was reminded of the more than passing similarity between politics and markets. At first blush this may seem an odd comparison but not, I think, on closer examination.
  • Both politics and markets probes human sentiment, and are therefore both subject to the same attributes of human psychology, especially that of large groups. In particular, they quantify human sentiment.
  • Opinions are often strongly influenced by what others are saying or doing. If you think gold is in a bubble, you are likely beginning to doubt yourself by all the people selling jewelry for its melt value and the "gold is going to the moon" headlines in the edgy investment media outlets. The same is true if you keep hearing how good (or bad) a job a politician is doing and you had voted against (or for) them last election.
  • When the mob gets enough momentum you get a transitory bubble in the market, and in politics you get Trudeau-mania or Obama-mania. Of course this also works in the other direction to push the market into a tailspin, such as in late 2008, or to oust a political leader, as Gordon Campbell of B.C. recently discovered.
  • Opinions are unstable between elections just as are views on companies and commodities leading up to an investments decision. Opinion polls and pontificating on markets are easy since people have no skin in the game.
  • Opinions can be entirely different once an election occurs, just as when a buy or sell decision is executed. People then become invested in their choice for a time, often rationalizing their choice to themselves and others, even when it goes wrong. The shouts of denial can get quite loud when the elected do what the voter does not agree with, just as when an investment goes sour. Just look to any pro-Obama or Sarkozy voter, or any gold short.
The particular reason why the new about Sarkozy's popularity struck me is that the polls remind of a dead-cat bounce in the market. If you are unfamiliar with that pattern, it is when a terminally-ill security crashes to the ground and then has a feeble bounce off the pavement before staying down for good. This uptick in Sarkozy's numbers could be just that.

Of course this isn't certain until the opinion cycle completes and we know for sure whether the decline resumes or if his reputation is on the mend. This, too, is true in the markets. If it were possible to bet on his future popularity it would be just like trading on pure technical analysis, rather than fundamentals, to those like me who know little of French politics. Both are bets on the behaviours of masses of other people.

Friday, November 19, 2010

Nokia Continues to Spin Its Wheels

Two months ago I (and many, many others) speculated or hoped that the changes in Nokia's top management augured well for a change in strategy. In particular, I wondered how they might now respond to the challenge from iPhone and Android. The choice is stark: Symbian is old and difficult to upgrade to compete on features in a timely fashion or switch to Android. Those aren't the only choices since even Windows is now showing some promise. However they haven't diverged too far from their previous trajectory by banking on MeeGo.

The question I have to ask is how exactly does this decision help them? They know they have to do something, and fast, but I suspect that pursuing MeeGo will only delay their ability to effectively compete. MeeGo remains feature poor in comparison to iOS and Android, and those platforms continue on a fast development pace. Yet their stated reason for sticking with a unique platform is to differentiate themselves from others.


There was an interesting article in the Wall Street Journal this week on Nokia's current direction that is well worth reading. I'll quote a few lines from that article in this post.
Though the go-it-alone strategy puts Nokia in competition with an increasingly powerful Google, the rise of smartphones has forced cellphone makers to differentiate their products and generate profits increasingly through the software they offer. Using Android or another platform would likely leave Nokia in the steadily lower margin business of hardware.

Alberto Torres, Nokia's executive vice president for MeeGo computers, argues it also would tie its hands in distinguishing its smartphones with new innovations, ultimately benefitting Google's search business at Nokia's expense.

Referring to other handset makers that have adopted Android, he said in an interview at the Dublin developers meeting this week: "Frankly, some of these alternatives in the market are not necessarily providing a lot of opportunity for innovation, and that is what we hear from people who are using those platforms at the moment."
Nokia has said this before and I still don't buy it since they are not at all clear at stating just what it is that will demonstrate their innovation. That is, what will MeeGo allow them to do that Android cannot? Innovation does not just mean different, it has to mean something unique or better. Instead we hear again about the Ovi store, their developer community and proprietary applications.
Nokia also has spent heavily to catch up to the iPhone and Android with its own platform and set of software services, under the brand Ovi. Those investments include its $8.1 billion acquisition of digital map maker Navteq in 2007, which competes directly with Google Maps.

Pairing with Google at this point would mean negating all of those investments, said Roberta Cozza, an analyst with Gartner. "Putting everything into Google's hands would mean all the work on Ovi would be gone, and I am not sure what that would change for them," she added.
This is misleading since it is certainly possible to put their maps applications onto Android and still keep their Ovi brand. They can even negotiate with carriers to choose their proprietary apps over Google's for the Android devices they market. I am left wondering if they are feeling uncertain about competing head-to-head with Google and prefer to use platform lock-in to promote their apps while also barring others.

The danger is that they could lose both the phone and software business if Nokia smart phone products continue to lag and the carriers simply go with the platforms, and device vendors, that give their customers what they want. As time goes on, that list is less likely to include Nokia, with or without MeeGo and Ovi. App developers will continue to make the same decision, leaving Ovi with a growing application gap.
Nokia's decision to push MeeGo over Android stems in part MeeGo's capability to support not only smartphones, but a variety of products consumers use including tablets, televisions and even automobiles, [Gartner] says.
This is obviously false, and I am astonished to hear it from anyone, especially an analyst that follows the industry. By next year the market will be awash in Android tablets and a growing list of other devices. In contrast, MeeGo is still in catch-up mode.

Nokia has to seriously -- and I do mean seriously -- determine how they can be different with a compelling platform and portfolio of products and services. One can only hope that they do know and are choosing to play it close in their public statements for the present. Nokia is a good company so when the new management team comes to a point where they are able to implement major changes, they may do so. To succeed they need good hardware, user interfaces and a few innovative apps and services, all of which are within their ability. However none of this requires Symbian or MeeGo.

Wednesday, November 17, 2010

Economic Impacts of Spectrum Auctions

Radio spectrum is a peculiar asset. It exists, it is continuously renewable, under certain circumstances it can even be shared. If its use is uncontrolled there is ample opportunity for impairment of services which exploit the resource. There is good reason to believe that eventually -- but not anytime soon -- that with more agile and intelligent technology, sharing and optimization can be largely automated. For the present we need regulations and licensing to allocate spectrum to specific users and how it's used.

Up until the mid-1990s the commercial and technological requirements for mobile phone spectrum was quite modest. The number of users was comparatively small and analogue voice technology was amenable to fairly uncomplicated channel management protocols. Governments tended to assign spectrum licenses to a select group of companies, and used those licenses as a tool to either further strengthen incumbent carriers -- some that were owned in part by the same governments -- or to foster competition by splitting spectrum among several companies with the demonstrated wherewithal to build a sustainable business.

Then came the idea that spectrum is a national asset that can be assigned a market price and can therefore be leased to competing entities by means of open auctions. Always ready to open the treasuries to new revenue streams, especially revenue not derived from broad-based taxes, the politicians listened and made it so. This idea was implemented with enthusiasm in the US, where they raised many billions of dollars. Canada and other countries picked up on the idea and spectrum auctions spread across the globe. Governments enjoyed the windfall of the wild bidding wars that ensued, and their populations liked the idea that taxes could be avoided even while expanding public services with this new revenue source. Of course nothing is ever that clean and simple: there is a public cost, even though it is sufficiently disguised to fool many people.
Besides the specific problems that they raise, spectrum fees share one big problem with auctions: They can too easily be used as cash cows. Experience shows that the government always needs more money, in booms and in recessions. Nearly $6-billion has already been earned by the federal government in auction proceeds, and $130-million is paid in annual spectrum fees at current rates.
Even at $130M annually these annual spectrum fees are not onerous. If we assume 24M mobile phone users in Canada, that works out to about $0.50/month/subscriber. That isn't much, even if it is annoying, and it is hidden in the price of service. Yes, you and I are paying those spectrum fees indirectly since, as with all input expenses, they are calculated into the price of the service.

This is fair; the carriers are businesses and they should be free to recover their expenses and earn a profit. Except that what we have here is a type of hidden government tax, where the government charges the carrier that then recovers the cost from subscribers. Unlike the fake fees the carriers are so fond of loading onto phone bills, this is a real cost of doing business that is imposed by the government. Unlike manufacturers and excise taxes that were replaced by the GST two decades ago, in part to make explicit to consumers those previously "hidden" taxes, spectrum fees are a current hidden tax on us.

While the annual spectrum license fees are small, that is not true of the basic licenses themselves. As the article pointed out, this amounted to a one-time fee of $6B. That works out to $250/subscriber (this is a grossly simplified model, though sufficient for the present discussion). This is amortized over a longer term than the annual fees so that, assuming a 10-year term (for planning purposes), it works out to $2/month/subscriber. This is beginning to become a significant portion of the prices we pay, and it get worse. First, the $6B must be paid up front once the auction is concluded (or sometimes in payments over a year or two) and in most cases the winning carriers must borrow to pay for the licenses. Just like with mortgages, the final cost can be substantially more than the original spectrum license fee. This is accounted for in the carriers' actuarial calculations so that the prices they charge us also cover their borrowing costs.

There is more. The way in which the government conducts the auction affects the development of competition, or the lack of it. This happens in two distinct ways, which I'll cover in turn. First up, which companies can bid for spectrum.
Rogers Communications Inc. chief executive Nadir Mohamed urged Ottawa to speed up plans on a key auction for new airwaves while saying federal officials must ensure the bidding does not disadvantage "made-in-Canada companies" like Rogers.

"In the last spectrum auction, the government restricted who could bid . . . existing customers were disadvantaged and unable to bid on certain blocks of this spectrum. This can't happen again," he told business leaders during an address to the Economic Club of Canada on Wednesday.
Here we see how governments restrict which companies can bid in spectrum auctions in order to enable competition. Unlike with wired telecommunications where incumbents have a dreadful advantage over new entrants, the playing field in wireless is fairly amenable to policies of this sort. An additional benefit of restricted auctions of this type is that the winning bid is likely to be lower. When the financially-stronger incumbents are kept out, the new companies are less likely to run up the price in a bidding frenzy; they all have somewhat similar financial realities and can be expected to bid in accordance with that reality. This keeps the spectrum license fees lower and, at least in theory, allows the new entrants to keep their prices lower than older companies that may have paid more in open auctions.

Yet even so not all is well, as we discovered back in the 1990s. New entrants such as Microcell (the original Fido) were eventually acquired by the incumbents, and the same can occur again. Lower spectrum license fees due to a restricted auctions are only relatively lower; they are still very expensive for a company starting with no revenue and about to embark on an expensive network construction project. All of this must be financed and it leaves these companies vulnerable to business hiccups, increasing interest rates and revenue downturns due to price competition.

If the incumbents are allowed to bid on new spectrum, the new entrants will typically find themselves paying more for the licenses they do win. We should expect that the hungry and stronger incumbents will bid aggressively. They win whether they win or lose the auctions: if they win the auction, competition is avoided and they can raise prices to their captive market to recover the cost of the newly-acquired licenses; if they lose the auction, they have driven up the license fees by participating in the auctions so that their competitors are thus more likely to fail or to fail sooner due to the increased financial burden. We lose because, no matter which carrier wins the auctions, we will be charged higher prices to compensate for their spectrum costs.
[Pierre Peladeau, CEO of Quebecor says,] Predictably, the same incumbent voices that opposed an equitable distribution of spectrum in the last auction can once again be heard calling for an auction devoid of any rules that could hamper their dominant role in the market.
The second impact of spectrum auctions is a little more subtle, but just a little. Although the carriers must pass on their license fees to customers, there is also a limit to how much they can load our bills in a competitive market. Higher costs due these auctions led to higher prices and, importantly I think, slower network builds. There is only so much capital intensity a company can withstand before compromises must be made. If you can't raise prices you must slow network investment. If you do raise prices, you slow the pace of customer acquisition or you market service more toward business subscribers. Gaining subscribers when you are new to market and are in the midst of building a network is never easy as, for example, Wind Mobile and Quebecor are discovering.

I believe this is one of the reasons why Europe, Japan and some other countries were able to grow their wireless markets faster than the US and Canada in the previous decade and a half. Saddled with heavy debt from both capital expenditures and spectrum licenses, new carriers here have a disadvantage to create competition. In the absence of healthy competitors, the incumbents can keep prices high and not particularly care about rapid innovation or expansion.

As far as public policy goes on spectrum auctions, the government is sacrificing telecommunications infrastructure and the economic activity it would create for the immediate gratification of money in the treasury. As citizens, we have largely bought into this bad bargain, and we are paying the price. Except that the price is extracted from us in the form of high mobile phone bills rather than direct taxation.

I recall one time, many years ago, I inadvertently stumbled into this debate in a meeting with senior FCC staff. It turned out that the most senior person there had been deeply involved in selling the idea of spectrum auctions to Congress. Let's just say that I quickly discovered the political investment that the government had put into this policy instrument and that they didn't take kindly to criticism. The thing is that at first blush it really does seem like a good policy, but I believe as fervently now as I did then that it is a bad policy that costs all of us more money and poorer services and choices than alternatives.

Friday, November 12, 2010

Gold, Reserve Currencies and Global Trade

Imaging that you wake up in the morning and you hear about an unbelievable cataclysmic event that has occurred overnight. As you struggle to clear the fog from your head you think you heard on the clock radio that due to some mysterious cosmic event everything is now exactly half the size it was yesterday.

What a crazy thing you might say to yourself. But how would you convince yourself that this extraordinary occurrence is true or false? You look around and you notice that everything appears the same as before. Then it hits you that if everything is half the size, including yourself, you might be unable to tell the difference. Think about it a little more and you wonder if by 'everything' that really does include everything or just some fraction of the world or cosmos.

You vaguely recall from high school science that if you halve linear dimensions then volume, and therefore mass, will decline by the cube. So you jump onto the bathroom scale and see that it gives the same reading as before. It would seem that whatever occurred with the foundations of physical laws the springs and levers in the scale also seem to be responding in a proportionate manner. You might then consider looking at the sun or moon to see if they've changed, but then you remember that even if you are smaller that if these heavenly bodies are still the same size and at the same distance they would also have the same appearance. Of course there may follow catastrophic astronomical events if their mass stayed the same while that of the Earth decreased, but, well, it all gets very complicated very quickly. All you know is (if it's really true) that there must be something detectable to scientists or they would not know that this peculiar event had occurred and it would not have made the morning's news.

Now imagine instead a slightly (but just slightly) more believable news story. Instead of a change in physical laws, imagine that overnight the Governor of the Bank of Canada announced that effectively immediately the loonie is worth exactly half its previous value. At first you're shocked, but then you notice that little has actually changed. The money in your pocket and bank account are unchanged, and price of goods and services are, at least at first, either unchanged or randomly altered due to general confusion among the public. The intrinsic value of things and labour is no different, so perhaps this makes sense. Or perhaps not, as we will see.

When you are lost at sea and the sky is cloudy you search for a reference point to position yourself. It is the same with currencies (and the laws of physics): you can only make sense of the new value of the loonie by comparing it to something else. What else, after all, could the Bank of Canada have meant except that the loonie is half its previous value in comparison to some reference. For several decades now the world has most often used the US dollar at a reference point. Before then it was gold. Once we left the gold standard -- opting for floating currencies -- the size and resilience of the US economy made that choice only natural. We are now witnessing what happens when their economy becomes unhinged. Pushing the earlier analogies a bit further, this is like discovering that the spit of land you've spied is itself floating freely on the sea's surface, leaving you feeling lost once more.

In the case of the hypothetical devaluation of the loonie, the existence of the US dollar serves us well as a reference since our economies are so inextricably linked and the US economy is so much larger. Declaring, by fiat, that the loonie is now worth US$0.50 is insupportable unless we at the same moment sever all financial and trade ties to the US. If we don't, there will be a rush of capital across the border and abrupt shifts in business relationships that would seek a new equilibrium point, one that (if we ignore the disaster such a rapid change would entail) will restore the loonie to its previous level.

However if the US and Canada jointly halved their respective currencies the impact would be slower if even more monumental, with the loonie getting tossed around like a leaf in the wind as the world comes to terms with the change. The US Federal Reserve is pursuing a policy very much like this at the moment, and many currencies, not just ours, are getting tossed around.
Bank of England Governor Mervyn King said ...“There needs to be a genuine recognition that there is a collective interest in the path along which the current-account imbalances unwind,” King said. “Unless we recognize that, then we will face a situation where more and more countries will resort to policy instruments that will be damaging to everyone.”

 If they persist we will likely see increased momentum to remove the US dollar as a common reserve currency in favour of, most likely, a basket of other currencies. Interestingly this is not too unlike the policy China follows in setting the value of the yuan, which they do not allow to float freely. Other, usually small countries, have tied their currencies in this manner to the US dollar, except in China's case there are some serious impacts on global competitiveness and trade. This makes the currency dispute between the US and China of such global import, and why it is getting so much attention at the present G-20 meeting.
“We will never seek to weaken our currency as a tool to gain competitive advantage and to grow the economy,” Treasury Secretary Timothy Geithner told CNBC in response to Mr. Greenspan's commentary
.
..
Mr. Geithner had proposed resolving the currency dispute by setting limits on current account balances, which measure trade-and-investment flows.
Ahead of the crisis, the U.S.'s current-account deficit swelled as households consumed imports with borrowed money, ...

Mr. Geithner's bid for current-account targets fell flat amid the anger over the Fed's monetary policy.
Here we see a reference to debt, much of which is being purchased by China. They are obviously interested in maintaining the value of their investment in the US economy, which would suffer is either the US dollar is devalued or, equivalently, the yuan is raised. China isn't budging so the US is initiating a sort of currency war.

Other countries are not willing to sit on the sidelines while this dispute unfolds, perhaps in a manner that will disrupt the functioning of the global economy. Since we live in a world where all national economies are so intricately tied together that currency volatility can quite easily disrupt international business patterns and send all of us into a deep recession: currency stability is needed to allow business investment, today, to proceed by removing the risk of value destruction, tomorrow.

A likely mechanism to replace the reserve currency model might be a basket of major currencies with some type of common asset base to make it workable. Although gold is being brought forward once more as a possible asset base, this is unlikely.
World Bank's Robert Zoellick, says he didn't propose a full return to a gold standard. His objective, he said, was to point out that the gold price is sending a message that the policy fundamentals within the G20 are rotten.
However, gold is not really a suitable standard in the present ear since it no more stable in value than any other commodity; that is, gold does not have any intrinsice value, just scarcity and and short-term appeal to those searching for hedges against currency devaluation and economic disruption. Here's Zoellick again:
The point on gold, and this is the golden elephant in the room, whether people recognize it or not, it is being used as an alternative monetary asset. So I'm not saying return to the gold standard as a control of money stock. But what I'm saying is the price of gold has been telling people is that there is a lack of confidence in some of the fundamental growth policies. So gold in that sense is a reference point, it's an indicator. Now people might wish it wasn't so. But I'm describing the facts as they see it and saying to policymakers: "You have to recognize what this says about the fundamentals of the policy you are pursuing." [You can't achieve confidence with] exchange rates and rebalancing alone.... You want to get the private sector back engaged. The time of government fiscal expansion and programs has run its course.
His viewpoint seems perfectly sensible. Gold, in fact, only has the value that we ascribe to it. The combination of scarcity and demand make it a risking asset class, but one that could reverse tomorrow. Its price is just as volatile as the currencies to which some would seek to tie themselves. Gold isn't even a very liquid asset, for a variety of practical and legal reasons, and for the majority of people it is merely an abstraction underlying the variety of financial instruments that derive from real stores of the metal.

Whether currencies or metals, investing in either is nothing more than betting on human belief systems, and what those beliefs are likely to be in the future. We should leave betting to the poker table and manage the global economy more deterministically. Like it or not, depending on one's political outlook or financial interests, some measure of coordinated monetary policy is necessary to keep our economies, jobs and trade reliably functional.

The US has less than 5% of the planet's population, but it no longer has 30% of the global economic activity. As more populous countries like China and India, not to mention Brazil, Indonesia and many more, continue to rapidly grow their economies and, of course, the well-being of their peoples, it is a question of when not if the US will lose it position as the world's leading economic engine. As Brazilian Finance Minister Guido Mantega said this week:
“The U.S. economy used to reign absolute, it was the strongest economy in the world and stood out from the others,” Mantega told reporters. “Today that is no longer the case.”
The faltering of that engine over the past two years only brings that time closer. As they become one among several dominant economic giants it is only sensible to move away from the US dollar as a reserve currency.

It's replacement will not be gold. Gold's price will continue to move up and down as currency volatility continues, and that volatility is enough to drive investor interest. I don't believe there is anything to be gained in thinking that the gold standard will return, ever.

Monday, November 8, 2010

Rural Broadband: CRTC Decision 2010-805

With the number of articles I've written recently on CRTC actions you might imagine that I follow them quite closely. In actuality I don't. Typically I peruse the major online news media and trade publications on a regular basis, and if I notice an article related to Canadian telecommunications (an interest of mine) I may then skim it for any potentially interesting content. If it's interesting enough I may search for related material, and if it looks worth commenting on I will go to the source -- the CRTC documents -- to see what they're up to. (The CRTC does have a what's new page but I don't read it and, as far as I can see, they don't supply an RSS feed.)

It was in this fashion that I came across CRTC Decision 2010-805 on rural broadband last week. Although this is an important topic in its own right, it isn't one that especially interests me; it was something else about the decision that caught my eye since it said something about how the CRTC operates.

I'll come to that in a moment, but first let's briefly look at what the decision itself is about. Much of rural Canada is (using CRTC parlance) in high-cost serving areas (HCSA). The density is low, the wires are long and even wireless captures fewer customers per tower. Once you leave towns and cities, and are not on a major transportation corridor, not only do you often not have access to broadband service or cable, you are also very often without cell phone service. This is quite a challenge to any objective to extend broadband services to these areas. In light of this, and a pool of over $300M that was slated be rebated to overcharged Bell Canada customers, the CRTC would like to see this used to extend broadband service to 112 rural communities in Ontario and Quebec.
9.      In Telecom Decision 2010-637, the Commission indicated that the Bell companies’ original proposal to use HSPA wireless technology to provide broadband services in the approved communities (the original proposal) did not satisfy the Commission's requirements as set out in the deferral account decisions. Specifically, the Commission indicated that the original proposal did not offer features comparable to broadband service in urban areas such as (i) a variety of service options, including various speeds and usage caps, (ii) an option for a greater than 2 gigabyte (GB) monthly usage allowance, and (iii) an insurance option that would provide an extra 40 GBs of usage for $5 per month. The Commission also considered that the original proposal would not represent the use of least-cost technology. The Commission therefore approved the use of wireline DSL technology and fixed the amount of funds available for broadband expansion at $306.3 million to serve all of the approved communities.
This is in fact our money (if you're a Bell Canada telephony customer in these provinces) since it should never have gone to Bell Canada in the first place. However that doesn't terribly concern me since, while it sounds like a lot of money, that $306.3M is only about $15 per person or perhaps $40 per average household. Despite the modest amount of money involved, it does worry me that the CRTC would use their power over the telcos to decide that money (our money) should be used to further a broader policy objective. This ought to be a political policy decision rather than a regulatory directive to Bell Canada to invest it elsewhere. Even if we concede that the initiative is worthy of this use of our money, it is still only a fraction of what it will cost to truly extend broadband throughout rural Canada.
Mr. Garneau's remarks before the commission are important mainly because significant broadband policy, from the regulator or from Parliament, would require hundreds of millions of dollars, possibly billions, from the federal government...MTS Allstream, a Winnipeg-based service provider, suggested during the hearing last week that this could cost upwards of $7-billion.
...
“If telecom providers are permitted to pick and choose customers and areas they want to serve, all efforts to achieve universal, affordable broadband are doomed,” said John Lawford, counsel with the Public Interest Advocacy Centre, an Ottawa-based consumer advocacy group.
The carriers will, quite reasonably, target service for areas and customers where there is profit to be made. Where it isn't profitable or insufficiently profitable in comparison to making the same investment elsewhere, it is good business to not invest in rural broadband. If the government decides, for policy reasons, that rural broadband is desirable, it should be funded from the public purse -- the general tax base -- like any other program. That is, the government pays the carriers to deploy rural broadband. These types of programs can easily turn into a morass if not handled properly, as evidenced by the corruption-plagued USF (universal service fund) in the US. We don't want to repeat that here.

By now you are thinking that this is what caught my interest about CRTC Decision 2010-805, but it isn't. Instead it is an inconsistency between this decision and an earlier one on high-speed wholesale: CRTC Decision 2010-632, which I wrote an article on, and another on the comments of one of the commissioners. Here is the passage of that earlier decision that I found contradictory to the present one:
Competition drives innovation and provides consumers with a choice of service providers and service characteristics. The Commission notes that ILECs and cable carriers are offering their retail Internet services at increasingly higher speeds. The Commission considers that, at present, retail Internet service competition results primarily from services provisioned using wireline facilities. Other retail Internet services, such as those offered using wireless and satellite facilities, are not generally substitutes for wireline facilities at this time.
In that decision, CRTC said that wireless was not a suitable alternative for wired broadband service, cable or DSL. This was a point of contention in 2010-805, where Bell Canada pushed for HSPA+ (wireless) service as the appropriate technology to extend broadband for those 112 communities. The CRTC agreed! Not only did they agree, they further departed from their earlier decision by requiring third-party ISPs access to that technology using GAS.
11.  Bell Canada indicated that it would also file a tariff to provide access to wholesale HSPA+ wireless broadband services under similar terms of service characteristics as the Bell companies’ existing Gateway Access Service (GAS),[5] in order to allow competitive providers the ability to offer retail broadband services to end-users.[6]
The reasoning presented by the CRTC in 2010-805 does not make clear why wireless is appropriate for these rural communities and for GAS and, further, they make no reference to 2010-632. This is quite interesting. What it really exemplifies to me is the CRTC latitude to make poorly-reasoned and inconsistent decisions without any political consequences. The situation is very different in the US where FCC decisions are frequently and vociferously criticized by members of Congress and even other branches of the government, and their rulings often end up in federal court. It is not surprising that the FCC employs many lawyers to carefully argue their decisions on the basis of legal statutes.

Unlike is another recent decision, this time the cable companies were very critical of Bell Canada proposal to use HSPA+ for rural broadband.
13.  Barrett, EastLink, RCI, and Videotron opposed the revised proposal. These parties submitted that, while they supported the principle of technological neutrality, the Bell companies should deploy a wireline DSL solution as originally directed by the Commission.

14.  RCI and Videotron submitted that HSPA services are now, or are expected to be, available from Bell Canada, RCI, and Videotron in most of the approved communities. As such, they argued that it would be inconsistent with the deferral account decisions to approve the revised proposal in order to fund broadband service where such services are already offered.

15.  Barrett also argued that the Bell companies should be required to provide access to individual components of the wholesale HSPA+ service, rather than the proposed aggregated solution.

16.  EastLink and Videotron argued that approving the revised proposal would result in the subsidization of Bell Canada’s mobile voice service. They indicated that it would be inconsistent with the Policy Direction[7] to distort the competitive market for mobile voice services in the approved communities by funding a technology that could provide both voice and data services.

17.  Barrett, RCI, and Videotron submitted that the revised proposal does not adhere to other principles in the deferral account decisions, as it does not represent the use of least-cost technology to deploy broadband services. These parties argued that alternative broadband service providers could provide a comparable service at significantly less cost than Bell Canada, and submitted that if the Commission approves the revised proposal, it should allow for competitive bidding to see whether other companies could provide the HSPA+ service at less cost.
I suspect they are right to level these criticisms. It does seem unfair that Bell Canada can, with money that ought to be rebated to customer, fund the deployment of network equipment that competes with cable and mobile services from other carriers. It is also interesting that Commissioner Katz, a former executive of Rogers, wrote a dissenting opinion in CRTC Decision 2010-637 to argue in favour of wireless for these rural communities, which was subsequently accepted in 2010-805.
I fail to see the logic in limiting the Bell companies' ability to use alternative technologies that meet or exceed the requirements imposed in the deferral account decisions.
Although I applaud Commissioner Katz's view that the CRTC should focus on service objectives and not specific technologies, I believe the cable companies have a valid argument that DSL could be just as cost-effective for these rural communities. For both DSL and HSPA+ there is a need to back-haul the traffic, from either the central office or tower, respectively, to their core network. Locally, the copper already exists, so the primary economic comparison is between DSLAMs and HSPA+ base stations (towers). Unfortunately, as is routine in these matters, Bell Canada's network costing figures are confidential. All we know is that the CRTC reviewed the submitted material, but not how persuasive it was or if it bore any resemblance to the true costs.
23.  With respect to the proposals to allow for competitive bidding in order to ensure the use of least-cost technology, the Commission notes that it rejected this idea both in Telecom Decisions 2006-9 and 2007-50, since it would add a significant layer of complexity, delay the implementation of broadband expansion, and result in substantial administrative and regulatory burden. The Commission considers that these reasons continue to be valid.

24.  In light of all of the above, the Commission finds that Bell Canada’s HSPA+ wireless broadband proposal is consistent with its determinations in the deferral account decisions. The Commission therefore approves the revised proposal.
Once again we see the capriciousness of an opaque regulator in our telecommunications market. We all need to keep a close eye on the CRTC and similar regulatory bodies if we are to ever see an end to invasive and poorly-justified distortions of the free market. This should worry everyone, including those individuals and companies that benefitted from these recent decisions; today's winners could easily become the losers in CRTC's next decision.