Tuesday, September 30, 2008

First Look at Comcast's Neutral Throttling Scheme

Last week Comcast submitted to the FCC their detailed proposal for congestion throttling of broadband subscribers, one that does not discriminate on content. When I read reports about it I was very curious about it so I read through the document to see the details, or at least the details that they disclosed. While this is a US regulatory process it could very well impact Canadian ISPs, especially in light of throttling of P2P and other applications by Rogers, Bell Canada (both Sympatico and unrelated ISPs) and others.

The questions I had for myself were, how it works, will it meet its objectives, and who is impacted and how? My hypothesis on a first skim of the document was that it would not do quite what they claim. Then I read it more carefully, and now I more sure of it. It isn't that it's a bad plan, just deficient in certain and important ways. Let me take you through my thinking on this, and my conclusions.

For purposes of argument I will construct a simple cable broadband scenario with lots of easy to manipulate numbers. It doesn't fully correlate to reality or Comcast's figures though it should prove sufficient for a quick analysis. You'll see that it's easy enough to alter the numbers for a real case study while using the same methodology. Here are the attributes of the scenario I am constructing:
  • 200 Mbps downstream capacity serving 275 subscribers (275 is Comcast's average number). Comcast's scheme targets both upstream and downstream data, and although I am only describing downstream, upstream is similar. Also note that the cable access is shared, unlike DSL. However it does correlate well with the shared link between the DSLAM and IP core, so in that way my analysis could also apply to DSL. In other words, Bell Canada could, if the CRTC nails them, implement this scheme (with suitable interfaces).
  • Each subscriber's downstream capacity is 8 Mbps, as determined by modem, policies and DOCSIS level.
  • From the above, bandwidth is overbooked at over 10:1. This is perfectly reasonable and good engineering practice as we'll see.
  • At 'busy hour' there are 200 active subscribers (~75%), with an average long-term (15-minute window) utilization of 0.85 Mbps (my invention, but is probably not an unreasonable choice). The pseudo-normal curve of subscribers counts will show a peak near 0.85 Mbps, with a declining number of subscribers at lower (toward 0 bps) and higher (toward 8 Mbps) speeds.
Here we have a long-term utilization of 85% (200x0.85 = 170 Mbps out of 200 Mbps), which Comcast correctly notes is within the range where congestion can appear often enough to impact subscribers. When congestion does occur it will impact all subscribers regardless of their individual traffic profiles since every packet has an equal probability of being delayed (buffered) or lost (protocol or application timer expiry, or buffer overflow); all subscribers start with the same 'best effort' service level.

As an aside, at windows shorter than 15 minutes the probability of congestion (>80% utilization) increases. This phenomenon is a consequence of the statistical nature of communications by multiple independent transmitters. Therefore at very short windows, say 1 second, congestion is common but is usually invisible except as some (variable) latency as packets are buffered until the link is free. As congestion increases, so does packet loss (and retries).

Now let's say there is one subscriber downloading at maximum rate (8 Mbps) in the measured 15-minute window. Comcast will throttle that one subscriber to, perhaps, 1 Mbps. That will drop the utilization from 170 Mbps to 163 Mbps, or 81.5% of capacity. Congestion will be reduced but not to below their objective of under 80%. If that subscriber is only using 6 Mbps, which is still above Comcast threshold of 70% of the maximum 8 Mbps, the effect is even less. However this should not be disregarded since congestion becomes severe because of a 'knee in the curve' that is typical in packet networks. That is, if the knee is near 80% of capacity, the impact of a 1% from 84% to 85% can be much greater than a similar 1% from 80% to 81%. Therefore in the instance of one heavy usage subscriber being throttled, the improvement for the other 199 subscribers can be significant.

To get below 80% they will need to throttle 2 such heavy usage subscribers, which is 0.73% of the total of 275 subscribers (both active and inactive during the period). By Comcast's own figures, in the Colorado throttling trial, about 0.36% of the over 6,000 subscribers were throttled at one time or other. However these were not even concurrent and they did not say how many subscribers were at most concurrently throttled. This implies that my choice of 2 concurrently throttled high usage subscribers would be uncommon.

Let's now look at the case where there is no subscriber using more than 80% of their 8 Mbps capacity in the 15-minute window. We are still at 85% of the 200 Mbps capacity yet there is no one to throttle according to their policy. In other words we have 200 fairly average subscribers, or we may have one or more heavy usage subscribers who game the policy by deliberately throttling their traffic to below 5.6 Mbps (70% of 8 Mbps) during busy hours. Comcast can then only respond by changing the policy or reducing the number of subscribers per shared access to a number less than 275, or I suppose they could ignore the problem and allow the congestion to affect all 200 subscribers.

I don't have the numbers at hand, though it is known that the average user's traffic rate and traffic total are increasing, due in part to richer web content and the popularity of video sites (Hulu, YouTube, etc.). So even if a connection is now at 70% utilization at busy hour they need only wait a few months until they reach 80%. They will still need to upgrade the network, at some cost, regardless of high usage subscribers such as BitTorrent users.

My conclusion is that their throttling scheme, while it is application agnostic, does not solve any problem. All they are doing is delaying for a short period the need for network upgrades. Otherwise their only alternative is to use the technology to put in place increasingly severe policies, which over time will decrease their delivered bandwidth to subscribers to much lower levels, even (horrors!) to below that of their DSL competitors.

Update: 22 out of 6,016 users is 0.36%, not 3% as I originally stated. Now corrected.
Update (Oct 1): Yikes! Another error. 8 Mbps - 1 Mbps = 7 Mbps, not 5! Fixed throughout subsequent calculations. Conclusions remain valid.

Friday, September 26, 2008

Cogent and Network Competition

With all the noise recently about about telecom providers charging high rates for low-bandwidth services like SMS, Bell Canada rumoured to charge for mapping (while impairing competitors), and mobile providers blocking or impairing VoIP (via their control of user appliances), it was refreshing to read this Forbes article about Cogent and its CEO, Dave Schaeffer.

While I have nothing to say about his aggressive, though legitimate, business tactics, the network logic is compelling despite being an old story: keep the network stupid and cheap, and watch telecom innovation and usage explode upward. As Cogent is showing, competition in the backbone (transport between carriers and between the core and the access networks) is vibrant, fierce and dirty, and fairly healthy.

Contrast this with the edge. That's where there is still a monopoly, or oligopoly at best, and so service providers try to squeeze every penny they can from customers. While I don't blame them for doing so, after all it's their job to make money for shareholders, it smells of desperation. They seem to be rushing headlong, often clumsily, to find new things to charge for and to fight every competitive option by leveraging their privileged position as network "gatekeeper". I believe the tide has turned against them and these efforts to stem the tide will, in retrospect, be seen as futile.

I don't claim this position as an ideological rant, but rather as a reasonably informed and (mostly) disinterested observer. Consumers, including businesses, are becoming far too aware of how they are paying more than they ought and this in turn creates a market for new competitors. The access network oligopoly is ever-so slowly being wedged open and as it does so it becomes too difficult for the incumbents to push the door closed; the opening will only grow wider.

Those who want this to happen can all help bring it about in their choices as telecommunications consumers. Choose providers and equipment vendors that are increasing competitive options. This makes them stronger and thus able to make further investments. There can be a cost of inconvenience since, as we see, the incumbents will fight back by trying to impair the services or business of the upstarts, whether through technological or political means.

Ultimately there will be broad economic benefits as telecommunications get cheaper and both technology and content providers have easier access to the market. The biggest cost will be that the incumbents will lose their dominant, or perhaps just their privileged, position. That's an acceptable cost to all but their shareholders.

Thursday, September 25, 2008

EU Splitting Telcos: Network & Retail

I was surprised when I read this article saying that the EU is going to split incumbent telcos' businesses into separate network (wholesale) and retail units. I know the Europeans are more interventionist than here in Canada or in the US, however I believe this is going too far.

In comparison, the US opted for non-structural separation (accounting and marketing) between wholesale and retail units in the 1996 Telecom Act, along with a sunset clause that has now fully kicked in. Even with a Democrat in the White House the government did not go so far as to force a permanent severance between the incumbents' business units; it is too interventionist and anti-free market for American tastes. There were consequences of this choice, which I discussed back in July (under "The Worst Job There Is"), but that's what they wanted.

In Canada the situation is similar to the US though the CRTC made the conditions for wholesaling less favourable to the new entrants, except for DSL wholesale which is better for ISPs in Canada than in the US.

My gut reaction to the EU action is that it goes too far. The above-referenced CBC article makes the tired claim that the networks were largely funded (indirectly) by taxpayers through regulations that pretty much guaranteed profits to the telcos. The argument is valid except when overused; expecting eternal gratitude from the telcos, at a severe cost to their business, is not reasonable. Besides, the population did derive secondary benefits for their investments, such as portfolio appreciation (investments directly or via government ownership) and lots of spin off economic activity.

Regulation has its place and I have argued in favour of it in other circumstances. In this instance you cannot regulate yourself into a healthy and vibrant telecommunications sector. What is needed is more competition, not more parasitic business on another, large and regulated one. That's what this model promotes. It depends on the belief that the existing access networks are a natural monopoly, so that this sort of telco wholesale-retail separation is necessary.

The assumption is that the access network natural monopoly is perpetual. Therefore, because the expense required for a new company to build a similar network is so high, it hobbles their ability to offer competitive pricing. Another less well understood cost is one on society since these companies need to tear up streets, string cables, raise towers and otherwise mar the cityscape with their networks. To an ordinary citizen these networks may seem redundant and therefore unnecessary.

The questions to ask before enacting this severe form of EU regulation is: will regulation stifle the impetus of both incumbents and their competitors to create new networks that, in future, will break the natural monopoly? New wireless technologies are already pressing the boundaries, getting ever closer to being fully competitive with wired networks, whether they be twisted pair, coax or fibre. If the incumbent's core network business effectively becomes a utility (including rate-of-return guarantees) they will have no incentive to take on the risk of deploying new technologies, including fibre. The new entrants will always find it cheaper to leech off that incumbent's network rather than build their own. The bigger risk is that society's future economic growth will be limited just to get what may only be a short term benefit.

I lean towards more open competition combined with rules to ease experimentation and exploitation of new technologies. I believe this will deliver the biggest benefits to the economy and society at large. I am against regulations that have the effect of locking down current business models and technologies, and that consequently discourage innovation.

Monty Python-esque All-Candidates Debate

Here they go again. This week there was a "debate" among local candidates here in Ottawa West-Nepean that was nothing more than a contest between Conservative (Baird) and Liberal (Pratt) candidates to see who can behave the worst. Sounds like they both won.

Their clumsy tactics remind me of an old Monty Python skit. This is the one where a "Mr. Hilter" is running for a council seat (UK version of a municipal election) as the first step to world domination. Hilter, played by John Cleese as I recall, bears an uncanny resemblance to you-know-who, and who, perhaps, did not die after all. In one of his clumsy campaign tactics he gives (shouts) a street-corner speech to a bemused crowd of several passers-by. Circulating through the "crowd" is Graham Chapman dressed as a common citizen but bearing a striking resemblance to a certain former chief of propaganda. As Hilter makes a nonsensical campaign promise Chapman whispers to one of the listeners, "he's right you know."

Unfortunately that's far too similar to what happened at the all-candidates debate. At least those two contendors filled the audience with party shills who would applaud or cheer whatever their candidate said and try to shout down the other candidate. I wonder how many regular citizens were among the 300 in attendance.

Do they honestly believe this crap furthers their causes? All I've heard in response to this event is disgust or disinterest. I suspect only partisans were satisfied with the evening.

These antics make Baird and Pratt look like buffoons. Perhaps that's what they are.

Wednesday, September 24, 2008

Venture Capital Exits

There is a dearth of venture capital (VC) in Ottawa. [... pause ...] Alright, everybody knows that. Despite this truism there is a lot of money out there. That is, the wealth is there but not the investment activity. Why not?

The other fact everyone knows is that the general investment climate at this time is one of fear. Investing, which includes VC, is a pendulum that swings aperiodically between greed and fear. We are in a prolonged period of fear. It is a fear of calamitous losses resulting from any investment, and especially illiquid investments like those in technology start-ups.

I wish the environment were better since the technology sector in Ottawa is moribund with no obvious signs of improvement in the foreseeable future. Money is the fuel that will light that fire, but it is staying put under investors' mattresses. What I want to talk about in this post is why that money is not seeing the light of day. I think it is better to understand what's driving this behaviour rather than reading more articles about how awful things are, or about the new models of investing that are nothing more than ways for investors to reign in their risk.

Investing of any sort, which naturally includes VC, is all about making money. Forget the mushy words about how investors' purported enthusiasm for telecom (or Web 3.0, mobile, clean tech, etc, etc). While often true it is irrelevant. Pleasant feelings may drive their interest in playing the VC game, and this includes angels, it remains (always always always) about the money. If they don't see the prospects of a substantial return the money stays under the mattress.

Now we can look at how they determine the prospects for a beneficial financial outcome. For the moment let's gloss over their due diligence of prospective start-ups; we'll assume the company is good to go so that is not at issue. We will also assume the VC has a portfolio of companies so the usual start-up attrition rate is taken into account.

The way I would then summarize the thinking of a VC is as follows:
The valuation at exit 'x' must exceed the valuation at investment by some factor 'y', and the exit must happen within 'z' years.
Take these numbers and the start-up attrition rate, do some rudimentary actuarial calculations and you get an expected rate of return. Simple, isn't it? Except of course for knowing the values of x, y and z. And there's the rub.

If you don't know the exit parameters it is not possible to come up with an acceptable post-money valuation. Forget all those fancy algorithms and charts that depend on rigourous scientific and market analysis to come up with a number. It's almost entirely a wasted effort. Of course it sounds good to the company making the pitch ("our company will be worth $500 million in 3 years based on our business plan and industry valuation metrics..."). Now go sit on the other side of the table. That isn't the sort of talk that pries money out of their hands, and especially not for angels who you are asking to part with their own money rather than money from a fund being managed for their limited partners.

What will instead impress them is industry comparables (or simply, comps). These are ongoing exits of other companies that are operating in the same space as the prospective start-up. Those companies don't have to be identical to yours, just close enough to make the comparison meaningful. Let's say your start-up makes widgets. You have competitors who also make widgets (except that your widgets are superior). Several of them were acquired or did an IPO in the past 6 months and their valuations at exit ranged between $300 and $450 million. You make better widgets, you have delivered prototypes to customers and they even like your prototypes better than the other guys' widgets (guys who just got bought out for $375 million).

Now you have the investor's complete attention. Their next question should be something like, "if you get the money you're looking for how many widgets can you produce by next Christmas?" Give a good answer and they'll offer you twice the amount you want. They see the exit possibilities and want to strike quickly while the iron is hot; widgets could fall out of favour next year. So you negotiate and end up doing the deal and go on to fame and fortune.

Except that back here in the real world there are no IPOs or acquisitions of companies making widgets. In fact, while the market for widgets is healthy it really isn't growing much if at all. So no comps. No comps, so no exit prospects or any way to judge exit valuation. The money stays under the mattress and customers make do with today's batch of widgets.

Perhaps a very aggressive investor will offer some money for a disproportionate fraction of the company (i.e. low post-money valuation) to allow you to make a superior but far less featured widget by the Christmas after next, in the hope that the environment will eventually improve and offer an exit. Sure the exit valuation may be depressed, but since the post-money valuation is kept very low there is still a prospect for an outsized return, and if you flame out, well, at least the investment wasn't large.

So that's how to know the investment environment in Ottawa is improving - when the pace of technology exits picks up, if it picks up again. If you cannot wait you should do what others are doing, which is to build their businesses with little or no investment capital. It is especially doable for software products and services. Or, you can follow the money to a sector that does have better exit possibilities, like clean tech.

There are strategies for entrepreneurs to follow that don't involve waiting for the world to change. Patience is not a virtue since the wait could be very long.

Tuesday, September 23, 2008

Unintended Consequences of Attack Ads

I don't plan on more than a few posts about the election, and even that will be non-partisan in nature since I am vehemently non-partisan; I tend to despise all the parties, and party-centred politics, to a greater or lesser degree. With that in mind I would say that the barrage of attack ads against Dion by the Conservative party can have consequences they do not intend. Let me explain by showing how I interpret these attacks.

I almost always discount any attack ad because it is so expected in any competition. We all know that Coke hates Pepsi and that Pepsi hates Coke; of course I expect that Liberals and Conservatives hate or at least dislike each other. So what? That's expected, and irrelevant to me and my vote. Instead, when I hear a non-stop series of attack ads this is what I think:
  • Every attack ad is a lost opportunity to tell me what the Conservatives stand for and what they will do in the next Parliament. They are doing this to deflect attention from an agenda I might otherwise dislike, or because they have nothing concrete to offer. That is, they are declining to promote their own agenda.

  • The relentless attacks on Dion lead me to believe the Conservatives are afraid of him because there may be more substance to him than I realized. Since the Conservatives are so worried about him I need to pay closer attention to what he is saying. If the Conservatives really thought he wasn't a threat, their campaign would ignore him.
With those lessons learned at Conservatives' cost of their attacks on Dion I am now more likely to seriously consider the Liberals. It's still a dilemma for me since I have ample reason to turn my nose up at the prospect of voting for either. I may say more about this in a future post.

Monday, September 22, 2008

Whither Nortel?

As a former employee of Nortel back in the '90s I am saddened by their present dilemma. Saddened, yes, but not surprised. This slide in Nortel's fortunes reflects what, in my view, is a long-standing and never corrected problem in the fundamentals of their business. They still don't seem to know what they want to be when they grow up. That is, just what business are they in?

Let me give an abbreviated review of Nortel's history to put this in some context. At each major phase, as I will define it, I will try to summarize their business focus, with the understanding that I have to simplify, though hopefully not over-simplify.
  1. Pre-1996: Engineering-driven, hardware-based product business, selling primarily to telephone companies and secondarily to enterprises. The only significant service business was post-sales product support; telcos had their own large internal organizations to develop systems and processes to use the products. Major competitors in the carrier space were AT&T (now Lucent), Siemens, GTE, Ericsson, Alcatel, and some 2nd-tier vendors like Tekelec. It was difficult for smaller vendors to sell into the carriers.

  2. 1996 to 2001: There are two new strategies that overlap, but with different start times. With the opening of the telecom market in the US in 1996, and elsewhere shortly thereafter, there were a large number of start-ups in the carrier space. These included the large optical players like Qwest and Level 3, hundreds of CLECs, a smaller number of wireless players (by means of the then novel spectrum auctions), and cable companies (MSOs) that turned their attention to telephony. Nortel approached these companies the same way as before - sell them products to build telephone networks. Since many of these companies were neophytes, or at least new to the segments they were entering, Nortel and other vendors needed to offer more. Apart from the disastrous vendor financing, Nortel developed a services organization to help these companies build their business plans and network plans, and even to help operate their networks. Apart from network operations, the planning work was given away; these services became part of the cost of sales.

    The second strategy that began in 1998 was to begin its transformation into an IP company (Roth's "right angle turn"), since they correctly ascertained that IP was the future of networks, and Cisco was increasingly a competitor. Not having time to transform the company's products overnight, Nortel began its strategy of acquiring IP companies, starting with Bay Networks. The bidding war among Nortel and the other major vendors was intense and the number of companies bought was enormous. They could afford this with their 'dot com' inflated stock valuation.

    By 2001 when their stock tanked (along with everyone else's) Nortel had failed on both strategies. The first being to build a services business as a complement to their products business, and the second to become a leader in the IP products business, including telephony, IP infrastructure and applications. The reasons they failed are many, but are not the subject of this article. Suffice to say that 2001 was the start of a very difficult time at Nortel. Most of the services business was dismantled and purchased product companies were shuttered. The remaining product lines saw their revenues plunge as many of their customers disappeared, along with the money Nortel lent them to finance their purchases, and their former large customers merged and cut capital expenditures. Not only that, without a sustainable IP presence, Cisco and many newer companies were making large inroads into Nortel's traditional customer base and were setting the technology agenda for network evolution. And so the cutbacks began, as did Nortel's gradual slide into irrelevance.

  3. 2002 to 2008: As the company contracted they were able to bring costs more in line with revenues (even with the accounting scandals of this period), and so were able to keep the lights on. After all, Nortel still had top-notch products and engineering even if it all looked more and more like a patchwork quilt in comparison to what their customers were looking for. Revenue was also impacted by the transformation from a hardware-based business to one that was increasingly software-based. There is a comparison that can be made to the IBM PC business in the early days where they persisted with a hardware-based business model, with the software being thrown in for pretty much nothing. We know what that led to: Microsoft's growth into a giant while IBM lost the PC business entirely. Unlike IBM, however, Nortel has not succeeded in turning services into a significant business. Their business is still product based. Even with some excellent success with IP telephony this is primarily a software business with its lower revenues in comparison to hardware.
So there we are as 2008 comes to a close. There is no clear foundation in their business - it remains a patchwork. Their attempt to refine the product portfolio by selling or killing products and business units only answers short-term financial metrics and further confuses their customers. Just what kind of company is Nortel? At the moment no one really knows, and possibly not even within the company, though we can hope. Consider this quote from a recent Globe and Mail article:
Among investors, there was increased frustration that Nortel lacks the scale and the financial flexibility to become a top player again. “They're worth more dead than alive,” said a portfolio manager of one large equity stakeholder who asked not to be named.
Nortel is playing defense, not offense. That is not a business strategy; it is one that smacks of desperation, which is the view of many analysts watching the company. The stock price cannot recover until they define themselves clearly and at least put in place a credible plan to execute. Investors have to see the potential for growth to bid up the stock. As was noted in this article in the Ottawa Citizen, technology companies are often priced on growth, and so it is common to see the market cap of high-growth companies at many times annual sales. Nortel as of Friday was trading with a P/S of about 0.13!

So what are Nortel's options? I see three main strategies they could follow, or some sensible combination of them:
  • Continue as a product-based business. Unfortunately they will never regain their past dominance. They don't have the talent (or the ability to attract the best talent), the R&D budget, the speed, and investors will not give them the time needed. They can keep the products and technology that is most profitable or promising and get rid of the rest. Management is doing that now. MEN is a puzzle since it is profitable and no one in the company knows or will say why it's being sold, so it may indeed be an indication how big Nortel's problems are; many suspect they are doing this because they need the money now to shore up the company. If they continue with a incomplete mix of products they cannot be a prime vendor to the major carriers. Instead they may be relegated to supporting the bids of larger vendors with the few pieces they can supply.

  • Become a network integrator. If you don't have all the products in-house and want to bid on the largest network projects you must bring in other vendors to supply the missing pieces. These outside products may indeed form the lion's share of the bids. Nortel can rebuild its credibility with the carriers by becoming effective at doing this. I know they have tried (including from personal experience), though perhaps not as successfully as others. It requires building an 'ecosystem' of products that they can guarantee will inter-operate and that also use a common management framework. They must also demonstrate the expertise to support these products. They have made some missteps in the past though if they begin to take it seriously, rather than treating outside vendors as a threat, they could excel at this strategy.

  • Service business, both consulting and operations. I have my doubts that they could regain credibility in this space, especially since much of that expertise is long gone and they may not have the luxury of time to build the business. Further, the competition is much stronger now than it was a decade ago.
This article is going to end with a whimper, not a bang. I have no magical solutions to offer. At worst Nortel could be dismantled, with its valuable products and people sold piecemeal to other companies. The company is now so sick that may be all their investors will have the patience for. It will be a loss to Ottawa (and perhaps even Canada) even should those pieces survive since there will no longer be a dominant, or any, Canadian branding on those businesses.

The company has repeatedly failed in the past to reform itself into a major player with the strengths to compete in the new telecom environment. Why would this time be different? In conclusion, I am not hopeful that Nortel can survive the decade.

Thursday, September 18, 2008

Reregulation

This Business Week article is interesting since it shows how the US government is getting serious about returning to increased regulation. While this is currently being driven by the woes caused in part by deregulation of financial markets some years back, the momentum to regulate crossing all sectors, including food safety.

This topic is only being addressed in isolated talking points in our election. Both recent governing parties have failings in this are. For the Conservatives it is ideological aversion to regulation. For the Liberals it is a tool to manage the deficit. In my opinion, regulation should be more apolitical. Every party should be promising improvements in safety through a strong, independent regulatory regime. That's an important way for our governments to serve us.

Unrealstic Expectations For Banks and Other Financial Institutions

I had a good laugh this morning when reading this article in the Ottawa Citizen. I think it takes a totally nonsensical view of the business world. Whence comes this expectation that the most amoral of institutions, the corporation, and one whose core purpose is the generation of returns for its investors, is somehow going to unilaterally do something other than just that? Investors expect nothing less from management of their corporations, including all the various financial institutions, or they push those teams out and replace them with management that will generate the returns they expect. I wrote about this once before, so you can see a bit of my own ideological thoughts emerging here.

No, it is not the job of corporations to reign in their quest for profits. That is the job of the society in which they operate. Our appointed agents to do this is government. On our behalf, and to protect us, government must regulate corporations so that they cannot get away with, at least not for long, activities that threaten us and our society. Banking and finance are no different in this than are food safety, pollution controls, environmental management, and all the rest. Regulation of industries that can cause damage in their relentless pursuit of profit is necessary. Laissez-faire capitalism was supposed to have gone out of style in the 19th century.

Government is to blame here, and us for allowing them to get away with dereliction of duty. There seems to be some ideologically-driven need of governments here and in the US lately that eschews regulations. It is also driven by a desire to reduce government expenditures, which is very much in vogue with Canadian federal governments, whether Liberal or Conservative.

I'll say it again - self-regulation is an oxymoron. Give a teenager a car and he (or she) will almost certainly speed. Putting up speed limit signs while the police duck into the nearest Tim Hortons is avoiding the problem. The speed limits must be actively enforced for drivers to begin altering their behaviour. Financial regulations must be enforced if our large banking corporations are to begin to avoid risky, though profitable in the short-term, behaviour.

The responsibility is ours. We must hold our governments accountable for their failure to protect us with effective regulation and regulators. Everything else being said is just so much hot air.

Update: Here is an excellent post on this topic by Barry Ritholtz I read after writing my post. The punch line is particular relevant here: "Chalk up another win for excess deregulation . . ."

Tuesday, September 16, 2008

Bell ExpressVu - The Shakespeare Decision

This is a potentially promising development for everyone who uses any telecom or broadcast service. It is a controversial decision, and since I am certainly not a lawyer I can't say if it is likely to be sustained on appeal.

Apart from the "administrative fees" covered in this particular case, there are more that we are all aware of. Perhaps the most ridiculous are those "system access charges" for cell phone and long distance service. These are thinly disguised basic service charges that you must pay, and are named in a way to make them appear as justified supplementary charges or even as charges required by the government. They are careful to never claim this, they just let the misleading names they give those fees lead customers into deceiving themselves.

Once upon a time Bell Canada and other telephone companies were stringently regulated on the fees they could charge for late payment of bills, service and equipment installation, unjustified service calls, and much more. A lot of it was nonsense and needed to be loosened up in our now more competitive market. The trouble is that the prices continued to rise past the point of reasonable cost recovery, and then even more fees were invented.

It doesn't end with telecom services. Think of all those shifting and unpredictable charges on air travel. Not only do they use these charges to claim that the basic fee (just like on your telecom bills) isn't changing, it's just all those pesky "extras" you are requesting. These fees are charged even when travelling on frequent flier points, making them far less valuable. Of course that is their intention.

As Shakespeare is so oft' quoted: "What's in a name? That which we call a rose by any other name would smell as sweet." Or as more popularly quoted: "a rose is a rose by any other name." That's the value of this court decision. Calling a fee by any other name does not change it's true nature. In this court decision, ExpressVu's late fee administrative charge is fundamentally an interest charge. What they call it is not important. It's still a rose.

The same legal argument may very well apply to system access charges and all the rest of their ilk. This could be a huge win for consumers when dealing with quasi-monopolistic businesses. It just requires someone in Canada, someone with the tenacity of Peter de Wolf, to take on telecom service providers and the airlines.

Monday, September 15, 2008

Why Do Markets Drop So Much During a Crisis?

As we watch the markets free fall around the world today it is worthwhile to take a moment to understand why this happens when there is bad news. After all, it is fair to ask why so the stocks of so many companies unrelated to those directly involved in the process see their prices go down. Let's take a short tour, using as our example the present crisis in the US financial sector.

Before starting let's also understand the mechanics of stock prices, andthe price of any security in an open market. It's all about supply and demand. If there are lots of sellers and few buyers at one point in time the price of the security will tend to drop until equilibrium is restored between sellers and buyers. Conversely, prices tend upward when there are few sellers and many buyers. It's an instantaneous effect. Other investors and speculators may then jump in as the price moves. There is no pricing god in the market - it's all people, programs and psychology.

Now let's consider factors that likely play into this morning's activity. We must also acknowledge there are huge numbers of players and while it is technically possible to track every trade (by regulators and governments) it is not possible to know why everyone makes those trades. Every trade has its buyers and sellers, and all have their unique reasons.
  • Loss of confidence in financials: Our economy, including the value of currency, is largely supported by confidence in the entire system. Lose that confidence and investors head for the hills. They will take losses in low-confidence assets and hope to shift that wealth into higher-confidence assets. Gold and other precious metals usually benefit, though gold isn't up that much today.
  • Lack of confidence is contagious: There is a secondary impact from those not directly involved in finance, such as you and me. The thinking of many is that if the experts are scared then so should I. Thus, more selling.
  • Active trading rules the day: It doesn't matter if you and I decide to hold our investments through this crisis. Share price is determined by those who are buying and selling today, and which way the balance between them tips. This includes short sales, options and other derivatives. All it takes is one seller and one buyer to move the quoted price, and there are lots of buyers and sellers today.
  • Mutual funds, including index funds: A large amount of personal investments, including RSPs, are not made directly in public companies but through intermediary funds sold to the retail market by banks and other financial outfits. When an investor sells units in these funds, and especially when there is an imbalance of outflows over inflows, the fund managers need cash to pay out those investors. Since these funds keep very low cash balances they need to raise cash by selling shares in the securities held by those funds. Thus, more selling pressure in the market.
  • ETFs: Ditto.
  • Hedge funds: Ditto. However, actual outflows are delayed since most funds only allow redemption at fixed calendar intervals. Nevertheless, fund managers may want to raise funds now in anticipation of future redemption. This is especially true of funds that are performing poorly, and there are lots of those right now.
  • Second-order effects: As prices drop in the financial sector, the wealth of ordinary people suffers at least a paper reduction. As these numbers erode, people respond by slowing contributions to RSPs and RESPs, which reduces buying pressure in the market. People also tend to spend less which can impacts the future profits of companies selling discretionary products and services to consumers, and to companies that sell to those companies. This tends to cause (with more lag time) selling of shares of companies throughout the economy. Commodity prices also tend to fall, which is of particular interest to Canada, since less business activity reduces demand for commodities of all types, including metals and oil.
  • Dollar impacts: When commodities fall the Canadian dollar is weakened because there is the implication that the future wealth of the economy is reduced. This can drive up the (local) price of some commodities and stocks that have strong ties to foreign currencies, and especially the USD.
  • Third-order effects: As the dollar drops, exporting businesses may get a boost to their revenues and profits since their products and services become more attractive. This is unlikely to fully compensate for primary business impacts but will give the stocks of these companies better relative performance.
It's all a tangled mess because our economy has so many complex inter-relationships, domestically and to other countries. At some point there will come a new period of relative stability in the markets though it is very difficult to predict at what the price levels. I won't even venture to make a prediction. For now we sit and watch (and perhaps do some buying and selling) as this shock ripples around the world. Even as I type this the markets are bouncing off their lows, though there's no telling if this is or isn't a bottom.

Some Android (and Java) Lessons Learned

In my spare time over the previous 3 weeks I developed a small application for Google's Android platform. I did this to learn about it in more detail than is possible by only reading general descriptions. The utility for me is to better inform my future decisions for commercial ventures.

A secondary objective was to have a bit of fun. I have not programmed in a very long time and I wanted to challenge myself to see if I could do it. Not only have I never programmed in Java, I have never programmed in any OO language, so I needed to endure some painful re-education. Add to that my unfamiliarity with modern development tools (Eclipse in my case), mobile devices, and of course the entire Android OS with its APIs and emulator. Looking back, perhaps I should be surprised that I accomplished anything at all in just 3 weeks!

As the title says I want to relate what I learned during this process. This article will most likely be of no interest to non-programmers. To programmers it will at best perhaps be of no more than cursory interest. Nevertheless I find it instructive to document these points if for no one's benefit other than my own.

My first attempt to start coding in Java was to use the monkey-see, monkey-do method. This entails liberally cutting and pasting structure and code snippets from sample programs and usage examples shown in the documentation, then modifying these segments in accord with a general sense of how things ought to work. I've done this in the past and it can be quite successful. However in this case I failed miserably. The structural paradigms underlying object-oriented languages are too different from my experience with the earlier generation of 'linear' languages. An analogy to my difficulty would be when a speaker of English, French, Spanish and German decides to learn a new language by leveraging knowledge of sentence structure and common linguistic roots. This can work well when tackling another European or Indo-European language, but not for, say, Japanese. Its historical roots are entirely different.

Sadly this meant I had to hit the books (web sites) to learn the basics of OO and Java. I tried the various introductory material on Sun's and other sites, but found that tutorial material primarily targets beginning programmers, so the material was too elementary for my needs. I settled on the Java language reference manual as my primary guide. While this would be a terrible way for a beginner to learn, I was able to skip around and learn the elements of OO fundamentals, Java structure, and the particulars quite quickly - about 8 hours spread over 3 days. Though still a neophyte I now grasp the principles well-enough to understand what I see and write code that works, without simply aping what others have written.

With that introduction to my quest I will list some curiosities that popped out at me as I went along. If you already know all this stuff well, you might not find this of interest, though perhaps for others who find themselves in my particular situation (there must be some out there) it may be useful.
  • Java is a police state language. Back in graduate school (a long time ago), a visiting professor (forget the name now but he was well-known in formal language theory circles) liked to characterize computer languages by two general attributes: easy to write good programs, and hard to write bad programs. The latter he called a police state language. Java seems to fit the latter attribute pretty well. Something you must do when faced with a police state language is to learn all the things you must do and those things you cannot do. This can be tedious unless you are very experienced with the language since it requires lots at reference material at hand to write any code at all. With a language like C, to pick a random example, you can get going much quicker, with the attendant danger that it is easy to create bugs that are hard to pin down.
  • Java has strong typing that is very flexible, but which I found hard to navigate. Perhaps my biggest difficulty was type conversion. Java, along with reams of imported classes, has several ways to do this, and it is never very clear which to use in each case. Let me give an example. I needed to do lots of conversions among String, CharSequence and Float. Depending on the classes and their methods, I had to determine if I could do a cast or use a suitable method (and the class that defined it). It can be tedious finding the one to use in each case. Then there was a case I ran into where I needed to convert a String into a CharSequence but a cast wouldn't work and I couldn't find a method that fit. In frustration I placed the String object into the parameter that required a CharSequence and ... it worked.
  • Strong typing can let you down at run time. While the Eclipse IDE is good at finding compile-time typing errors, there are many cases where this can't be done, about which the Java reference manual gives ample warning. When a type error occurs at run time it can be time-consuming to find, understand and correct. Then there was an oddity I discovered (at least I found it odd) where methods of a particular type, say boolean or float, that when used were indifferent as to whether the result was used. For example, say there's a method declared as 'public float fred (int v)'. The method can be used in the statement 'fred(x);'. The result of the method isn't referred to in its use, and Java is indifferent about it. This seems bizarre seeing how strict Java is about usage of types. There may be a good reason for allowing this in the language, but I cannot guess what.
  • Logic flow is easy to subvert into something resembling spaghetti. You can, for example, place a return statement pretty much anywhere to end execution of a method. Similarly, you can insert a finish() anywhere to terminate an Activity. This violates all I learned about good program structure; I learned, and came to believe, that logic flow should be logically nested with no ability to jump into or out of a code block except at the beginning and end, respectively. It's almost as bad as the old go to statement. Related to this is the ability Java permits to dynamically assign a type to an object. Considering what a type can be, including a large swath of code, while being nicely fungible it also allows the creation of unfathomable code. Sure, lots of languages like C allow this, but, again, this seems like an inconsistency with Java's rigidly-enforced type rules in other circumstances.
  • Locating and structuring nested objects, packages and inheritance can be important. For my first program I had a lot of difficulty deciding how to arrange classes of objects and methods. There are many ways to do so and it is up to the programmer's judgement to decide how to do this. With everything else on my plate I simply gave up and put the entire application in one package with one class, with all the primitives and other objects available to all the many methods I defined. I did keep some primitives local to a method, but that was an exception to the rule. I know I'll have to do better in future if I want to keep applications easy to work on, and also when I need to launch multiple activities.
  • A large effort is required for designing UI and persistent data storage. Applications for mobile devices have to pay close attention to interactions with the user. Time is profitably spent designing screen layouts and informative feedback. Interaction requires buttons, menus, forms and information displays of many types. That means laying out screens, usually in XML, in great detail and provide support in the code for all the user activity events as they click and type away. Of course all the data entered and generated often needs to be stored for later retrieval, which requires good schema design and database interrogation techniques. After doing all this I found that the core application code can be a small part of the actual effort!
Apart from Java, I also had a lot to learn about the Android APIs to write even the simplest program. I won't mention all of that here except to note that the documentation, while extensive, is in many cases difficult to navigate to the information I specifically need. There are also errors sprinkled throughout, though documentation quality is gradually improving. In some cases the material still reflects the older SDK and not the most recent one (0.9).

Now that I have gotten as far as developing a working application I may stop. I am mostly interested in understanding enough about Android to make better choices in application design, and not to necessarily do the development myself. Despite the difficulties I encountered due to my rusty skills I did enjoy myself. I now have some dim remembrance about why I originally got into the software field so many years ago.

Friday, September 12, 2008

Canadian Business Competitiveness

As the focus of much of my career, telecommunications is close to my heart. The business of telecommunications services in Canada, not products, stands out as an anomaly. Competition is poor, prices are high, customer service lags, and our home-grown businesses compete poorly on the global stage. These are all related.

In this CBC article (Sept. 10) about the state of the telecommunications business in Canada, this quote caught my attention:

Moreover, the safe and easy Canadian market has also kept the likes of Bell, Rogers and Telus from expanding outward to become globally competitive companies, like Britain's Vodafone Group PLC, Spain's Telefonica SA or Germany's T-Mobile, all of which are multinational empires.

"Nobody is stepping out into a world setting because they're doing quite well here in Canada, thank you," Milway says. "None of them have been able to take this Canadian base and become a world leader like [BlackBerry maker] Research In Motion."

When you have it safe in your home market, and our oligopolistic telecom market is fairly safe, competition is unnaturally constrained. Businesses operating in that sector lose their competitive edge; like any skill it must be used or it is gradually lost. Like animals in a zoo, these businesses' existence is safe, but they would have a hard time surviving should the cage doors be opened, whether to wander out into the wider world or to have outsiders wander in.

If our corporations can't compete domestically they absolutely can't compete globally. World-wide there are now many telecom corporations that have sharpened their teeth after years of brutal, and sometimes fatal, competition. Canadian telecom companies are comparatively slow to respond to threats - they are too comfortable and complacent.

Letting in a few competitors would make for some positive change. First, consumers would benefit from effective competition, which should improve pricing and service selection. Second, the incumbents could start down the road of improving their domestic competitiveness. Third, in the longer term, Canadian businesses would have the skills to expand globally.

On a practical level I do concede that cable and conventional telephony are not often multi-national in scope due in large part to the dependence on building out a wired infrastructure. Since this is not true of wireless service it is eminently suitable for open competition, despite some doubts I expressed regarding the coming round of wireless competition in Canada.

The other exportable business is the selling of expertise. Bell Canada did do this in the past (BCI) but they could not sustain its viability. I don't believe it would be successful if tried today since I doubt they have anything unique to offer foreign operators.

Our largest corporations could be the country's biggest asset in increasing our society's prosperity, if they would accept competition or if the government made them accept it. This goes beyond telecom to also cover commodities and banking. Miners, drillers and wood harvesters could move up the value chain, but most often do not. Banks could expand internationally today, and while they do so they are not as aggressive as they might be. They are also comfortable with a regulatory environment that discourages outsiders from operating here.

Overall I don't find the present situation encouraging. I say we need to open the door to more competition to strengthen our businesses and economy. Let the weak businesses fail.

Thursday, September 11, 2008

Fast Cleanup for Sewage Spill Legal Process?

Following up from my earlier post, I was pleased to see that our mayor, Larry O'Brien, plans to have the City of Ottawa plead guilty to the 2006 sewage spill charge. The first benefit is that it saves the costs of defending against the charge. Of course that will mean a possibly large fine will be assessed, which as I had argued could harm Ottawa and its taxpayers.

I am guessing he would only be taking this route if he has received some verbal assurance from the province that the assessed fine would be applied back to the city's infrastructure budget. O'Brien was quoted as saying that this was his hope, though I don't have a link for that (heard it on the radio). I don't know if the court could make this ruling directly or if the province would have to grant the money back to the city as a separate action.

Hopefully this is how it all turns out, and is what I had suggested as the best outcome. It would be nice to believe the mayor reads this blog but I very much doubt that! Perhaps contrary to typical political thinking, sensibility has prevailed at both levels of government. Now that would be a welcome change.

Wednesday, September 10, 2008

LHC Armaggedon (a parody)

The following is a parody in the form of a fake news article that I posted earlier this year as a comment on a popular science blog, under a different pseudonym. With the turn up of the Large Hadron Collider scheduled for today I thought it timely to republish it on my own blog, and taking the liberty of making a few edits.

The theme was to satirize the few odd folks out there who seem to think the LHC will destroy the world, and to also take a dig at a few other loons unrelated to the LHC issue. I hope you enjoy reading it as much as I enjoyed writing it.

------------------***------------------
ZP
Foiled Terrorist Attack on Science Installation
Tuesday April 1, 12:01 am ET
By Zeppo ze Zeppelin, for Zathurts Press
Armaggedon averted, for now

Geneva, Switzerland (ZP) -- Disaster averted as crazed group of Christian terrorists attack site of nearly-completed particle accelerator.

Earlier today a group of about half a dozen attackers assaulted and gained access to CERN's LHC (Large Hadron Collider) situated on the border between France and Switzerland. They were heavily armed and swiftly made their way to the interior and took hostages. The hostages, all of whom are now safe, were particle physics students and tradesmen working on the installation which is scheduled to be activated later this year.

After securing themselves and their hostages they communicated by telephone to major media outlets and CERN project headquarters. In an unexpected twist, the terrorists demanded, not that the facility be destroyed or otherwise disabled, but rather they wanted it turned on immediately or they would begin executing hostages one by one, or as they described it, dispatching them to their maker.

Specifics of their objectives remain unclear since they tended to incoherence and self-contradiction during these telephone conversations. According to a CERN spokesperson one thing they seemed upset about was a legal suit recently filed in Hawaii. Even the spokesperson seemed to struggle with the tortuous logic of the terrorists' concerns as he continued to explain. "That suit is already known to us and is an ill-considered attempt to stop the activation of the collider since they claim that it might destroy the Earth. Those fears are absolutely groundless." In response to further questions he went on to talk about the team's enthusiasm about the possible creation of novel particles such as 'strangelets' and miniature black holes.

The terrorists however, believe that the LHC, which when it is finally activated will be the most powerful partcile accelerator ever constructed, will result in the destruction of the planet. If the Hawaiian court, whose jurisdictional authority is dubious, does issue an injunction there is a risk the accelerator's activation would be postponed indefinitely. The terrorists, who are members of an extremist fundamental religious sect, believe activation of the device will bring the day of judgment foretold in the Bible. Apparently they wish to hasten the purported end of creation and, it seems, to prevent the 'unworthy atheistic scientists' from being the instigators of the End when the accelerator is activated in due course. One of the terrorists in an abusive rant also derided Darwin's theory of evolution and, if it were true, which he also asserted was a lie, ending the world would stop that as well.

The situation seemed quite dire until project staff traced the telephones being used by the attackers to the vicinity of the one of the facility's particle detectors. The facility, being 27 km around would have taken too long to search even with the rapid deployment of French special forces. Luckily they eschewed using wireless phones since they feared getting cancer, presumably in the short time remaining to the 'End'. Some quick calculations by a small group of bespectacled scientists led to a conference with the military commander and an unusual tactical gambit. The science team's leader, Dr. Fritz Kugelblitz spoke afterward to reporters.

"When we discovered they were hiding inside the ATLAS detector we had an interesting little idea. There was some risk to the hostages but no more than an assault by the soldiers, and we could do it much faster." Simplifying the matter for us he described how the terrorists minds were clearly highly polarized in comparison to the general population, which they could somehow make use of. "What we did, in short," continued Dr. Kugelblitz, "was to briefly activate the detector's enormous magnets."

While the technical explanation is complicated the result was spectacular, and fatally gruesome. One of the hostages, still seemingly in shock and being escorted to the waiting medical team, told of hearing a loud hum and seeing the eyes of the terrorists glowing brightly. They then all fell to the ground with their liquified brains pouring out of their ears.

According to Dr. Kugelblitz's explanation the terrorists' highly-polarized and low-energy neurons enabled the powerful magnets to spin their brains inside their craniums at relativistic velocities. The hostages were unaffected.

When asked about the glowing eyes, he mused. "Ah! Bremsstrahlung."

How the terrorists gained access to the secure site is still unclear. It is rumored however that a security camera at one of the guarded entry portals, where the guards were subsequently found to be sound asleep, shows a dapper looking gentleman who bears a striking resemblance to media personality Ben Stein. In the video he is seen talking to the guards. Just talking.

In a bizarre coincidence, several PETA activists were apprehended on site the same day by the already-deployed military force. It was determined they were on a sabotage mission to disable the very same installation. As they were being led away in handcuffs, one of them who gave her name as Jaime Poutine, told reporters that their group is trying to put a stop to unrestrained scientific curiosity which, PETA claims, is having a deleterious effect on feline mortality rates.

"Despite today's extraordinary events," the CERN spokesperson said, "the Large Hadron Collider project remains on schedule."

Tuesday, September 9, 2008

Why Is ISP Customer Service So Awful?

This is a inescapable for anyone using the internet. It is a mantra of successful businesses that good service is a proven method to keep customers loyal and to win new customers. Yet ISPs, particularly the large ones, often provide dreadful customer service. The poor ratings and tales of customer woes from US and Canadian subscribers are legion on the pages of DSL Reports and elsewhere.

Some time back I dropped my first broadband ISP, Rogers, after a series of infuriating encounters with their customer service. Interestingly, not long afterward I was at an industry event where a Rogers executive was speaking. He found himself having to answer a question about their poor customer service. The audience was friendly, mostly telecom industry management, so he dealt with the question with honesty and a sense of humour. All he could do was roll his eyes, agree that there was room for improvement, and that it was outside his division's scope of control.

I have to imagine what it must be like when Rogers' top management meets in its boardroom in Toronto (I've been there - it's very nice) and the topic of customer service comes up. Despite having to defend the company in public I suspect they are quite blunt among themselves because they typically don't enjoy having unhappy customers. But consider the corporate perspective. The executive in charge of customer service strives to keep the cost of it low and thus contributes to maximizing corporate profitability. Should he or she get negative feedback from executives responsible for Rogers' various products, including broadband, the head of customer service might well ask them if they are losing customers and revenue, or if their churn rate is higher than the industry average. If the numbers show that Rogers is not doing so badly, the customer service head could counter that there is no reason to increase customer service expenditures. The CEO is likely to support this position.

Similar discussions on customer service quite likely occur within other large ISPs and providers of other telecom services like mobile. Since there is a large and vibrant ISP industry in Canada, at least thanks to DSL wholesale, the competitive threat to the majors is modest; most people are content to subscribe to bundles since the internet is primarily for casual usage and they get a price break on the bundle, and they may have become accustomed to the poor customer service they get.

Considering the effective oligopoly in the ISP business an argument can be made that the CRTC should take a heavier hand at regulating customer service here as they do for basic telephony and cable. This is where the majors can point to all those small broadband ISPs and can say there are many competitive alternatives, and if customers really feel their service is so bad they would switch providers, yet they choose to stay. This is not the first time a dominant telecom player has ceded a fraction of the market to a large number of small companies to avoid significant government intervention. AT&T chose this path back in the 1920s, among other tactics, to avoid the threat of nationalization by the US federal government. They had in part become large by acquiring large numbers of these small companies over a couple of decades, then stopping at about 80% of the market. They kept this fraction of the market until they were broken up in 1984.

Despite their need to tolerate the many small ISPs, it is no surprise that Bell Canada also is willing to constrain their ability to compete too effectively. Throttling (and capping) wholesale DSL may very well play into the same strategy.

Poor customer service from the major ISPs has deep roots.

Monday, September 8, 2008

One Securities Regulator in Canada

Regulation seems to be becoming a current topic on this blog now that I've written a couple of articles on the subject, and I have a couple more in mind that may appear this week. It was therefore serendipitous that this Globe and Mail article by Bill Downe, BMO's CEO, caught my attention this morning.

The message in the article is clear and so does not need any elaboration. It suffices to say that I agree with it.

The one supplementary point I would make is my hope that if we do finally get a single, national securities regulator in this country that it will have teeth. There is a lot of misconduct under the radar in corporate governance in Canadian-listed companies, in which we are hardly unique. The record of the OSC, among other provincial bodies, has been less than stellar in identifying and prosecuting abuse of shareholders' trust. Mostly they wait for the SEC in the US to take action and then cooperate with the US investigation or, later, engage in mimicry with a similar but ineffective action of their own. Try to think of any successful action by the OSC against a corporation or its directors and officers, especially one undertaken on their own initiative.

So, yes, we do need one regulator, but it should also be one that holds public corporations to account. Without that it is no more than an administrative bottleneck for Canadian businesses, as Mr. Downe alludes to in his article.

Media Trivialization of the Federal Election

The trivialization of our electoral process has begun. It all started Sunday with the, get this, live coverage of Harper's walk to Rideau Hall. Live coverage? This is nothing more than the theatrics of political ritual with no meaningful content, yet the media gives it live coverage. Perhaps they think Jean will tell Harper, "well, no, you cannot dissolve parliament."

The media seems to think it is necessary to fill air time and column inches with a quota of material even if there is not that much substance to report on any given day, or even every hour on the hour. Thus they often focus on the trivial. Get ready for long discussions about colours of ties and dresses, and, when the candidates go casual, how many buttons they leave undone at the tops of their shirts. They may even bring in experts to expound upon the importance of buttons to the candidate's message to voters.

The media will also periodically complain how the candidates and their teams craft their glib answers to questions that are really just oft-repeated campaign slogans. The same media will then go on to only quote those very same sound bites in their reports. They may even analyze them in great detail to add depth.

Why do they do this? What is the benefit? Who benefits?

The media tend to over-hype predictable and lengthy events like elections since it is one way to draw an audience and sell advertising. Federal elections fit their need nicely now that the summer olympics are over. More air time means more ads. So they inflate the trivial to draw an audience and keep them glued to their television sets in order to maximize profit. They know that a large fraction of the population is vulnerable to entertaining coverage so that it what they seek to do - entertain, not inform.

Certainly they will report on consequential matters as they arise and, apart from the afore-mentioned live coverage of the Rideau Hall visit, what I did hear today was actually pretty good. However the election has only started so there is much that is new and fresh. Within a week or two I expect we will have to wade through lots of dross to find useful material. They will carefully space out the good bits so that we are exposed to more ads, and therefore the trivial stuff that fills out the otherwise empty minutes in between. Election coverage is an excellent example to support my previously-stated view that all streaming media news is a waste of my time.

I believe it will be better to use the internet for election news, even if only to visit the sites of the very same media outlets. Just avoid the embedded streaming media and other trivial content so you can better use the time to learn those thing that will inform your voting choice.

Friday, September 5, 2008

Headline Sentiment

A random selection of article headlines about gold I see this morning:
There's lots more of the same, especially when you are willing to dig below the higher-rated blogs and news services. You don't need to read any of these articles to learn something important (I read only one of them).

When any stock or sector suffers a rapid decline (there lots of those right now) armies of rationalizers materialize seemingly out of nowhere. They argue vehemently against the decline and pounce on any rumour of manipulation, ignorant speculators (never themselves!), supply problems or unheralded demand to argue that the decline is unjustified and the turn is coming soon. Real soon.

The message I see is that those still in the declining stock or sector are in denial. Yes, they may be right eventually, but for now they are wrong. So what will they do if the decline continues, for whatever reason? Eventually they will sell, get taken out on margin, or perhaps simply get very quiet as they cuddle their worthless paper assets. This means that right now there is more fuel to drive the decline further.

The same thing happens with rising stocks and sectors, except then it's those on the short side that raise the biggest fuss. You don't have to look far to find examples; energy was a good one as oil rose to $140 and natural gas to $13.

If you are holding, or even looking to get in at the 'bottom', in any equity when the rationalizers are at their loudest, you should rethink your inclination. It doesn't mean it would be wrong to invest there, it just means you need to look past the noise to see if you have a sold, evidence-based reason to be there. That reason can be based on fundamentals or technicals. Also, ensure you have a contingency plan such as a rigid stop loss in case you get it wrong.

Thursday, September 4, 2008

Self-Regulation is an Oxymoron

You know the drill by now: aircraft safety, food safety, water safety, bridge safety, nuclear safety, and so forth. The public (all of us), as users and funders/purchasers of these critical goods and services, rely on some certainty of safety, however it may be enforced. Market forces in many cases are sufficient to incite producers of those goods and services to take safety seriously, but there are cases where that is insufficient. That is where regulation comes in.

You cannot regulate yourself. Sure, if you use a loose dictionary definition of regulation it is possible - just keep an eye on yourself to confirm you're performing as you should to some self-stated set of criteria, and do it diligently and honestly. In the commercial realm, regulation requires a more stringent definition where personal safety is involved. Regulation should at least include:
  • Performance standards and metrics to reach a desired outcome. These should be based on sound scientific methods of analysis and evidence.
  • Structural separation between the regulator and the regulated. The intent is to avoid common commercial or other interests that would stain the impartiality of the regulator.
  • Accountability of the regulator and the regulated. As it is often said: a contract is only as strong as the penalty clause. There must be consequences for failure to meet performance standards by all involved in the process.
  • Process oversight. Yes, even regulators need to be regulated. The entire regulation process should be defined and monitored by an accountable industry or professional body, or a government agency protected from political interference.
As I read about the Maple Leaf saga, lack of action of recommendations resulting from the Swissair disaster, and other recent stories, I notice how one or more of these fundamentals of regulations have been violated. There is room for improvement, along with constant vigilance to reduce the frequency of relapses.

Like it or not, politicians set the tone so it is up to us to hold them accountable. Investigative reporting serves to shine a light on things we might not otherwise be aware of, for which we ought to be grateful since those who would benefit from a failure of effective regulation thrive in the shadows. If the public fails to act on these revelations we can expect more of the same in future. Ultimately the responsibility is ours alone.

Wednesday, September 3, 2008

Calm Before the Volatility (Market) Storm?

This afternoon I called up chart after chart of small and mid-cap technology and telecom stocks trading on NASDAQ to see if I could see any commonality in their technical behaviour. There appears to be a sameness among them, though I would only call this sample anecdotal, not a true analysis of the stocks in these sectors.

A surprising number of them are in volatility squeezes (narrowing of Bollinger Bands), sitting on top of their 50 DMA lines, or otherwise trading very placidly on low volume and low price volatility. Some of this low volatility is due to low volumes across the markets in the lead up to Labour Day although, so far, not much has changed this week. This leads me to suspect that the market, or at least the technology sector, is likely to see the start of a strong trending move, up or down, in the next week or so. My gut feel is that it'll be up.

Commodities on the other hand look like they're in for a longer period of slumping prices. It isn't only that energy, gold and the rest dived after Gustav missed hitting any critical infrastructure, it's that they dived below the price levels before the threat from Gustav was priced into the market. Next week should be a good time to fill up the tank.

Cloud Computing: History Doesn't Repeat, But It Does Rhyme

Shades of the 1970s! Back in that ancient period we had cloud computing. Really. If you're a young 'un, let me take you on a brief tour of the world of computers and networking back then as experienced by a small subset of the population, including me.
  • Computers were almost always large mainframe or minicomputers located at a distance from users which they accessed using dumb terminals and (usually) proprietary dedicated networks or 300/1200 bps telephone modems. All the applications and data storage were centrally located. We called it time-sharing, but it is essentially what we now call cloud computing.
  • We could look to see who was logged onto these central computers, and if they were we could send short text messages to each other. Instant messaging did exist back then. When these computers were networked we could even message across the continent and overseas.
  • Store and forward text messages became widely available by the early 1980s, extending the more rudimentary systems available in the 1970s. We had email, too.
  • I would like to say we had webcams, but sadly the digital imagers we had back then had extraordinary poor resolution, a few shades of gray, were terribly expensive, and came with no software tools to retrieve and manipulate images. Nevertheless it was done by the more adventurous with lots of spare time.
Of course it isn't the same this time around. Cloud computing builds on a vastly different environment. The personal computer, cell phone and internet made for a remarkable transformation. Now everyone is connected, from pretty much everywhere, rather than a few geeks talking to each other from terminals to which they had to travel to use.

Cloud computing in today's sense is far superior, with only a passing resemblance to the time-sharing systems of an earlier era. The best thing about it in my opinion is that we get to regain the freedom of being untethered from our little electronic boxes that contain all of our applications and data. This means we can reach our personal data environment from anywhere, without the risk engendered by carrying expensive, fragile and bulky hardware around with us (and probably without adequate backups). With all the computers I use, and the cranky, unintuitive networking that Windows provides, I go crazy sometimes when trying to find some file I stored somewhere on one of these boxes, or perhaps even on one of several web-based services. Ideally, cloud computing will sort out this mess, making all of our lives a little easier.

Will Google win this particular battle? Maybe, though I don't think it matters since there will be competitive alternatives. Better still, I am hoping that enterprises will be able to adapt these cloud computing platforms and replace the many and proprietary networking tools that confuse and confound both their employees and IT departments. IT departments will also win since they will be able to centralize software and hardware maintenance, even do backups and provide social networking tools for their users, rather than trying to manage a sprawling population of individual PCs each with all their own little hardware and software quirks, and even quirkier users.

It'll be just like the 1970s all over again. Except this time I'll be a comfortably dumb user, and not working in IT on those old mainframe behemoths in too-cold machine rooms.

Snakes & Frogs

In the glorious weather this past long weekend in the Ottawa area, during an outdoors excursion I stopped to rest in the cool shade of a particular tree. The exact location is unimportant, but it is recognizable and a comfortable place to rest. It reminded me of a curious if common event that happened in that exact spot a couple of years ago.

The tree was large and alongside an overgrown ditch and a bare patch of packed dirt and stone. As I was sitting there I noticed a brown frog sitting in the dirt a meter or so from my feet. It blended well into the browns and greens of its surroundings. Probably the only reason I noticed it is that it hopped a short distance. Looking down at it I also saw a garter snake slowly slithering towards it.

At first I didn't see what was happening. The frog was very clumsily moving away from the snake and the snake was ineffectively trying to sneak up on the frog. The frog only hopped about 10 cm each time, then it would sit there quite placidly. When the snake approached again it would do the same thing, always hopping directly away from the snake's general direction of movement. The frog was doing the absolute minimum to keep the snake at a distance. Conversely, the snake was getting nowhere in its apparent quest to catch the frog.

Looking at it from an evolutionary perspective, I should not have been surprised at the mutual tactics of predator and prey. If frogs were too successful at eluding snakes, we might expect fewer garter snakes in the world; if snakes were too successful at catching frogs, it is frogs that would see their numbers diminished. However it comes about there is an unsteady equilibrium between the two at the point where they are very nearly equal in their opposing strategies. That is a very neat conclusion if oversimplified since there are other species involved that would fill the void in the predator or prey landscape if the equilibrium between frogs and snakes is upset.

To continue the story, by means of a series of small hops, the frog landed a short distance in front of my feet. In its attempt to come around the frog, the snake discovered me, in particular my legs and shoes. Changing direction to make use of this welcome obstacle, the snake came around the heels of my shoes. While it formed itself to the curves of my shoes I kept as still as I could. Very slowly it used the screening afforded to gradually get closer to the frog than it yet had managed. The frog, ignorant of the looming threat, sat quite still.

When the snake's head reached the farthest reach of its blind, the tip of my right shoe, it was about 10 cm from the frog. That's when it coiled up and lunged. Its tiny fangs got hold of one of the frog's hind legs and held on tight while the frog tried to escape. The snake held on quite easily. Its quest for lunch was won.

I continued to sit very still since I was very curious to see how it would make its meal. I saw two problems. One was that the frog was about twice as thick as the snake's mouth and body. The other was that the snake's only tool was its teeth. To get more of the frog into its maw it would have to do it in steps, needing to release its grip at every step.

I discovered much later that garter snakes are mildly venomous, just enough to keep the frog from having the reflex speed to escape in the brief time the snake needs to use its backward-curving teeth to move up the frog's body. Probably that's why it waits after the initial grab before continuing its feeding.

There's no point in describing the deadly process further other than to note that the snake was perfectly capable of gradually drawing the entire frog into itself, taking about 15 minutes in all. With its now distended body the snake slowly slithered back into the shaded ditch to digest its meal, while I continued on my way.

Despite its macabre nature it was fascinating to watch something that in our urban lifestyle is often only understood intellectually rather than observed. Nature can be cruel, but that is the story of life on Earth. This never-ending game between predator and prey is one of the fundamental drivers of process of evolution, weeding out of the relatively weak while leaving the strong to reproduce, that over eons, quite accidentally, produced us.

Tuesday, September 2, 2008

Political Leverage of Innumeracy

With all the talk about a federal election this fall, I will instead focus on provincial Ontario politics for this look at how numbers and public inattention are used by politicians for their own ends. I'll use a trivial example to start.

Recently the Ontario government announced $1.1B allocation of funds to municipalities for various infrastructure needs. Ottawa's share is $77.2M. Never mind for the moment that this is our own money (transferring our provincially-paid income and other taxes to another level of government). The first question is, is there anything remarkable about this amount of money? No, not really. Being fair-minded, meaning they want to avoid showing favouritism if at all possible, they would tend to allocate the funds in proportion to municipal population. If you take the percentage of Ottawa's share of the $1.1B (7%), then take that same percentage and apply it to Ontario's current population (about 13M people) you get about 900,000. This is pretty close to Ottawa's metro-area population (excluding Gatineau, of course).

There is also the matter of whether that 7% is proportional to the taxes we pay into provincial coffers. I will leave matters of wealth redistribution unaddressed in this article since, provincially, it doesn't appear to be as big a factor as it is with federal to provincial transfer payment.

Now that I've dealt with this simple case as an introduction I will move on to a slightly more obscure matter - how politicians announce program funding increases. You know the type I mean, where, with some regularity, you hear an announcement such as X% increase in funding to, say, health care. At first blush these announcements sound welcome, yet what are we really getting? To look at this properly we need to first consider two secular economic effects:
  • Population Growth: For the past decade the population growth in Ontario has stabilized at around 1.3% annually. Despite some rhetoric about how immigration (a significant fraction of our population growth) steals jobs and so forth, the GDP does roughly scale with population. This is as expected since every new person (immigrant or born here) is, in equal proportion to the population, working, paying taxes and buying goods and services. Therefore every Y% growth in population roughly increases government revenue by Y%.
  • Inflation: This effect is better known, but let's recap it briefly. When inflation (a measure of the increase in price of a basket of goods and services) occurs there is again, roughly, a proportional increase in GDP, taxes and, yes, government revenue (here's one document to wade through if you have an hour to waste). Recently, GDP is tracking 0.5%-1% less than inflation + pop. growth, but over the long term is generally true, which you can see from the charts near the bottom of the referenced page. Government programs are part of the same economy, so the cost of those programs also roughly tracks inflation. As an aside, ideally we'd like to see GDP growth higher than pop. growth + inflation, as evidence of higher productivity, but this is sadly not the case in Ontario.
What this means is that if a government program that is currently funded to the tune of, say, $10B, and a government minister announces a $300M increase of funding this year, that program is quite likely being cut back. With inflation running at ~2.8% right now (due in large part to energy) and population is up 1.3%, the program would have to be increased $400M to maintain existing service levels (assuming other factors, like productivity, are unchanged).

In this calculation I am assuming the government program is offered to everyone and costs are aligned with inflation. This is rarely true for any one program, for no other reason than statistical variance in inflationary costs, but would be about right across all programs.

In conclusion, be wary of politicians bearing gifts! Look just a little bit closer at the funding increases they keep announcing, especially with a federal election growing nigh. Notice how they tend to announce absolute dollar amounts rather than percentages, or if they do mention percentages how they omit the context. In 2008, a 3% to 4% annual increase is not an increase - it's a static funding level. A 2% increase is a funding cut. Regardless of your personal values and beliefs about government policies and programs, you need to arm yourself against the machinations of politicians.