In Part 5 I walked through the history of wireless (cellular) telephone service and how the telco has struggled to maintain control over that phone. One point I emphasized was that the line of demarcation, that imaginary or not-so-imaginary line, is where one side definitely belongs to you and the other side belongs to them. The nature of the phone itself makes the line a bit fuzzy. I suggested that the line is at the interface between the SIM card and the phone.
This works well enough for basic cell phones, those that do little more than make phone calls and support vertical features like Caller Id and voice mail, and a few simple phone-resident features like a directory, call log and ring tone selector. There is little question that in this situation the telco is very much in control. That situation changes with the smart phone.
To begin, let's go back about 10 years. At that time the smart phones were not very smart. It was still quite a challenge to pack enough computer and data communications hardware into a hand-held device. There was also the matter of the radio technology, both in the phone and over-the-air, that made data communications difficult, expensive and slow. Most of the wireless infrastructure was still analogue, not digital. If you had a phone back then you'll know what I mean.
Many of those early smart phones were Symbian-based from Nokia. The screens were still monochrome and low resolution, and almost all without keyboards. If you wanted computer technology in your pocket you used a PDA, a space which was ruled by Palm. As the technology evolved there were endless discussions about who would win the smart phone wars (though they weren't often called that at the time), the phone makers or the PDA makers. I would argue that the ultimate winner was neither, but rather a small Canadian firm that started with an email device that functioned like a 2-way pager. That of course was RIM. It didn't stop there, as next came Apple and ... well, so on to today.
Let's return again to 1999. To get the best use out of the technology it was necessary to customize the services and communications protocols to fit within the constraints of both low-bandwidth and low-computing capacity. This was the dot-com era yet it was not possible to give unfettered internet access to these devices; browser software was much too large to fit in the phones and the data networks could not handle the browsing traffic load, even though it was almost exclusively text plus static images. This was bad for users and an opportunity for the telcos.
Look at the diagram on the right. Even in 1999 voice minutes were only modestly profitable on the wired network, and accounted for most of the profit for wireless. They knew as well as anyone that voice minutes would decline in profitability so that if they were to sustain their profits they needed to replace that with something else. Vertical services were, and are, one important piece of the puzzle. If you didn't already know, the profit margin on vertical services like Caller Id, voice mail, call waiting, call forwarding and so on is close to 100%. About their only costs are service provisioning and marketing since the technology to provide those services is already in networks and the phones.
When it came to data services, it is no surprise that they envisioned a similar structure. At the time they charged very high prices for basic data transport to and from the phone (and regrettably still do in Canada), and expected a gradual decline in price, similar to what they were seeing for telephony. Vertical services were seen as the likely profit leaders. The leading service was email, but there was much more. Here is where they exploited the limitations of the technology. With open TCP/IP internet access out of the question they exploited the specialized software components and protocols that enable service access by becoming the gatekeeper to those services. This was the origin of the term walled garden.
The telco worked with some of the then high-flying tech start-ups that were at the forefront of the software and services specially-built for the wireless networks of the day. A couple of notable companies were Openwave (for WAP - Wireless Access Protocol - and gateways) and Infospace (service broker, payment processor and WAP-optimized user interfaces). For a time they were on top of the world as the the world's largest carriers pounded on their doors, practically begging to buy their stuff. Of those two, Infospace was the most fragile since, in true walled garden fashion, each of the carrier's had their own plans for services and payment systems. The services enabled ranged from the basics, like phone directories and restaurant listings, to the more valuable like travel schedules and reservations, email, even concert tickets, and, yes, all those popular and often-annoying ring tones.
Their idea was to get a piece of the action in each of these transactions, much like the credit card companies. Infospace collapsed as the telcos develop their own WAP-based services, but even those eventually went nowhere. Yet for a while they did have some success with this strategy.
Time marched on and very soon those phones really were becoming smart phones. Users were also chafing at the restricted offerings and limited choices that were painfully obvious when compared to what was available from any computer with an internet connection. The market pressures were strong. Once the PDAs became internet-enabled the carriers gradually opened TCP/IP gateways that opened the first trickle of internet traffic, and which soon threatened to become a torrent. Sure, they were charging a lot for data transport, but they were gradually losing control of the most valuable part, the vertical services. The sweep of technology-enabled change was faster and stronger than they could control.
Today it is common to use smart phones with large, high-resolution LCD displays, copious memory and fairly powerful computers and attendant software to access the internet much like from any desktop. Surprisingly there are still some remnants of those walled garden services here and there, but it is the internet that now rules. The telcos are back to charging for megabytes and minutes, which is exactly where they don't want to be.
Even these businesses are under threat. With phones that will preferentially select a Wi-Fi connection if one is available, more of the wireless data traffic is being diverted to other networks and services. VoIP and web-enabled voice IM are threatening the revenue from voice minutes. They can't stop use of Wi-Fi but they are still able to seriously impair competitive voice services.
The next article in this series look more closely at smart phones and competitive voice services. I'll show how the demarcation point is no longer a point but instead a coordinated set of distributed points within the phone and in the network.
Friday, January 30, 2009
Tuesday, January 27, 2009
Tax Cuts - Thoughts Before the Budget
Can tax cuts stimulate the economy? Or, to put it better, is the dollar amount of a tax cut a better or worse stimulus than placing that same amount elsewhere? We want to get the most out of the debt our federal parliament seems certain to saddle us with so we should try to get the best possible value. I am watching for this in today's budget announcement.
Of course we all love tax cuts, but they are not always the right answer. Let's consider the recent GST cuts. If these were an effective stimulus we should expect to have seen an increase in economic activity and consequent tax revenue. This doesn't appear to have happened. I say doesn't appear since it is difficult to disentangle the effects of this one item from all the other, sometimes larger, factors affecting the economy and government balance sheets. I am no expert so I will refrain from offering an opinion. All I can do is note that this is being said by many who do have some level of expertise in these matters.
Whether the experts are right or wrong, for the moment let's assume they are right - the GST cuts did not stimulate the economy. Why might that be? If I think about my own behaviour it makes some sense. Frankly, I have not noticed the impact of the GST cuts. A 1% or 2% change in the total cost of my purchases pales in comparison to the price volatility I encounter. Gas prices can shift 10% or more overnight. Some basic foodstuffs have gone up 50% in the past year. A decline of 2% is negligible in this environment. It is invisible. Because it has been outside my notice, it has not altered my purchasing behaviour. I believe the case would be different for high-priced items like cars, but I haven't bought one that recently.
I don't know if my experience is typical so I will avoid generalizing. It does however cause me to wonder what kind of tax cuts would affect me. My guess is that they would have to be taxes that by their very presence, or absence, are impossible to avoid notice. One example would be income tax. We all peruse our paycheques and grimace at the large amounts in the tax box. Should that number go up or down, we immediately notice . Another is capital gains, although it is a minority of the population that directly pays taxes on these. Again, it's right in your face as you fill out your tax return.
I suspect these are the sort of tax cuts that have at least a chance of altering consumer behaviour. The GST cuts are simply too unremarkable and spread too broadly to be very helpful. This is where targeted government-sponsored spending projects are quite different, in that they concentrate capital in selected sectors and regions. This makes them highly visible. The visibility is not always good since, because it is concentrated, there are those who go without the stimulus. Yet even if there is dilution by spreading the capital over many projects country-wide the effects can be considerable since they become newsworthy. That can boost confidence, and therefore spending - initiating an upward spiral. Perhaps this can get us by until the US economy shows signs of recovery.
I am not terribly happy with any stimulus, as I wrote once before in regard to the auto sector. If we are to have it then let's get it doing the best job possible for us by focusing investment in areas that are not financial sinkholes, but in areas where there is a strong likelihood of increased citizen confidence and an eventual return on investment, or at least stay close to break-even.
Labels:
Politics
Monday, January 26, 2009
Controlling The Phone - Part 5: Wireless
Back in the first article of this series I described how the demarcation point between you and the telco has had large consequences for the shift in power away from the telco. Consumers benefited when an industry was built up around providing innovative (and not so innovative) products and services to them once the telco was kicked out of the home. That was for legacy telephone service provided over copper loops. Now we come to wireless where many of the same battles are being re-fought.
About 2 out of 3 Canadians have a cell phone. In some parts of the world the proportion is much higher, and in a few places cell phones outnumber people. Yet for all this phenomenal success the industry is relatively new, having only achieved notable commercial stature about 25 years ago. This industry was not instigated by telcos, instead being driven by entrepreneurs. The telephone companies saw little need to lead the charge since there was little utility in the new service, being expensive and perhaps only of interest to a niche market. When it did gather significant momentum and they saw that cellular service would certainly grow to challenge their monopoly on telephony, they swooped in and bought out the entrepreneurs. These transactions benefited everyone, including consumers, either by making them rich or providing other benefits. This was mostly a US story since in most other countries, notably in Europe, it was the traditional telco that typically introduced cell phone service.
In those early days the phones were analogue and, while sophisticated for the time, were unintelligent devices; it was still difficult to get computer hardware and software into those tiny devices, and in any case they would have had little to work with since there was no general-use data channel. That only became generally available a little over a decade ago.
Standards were still unstable so we also had the situation where the mobile service provider had to also control the phones that could use its network. In important respects the environment was very similar to wired telephony back in the 1950s and 1960s, pre-Carterphone. There was no demarcation point; the mobile phone provider owned the whole shot, even your phone, regardless of whether you leased or purchased it. The phone was nominally untethered by the fact it was wireless, yet the tendrils of control were so strong you could almost see the wires trailing on the ground behind you as you moved about.
Cell phone technology has evolved. Pick up your phone and see if you can find a demarcation point. It should be clear that it isn't outside the phone since that is all owned by the network operator, including the spectrum, towers and switching equipment. Do you own the phone? You did purchase it, so in one sense you do. To use a weak analogy, let's compare this to software. You do not truly own the Windows software on your computer; it is owned by Microsoft. You are licensed to use it within certain limits they define, often called RTU - Right To Use - in the business. Similarly, while you own your phone, your use of it is limited. Sure you can throw it out the window or stomp on it if you please, but when it comes to actually using it, you find that the operator controls the phone.
The phone is tied to the network and uniquely identifies you and your service contract. This is accomplished by hard-coded data within the phone. It is possible to break this lock with some expense or with time and ingenuity. This is less necessary now since, depending on the technology in use, many phone have the network codes on a separable device - the SIM card. This means you have more control over the phone you purchased, except for when you want to use an operator's over-the-air voice and data services. That is generally considered a reasonable compromise. You could say that the demarcation point is in the socket on the phone into which the SIM card is inserted.
The SIM card came about largely as a result of government action in Europe. There was always something functionally similar inside the phone, but regulations caused it to be exposed to subscriber control. It existed because phone manufacturers, including companies like Nokia, Samsung, Motorola and others, sell their products to many operators with unique requirements on how the phones need to be controlled. Good engineering and cost management dictated that the customization should be constrained to as few components in the phone as possible.
We should not skip over the business reasons why the wireless operator feels compelled to control the cell phone. They typically offer it as a loss leader packaged in with a service contract. Marketing departments know that consumers are put off by a high initial price. So they lower the up-front price by selling the phone to you well below cost, sometimes for free, knowing that they'll capture the revenue during the contract life. Simple arithmetic dictates that the charges for service are well above cost. If we paid full price for the phone and paid a lower monthly rate, in the end the total expenditure would be about the same. It's just marketing. Except that once they have you, you keep paying every month, usually even past the original contract term. They like that, and they use various techniques to dissuade you from switching, some of which can be of questionable ethics.
Coming back to the phone, we have a clear demarcation point on the modern cell phone, or at least most of them. Yet we still do need to connect to a wireless operator's network to get to use the phone. In the early days this wasn't too controversial since all you did was make and receive phone calls: a phone was just a phone. That is no longer the case. Now we have smart phones. These are exemplified by Apple's iPhone which is a leader in this rapidly-growing market segment. In time we can expect that all cell phones will be smart phones, exploiting the data connections they employ and the increasing internal capability to support a wide-range of software applications.
In my next article I'll take on smart phones, from their first appearance to their probable future, continuing my focus on how the telephone companies continue to control the phone.
Labels:
Technology
Friday, January 23, 2009
Comcast Digital Voice and VoIP
I was surprised to hear this week that the FCC is going to probe Comcast's differential treatment of internet telephony services from their own Digital Voice service. The issue is supposedly, since both are VoIP, that if they treat them differently then they are discriminating. In particular, the VoIP-based services of others can be throttled like other internet traffic, yet Comcast's service is not. Therefore this becomes a question of network neutrality.
This is hogwash. The problem appears to be a confusion between technology and service. This may or may not be deliberate on the part of an FCC that has been, by many accounts, biased against the cable companies versus the telephone companies. Regardless of the political angle, let's look at the technology and understand better what is going on here.
First, VoIP is a technology not a service. Like every PacketCable-compliant operator, which covers pretty well every North American cable company, Comcast's own telephony service is based on VoIP technology. Without delving into technical details, this means that the voice and associated call-control signalling rides on the TCP/IP protocol stack, which itself rides on top of the digital coaxial cable service (or hybrid coax+fibre), which serves as Layer 1 (physical layer) of the protocol stack.
Like every other service on the coax, telephony service is allocated exclusive bandwidth (or a channel if you prefer). This channel is quite narrow in comparison to the coax capacity, which is typically at least 750 MHz these days, except that since the coax is common to many subscribers the total bandwidth consumed can be much higher. There are other channels set aside for digital television and broadband internet. The television channels are quite wide but the amount of bandwidth is independent of the number of subscribers. Broadband internet bandwidth, like telephony service, scales with the number of subscribers.
The fact that VoIP is used for the cable company's own telephony service is a red herring. They could use that bandwidth to support any other technology that is digital. The first generation of cable telephony was circuit-switched digital, not packet. And not all packet technology need be TCP/IP. But I suppose that since it wasn't VoIP technology it didn't trigger any zealotry.
In fact, the situation here is equivalent to the telephony and broadband segregation on the telephone company's DSL-enabled copper loops. Here each loop is dedicated to a subscriber, at least as far upstream as the DSLAM, with the analogue telephony and broadband internet services frequency multiplexed. Voice extends up to about 3 kHz, with the broadband data channel starting somewhat above that and continuing up to (depending on the technology) several MHz. Since the modems and telephones can misbehave when exposed to the other service's content, the DSL provider gets you put filters on each of your phones (the DSL modem has its filter built-in).
So all this FCC action is about inflaming latent resentment against the big bad company and not about actual bad behaviour. Like them or hate them, Comcast is doing nothing wrong. In this case it's the FCC at fault.
Maybe next they'll go after Verizon or another telco, wondering why plain old telephone service isn't subject to the throttling on DSL internet access services. The FCC would still be in the wrong, but at least then they'd be consistently wrong.
Labels:
Technology
Wednesday, January 21, 2009
Venture Exits (Again)
I've written about this before, as have so many others, but it bears repeating: the possibility of exits, and lucrative ones at that, are at the heart of the venture capital market. In that light I want to draw attention to this blog item by Om Malik.
No IPOs and no acquisitions mean no appetite for investment. Until investors, both angels and VCs, replenish their coffers by liquidating current assets, and do it at break-even or better, nothing will continue to happen in Ottawa as elsewhere.
Counterintuitively, if you do believe in this sector then this would have to be a terrific time for investors to become more active. Valuations are about as low as they can go, which means they can get a larger slice of start-ups and increase potential returns. This brings to mind the old market cliche that when there's blood in the streets, buy!
Will investors see things this way? Probably not. Indeed that is one important reason why valuations are dreadful for both public and private tech firms - no buyers, just sellers. It takes a lot of faith or insight (or both) to operate counter to the prevailing trend. Yet I could not advise anyone to take the risk since it is so difficult to tell whether that dim glow in the distance is light at the end of the tunnel or a trick of the eye.
Labels:
Business,
Technology
You've Never Done That Before
Like most everyone I do pay attention to mass media. It's hard to avoid and should not be avoided since so much good and timely news comes to us from them. We do need to remain wary since there is a lot of blather in the mix, some of it downright intellectually insulting.
One of the common trite themes in the media (and, yes, also in blogs) is to critique those who aspire to a position or level they have never before held. It is so easy to say: he or she has never done that before. That becomes a platform from which the commentator then goes on to demean or dismiss the person. I find that incredibly irritating, especially since the message is so often coming from a mediocre journalist or "talking head" analyst who have themselves never achieved much of anything.
This style of put down is ridiculous. If no one ever tried to achieve objectives that they've never before accomplished every human being would remain in their cribs for life. Never tried crawling before? Who do you think you are, baby? And, walking? Well, you can't even crawl yet. This is the domain of the incorrigible pessimist, the lazy or incompetent (hate to see others succeed) and schadenfreude (like to see others fail).
This particular subject came to mind as I read about Yahoo!'s new CEO, Carol Bartz. I was quite astonished at the number of articles and other commentary that criticized her for never having run a company like Yahoo! before, and that this portends dire consequences for both her and the company. Of course we also heard the same thing about Obama and the presidency both during and after the campaign. We get the same in domestic politics, with the wags going nuts saying, "I told you so," after Dion bombed out. But at least he tried which is more than can be said of so many of the naysayers. Since when is striving and failing an indictable offense. It's about improving ourselves and others, or at least making a valiant attempt. Bartz may fail, but then, like Zafirovski at Nortel, she is taking on quite a challenge. Good for her, I say.
I'm sure you could come up with countless other examples, from sports team (they've never won the cup before), individual athletes (how can he possibly expect to break that world record), to friends and family (she's never run before so she'll surely hurt herself trying to improve her health). I also see it a lot in the Ottawa technology community. Investors are always willing to bet on people who have successfully built (and exited) a start-up, yet are frequently dismayed at the idea of betting on an untested team. But to avoid doing so is holding good people down, and miss potentially high returns (with the customary risks). What you then see are many budding entrepreneurs cut down and opportunities lost. It's a shame. If Ottawa's tech scene continues to decline this attitude will be partly to blame; it is wrong to blame the problem solely on the those who are making the attempt. That is what happens in private with many I have known, even if the public reasons for failure are quite different.
Regardless of what your own endeavour may be, in business, community action, amateur athletics, or even making your own furniture, listen to yourself. The critics may be right, but critics by their nature always criticize and especially if you attempt something they have not had the courage to attempt. They don't know and perhaps you don't either. Try something new occasionally anyway. Failure isn't so bad since you're sure to learn something, maybe even something important.
Monday, January 19, 2009
Economic Ties to the US
In the US it's Martin Luther King day so the US markets are pretty much all closed except for some futures activity. It's holidays like this that remind us how dependent the Canadian economy is on the US, seeing that the bulk of our economy is export base, and the majority of our exports go to the US.
On this lazy day on the Toronto Stock Exchange, indices and stocks are barely moving and with pathetically low volume. Those that do show some activity are often issues that are sensitive to the futures markets, and usually take their cue from international markets.
While this certainly is a welcome break it is also sad in that it makes plain just how our economy is not really under our control. The politicians can prattle and spend all they like but a recovery in Canada will not come until the US recovers. There is no short term solution that we can implement here.
Labels:
Markets
Sunday, January 18, 2009
Controlling the Phone - Part 4: Intelligent Networks
In Part 3 I introduced the concepts of stupid and intelligent networks, and how intelligent networks are make their appearance in the telephone networks. At the close of the article I talked about Intelligent Networks (IN) as a specific class of implementation of the more general concept, but held off on talking about why IN was of importance to the telcos back in the 1990s.
You may be surprised to learn that it had nothing whatever to do with you and I, the consumers of telephone services. From the telco's perspective, you are, to a degree, a captive audience that they already keep under control by means of restricted signalling at the network edge (Part 2) and various non-technological means, including government lobbying. When many consumers chose to ditch their wired phones in favour of a mobile phone, the telcos bought the cellular phone companies, further restricting alternatives.
IN was instead driven by the consequences deriving from the love-hate relationship between the telcos and their suppliers, in particular the major manufacturers like our own Nortel Networks.
Like in anything having to do with business we need to follow the money to understand this point. Observe the diagram on the right. Here we see the consumer, you, as the source of all revenue. Since a little over a decade back, you can select from one of very few service providers. They in turn spend your money on the equipment and services needed to operate those services you've paid for. In an ideal world, you are satisfied with the value you are getting for your money and the telcos and their suppliers all make a reasonable profit.
Dig a little deeper and you find more forces at play than financial balance sheets. You depend on the telephone company, and you now have some, but not a lot, of choice. Similarly the telco has a dependence on the suppliers of the hardware and software that comprise its network and services. This latter dependence is at the heart of IN.
The switches from Nortel and Lucent, among others, are not simply switches. They also contain the software and the signalling endpoints that enable the bulk of the services you use. Because these switching platforms are closed systems the telco must request features from the manufacturer. The process is long and tedious as the parties negotiate, with the manufacturer looking for an up-front commitment on sales volume before even beginning development.
The telco has other difficulties. Since in North America they pretty much use the switching equipment from just two suppliers: Nortel and Lucent. Since they use both it does them little good if, say, Lucent agrees to build a new service when Nortel does not. The telco needs to offer the service to everyone, otherwise the marketing and operations are a nightmare, and the regulator gets on their case for failing to serve segments of the population. I've been a part of these sorts of negotiations and I can tell you there is no love lost on either side. There is a mutual dependency between them, but it isn't a happy one.
Then there's the matter of pricing. Here we see a common thread with other software products. When Nortel sells a software package to a telco, even though the price is high it is a one-time sale. In contrast, the telco uses that software to generate recurring monthly revenue forever. The manufacturer has had to accept this situation since they in turn have a dependency on the telco - the telco is their sole distribution channel to reaching the consumer market (see the diagram above). When the manufacturers did try to structure software pricing so that they got a piece of the retail action (they called this "risk sharing") it was no surprise that the telcos pushed back, and pushed back hard.
As you can see there is ample reason for friction between these two sets of corporate behemoths. Theirs is not a warm and cozy business relationship. With this knowledge we can now understand why IN matters:
IN was the telco's way of wresting control of service content, cost and delivery from their suppliers, with the ultimate objective of winning a greater share of the profits. To control the phone, your phone, they needed to free themselves from the clutches of their suppliers.
The telcos also had to achieve signalling standards so that equipment from disparate suppliers could talk to each other as their software intelligence level increased in complexity and capability. They and their proxies (primarily Bellcore, since transformed into Telcordia) attempted to use the standards bodies to advance their cause.
All of this took many years. It not only sucked the time and resources of many major corporations, the entire concept suffered from some fundamental problems. Here are what I believe are the worst these problems:
- Expertise: There has always existed some tendency to trivialize the scale and complexity of the technology that comprises the network by those who would wish to exercise control. This has often been true of the telcos. "It's just software," you could almost hear them say. The truth is somewhat harsh to those aspirations. Whatever one may think of the equipment manufacturers, one should acknowledge the depth and breadth of expertise they have developed. For all their own domain of expertise in operating networks and marketing to a vast population, this expertise they did not have. Even when handed the tools they made little progress. A host of proprietary service creation environment (SCE) products came on the market, all of which promised great riches from new services yet failed to deliver more than the basics. Even those were bought at great expense due to technical challenges (see next bullet). Languages like Parlay and JAIN came on the scene and did little better.
Most ended up outsourcing their feature development back to those same companies, but by paying for the show they retained much of the control of those services. - Theoretical challenge: The base code of the switching equipment is, as already mentioned, closed. Unlike Microsoft (as one example), no APIs are provided to enable outside software development. IN was an attempt to define a middle layer, somewhere between raw code, like C, and high-level flow charts, that would define atomic service building blocks and the operations that could be performed on or with them. These would plug into a limited number of "triggers" implemented on the switches.
Anyone with a passing knowledge of formal language theory would likely get intrigued or even uncomfortable on hearing this description. It's a rat's nest of theoretical problems, some approaching intractability. As the difficulty came to be understood, the scope of IN was scaled back. In the North American system, AIN, there were precious few triggers exposed by the switch call model, and even those had to be treated tenderly to avoid catastrophe. Interactions among components was hard enough on its own, and was made immensely more difficult by opening up the software to outside agents with unknown and unpredictable behaviour. The problems remain to this day. - No new features: If you're like me you use the phone to call people and talk to them. You also make some use of additional (vertical) features such as Caller Id, Call Waiting, Call Forwarding (to voice mail), Call Screening, Hold, Conference, and perhaps Calling Name and a few others. What more do you need and are willing to pay for? The answer is, not much. Even when more features are bundled into the service price they are rarely used. Most suffer from being only occasionally used, which means you cannot remember how to use them. Using them is never easy because of the limited signalling capabilities that the network exposes to consumer devices. There are only so many dial codes and combinations of hook-switch flash that a person can deal with. What people want is not more vertical phone services, and they certainly will not pay for them, but rather more communications possibilities, such as those enabled by the internet. There is no market demand for IN-enabled services.
- Consumers are tapped out: We are not looking for ways to spend more on phone service, but rather to do more, or even the same, for less. If a telco does somehow succeed to invent and deploy a new vertical feature that garners some market interest, there is little willingness to pay. Consumers are refusing to spend more on phone calls. They are looking beyond that to smart phones and the internet to deliver something totally new, which they may indeed pay for.
Lastly, VoIP (internet voice) now has enough momentum to finally kill the legacy voice networks. It won't happen quickly but it will happen. VoIP threatens to upset all the telco's plans to maintain control of your phone and protect the revenue of the services they sell. VoIP and mobile will be subjects of future articles in this series.
Labels:
Technology
Friday, January 16, 2009
Adjectivitis and the Weather
Superlatives are superlatively superfluous. Or something like that. Adjectives are a bit of a disease that you learn to avoid in business (if you dare) since they add nothing and can alienate those the adjectives are meant to impress. Are you intrigued by products that are the best, or new and improved, or indestructable? I hope not. We should all have become just a little bit wary of these verbal attacks on our credulity.
Something like this happens when it comes to Canadians and the weather. Cold enough for you? It isn't enough to say that it's -30 C, we have to add that it is the coldest this winter, this decade or this century. And it isn't just cold, it's very cold or extremely cold. Never simply cold.
We hate the cold, but seem to revel in enjoying how much we hate it. So we harp upon the temperature as if cold weather in January were remarkable. Or perhaps it's just me that's a bit jaded since I've lived in a colder climate than that in Ottawa. I don't remember which comedian said this, but I like the line: Canadians have to be tough since just the weather is enough to kill you.
But the thing that really gets to me is that we just have to add in the wind chill. Now it's no longer -30, now it's -40. To emphasize the point, many will even forget to mention that it's a wind chill number, not actually the temperature. Wind chill is Canadians' way of adjectivizing numbers; saying -40 is apparently a more sophisticated way of saying it's -30 C with a breeze.
The thing is (and you can do this experiment yourself), if you take a thermometer outside where it's -30 C with a howling wind and hold it out there for a while, the reading will quickly drop from +20 C and then slow as it approaches and finally reaches -30 C. The wind chill may be -60 yet the thermometer will steadfastly refuse to go any lower, and it certainly will not go to -40 where the mercury will, the urban legend says, freeze and then explode in your face. None of this will happen. The reason it won't happen is that the temperature is -30 C, not -60 C, despite the attraction of those beautifully-large negative numbers.
If you're adventurous, you can try the above experiment slightly differently, in line with this fascinating snippet from the Wikipedia entry on wind chill:
The method for calculating wind chill has been controversial because experts disagree on whether it should be based on whole body cooling either while naked or while wearing appropriate clothing...
I would love to observe these experts when they're comparing these competing techniques. Who knew scientists were so tough.
So what does it all mean? It means that it's -30 C with a nasty, heat-sucking wind. That's just what it is. When it's windy that thermometer get to -30 C faster; the wind, like a fan pulling air through a car's radiator, causes a more rapid transfer of heat. Adjectives need not apply. It's just that wind chill temperature is more sexy than watts/centimeter/second, which does accurately tell you how cold it feels.
When Environment Canada went to this latter style of reporting in parts of the country years back, people hated it. I think they hated it even more than metric. No one complained that the reports were wrong - they were in fact extraordinarily correct, surprising everyone who knows to never trust a weather forecast - but a statement like "the wind chill is 2,400" just didn't excite people. There were no opportunities for adjectives. Eventually they gave it up and went back to the temperature style reports which, while mostly meaningless, fit better with Canadians' world view, which includes not trusting the weather report.
The only cure for this disease is called spring. Then we can complain about how wet it is and try out some new adjectives to describe that.
Wednesday, January 14, 2009
Nortel - The End Game Begins
Perhaps I should apologize first for adding to what is by now a deluge of blog posts and media coverage on the latest milestone in Nortel's downfall - protection from its creditors. It is nevertheless wrong to pass it over without a few words since I am a part of the Ottawa tech scene and was employed by Nortel in the past.
Despite the media attention, in the near term things will be pretty much business as usual for Nortel and its employees. They have a real and viable business, and customers that depend on them being there. That won't change. What will happen is that behind the scenese the company will be solely assessed on its present market value by the financiers. Promises, leadership, vision, strategy... all those things are now dead. Now it's a matter of coldly looking at the assets, including the value of the various enterprises, and selling them to willing buyers. No one believes that there is any added value in keeping those pieces together except perhaps as a bundle to simplify the sale to a single buyer, if there is one. If there's any money left over after the assets are sold, shareholders may see something out of it. Realistically the chance of that is slim to none. Nortel shares are now truly worthless.
The attempt to stick the blame on someone will continue as it has for some time already. In that regard I am in agreement with Gordon Pitts that it is unfair to tar and feather Mike Zafirovski. His error was to take on the job of CEO with insufficient due diligence, so that he would know that he did not have all the resources needed to turn the company around. True, he could not fully predict the melt-down in the economy but he could have at least known that Nortel was more fragile than expected. Like with the flu, when a healthy person gets it you are out of commission for a few days; when you are already critically ill, you die.
I hope that matters work out for the best with the company and its employees. Those I have spoken to recently are incredibly fatalistic. Ask them about the company and all they would do is shrug their shoulders and give a thin smile. It seems that all they want is closure instead of this long drawn out uncertainty. At least now they may get that.
As to the market, I have to say I am very often wrong about my guesses yet I am unhappy to have gotten it right with my most recent post on Nortel's stock. Perhaps like the proverbial broken clock that is right twice a day I sometimes, too, am right.
For market players there is one lesson to be learned from the trading in Nortel's shares the past few days and most especially in pre-market (and post-announcement) trading in the stock this morning. You might wonder, who is buying all those shares that everyone is dumping like rats jumping off a sinking ship? Short sellers are the most likely answer. They've made their money and now want to lock in profits by covering their short sales with shares bought on the market. This activity keeps the shares above zero. While shorts could get a few more pennies by holding their positions down into shareholder oblivion, their positions would then be locked in for a protracted period which can be costly. So they buy. Anyone else who was buying this morning made a bad mistake.
Controlling the Phone - Part 3: Stupid vs. Intelligent Networks
Years ago, before the internet became the phenomenon that it is now, David Isen put forward the idea of the Stupid Network. In a short and very readable paper he laid out the basic tenets of the concept and its benefits. Benefits included driving innovation by accelerated deployment of new technologies, and this was to be driven by unlocking control of the network. In essence, the network would continue to support foundational functions such as transport and routing, while service intelligence was to be located outside the network.
This certainly did not endear him to his employer, AT&T. The telco business model was, and is, strongly dependent on maintaining control over the network, and therefore the services it enables. They were right to be concerned. All these years later, having nearly achieved what was called the death of distance, much of the telco's profits come from features above and beyond the basic business of making voice and data connections.
Now, before we look at the competing concepts of intelligent versus stupid networks, it may be helpful to compare telephone networks and computer networks, and especially how the latter evolved over the years.
Up to about the early 1980s the principal model of computing reflected the underlying economics. Computers were large, expensive and required trained experts to install, operate and maintain them. Software was constrained by the hardware and communications channels. This necessitated thin clients and centralized intelligence. Therefore we found dumb terminals, often shared rather than one per person, low-speed and error-prone communication channels, and large time-sharing mainframe computers that supported dozens or even hundreds of users. Over time the terminals got increasingly intelligent and those central computers did get smaller and less expensive, yet this model of a dumb edge and a smart centre persisted for a long time.
In my youth, before I got involved in telecom, I worked in this business and got to know it very well. I changed my career focus as the technology evolved. When personal computers (PCs) began to make inroads I, along with rest of the industry, made the move. Rather than a small number of computer manufacturers like IBM, Honeywell, Burroughs and Amdahl, innovation and entrepreneurship flourished. Microsoft, Lotus and others appeared, and they and the hardware vendors sold direct to businesses and consumers. I still remember when co-workers felt an incredible degree of empowerment when, with a PC and spreadsheet right there on their desks, they could do stuff without pleading for someone (like me!) to write some software and get them access to valuable machine resources. Anyone with eyes could see right then how powerful a concept it was to have an intelligent "edge" on the computer network.
The virtuous spiral did its job well, creating even more opportunities for everyone. Within a surprisingly short span of time the mainframe was sidelined - not quite dead, but dying - and everyone had a PC on their desk.
It was inevitable that all these sophisticated yet isolated devices would leverage the network effect by starting to communicate with each other. It started with the simple things, like email, then suddenly the web was among us. The internet is today the ultimate stupid network, capturing very well the essence of Isen's vision.
It is no coincidence that telephone switching followed a similar path at first. Yes, it did lag some years behind computing but that was justified, at least partially, by the need to harden the technology for 24x7 service availability. Even so, by 1980 the switching equipment was almost all digital and computer-driven. But then matters languished for many years despite the introduction of some very sophisticated hardware and software inside the networks, far on the other side of the demarcation point, and out of the reach of consumers (see Parts 1 and 2). The phones looked prettier and did a few new things like speed dial, in-home wireless, speaker-phone and the like that did not require sophisticated signalling to the network. That was not and is not made available to residential customers.
Inside the network (PSTN or Public Switched Telephone Network) some changes were apparent. There were now feature servers and a closed signalling network for sole use by the telcos - SS7. I won't bore you with a lot of industry jargon, so let's just call it by its most generic title: the Intelligent Network, or simply IN.
As I mentioned in part 2, residential customers were not offered signalling that made use of the underlying network capabilities. There were limited exceptions, mainly targeted at enterprise customers, but also offered to competing carriers when they were forced to do so. (That I'll perhaps cover in future so that I can focus here on you and me, the typical residential customers.) So, while the network evolved, mainly providing operational and economic benefit to the telco, consumers saw very little in the way of substantial improvements. To them the network looks much the same now as it did three decades ago!
This may seem odd since surely there are lucrative opportunities to get more revenue by offering customers more or at least better services. But as I've mentioned before, in the non-competitive local telephone market that was in place pretty much everywhere in the world until the late 1990s they had little incentive to do so. It wasn't just their fault, it was also governments and regulators that share the blame of making them complacent. Again, that's another story and I do want to talk about the technology.
Which brings us, after a very long and hopefully not too sinuous a path, to IN. This became a force in the industry through the 1990s, which may seem surprising since the work on IN was virtually invisible to everyone outside of the telecom industry. Despite the massive effort put into IN by the telcos and the major (and minor) equipment vendors, in the end it was stillborn. It does have some successes, which are most notable for being so rare. For one, there's the Calling Name feature. Others include call screening features that go by various names, including Call Director and Call Intercept, and then there are a few more sophisticated services like Verizon's iobi.
Why did IN fail to fulfill its objectives? Why did consumers never see the rich set of features that IN promised? For this we have to go back and understand the telco motivations to build IN, some nasty technical issues, and in particular (to reprise the title of this series) their overarching need to keep control of the phone. The primary reason why they even bothered to pursue IN, and, make no mistake, it was driven by the telcos and not the equipment vendors, is probably very different from what you might guess. I'll get into that in Part 4 of this series.
Labels:
Technology
Tuesday, January 13, 2009
Air Canada Gets It Wrong, Again
It is truly astonishing when a company is persistently unable to respect its customers. Customers are of course responsible for every dollar of revenue a company takes in. It is simply not sustainable to abuse your customers except in a monopoly or oligopoly. How fortunate for Air Canada that it is operating in the latter environment. Even so, they continue to push their customers closer and closer to the edge.
This article in today's Globe is worth a read. It isn't anything new for those of us who have been long-standing Aeroplan members and have made use of reward miles. Air Canada's tactics over the past several years have, in my case, driven me from their program. This was relatively painless since I was only using points, not adding to them, since I otherwise took to avoiding Air Canada.
Aeroplan, and similar programs of other airlines, have been commonly known as loyalty programs. The idea being that benefits accrue to those who patronize the sponsoring company for a long period of time.
I won't repeat what the article reports, in particular regarding how Air Canada is whittling away the benefits and costs of remaining a loyal customer. What I do want to do is expand upon a couple of points that the article does not mention.
...Aeroplan spokeswoman JoAnne Hayes ... emphasized that mileage levels required for Classic rewards have been virtually unchanged since Aeroplan was founded in 1984.
In recent months, a round-trip reward flight booked between Toronto and Vancouver cost 25,000 Aeroplan miles, plus an $80 fuel surcharge, $4 in GST on that surcharge and $46.55 in various other fees.
Sounds reasonable, doesn't it. Well it isn't. What Ms. Hayes fails to mention is that the cost to consumers of earning those miles - Air Canada's revenue - has gone up in proportion. That is, we pay more now to acquire points since ticket prices have gone up over the years. It is dishonest to allude to their increased costs without mentioning that so has revenue per mile.
My final point is that Air Canada, by their punitive conditions on redeeming points, is turning their best customers against them. People who have 40,000 points to spend on a winter getaway earned those points by flying with Air Canada, not once, but many times. These are the people they must keep satisfied since they provide so much of their coveted business. Others who scrimp points from LCBO purchases over several years, while certainly not to be abused, are of lesser business importance. Air Canada may get away with their bad behaviour in the latter group, but not the former. Aeroplan has become a disloyalty program.
So, with Air Canada it's still the same old, same old. Hopefully Westjet will dig the knives deeper into Air Canada now that, after all these years, they are planning their own loyalty program. Their record tells me they are more likely to do it right.
Labels:
Business
Friday, January 9, 2009
OC Transpo Strike - Financial Windfall For Taxpayers
There was a good article in the Ottawa Citizen on the financial implications of the current contract Ottawa has with the Amalgamated Transit Union, and what the union wants. The author, Randall Denley, correctly points out how these costs affect you and me, the taxpayers. So far, so good.
Now look at this quote from the article, related to the recent survey taken of Ottawans' opinion of the fairness of the city's offer to the ATU:
Despite the inconvenience, a city poll shows that 63 per cent of the public backs the city's position while only 14 per cent are taking the union's side.
This provides a deeper insight into the public's attitude. The majority in Ottawa do not use public transit and so are not directly inconvenienced by the strike except in increased commuting times due to traffic congestion. I am part of that majority. Not only is the inconvenience small, this strike is putting money in my pocket. Here's how.
It hasn't been widely talked about, although Alain Mercier did mention it publicly at least once, that the longer the strike continues the better for the city's financial position. The reason is that OC Transpo operates at a loss. With operations suspended, those losses are averted. Regrettably I don't have the numbers at hand although I do vaguely recall that the amount is some millions of dollars per months (Update: $3M/week according to this article).
Great! Not only am I not inconvenienced, the longer the strike the better the city's books will be balanced. That means prospects for lower property taxes. Count me in.
Perhaps this makes me look overly mischievous, but keep in mind that the ATU is looking out for its members' self-interest so I feel justified in looking out for mine. I want Council to hold firm.
Labels:
Politics
Palm and Competitive Shakeouts
One business area that is of some particular interest to me is that of smart-phones. Because of this I paid some attention to the announcements this week from Palm about their new phone, the Pre, and its operating system and features.
It's getting rave reviews, yet it may not matter. I don't have a crystal ball so I won't foolishly predict whether or not this will turn around their fortunes (or even bump their stock back up to its glory days). What I do have to wonder is whether, no matter what they do at this point, they have a hope.
Technology moves fast. What was science fiction yesterday is today's innovative darling and tomorrow's everyday product. The transition from today to tomorrow is one of business and competition. That is, once the innovator has done his or her work, many companies will jump on the bandwagon and start producing competitive products and services. Some of these will be great and others will suck, but in its totality there is a steady progression of improvements.
This is usually the stage where venture capital comes in, by placing bets on products and teams who they believe can come to dominate the new product category. They know most will not survive - that is the nature of the beast. For the most part, markets tend to initially embrace diversity and then punish it. All that competition tends to cause especially uneven quality and incompatibility among competitors' designs. This is rarely sustainable in the long run. Eventually consumers and enterprises make their final choices and stick with them. Then the shakeout begins.
Smart-phones are like that. Today we have Apple, Nokia, RIM, Palm, Google, Microsoft and more vying to be one of the survivors in the smart-phone software and applications space. We have seen the same thing many times before: VCR formats, PC operating systems, office productivity suites, PC manufacturers, CRM, online auction-houses, social networks, and so very much more. Palm is not well-positioned right now relative to its competitors, and we know from history that quality is often not a meaningful indicator of success. Market momentum can count for much more.
I have my own interest in this particular smart-phone battle since I have made a business bet on Google's Android system being one of the survivors. A bet is necessary to play in this game, and while I can do all the due diligence I please there can be no certainty that I have made a good bet. All I can do is wish Palm luck with their new product and fighting spirit. But make no mistake, the market momentum is not moving their way despite this week's positive exposure so their battle is an uphill one. That hill is steep.
Labels:
Business
Thursday, January 8, 2009
Ban Historical Rationalization
I had to laugh, in sympathy, with this rant by Paul Kedrosky. Media and analysts have a pathetic habit of comparing the current market and economic downturn to those in the past. This is what we expect from sports commentators (e. g. how does the Sens on-the-road losses compare to...). In sports that at least makes some sense since the rules of the game were much the same decades ago so there is some semblance of respectability in the comparisons.
It is not so of messier real-world matters like the economy. It is different this time. The causes are different, the economic structures and instruments are different, peoples' investing, purchasing and saving behaviours are different, and the politics are different. I, too, harped on this once before but it bears some repeating: there is always a first time, and, it is untrue that there there is nothing new under the sun.
This does not mean that it's all gloom and doom. I am reasonably optimistic about investing prospects in 2009. There are lots of great deals out there, provided you don't expect to rake in big profits inside of a short time span.
Labels:
Markets
Wednesday, January 7, 2009
Controlling the Phone - Part 2: Residential Signalling
In the previous article I stopped at showing how a demarcation point permitted competition for phones and wiring in the home. What I will now discuss is how the functionality, or intelligence, of home equipment remained limited by signalling technology.
Every home with classical telephony service (POTS) is served with a pair of copper wires called the local loop. It's a loop since they form each side of an electrical service with the generator (switching system port) in the telco central office and the load in the residence (telephone or modem), which completes the circuit. You might hear this 2-wire loop called a twisted pair or tip-and-ring, which refer to electrical matters outside the scope of this article. You can also ignore the 3rd and 4th wires you see if you cut open that cable in your home since they are unrelated to the telco service. Also not relevant is that nowadays the copper loop terminates in a multiplexor or DSLAM since this equipment keeps the electrical "view" identical at your home.
This circuit support AC and DC. When the phone is inactive (on-hook), the circuit blocks DC. Power ringing (that will still operate an electro-mechanical ringer) is AC so that it can operate the phone's bell (or electronic ringer). That is one example of signalling that the telco provides you. Another example of AC signalling from the central office is the 1200 bps data burst that provides Caller Id.
When you go off-hook to answer the phone or to dial out you have both AC and DC circuits. Except now the AC signalling is kept to voice-band levels and frequencies. Examples include DTMF digits, conversation (voice or dial-up data), Call Waiting tone, and so on. Some features are initiated by the user with a flash signal, which is a momentary interruption in the DC path, but not so long that it looks like you've hung up.
That's pretty much all. Now, try to imagine all the nifty features and services you can build with that range of signalling options. Not easy, is it? There isn't much to work with here since you are dealing with signalling technology that dates from around World War II.
Originally this paucity of signalling was necessary due to the primitive technology (from our modern perspective) which was designed for good economics, reliability and to make it possible to simply dial and talk. Caller Id was added over 20 years ago and had to be carefully engineered to work reliably while not affected users without Caller Id. DTMF (or touch tone) came many years before that.
The necessity for this simple signalling due to technology considerations hasn't been true for very many years. Attempts to modernize the local loop and its feature capabilities started back in the 1970s. The history of these (mostly) failures isn't pretty, especially for me since I participated in it for over 10 years during the 80s and 90s. Perhaps the biggest boondoggle of that time was ISDN, so let's look at it briefly since it exposes the theme I am developing in this series of articles: how the telephone company seeks to keep to itself technology that enables revenue-generating services.
ISDN was a digital service that, for homes and small offices, supported two 64-kpbs channels for voice or data, and a 16-kbps channel for packet data and fairly sophisticated call signalling. The signalling system was extensible and comprehensive, with the ability to support some very interesting and innovative services, especially when combined with those two voice-data channels. While the protocol was not based on IP it certainly could have supported it (remember, this was long before the internet entered the wider world).
So what happened to ISDN? One big problem was business inertia. Monopolies has little incentive to innovate and embark on major capital-intensive projects regardless of the purported benefits; it was far easier to sit back and let the guaranteed profits flow in for the existing services. The other big problem, which is of interest to us, now, is that ISDN made it possible for service innovation by outsiders, even users themselves. Here we had a major threat to the core of the telco business model. While the downward-spiralling path had many bizarre convolutions, this is what ultimately doomed ISDN.
With no business leadership the technology teams were hopelessly adrift. So the fight drifted into the realm of the equipment vendors who filled this vacuum with unproductive squabbling in a battle for competitive positioning. There were even several attempts to dumb down ISDN to effectively turn the signalling into a digital analogue of tip-and-ring by (so-called stimulus signalling). It was ugly, pointless and fruitless.
The issue didn't die until the internet and early forms of DSL exploded onto the scene in the 90s. The world had gone IP mad. Primitive versions of internet voice (VoIP) then appeared and began to enable what ISDN had promised decades earlier. Unloved and unnoticed, ISDN crawled into a corner and died. No one noticed.
With VoIP on the scene and with no more investment in legacy switching infrastructure, the battleground for services has moved on to mobile, VoIP and the core of the network. You should not expect any new services on the legacy telephone network.
Before ending, I must mention that enterprise telephony has been far more innovative than residential. The reason is that, as with residential, once the demarcation point moved outside the building there was a booming industry in private branch exchanges (PBX). Also, the trunk varieties used to interconnect medium and large enterprises to the network had more signalling services than on residential 2-wire loops. A lot of innovative services came to the business world because of this. Local company Mitel built its fortunes in this market a generation ago. However, since VoIP and the internet are similarly rolling over this market sector I will not delve further into the history of PBX technology.
In my next article I'll talk about Intelligent Networking (IN), which is the network-side complement to ISDN. This technology was nearly as stillborn but further shows how telcos strove to, unsuccessfully, fence in service intelligence, and its revenue. Later I'll talk about each of the other, new services playgrounds, mobile and VoIP.
Labels:
Technology
Monday, January 5, 2009
Controlling the Phone - Part 1: Demarcation
Right before Christmas I promised a series on how telephone companies use technology to control you, the subscriber, and their own business interests. I am pushing aside political and regulatory activities to some extent in this series so that I can concentrate on technology, knowing full well they are nevertheless inextricably tangled. I believe I can help to remove some of the emotionalism that seems to inevitably creep into this topic by focusing on technology.
To begin this series I will talk about the demarcation point. This is the invisible point (or line) that separates your sphere of control from theirs. For classical, wired residential service in Canada and the US the demarcation point is usually on the outside wall of the residence. The location for apartment and condo dwellers is typically indoors but outside the individual unit. (Wireless mobile service is quite different with respect to demarcation, so I will cover that in a separate article.) Within your sphere of influence you can do as you please provided you do not damage the telephone company's network. It wasn't always this way.
Historically there was no demarcation point. The telephone company's network extended right up the telephone appliance, including all inside wiring. If you go back 100 years this actually had technological justification. Telephony was new and standards were non-existent, so the only way to assure proper operation was a totally integrated network from a single source.
AT&T took the lead by becoming both an operator and a manufacturer while acquiring smaller outfits, including some in Canada. As a result, by the middle of the 20th century the standards for telephony equipment simply became the way that AT&T did things. Because of their dominant position other operators and manufacturers gradually adopted AT&T's specifications. This was less true on other continents, with the reasons being fodder for a future article in this series.
As time passed, we reached a point where dominant, near-monopolistic telephone companies used the technological argument of network protection long past the time that it had much technical justification. Regulators and entrepreneurs knew it and also understood how it was being used as a barrier to competition. The regulators, then as today, did not have the technical knowledge to effectively counter those arguments. Instead, it was the courts that ultimately changed the playing field through some landmark decisions in the US (further reading: Hush-a-Phone, Carterphone).
In my own lifetime I can remember when my parents had to provide a doctor's certificate to the telephone company to lease a long telephone cord so our one phone could be carried to a sick family member's bedside. Today this sounds laughable, almost inconceivable. They did this as a means to protect their revenue for extension phones - if you have a long cord there is less incentive to subscribe to extension phones. Of course they did charge a monthly fee for the long cord, but it was more affordable than a second phone.
That is the absurdity of the demarcation point in action. As a result of the referenced landmark decisions, the demarcation point was gradually pushed back. First it was the telephone sets themselves, to permit competitive telephone sets. This led to speaker-phone, wireless phone, computer modems and more. Next it was the inside wiring, permitting extension phones to go wherever the subscriber desired, or just to take the one phone from room to room by plugging into a phone jack (before this there were no jacks or they were prohibitively expensive). Homeowners could also do their own wiring or hire a contractor. Finally, the telephone company was pushed outside the house entirely. All of this was enabled with technical standards that were enforced by the regulator, under force of court order in many cases.
This is all behind us now, yet the telephone companies fought these standards every step of the way. Even when they knew they would have to yield they still sought to slant the regulations. Their intent was to reduce the functionality that competitors could offer in comparison to their own equipment, or to increase the expense of compliance. This was not a pretty process; the process was akin to a decades-long mud wrestling contest.
Look at the back/underside of every telephone, modem, computer or any device that plugs into a phone jack. You should find a CRTC/CSA or FCC compliance sticker that the device complies with relevant regulations that ensure the device will not harm the telephone company's network. Notice that the sticker does not indicate the quality of the equipment or even that it will work as intended, but only that the network won't be harmed by the equipment. The regulations protect the telephone company, not the consumer.
The next article in this series will be on residential telephone signalling systems.
Labels:
Technology
Friday, January 2, 2009
Persistance of Corporate Culture
Welcome to 2009. I want to say a few words about Air Canada's widely-reported failures over the holidays in the service they rendered, or more often failed to render, to their customers. I believe there is a lesson to be learned here that extends well beyond this one company.
Consumers' Association of Canada president Bruce Cran said most of the complaints are about Air Canada's response to the situation.
"I don't hear anyone complaining about the weather, it's the treatment that they were given, or the lack of decent treatment by the airline in question," Cran said.
More specifics are in the article itself, but to me the greatest failing is one of honesty: Air Canada has a long-standing behavior of lying to their customers. There are examples of this to be found in other airlines, though perhaps not with the persistence and dogmatic manner which Air Canada exhibits, and for so long a period of time. It is, in a sense, a seemingly inseparable part of their corporate culture.
Going back well over 20 years, from when I first started flying frequently to now, Air Canada has behaved this way. This is astonishing when you look at all that has happened over that long span of time. The company itself has been transformed, in stages, from a tightly-regulated, market-dominating crown corporation to a loosely-regulated public company with a far-less dominant market position. It has seen competitors come and go, one large competitor was acquired (Canadian Airlines), its industry partnerships have shifted to and fro, and foreign operators have been given increasing freedom to operate on their home turf.
Through all of this they have retained their original arrogance and disdain for their primary revenue source: us. It may have originated from the natural self-centredness of government bureaucracy or from their near-monopolistic historical roots, yet it is still there. Individual employees have gone against the trend and rendered (in my experience and in countless other cases) superior service. Unfortunately these stand out the more so for their rarity.
I would characterize this one element of their corporate culture this way: A dollar in our pocket today is worth more than the promise of two dollars in our pocket tomorrow. In other words, once they sell a ticket they will do anything, not stopping even at outright lies, to keep that money, even at the expense of driving away that customer's future business. Underneath this I think you'll find that they believe that customers have little choice and therefore they have little to lose.
Let's catalogue some of the ways they lie. All of these are intended to hold onto your money after you've purchased a ticket.
- If a flight is delayed because the plane was held up elsewhere they will tell you it will be there in 15 minutes when in reality it is still on the ground at a far-distant city. They want to prevent you from seeking a competitor's flight until you have no option except to wait for the delayed flight.
- They will push back from the gate and sit there, sometimes for hours, knowing they cannot take off due to local or distant problems with air traffic and weather, and keep telling you it'll only be a few more minutes. If they stay at the gate you have the option to disembark, which they want to prevent.
- A flight is canceled due to equipment failure and they put you on the waiting list for another flight, telling you there are lots of seats available when they know there is almost no chance you can get a seat. Again, they are discouraging you from seeking a competitor's flight.
- They issue boarding passes to two passengers for the same seat and try to shift the blame to one both passengers. At this point they can bump you to another flight, having successfully kept you from seeking competitive alternatives.
Why do they do this when so many other airlines are open and honest with their customers? Apparently they seem to get away with it, to a degree, although their business has suffered. How is it that through decades of turmoil in the airline industry, their own company, and frequent turnover of senior management, this culture of dishonesty persists?
I don't know the answer. I can even point to similar examples in other industry sectors, and I've even worked for companies exhibiting similar long-term dysfunctional behavior. If Air Canada has not been cured of its inveterate lying until now there may be little hope. The only choice is that the market must punish them. I am doing my part by avoiding Air Canada whenever I have a choice.
Labels:
Business
Subscribe to:
Posts (Atom)