I decided to rush out a brief post before my holiday break when I read this article. I fully sympathize with the need for more broadband competition and broadband deployment to under-served areas, it's just that we should not confuse the two: competition and ROI (return on investment), respectively, are completely different animals.
Line sharing (aka network access unbundling) can improve broadband competition in currently served areas, by driving down prices and creating incentives for better service where facilities-based competition is limited -- this is the reasoning behind GAS (gateway access service) in Canada. However, line sharing will not only not create an incentive to deploy broadband to under-served areas, it will slow deployment to those areas.
I would have thought this would be obvious, but perhaps it isn't. I think the discussion has been confused recently by reports about AT&T's declining investment in their networks, which is occurring even in the face of a moderately-competitive market for mobile wireless in the US. The contention seems to be that the present level of competition isn't sufficient to prod AT&T to maintain or increase their capital budget, therefore the regulator should encourage investment by other means. It must be said that the criticism of AT&T may turn out to be misplaced since it could very well be that they are making a sufficient investment in their wireless data network, and that the decreases may be elsewhere in their budget; AT&T is a diversified corporation with several major lines of business.
Getting back to the topic at hand, if line sharing for currently-served areas ends up in reducing the price for broadband service (or increasing investment in the service side of the business) there will be less money available for capital expenditure in the network. Of course one could argue that the carrier should find cost savings elsewhere in their operations to compensate for the reduced revenue, although that is easier said than done when you consider just how much they have been trimming the fat (including massive headcount reductions) over the past several years.
If you then combine this price pressure with the loss of service revenue due to line sharing on new technology -- deployment to under-served areas or upgrades of existing networks -- the business case for capital investment gets much tougher: same or higher costs, and less potential cumulative revenue from served customers. This situation in the US has its parallel in Canada, as exemplified by the recent government decision to overturn the CRTC ruling on GAS for higher-speed technology. I discussed this point in an earlier article.
With that, I am out of here. My next post won't be until after the coming Earth perihelion in early January, assuming we survive our annual close brush with the Sun. Enjoy the holiday.
Thursday, December 24, 2009
Wednesday, December 23, 2009
How Rogers Wireless Can Win
In my previous post I said that it is possible that Rogers will benefit, not lose, from Wind's entry into the Canadian mobile wireless market. While I did briefly touch on the reason -- roaming charges -- I now think it is an interesting enough point to deserve elaboration. That is the purpose of this article.
As mentioned before, Wind's network will take some time to roll out, whether across our urban areas, the suburbs, transportation corridors or the great expanse of rural areas. They make up for their current low geographic coverage through a roaming agreement with Rogers Wireless, which is only one of the big 3 to share the same radio technology: GSM. The CRTC has mandated that the incumbents must enter into these roaming agreements with the new entrants, and that the roaming charges in those agreements must be reasonable (probably cost plus a small profit, although I have not looked into the actual language). The quoted rate is $0.25/minute, which appears to meet the requirement.
Let's look into a sample scenario using round numbers that I believe are good approximations to many typical cases so we can see how Rogers business changes when subscribers switch to Wind. The format of my analysis should be usable for those interested in doing some digging and using figures more accurate than my approximations.
The market shares of the big 3 are not equal, but let's assume that they each have 1/3 of the total market, and each of their subscribers are equally likely to switch to Wind. Now we can look at what happens when one subscriber from each -- User R (Rogers), User B (Bell) and User T (Telus) -- switches to Wind.
If each subscriber's current bill is $50/month (not including taxes), each carrier has their revenue reduced by that amount. Wind charges less, so their revenue increases by $90 when they charge these subscribers $30/month. We can now back up and exclusively look at the impact on Rogers -- we'll come back to the relative impacts on Bell, Telus and Wind at the conclusion of this analysis.
Rogers has a marginal cost to support each customer, mostly due to billing and customer service (terrible as it is), that goes to personnel, systems, mailings and so forth. If we assume that this cost is $6/month/subscriber, the net revenue loss is $44/month. I'm assuming that the ETF (early termination fee) subscribers pay when they break their contracts to switch to Wind will cover subsidies for the phone they may have provided for free.
Each of the subscribers who switched to Wind will spend a portion of their air time on Rogers' network. For a typical user (and deliberately choosing nice, round numbers!) we can estimate this at 40 minutes/month. This works out to $10 for each of Users R, B and T, or a net revenue to Rogers of $30 (I am making the reasonable approximation that the roaming fees collected by Wind are forwarded to Rogers without any transaction fees). Rogers' net loss has now been reduced from $44 to $14/month.
If the 40 minutes of roaming represent 1/3 of a typical subscriber's total monthly air time, the sum of these three subscribers roaming load is equivalent to one Rogers Wireless subscriber. Therefore the network cost for Rogers is the same as before Wind came on the scene; Rogers lost one subscriber, then gained three 1/3-subscriber traffic loads. The network load is not distributed the same since the roaming is mostly in outlying areas, but rarely in urban cores. Even so, we can call it a wash since the overall long-term equipment impact is similar.
If the actual roaming load is less than what I've described, the roaming revenue declines as does the network load. While the marginal network cost will be lower than the marginal revenue for every unit of air time -- if this isn't clear, think about it some more -- there is some savings since the lower network load allows some network capacity increases to be deferred to a later date. This saves capital expenditure in the near term.
The final tally is a net loss of $14/month/subscriber rather than the $50 charge that appears on their subscriber's monthly statement. This isn't bad! Ok, so while my earlier guess that they might show a net benefit was wrong, they still do very well. This becomes especially apparent when compared to Bell and Telus which each show a net loss of $44/month (using the same set of assumptions). As for Wind, they are carrying a large debt load and are further burdened with a high ratio of capital expenditures to revenue. Of the four carriers it should be clear from even this simple analysis that Rogers Wireless will be the net winner for at least the next year or two.
Perhaps that is why, of the big 3, Rogers raised the least stink when Clements overruled the CRTC to let Globalive turn up the Wind network.
As mentioned before, Wind's network will take some time to roll out, whether across our urban areas, the suburbs, transportation corridors or the great expanse of rural areas. They make up for their current low geographic coverage through a roaming agreement with Rogers Wireless, which is only one of the big 3 to share the same radio technology: GSM. The CRTC has mandated that the incumbents must enter into these roaming agreements with the new entrants, and that the roaming charges in those agreements must be reasonable (probably cost plus a small profit, although I have not looked into the actual language). The quoted rate is $0.25/minute, which appears to meet the requirement.
Let's look into a sample scenario using round numbers that I believe are good approximations to many typical cases so we can see how Rogers business changes when subscribers switch to Wind. The format of my analysis should be usable for those interested in doing some digging and using figures more accurate than my approximations.
The market shares of the big 3 are not equal, but let's assume that they each have 1/3 of the total market, and each of their subscribers are equally likely to switch to Wind. Now we can look at what happens when one subscriber from each -- User R (Rogers), User B (Bell) and User T (Telus) -- switches to Wind.
If each subscriber's current bill is $50/month (not including taxes), each carrier has their revenue reduced by that amount. Wind charges less, so their revenue increases by $90 when they charge these subscribers $30/month. We can now back up and exclusively look at the impact on Rogers -- we'll come back to the relative impacts on Bell, Telus and Wind at the conclusion of this analysis.
Rogers has a marginal cost to support each customer, mostly due to billing and customer service (terrible as it is), that goes to personnel, systems, mailings and so forth. If we assume that this cost is $6/month/subscriber, the net revenue loss is $44/month. I'm assuming that the ETF (early termination fee) subscribers pay when they break their contracts to switch to Wind will cover subsidies for the phone they may have provided for free.
Each of the subscribers who switched to Wind will spend a portion of their air time on Rogers' network. For a typical user (and deliberately choosing nice, round numbers!) we can estimate this at 40 minutes/month. This works out to $10 for each of Users R, B and T, or a net revenue to Rogers of $30 (I am making the reasonable approximation that the roaming fees collected by Wind are forwarded to Rogers without any transaction fees). Rogers' net loss has now been reduced from $44 to $14/month.
If the 40 minutes of roaming represent 1/3 of a typical subscriber's total monthly air time, the sum of these three subscribers roaming load is equivalent to one Rogers Wireless subscriber. Therefore the network cost for Rogers is the same as before Wind came on the scene; Rogers lost one subscriber, then gained three 1/3-subscriber traffic loads. The network load is not distributed the same since the roaming is mostly in outlying areas, but rarely in urban cores. Even so, we can call it a wash since the overall long-term equipment impact is similar.
If the actual roaming load is less than what I've described, the roaming revenue declines as does the network load. While the marginal network cost will be lower than the marginal revenue for every unit of air time -- if this isn't clear, think about it some more -- there is some savings since the lower network load allows some network capacity increases to be deferred to a later date. This saves capital expenditure in the near term.
The final tally is a net loss of $14/month/subscriber rather than the $50 charge that appears on their subscriber's monthly statement. This isn't bad! Ok, so while my earlier guess that they might show a net benefit was wrong, they still do very well. This becomes especially apparent when compared to Bell and Telus which each show a net loss of $44/month (using the same set of assumptions). As for Wind, they are carrying a large debt load and are further burdened with a high ratio of capital expenditures to revenue. Of the four carriers it should be clear from even this simple analysis that Rogers Wireless will be the net winner for at least the next year or two.
Perhaps that is why, of the big 3, Rogers raised the least stink when Clements overruled the CRTC to let Globalive turn up the Wind network.
Rogers said it believes competition is good for Canadian consumers.Too bad that a good deal for Rogers Wireless is not so good for us until Wind's geographic coverage is far better than it is now.
"We've always thrived in a competitive environment and we're ready to meet the competition head on," spokeswoman Odette Coleman said.
Labels:
Business
Monday, December 21, 2009
Switching Carriers, Or Not
I hate to say it, but mobile wireless competition in Canada has not arrived. Even worse, it could take some time yet. Neither Wind Mobile nor the other new entrants to the market will have a significant impact until later in 2010 at the very least. It is also possible that competition could be set back by Wind offering service now when their network is only partially turned up, which could cause a backlash from their early adopters when they see what their bills are really like in the first months of service.
The threat of competition made quite the impression on the share prices for Rogers, Telus and BCE on the days subsequent to Minister Clement overturning the CRTC ruling on the foreign ownership of Globalive, which owns Wind. While it is said that the stock market is a future discounting mechanism looking forward about 6 months, these declines are premature. I am beginning to doubt that their 2010 calendar year earnings will be significantly impacted by Wind and the rest, and therefore the predictions by many analysts will be proved wrong. Unfortunately this also means that the big 3 will not be quick to drop their prices in response to Wind's opening salvo of simple, low-price contracts (and full-price smart phones).
At the risk of stating a tautology, a key feature of mobile phones is the ability to, well, be mobile. If you try to do that with Wind's service you will pay a high price. Their coverage areas are (sensibly enough) in the urban cores of our major cities, but it will take time for that coverage to be extended to the suburbs, and perhaps much longer before they cover rural areas and major transportation corridors. If you are truly mobile, expect to pay a substantial amount for roaming ($0.25/minute plus long distance).
From what I've seen, it appears likely that Wind, slow as they are in getting their network in place, is further along in their geographical roll out than the others. Facilities-based competition is what we need, but it's expensive and it takes time. It may in fact turn out that Rogers' business will increase as Wind enters the market since they will have windfall revenue from Wind's customers who will spend a lot of air time roaming on their network. This revenue could more than compensate Rogers for the customers they lose to Wind since they will also gain air time from the subscribers Wind takes from Bell and Telus. (Wind subscribers can't roam on Bell or Telus networks since Wind uses GSM technology which, of the big 3, only Rogers supports.)
This situation reminds me of the mid-1980s when for a few years AT&T's biggest customer was MCI; MCI entered the long-distance market in 1984 when the FCC first allowed competition, but their geographical coverage was poor while they built out their network, so they leased facilities from AT&T, the former monopoly and now MCI's competitor.
It looks like I won't be switching to Wind anytime soon.
The threat of competition made quite the impression on the share prices for Rogers, Telus and BCE on the days subsequent to Minister Clement overturning the CRTC ruling on the foreign ownership of Globalive, which owns Wind. While it is said that the stock market is a future discounting mechanism looking forward about 6 months, these declines are premature. I am beginning to doubt that their 2010 calendar year earnings will be significantly impacted by Wind and the rest, and therefore the predictions by many analysts will be proved wrong. Unfortunately this also means that the big 3 will not be quick to drop their prices in response to Wind's opening salvo of simple, low-price contracts (and full-price smart phones).
At the risk of stating a tautology, a key feature of mobile phones is the ability to, well, be mobile. If you try to do that with Wind's service you will pay a high price. Their coverage areas are (sensibly enough) in the urban cores of our major cities, but it will take time for that coverage to be extended to the suburbs, and perhaps much longer before they cover rural areas and major transportation corridors. If you are truly mobile, expect to pay a substantial amount for roaming ($0.25/minute plus long distance).
From what I've seen, it appears likely that Wind, slow as they are in getting their network in place, is further along in their geographical roll out than the others. Facilities-based competition is what we need, but it's expensive and it takes time. It may in fact turn out that Rogers' business will increase as Wind enters the market since they will have windfall revenue from Wind's customers who will spend a lot of air time roaming on their network. This revenue could more than compensate Rogers for the customers they lose to Wind since they will also gain air time from the subscribers Wind takes from Bell and Telus. (Wind subscribers can't roam on Bell or Telus networks since Wind uses GSM technology which, of the big 3, only Rogers supports.)
This situation reminds me of the mid-1980s when for a few years AT&T's biggest customer was MCI; MCI entered the long-distance market in 1984 when the FCC first allowed competition, but their geographical coverage was poor while they built out their network, so they leased facilities from AT&T, the former monopoly and now MCI's competitor.
It looks like I won't be switching to Wind anytime soon.
Labels:
Business,
Markets,
Technology
Wednesday, December 16, 2009
Climate Change: Who Decides What Our Government Does
Everyone seeks to influence the decision-makers in our lives. This is true at work, within the family, and on matters of governance. On the other side, the decision-makers need input from experts, their constituents and even their social groups. However, when all is said and done, the decision-maker is responsible for making decisions.
That explains accountability and responsibility, but still leaves the decision-maker in jeopardy. No such position within any sphere of control is permanent and unchangable. That is, decision-makers are judged and can be removed. When we speak of government leaders, change can be by the ballot box or by other means, largely determined by the prevailing system of government.
This makes it crucial for a decision-maker to listen to his or her judges if continuance in their position of power is desirable. As I watch the protests going on in Copenhagen I do wonder who the decision-makers are listening to, and who they are prone to being swayed by in their deliberations. Is it the protesters, whether they are peaceful or violent, or is it their electorates?
Leadership has a cost. If you go against the electorate to do what you believe is right, you may lose your position. If you follow the electorate -- policy-making by opinion poll -- you may keep your position only to be excoriated later for having chosen power over doing the right thing. Politics isn't easy! It is particularly difficult in the present circumstance since the electorate is uncertain -- we generally want to deal with the problem but are not ready to pay the price -- while the activists are certain (climate scientists, who are only rarely activists, aren't "certain" but strongly lean toward action).
As I write this, it appears that our decision-makers are listening to neither the protesters nor the scientists. I understand the former since protesters do not represent the majority, and seek to circumvent the lack of wide public support by confronting the politicians. I understand the latter since doing what is in a sense "right" will have a cost -- make no mistake about it -- on our standard of living, for the near term at least.
We can criticize our decision-makers, who are all easy targets, yet their job is difficult. The loudest voices seeking to influence them are the ones least able to determine their own political futures, even if they are right. The quieter voices are those of the electorate and the corporations that will pay the price for any radical action, since those are the ones whose immediate interests are threatened. Yes, the electorate's interests are tied to corporate interests whether or not this is acknowledged: we are mutually dependent on economic stability and maintaining profitability, jobs and living standards. Even so, in the long-run the status quo is unsustainable: fossil fuels will inevitably rise in price and climate change will occur to a lesser or greater extent, with all the consequential impacts on local and global economics and social order.
So, what should our leaders do? Go slow, go fast, or find some wishy-washy middle ground? Our current federal government seems to strongly favour the status quo. They may only change their position in order to align our actions with the world's major economies, which at least preserves our ability to trade and therefore offers the best economic protection. Either way, the climate will continue to change and so further action will be deferred to a future time when the problem is more acute. Despite the howls of the protesters and the warnings from the scientists, the government will most likely win this round, where winning is measured when Canadians next go the polls. Whether they, and we, are right is another matter.
That explains accountability and responsibility, but still leaves the decision-maker in jeopardy. No such position within any sphere of control is permanent and unchangable. That is, decision-makers are judged and can be removed. When we speak of government leaders, change can be by the ballot box or by other means, largely determined by the prevailing system of government.
This makes it crucial for a decision-maker to listen to his or her judges if continuance in their position of power is desirable. As I watch the protests going on in Copenhagen I do wonder who the decision-makers are listening to, and who they are prone to being swayed by in their deliberations. Is it the protesters, whether they are peaceful or violent, or is it their electorates?
Leadership has a cost. If you go against the electorate to do what you believe is right, you may lose your position. If you follow the electorate -- policy-making by opinion poll -- you may keep your position only to be excoriated later for having chosen power over doing the right thing. Politics isn't easy! It is particularly difficult in the present circumstance since the electorate is uncertain -- we generally want to deal with the problem but are not ready to pay the price -- while the activists are certain (climate scientists, who are only rarely activists, aren't "certain" but strongly lean toward action).
As I write this, it appears that our decision-makers are listening to neither the protesters nor the scientists. I understand the former since protesters do not represent the majority, and seek to circumvent the lack of wide public support by confronting the politicians. I understand the latter since doing what is in a sense "right" will have a cost -- make no mistake about it -- on our standard of living, for the near term at least.
We can criticize our decision-makers, who are all easy targets, yet their job is difficult. The loudest voices seeking to influence them are the ones least able to determine their own political futures, even if they are right. The quieter voices are those of the electorate and the corporations that will pay the price for any radical action, since those are the ones whose immediate interests are threatened. Yes, the electorate's interests are tied to corporate interests whether or not this is acknowledged: we are mutually dependent on economic stability and maintaining profitability, jobs and living standards. Even so, in the long-run the status quo is unsustainable: fossil fuels will inevitably rise in price and climate change will occur to a lesser or greater extent, with all the consequential impacts on local and global economics and social order.
So, what should our leaders do? Go slow, go fast, or find some wishy-washy middle ground? Our current federal government seems to strongly favour the status quo. They may only change their position in order to align our actions with the world's major economies, which at least preserves our ability to trade and therefore offers the best economic protection. Either way, the climate will continue to change and so further action will be deferred to a future time when the problem is more acute. Despite the howls of the protesters and the warnings from the scientists, the government will most likely win this round, where winning is measured when Canadians next go the polls. Whether they, and we, are right is another matter.
Labels:
Politics
Monday, December 14, 2009
Canadian Enough? Telecommunications Policy Objectives
It seems that my prediction that Globalive would get the nod from the federal government was correct. That doesn't make me feel omniscient since the outcome was just too predictable. The halfhearted negative response from two of the incumbents (Telus and Bell) tells me they expected this result as well.
For those paying attention, this was only one of two important telecommunications policy decisions on Friday by the government, both of which went contrary to earlier CRTC decisions. While in seemingly different industry segments -- wireless and DSL -- I believe these decisions were closely related. I'll get into this later in this article.
First, we should dispense with the Globalive decision, in particular regarding whether this company meets the rules for Canadian ownership and control. I believe the CRTC got it right when they ruled against Globalive. I also believe Clement got it right when he overruled the CRTC.
Nevertheless, as a matter of national policy I think Clement got it right. As of right now it is reasonable to state that Globalive is Canadian owned and controlled. This may not be true tomorrow: it will depend on the gotchas in the corporate documents and debt instruments should Globalive fail to meet its future obligations to Orascom. That was why the CRTC correctly noted that Globalive's Canadian ownership and control situation is unstable. Clement disregarded the uncertain future in favour of increasing competition today, something which is sorely needed in the present market.
Here is where we can begin to tie this decision with the one on DSL. On Friday, Clement voided the CRTC decision from earlier this year that the incumbents will need to extend GAS (Gateway Access Service) to technologies that support higher data rates. My view is that both decisions by Clement were united by one policy objective, and one that I've discussed in earlier articles: to promote facilities-based competition, not competition by forcing sharing of incumbents' networks. Globalive is building telecommunications infrastructure, and the government wants exactly this. GAS encourages the opposite by extending ISPs' abilities to stay in business with minimal network investment while discouraging incumbents from investing in infrastructure since the return on their investment will be reduced.
I will now return to the Globalive decision's implications on Canadian ownership and control of telecommunications networks and services to better understand the future implications. Some believe we are on a slippery slope to seeing this and other critical infrastructure open to being acquired by foreigners.
It is correct to ask what is to become of the ownership and control requirements contained within the Telecommunications Act. Perhaps the right solution is to require that some threshold percentage of each critical industry segment -- wired and wireless -- be Canadian owned and controlled in aggregate. This suggestion has its own difficulties but may be preferable to the present law, and it could provide a workable balance between promoting competition and protecting the national interest.
It will be interesting to see if this government opens up the law to amendment. I don't think it's a priority of theirs, so for the time being they may be content to deal with related issues ad hoc. I doubt that the government much cares for the sensibilities of the CRTC Commissioners and will think nothing of overruling them again in future.
For those paying attention, this was only one of two important telecommunications policy decisions on Friday by the government, both of which went contrary to earlier CRTC decisions. While in seemingly different industry segments -- wireless and DSL -- I believe these decisions were closely related. I'll get into this later in this article.
First, we should dispense with the Globalive decision, in particular regarding whether this company meets the rules for Canadian ownership and control. I believe the CRTC got it right when they ruled against Globalive. I also believe Clement got it right when he overruled the CRTC.
Clement stressed that Friday's announcement was not giving Globalive special treatment.I am not being inconsistent by agreeing with both the CRTC and the Minister. The CRTC made a correct ruling (so far as I can tell) on the specifics of the law, despite pejorative comments that can be found on media comment threads and in various blogs.
"Let me state for the record, government is not removing, reducing, bending or creating an exception to Canadian ownership and control requirements in the telecommunications and broadcast industries," he said.
"The CRTCs seemingly arbitrary decision to impose restrictions on Globalive..."The only uncertainty I have is in regards to the fine details of the Globalive-Wind-Orascom corporate structure, with which I am unfamiliar in its details. If you have ever had to deal with the legal specifics of shareholders agreements, voting rights, share classes and corporate decision procedures, you will know that much is hidden deep in these matters and can be difficult to unravel even by experts in corporate law. I cannot say how much of these specifics are understood by the Commissioners and the Minister, though I would think they understand it enough for their purposes.
Nevertheless, as a matter of national policy I think Clement got it right. As of right now it is reasonable to state that Globalive is Canadian owned and controlled. This may not be true tomorrow: it will depend on the gotchas in the corporate documents and debt instruments should Globalive fail to meet its future obligations to Orascom. That was why the CRTC correctly noted that Globalive's Canadian ownership and control situation is unstable. Clement disregarded the uncertain future in favour of increasing competition today, something which is sorely needed in the present market.
Here is where we can begin to tie this decision with the one on DSL. On Friday, Clement voided the CRTC decision from earlier this year that the incumbents will need to extend GAS (Gateway Access Service) to technologies that support higher data rates. My view is that both decisions by Clement were united by one policy objective, and one that I've discussed in earlier articles: to promote facilities-based competition, not competition by forcing sharing of incumbents' networks. Globalive is building telecommunications infrastructure, and the government wants exactly this. GAS encourages the opposite by extending ISPs' abilities to stay in business with minimal network investment while discouraging incumbents from investing in infrastructure since the return on their investment will be reduced.
"With access to advanced broadband services denied, MTS and small ISPs are now under pressure to invest more in their own infrastructure."As a broadband consumer who uses an ISP that takes advantage of GAS, this decision is not to my personal advantage. Even so, I have to recognize that in the long run it is to the benefit of the country. There can be no guarantee of this: the decision reduces competitive options for consumers today, but it is merely neutral on creating investment incentives for ISPs other than those operated by the incumbents. That is, Clement's decision limits ISP dependence on the incumbents' networks but provides no other specific incentive to invest, except insofar as they may feel compelled to do so if they wish to remain viable well into the future.
I will now return to the Globalive decision's implications on Canadian ownership and control of telecommunications networks and services to better understand the future implications. Some believe we are on a slippery slope to seeing this and other critical infrastructure open to being acquired by foreigners.
"It has established an enormous precedent going forward as to how people are supposed to interpret our Canadian ownership laws," said Michael Hennessy, [Telus] senior vice-president of regulatory and government affairs.I disagree with this sentiment. The present law only made sense when the industry was run by monopolies or at most two or three major corporations. For the purpose of national security and economic stability, in the past we did need to prevent foreign control of these companies. (Students of history will know that this was not always true since Bell Canada, BC Tel and others were foreign-controlled at times, and no catastrophes ensued because of this.) However, with competition it is less important that each individual company be Canadian owned and controlled: if Globalive defaults on their debt and Orascom takes it over, the overall wireless industry will still be majority owned and controlled by Canadians.
It is correct to ask what is to become of the ownership and control requirements contained within the Telecommunications Act. Perhaps the right solution is to require that some threshold percentage of each critical industry segment -- wired and wireless -- be Canadian owned and controlled in aggregate. This suggestion has its own difficulties but may be preferable to the present law, and it could provide a workable balance between promoting competition and protecting the national interest.
It will be interesting to see if this government opens up the law to amendment. I don't think it's a priority of theirs, so for the time being they may be content to deal with related issues ad hoc. I doubt that the government much cares for the sensibilities of the CRTC Commissioners and will think nothing of overruling them again in future.
Thursday, December 10, 2009
Excessive Corporate Profits ... Yum!
I am amused by stories like this one that demonstrate how some people don't understand how business operate within the total economy. It is only possible to be horrified by corporate profits if you assume that the capitalism and free markets are evil, where money is being transferred from our pockets to the balance sheets of large corporations. That isn't how it works.
Banks don't earn money. Really. A bank, like any company, is a fictitious entity that only exists on a piece of paper. A bank does not drive fast cars, do drugs or chase after loose women. It's the bank's owners -- the shareholders -- that earn money from the company's profits (dividends), and capital gains (hopefully!) from increases in enterprise value. Society as a whole gains through taxation of those corporate profits, which, conceptually at least, keep taxation of private individuals at a lower level, unless of course if you're a bank shareholder and must pay taxes on dividends and capital gains.
Investors have an easy solution to this phantom problem: buy shares in companies that make obscene profits. Credit card interest rates and bank fees too high while savings interest rates are too low? Buy bank shares. You whimper in pain every time you fill the tank on your SUV? Buy shares of petroleum producers. Your cable bill has gone up again and again? Buy shares in the cable companies. When done well, you can gain more through investing than you lose through being a customer of those companies.
Not to be too dismissive about these companies' prices, there is one segment of society that is hurt: the poor. They generally are unable to invest enough to compensate for the price increases while also maintaining their standard of living. While this has no easy answer, at least corporate profits generate government tax revenue (both corporate taxes and taxation of individual shareholders who partake of the banks' profits) that is then used to provide social services and keep the marginal tax rates low for those at the lower end of the income scale.
I am just not able to feel any outrage at all when I see stories like the one referenced above. It simply isn't a problem.
Banks don't earn money. Really. A bank, like any company, is a fictitious entity that only exists on a piece of paper. A bank does not drive fast cars, do drugs or chase after loose women. It's the bank's owners -- the shareholders -- that earn money from the company's profits (dividends), and capital gains (hopefully!) from increases in enterprise value. Society as a whole gains through taxation of those corporate profits, which, conceptually at least, keep taxation of private individuals at a lower level, unless of course if you're a bank shareholder and must pay taxes on dividends and capital gains.
Investors have an easy solution to this phantom problem: buy shares in companies that make obscene profits. Credit card interest rates and bank fees too high while savings interest rates are too low? Buy bank shares. You whimper in pain every time you fill the tank on your SUV? Buy shares of petroleum producers. Your cable bill has gone up again and again? Buy shares in the cable companies. When done well, you can gain more through investing than you lose through being a customer of those companies.
Not to be too dismissive about these companies' prices, there is one segment of society that is hurt: the poor. They generally are unable to invest enough to compensate for the price increases while also maintaining their standard of living. While this has no easy answer, at least corporate profits generate government tax revenue (both corporate taxes and taxation of individual shareholders who partake of the banks' profits) that is then used to provide social services and keep the marginal tax rates low for those at the lower end of the income scale.
I am just not able to feel any outrage at all when I see stories like the one referenced above. It simply isn't a problem.
Wednesday, December 9, 2009
Unfairly Tossing a Fair Coin
I had a good laugh while reading this CBC article about teaching students how to cheat at coin tossing. The article does not relate the ways by which this can be done, however the techniques have been well-understood and used or abused for a very long time. If you are unfamiliar with the "tricks" involved, it may be difficult to picture how it could be done; that is, how the tosser can cheat without rousing the suspicion of observers.
In a traditional coin toss the coin is launched from a flat position with the spin imparted by an off-axis force from (usually) the thumb. A coin flipped in this way has two superposed motions: a gravitationally-determined parabolic arc and a rotation about an axis that contains a diameter of the coin. The result is a coin spinning rapidly end-over-end and travelling through the air in an arc. Since the spin about the coin's diameter is many times the rate of the (roughly-speaking) half-revolution about an invisible centre -- usually near the parabola's focus -- and both rates are not precisely controlled, the outcome is sufficiently uncertain that the outcome of the toss -- heads or tails -- can be said to be random. It isn't entirely random: it is possible that a tosser with a sufficiently fast eye and hand could interrupt the coin's flight at just the right moment to reach a desired outcome. However it is good enough to be an uncontroversial way to decide the kickoff in a football game.
That's was a simple toss, but there are other ways to spin a coin. Further, these various motions can be combined to create some complex, seemingly-random spins that are often easier to manipulate. If you want to find out all the ways to spin a coin (and cheat) I recommend you read the description provided by the physicist and statistician E. T. Jaynes. He went so far as to train himself in the various techniques and then test the results. From the raw data of many, many hundred of trial runs, he became very accomplished at achieving almost certain outcomes with tossed coins. The book where his description can be found is called Probability Theory: The Logic of Science, which you may want to avoid purchasing since it is heavy, large, expensive and highly technical. Happily, early drafts of the text are available online. One place is here (browse down to chapter 10) provided you have a Postscript viewer handy. If not, do a web search and you will find versions in other formats, though you may have to hunt a bit in each one since there is no consistency in the work's structure and content across its many versions. You'll also need a good imagination since Jaynes provides no diagrams, just precise descriptions.
If you don't want to do all that work, let's review a few of the spin types that a fair coin can exhibit. The most trivial is where the spin axis goes through the coin's centre and is orthogonal to the coin's surfaces; in other words, the spin axis is at a right angle to that in a traditional coin toss. No matter how much the coin spins, the result of the "toss" is trivially identical to the face that is showing at the start.
Next, we can launch a coin spinning on its orthogonal axis along the same parabolic arc as in the traditional coin toss. However, if no additional spin component is employed, this form of travel will not affect the outcome: it will still show the same face as if the coin were not launched. You can try this by placing a coin in your palm and accelerating it upward without changing the attitude of your palm. You could also do it with your thumb but it is more difficult to centre the force so that the coin does not tumble. In either case, this is difficult to do while also imparting spin around the orthogonal axis.
A slight variation is to launch the coin so that the same face of the coin points inward (along the coin's orthogonal axis, and approximately toward the parabola's focus). This is much like the tidally-locked orbit of the Moon around the Earth where the same face (but not quite due to libration) always faces the Earth. This is nearly as easy as a flat toss, but this time the outcome of the coin toss will be opposite to the face shown at the start.
Now it gets more complicated. Consider a spin axis that is intermediate between the two preceding cases of the orthogonal and diameter axes. The resulting motion is a wobble that is much like you get by spinning a plate on a table if you launch it an angle to the surface (where the plate touches the table at only one point). You do this with a coin by centering the force imparted by your thumb off to one side of the coin while simultaneously pulling your thumb towards yourself. You may have to brace the coin with another finger so it doesn't slide off your thumb, or you can direct your thumb's motion at an angle that is off the vertical. It can be learned with only a little practice.
The wobble amplitude (the angular difference between the highest and lowest excursions of the rim at any fixed point at the coin's edge) is important to creating the impression of a fair coin toss; to an untrained eye a large wobble is hard to distinguish from a spin about the diameter axis, if the spin rate is high enough. However if you think about it for a moment I think you'll see that the outcome of the toss is exactly the same as the original, trivial case where the spin axis is orthogonal (no wobble, and the top face doesn't change). Combine this with a flat or tidally-locked arc and you can get any outcome you desire. According to Jaynes, depending on the resiliency of the landing surface you can often let the coin land on the floor without greatly reducing the predictability of the outcome. This tactic also has the advantage of allaying the doubts of observers about the toss's fairness. It is of course better when the surface is soft enough that the coin doesn't bounce, such as a natural-turf football field.
There is more to it if you want to get into even more arcane combinations of spin axes that give highly deterministic outcomes while appearing to have random, unpredictable flights. Jaynes pretty much goes through all the possibilities in his book so I'll leave off here and let the curious read what he has to say on the topic.
In a traditional coin toss the coin is launched from a flat position with the spin imparted by an off-axis force from (usually) the thumb. A coin flipped in this way has two superposed motions: a gravitationally-determined parabolic arc and a rotation about an axis that contains a diameter of the coin. The result is a coin spinning rapidly end-over-end and travelling through the air in an arc. Since the spin about the coin's diameter is many times the rate of the (roughly-speaking) half-revolution about an invisible centre -- usually near the parabola's focus -- and both rates are not precisely controlled, the outcome is sufficiently uncertain that the outcome of the toss -- heads or tails -- can be said to be random. It isn't entirely random: it is possible that a tosser with a sufficiently fast eye and hand could interrupt the coin's flight at just the right moment to reach a desired outcome. However it is good enough to be an uncontroversial way to decide the kickoff in a football game.
That's was a simple toss, but there are other ways to spin a coin. Further, these various motions can be combined to create some complex, seemingly-random spins that are often easier to manipulate. If you want to find out all the ways to spin a coin (and cheat) I recommend you read the description provided by the physicist and statistician E. T. Jaynes. He went so far as to train himself in the various techniques and then test the results. From the raw data of many, many hundred of trial runs, he became very accomplished at achieving almost certain outcomes with tossed coins. The book where his description can be found is called Probability Theory: The Logic of Science, which you may want to avoid purchasing since it is heavy, large, expensive and highly technical. Happily, early drafts of the text are available online. One place is here (browse down to chapter 10) provided you have a Postscript viewer handy. If not, do a web search and you will find versions in other formats, though you may have to hunt a bit in each one since there is no consistency in the work's structure and content across its many versions. You'll also need a good imagination since Jaynes provides no diagrams, just precise descriptions.
If you don't want to do all that work, let's review a few of the spin types that a fair coin can exhibit. The most trivial is where the spin axis goes through the coin's centre and is orthogonal to the coin's surfaces; in other words, the spin axis is at a right angle to that in a traditional coin toss. No matter how much the coin spins, the result of the "toss" is trivially identical to the face that is showing at the start.
Next, we can launch a coin spinning on its orthogonal axis along the same parabolic arc as in the traditional coin toss. However, if no additional spin component is employed, this form of travel will not affect the outcome: it will still show the same face as if the coin were not launched. You can try this by placing a coin in your palm and accelerating it upward without changing the attitude of your palm. You could also do it with your thumb but it is more difficult to centre the force so that the coin does not tumble. In either case, this is difficult to do while also imparting spin around the orthogonal axis.
A slight variation is to launch the coin so that the same face of the coin points inward (along the coin's orthogonal axis, and approximately toward the parabola's focus). This is much like the tidally-locked orbit of the Moon around the Earth where the same face (but not quite due to libration) always faces the Earth. This is nearly as easy as a flat toss, but this time the outcome of the coin toss will be opposite to the face shown at the start.
Now it gets more complicated. Consider a spin axis that is intermediate between the two preceding cases of the orthogonal and diameter axes. The resulting motion is a wobble that is much like you get by spinning a plate on a table if you launch it an angle to the surface (where the plate touches the table at only one point). You do this with a coin by centering the force imparted by your thumb off to one side of the coin while simultaneously pulling your thumb towards yourself. You may have to brace the coin with another finger so it doesn't slide off your thumb, or you can direct your thumb's motion at an angle that is off the vertical. It can be learned with only a little practice.
The wobble amplitude (the angular difference between the highest and lowest excursions of the rim at any fixed point at the coin's edge) is important to creating the impression of a fair coin toss; to an untrained eye a large wobble is hard to distinguish from a spin about the diameter axis, if the spin rate is high enough. However if you think about it for a moment I think you'll see that the outcome of the toss is exactly the same as the original, trivial case where the spin axis is orthogonal (no wobble, and the top face doesn't change). Combine this with a flat or tidally-locked arc and you can get any outcome you desire. According to Jaynes, depending on the resiliency of the landing surface you can often let the coin land on the floor without greatly reducing the predictability of the outcome. This tactic also has the advantage of allaying the doubts of observers about the toss's fairness. It is of course better when the surface is soft enough that the coin doesn't bounce, such as a natural-turf football field.
There is more to it if you want to get into even more arcane combinations of spin axes that give highly deterministic outcomes while appearing to have random, unpredictable flights. Jaynes pretty much goes through all the possibilities in his book so I'll leave off here and let the curious read what he has to say on the topic.
Labels:
Science
Tuesday, December 8, 2009
Government Policies to Push IP-based Communications
Last week the FCC in the US publicized a proceeding to solicit questions on the content of a forthcoming NOI (notice of inquiry) on VoIP and IP communications. In effect they are asking the public to tell them what questions they ought to be asking.
This is not to say the FCC will not try to be fair. I have dealt with the FCC in the past and they are for the most part eager and able to do good work. They will nevertheless find it difficult to achieve a fair outcome. To see this we need only go back in time to earlier initiatives, such as, for example, the attempt to achieve equitable competition in the local telephony market with the 1996 Telecommunications Act.
Look around now and you'll see that after nearly 14 years we have a largely reconstituted oligopoly of massive telcos with mostly tiny hosts of niche competitors. The only competition with any heft are the cable companies, but they didn't need at least 95% of what the Telecom Act engendered to be successful. It was enough that they were permitted to enter the telephone market and able to secure interconnection with other phone companies. Almost everything else in the Act was about unbundling and securing access to the equipment and facilities of the telcos. With their own facilities-based networks, the cable companies didn't need this; what took them time to get into the market was the encumbrance of a more mundane nature: upgrading their technology and the will to enter the market.
In that light I have to wonder what this FCC proceeding can realistically accomplish? VoIP certainly already exists as both a technology and as a basis for driving down the prices for telephone service by means of regulatory arbitrage, through utilizing the broadband and (more recently) wireless consumer services provided by the incumbents. Will the FCC dismantle the existing plethora of bizarre rules for inter-carrier settlements and fees for user-funded services like 911 and the Universal Service Fund boondoggle? A lot of these persist for political reasons to support high-cost rural areas but are often misused and are poor at achieving the intended results.
I have come to strongly believe that no regulations that seek to micro-manage the business operations of corporations can succeed, and will only end up creating unintended consequences by distorting the normal operation of the free market. Regrettably, facilities-based competition is the only effective solution; it's regrettable because it is expensive and takes a long time to flower. With effective competition, there is no need to come up with regulations about IP communications: let the market decide on the best technological solutions and services to serve and grow the market since the public will have a real choice.
What the FCC should be doing is fine-tuning its regulations so that the major facilities-based telecommunications providers -- telcos, cable companies, mobile wireless and fixed wireless -- have a reasonably equal shot at success. If they can do that, they will have a far better chance of not only meeting their objectives but also re-energize this important and critical industry sector.
If you'll excuse one note of what I believe is justified cynicism, I am not focused on the US because it is an important and nearby market (which it is), but rather because what they do will be implemented in some fashion here in Canada about 3 years later. Only then will Canadians see effective telecommunications competition and some real choice. So let's cheer on the FCC!
"...appropriate policy framework to facilitate and respond to the market-led transition in technology and services, from the circuit switched PSTN system to an IP-based communications world."This first quote from the short (3 page) document seems appropriate. They recognize that communications is increasingly occurring on packet-switched networks, and they are concerned that their policies and regulations may hinder this market-driven change. So far so good.
"...we seek to understand which policies and regulatory structures may facilitate, and which may hinder, the efficient migration to an all IP world."This second quote is very similar in tone to the first, but with the addition of a desire to get into details. This is also very fair and so quite reasonable. I believe that the real, underlying problems are not so straight-forward, and I expect that this will muddle the long process of public engagement they are beginning. One important reason for this is that while the public is being invited to respond, as usual in these matters it will be the industry players, especially the largest incumbent corporations, with the greatest budgets and motivation to participate.
This is not to say the FCC will not try to be fair. I have dealt with the FCC in the past and they are for the most part eager and able to do good work. They will nevertheless find it difficult to achieve a fair outcome. To see this we need only go back in time to earlier initiatives, such as, for example, the attempt to achieve equitable competition in the local telephony market with the 1996 Telecommunications Act.
Look around now and you'll see that after nearly 14 years we have a largely reconstituted oligopoly of massive telcos with mostly tiny hosts of niche competitors. The only competition with any heft are the cable companies, but they didn't need at least 95% of what the Telecom Act engendered to be successful. It was enough that they were permitted to enter the telephone market and able to secure interconnection with other phone companies. Almost everything else in the Act was about unbundling and securing access to the equipment and facilities of the telcos. With their own facilities-based networks, the cable companies didn't need this; what took them time to get into the market was the encumbrance of a more mundane nature: upgrading their technology and the will to enter the market.
In that light I have to wonder what this FCC proceeding can realistically accomplish? VoIP certainly already exists as both a technology and as a basis for driving down the prices for telephone service by means of regulatory arbitrage, through utilizing the broadband and (more recently) wireless consumer services provided by the incumbents. Will the FCC dismantle the existing plethora of bizarre rules for inter-carrier settlements and fees for user-funded services like 911 and the Universal Service Fund boondoggle? A lot of these persist for political reasons to support high-cost rural areas but are often misused and are poor at achieving the intended results.
I have come to strongly believe that no regulations that seek to micro-manage the business operations of corporations can succeed, and will only end up creating unintended consequences by distorting the normal operation of the free market. Regrettably, facilities-based competition is the only effective solution; it's regrettable because it is expensive and takes a long time to flower. With effective competition, there is no need to come up with regulations about IP communications: let the market decide on the best technological solutions and services to serve and grow the market since the public will have a real choice.
What the FCC should be doing is fine-tuning its regulations so that the major facilities-based telecommunications providers -- telcos, cable companies, mobile wireless and fixed wireless -- have a reasonably equal shot at success. If they can do that, they will have a far better chance of not only meeting their objectives but also re-energize this important and critical industry sector.
If you'll excuse one note of what I believe is justified cynicism, I am not focused on the US because it is an important and nearby market (which it is), but rather because what they do will be implemented in some fashion here in Canada about 3 years later. Only then will Canadians see effective telecommunications competition and some real choice. So let's cheer on the FCC!
Labels:
Business,
Politics,
Technology
Wednesday, December 2, 2009
A House Stands On Its Own
Imagine that you commission a house to be built for you and your family. All seems to go well during its construction. Every inspector has finally given the nod that all is well, and that the routine problems that crop up in any custom job have been addressed. The contractor is about to hand you the keys and collect your cheque for the final invoice when you learn something disturbing. Many of the trades people who built your house were not model citizens: some were murderers, swindlers of the elderly, child molesters and, yes, even washed-up politicians. What are you going to do?
If the inspectors have done their jobs, the house will stand. It will not suddenly disintegrate or have a lower value because the character of one or more of the workers is called into question. Regardless of character, even if one or a few workers were less than capable at their tasks -- say, attempting to hammer in a nail using a stapler -- presumably this was caught by the contractor or an inspector and the incompetent person was forthwith escorted from the job site. The work was inspected and the house will stand. Believing otherwise will only cause you to suffer monetary or other stress-related difficulties down the road. The house is a house regardless of what you choose to believe.
Now let's move on to a similar situation that is getting a fair amount of media attention: the hacking of the email and files of the CRU (Climate Research Unit) at the University of East Anglia in the UK. Scientists and technicians associated with this group played a role in the IPCC's report on climate change. The leaked material shows that scientists are as human as you and me, complete with instances of braggadocio, rudeness, conflict (sometimes personal) and sloppy thinking. This is unsurprising. I would hope that no one still believes the stereotype of the scientist grimly in a white lab coat humourlessly pronouncing incomprehensible equations and facts to anyone silly enough to engage them in conversation.
It is not surprising that hard-nosed skeptics and outright political opponents of the case for climate change would latch onto any opportunity to re-open the science with the objective of reducing its credibility. However, opposition is not equivalent to proof of anything. For now the house still stands. It may be that the foundations of the house are indeed rotten and every inspector on the job site has missed this obvious flaw, although that is improbable. More likely is that some of the building materials were defective or several studs were incorrectly secured. It would be surprising if it were not so, since perfection is unattainable even with the very purest of intentions and diligence of everyone involved. This is as true for science as it is for house construction; that will not cause the house to fall.
The rhetoric is a distraction. I am content to watch while the inspectors go back in and poke around. Possibly they will suggest some remedial work, though even if it is not done I doubt that the building will fall. This is no house of cards, so a flaw here and there will not bring it down. So be prepared to pay up and move in when the latest batch of inspectors complete their review.
If the inspectors have done their jobs, the house will stand. It will not suddenly disintegrate or have a lower value because the character of one or more of the workers is called into question. Regardless of character, even if one or a few workers were less than capable at their tasks -- say, attempting to hammer in a nail using a stapler -- presumably this was caught by the contractor or an inspector and the incompetent person was forthwith escorted from the job site. The work was inspected and the house will stand. Believing otherwise will only cause you to suffer monetary or other stress-related difficulties down the road. The house is a house regardless of what you choose to believe.
Now let's move on to a similar situation that is getting a fair amount of media attention: the hacking of the email and files of the CRU (Climate Research Unit) at the University of East Anglia in the UK. Scientists and technicians associated with this group played a role in the IPCC's report on climate change. The leaked material shows that scientists are as human as you and me, complete with instances of braggadocio, rudeness, conflict (sometimes personal) and sloppy thinking. This is unsurprising. I would hope that no one still believes the stereotype of the scientist grimly in a white lab coat humourlessly pronouncing incomprehensible equations and facts to anyone silly enough to engage them in conversation.
"The e-mail exchanges among several prominent American and British climate-change scientists appear to reveal efforts to keep the work of skeptical scientists out of major journals and the possible hoarding and manipulation of data to overstate the case for human-caused climate change."The above quote from a blogger at the National Post goes a little far in the interpretation of what the material reveals of real scientists doing real work. To claim fraud and malfeasance is not justified, or at least not yet. Like in the house example, it seems that the prospective owner is calling in more inspectors to inspect the house yet again, and the work of the previous inspectors, all because the poor character of some of the workers has come to light.
It is not surprising that hard-nosed skeptics and outright political opponents of the case for climate change would latch onto any opportunity to re-open the science with the objective of reducing its credibility. However, opposition is not equivalent to proof of anything. For now the house still stands. It may be that the foundations of the house are indeed rotten and every inspector on the job site has missed this obvious flaw, although that is improbable. More likely is that some of the building materials were defective or several studs were incorrectly secured. It would be surprising if it were not so, since perfection is unattainable even with the very purest of intentions and diligence of everyone involved. This is as true for science as it is for house construction; that will not cause the house to fall.
The rhetoric is a distraction. I am content to watch while the inspectors go back in and poke around. Possibly they will suggest some remedial work, though even if it is not done I doubt that the building will fall. This is no house of cards, so a flaw here and there will not bring it down. So be prepared to pay up and move in when the latest batch of inspectors complete their review.
Friday, November 27, 2009
FCC Unbundling Revisited
It is said that in a liberal democracy the aim is equality of opportunity, while in socialism the aim is equality of outcome. Governments here and around the world have struggled with this dichotomy in developing and modernizing telecommunications regulation. The root of the dilemma is that telecommunications has in most countries become a monopoly or small oligopoly, with the government often taking a stake in these enterprises. The reasoning usually came down to a belief that telecommunications infrastructure, especially the "last mile" is a natural monopoly, and the realization that telecommunications is not only an important utility but one that is critical to society's functioning.
When the United States implemented local telephone competition in the mid-1990s, they recognized the need to tilt the playing field in favour of the new entrants, if only for a limited period of time. They correctly recognized that building a telecommunications infrastructure is extremely capital intensive, and without some legislative assistance it would be difficult for new entrants to compete with the incumbents whose capital budgets were a far smaller percentage of their budgets. As an additional spur to competition, to reduce the required capital intensity and to speed the availability of geographically-broad alternatives, the incumbents were required to unbundle local loops and other parts of their networks.
This all happened during the Clinton administration, which was willing to manage outcomes to a limited extent, with the intention that the country eventually return to a state of equal opportunity in the telecommunications industry. They put the industry on notice by building into the 1996 Telecommunications Act sunset provisions for many of these intrusive measures. Similar laws and regulations were adopted in many other developed nations, including Canada, though with lesser or greater intention in accordance with local political leanings. All this meddling is political and so it is no surprise that it was, and is, very much affected by the politics of the day. This is amplified in the US where socialism is an insult more than simply a description of an economic system.
With this history in mind, I was surprised to learn that the US is considering entering the fray once more with a modified form of unbundling to increase broadband competition. While there is little doubt that more broadband competition in the industry is helpful to consumers, this degree of government intervention is treacherous.
Certainly the current US administration is closer in sentiment to the one that spawned telecommunications reform that the most recent one. However, I think they are picking their fights carefully, choosing to focus on a few very specific issues such as health care. Intervention in the broadband market would only distract from higher priorities. I do not believe the FCC will be encouraged to mandate any further unbundling in the industry, not even the somewhat more benign form of DSL service unbundling that we have in Canada (Gateway Access Service), even though it is present in scattered locations in the US, but without the tight price and availability controls we have. There are also technology issues with unbundling -- copper loops are not seen as sufficient for the market's evolving requirements -- since it presupposes a network architecture that may not survive far into the future.
Wireless is the main disruptor here, and governments are -- correctly, I believe -- looking to these technologies to provide infrastructure-level competition in the industry, across both wired and wireless: in Canada this will take the form of new cellular providers, and in the US it will be companies like Clearwire. This would relieve them of the political risk of trying to regulate separation of physical network components from retail services, which requires permanent and intrusive oversight.
For at least the present, the likelihood of that level of intervention is small in both the US and Canada. In Canada that also means we should not expect any expansion or improvement to GAS, ever.
When the United States implemented local telephone competition in the mid-1990s, they recognized the need to tilt the playing field in favour of the new entrants, if only for a limited period of time. They correctly recognized that building a telecommunications infrastructure is extremely capital intensive, and without some legislative assistance it would be difficult for new entrants to compete with the incumbents whose capital budgets were a far smaller percentage of their budgets. As an additional spur to competition, to reduce the required capital intensity and to speed the availability of geographically-broad alternatives, the incumbents were required to unbundle local loops and other parts of their networks.
This all happened during the Clinton administration, which was willing to manage outcomes to a limited extent, with the intention that the country eventually return to a state of equal opportunity in the telecommunications industry. They put the industry on notice by building into the 1996 Telecommunications Act sunset provisions for many of these intrusive measures. Similar laws and regulations were adopted in many other developed nations, including Canada, though with lesser or greater intention in accordance with local political leanings. All this meddling is political and so it is no surprise that it was, and is, very much affected by the politics of the day. This is amplified in the US where socialism is an insult more than simply a description of an economic system.
With this history in mind, I was surprised to learn that the US is considering entering the fray once more with a modified form of unbundling to increase broadband competition. While there is little doubt that more broadband competition in the industry is helpful to consumers, this degree of government intervention is treacherous.
Certainly the current US administration is closer in sentiment to the one that spawned telecommunications reform that the most recent one. However, I think they are picking their fights carefully, choosing to focus on a few very specific issues such as health care. Intervention in the broadband market would only distract from higher priorities. I do not believe the FCC will be encouraged to mandate any further unbundling in the industry, not even the somewhat more benign form of DSL service unbundling that we have in Canada (Gateway Access Service), even though it is present in scattered locations in the US, but without the tight price and availability controls we have. There are also technology issues with unbundling -- copper loops are not seen as sufficient for the market's evolving requirements -- since it presupposes a network architecture that may not survive far into the future.
Wireless is the main disruptor here, and governments are -- correctly, I believe -- looking to these technologies to provide infrastructure-level competition in the industry, across both wired and wireless: in Canada this will take the form of new cellular providers, and in the US it will be companies like Clearwire. This would relieve them of the political risk of trying to regulate separation of physical network components from retail services, which requires permanent and intrusive oversight.
For at least the present, the likelihood of that level of intervention is small in both the US and Canada. In Canada that also means we should not expect any expansion or improvement to GAS, ever.
Labels:
Business,
Politics,
Technology
Tuesday, November 24, 2009
Gold and the Canadian Investor
If you're a Canadian investor in the gold sector, it is quite possible that your returns don't seem to be as good as the headlines would have you believe. Gold is breaking to new highs every second day, but is your portfolio benefiting?
I know that last sentence sounds like a sales pitch, however it isn't; I simply want to talk about the gold market from a Canadian perspective since it is not the same as it is for an investor in the US, which is the source for much of the media noise. One big difference is the currency. Do not be misled by the quoted price for this precious metals commodity -- or indeed for most commodities -- since they are quoted in US dollars, and that currency is faring poorly. Consider the adjacent 12-month charts: one for the commodity and the other for the loonie's exchange rate with the US dollar.
In rough numbers, gold has risen from about US$800 to US$1,150 (the quoted ETF -- GLD -- is an approximate proxy for the underlying commodity). During the same period the loonie has risen from about $0.81 to $0.95. If you invested in commodity futures or an ETF, in Canadian dollars your return is about 23%. Gold itself rose 43% against the US dollar. In other words, for every 1% rise is the US-denominated gold price, your investment returned 0.5%. That's one good reason, as a Canadian, to be wary of the hype surrounding commodity bull markets.
There is an alternative to the commodity and that is to invest in the producers: the gold miners. Consider the adjacent chart of one popular gold miners index. (Yes, while it is a US index, many of the worlds' major gold producers are Canadian and trade on exchanges in both the US and Canada, and, in general, perform similarly.)
The chart's shape is similar to that for the commodity, except look at the numbers. Over the same 12-months, the index has risen from about 23 to 51, and increase of 122% in USD and 89% in CAD. This is unsurprising since company profits typically are a multiple of the commodity. The reason is simple: if it costs a producer $500 to produce an ounce of gold, the profit is $300 when the commodity sells for $800 and $650 when the price is $1,150. That performance is why the producers outperform the commodity. Of course, and most importantly, the same reasoning works in reverse when the commodity price declines. The downside risk is in proportion to the upside opportunity!
My message is simple enough: if you are a Canadian who feels tempted to invest in gold, think carefully before investing in the commodity itself. The potential returns (and risk) are lower than investing in the producers. If you are willing to manage the risk, invest in the miners, either directly or via a fund. The advantage of a fund is that you are less exposed to operations problems that plague every miner from time to time.
Having said all the foregoing, keep in mind that this entire discussion hinges upon the price of gold continuing to track the US dollar. The strong correlation is only partly due to investors running to gold as a hedge against US public debt, which is helping to devalue their currency. This situation can change, and quickly, so act cautiously with your money.
I know that last sentence sounds like a sales pitch, however it isn't; I simply want to talk about the gold market from a Canadian perspective since it is not the same as it is for an investor in the US, which is the source for much of the media noise. One big difference is the currency. Do not be misled by the quoted price for this precious metals commodity -- or indeed for most commodities -- since they are quoted in US dollars, and that currency is faring poorly. Consider the adjacent 12-month charts: one for the commodity and the other for the loonie's exchange rate with the US dollar.
In rough numbers, gold has risen from about US$800 to US$1,150 (the quoted ETF -- GLD -- is an approximate proxy for the underlying commodity). During the same period the loonie has risen from about $0.81 to $0.95. If you invested in commodity futures or an ETF, in Canadian dollars your return is about 23%. Gold itself rose 43% against the US dollar. In other words, for every 1% rise is the US-denominated gold price, your investment returned 0.5%. That's one good reason, as a Canadian, to be wary of the hype surrounding commodity bull markets.
There is an alternative to the commodity and that is to invest in the producers: the gold miners. Consider the adjacent chart of one popular gold miners index. (Yes, while it is a US index, many of the worlds' major gold producers are Canadian and trade on exchanges in both the US and Canada, and, in general, perform similarly.)
The chart's shape is similar to that for the commodity, except look at the numbers. Over the same 12-months, the index has risen from about 23 to 51, and increase of 122% in USD and 89% in CAD. This is unsurprising since company profits typically are a multiple of the commodity. The reason is simple: if it costs a producer $500 to produce an ounce of gold, the profit is $300 when the commodity sells for $800 and $650 when the price is $1,150. That performance is why the producers outperform the commodity. Of course, and most importantly, the same reasoning works in reverse when the commodity price declines. The downside risk is in proportion to the upside opportunity!
My message is simple enough: if you are a Canadian who feels tempted to invest in gold, think carefully before investing in the commodity itself. The potential returns (and risk) are lower than investing in the producers. If you are willing to manage the risk, invest in the miners, either directly or via a fund. The advantage of a fund is that you are less exposed to operations problems that plague every miner from time to time.
Having said all the foregoing, keep in mind that this entire discussion hinges upon the price of gold continuing to track the US dollar. The strong correlation is only partly due to investors running to gold as a hedge against US public debt, which is helping to devalue their currency. This situation can change, and quickly, so act cautiously with your money.
Labels:
Markets
Friday, November 20, 2009
More Bad News For Android Apps
Stories about the difficulty, and even futility, of making decent revenue from apps published on Google's Android Market are legion. I am no exception, having written most recently on the topic earlier this week. I want to draw attention to this story about Gameloft and their negative experience. Without meaning to demean the longstanding complaints of 1-man and 2-man app shops, I think it's important when even a mid-size commercial app developer -- in this case, games -- expresses the same concerns:
So when Gameloft says they can get no joy from the Android Market and are going to abandon products for the Android platform for at least the short term, I have to wonder what they know that they are not telling us. Are they getting unacceptably vague feedback from Google regarding improvements, or even a complete unwillingness to talk? Are they unimpressed with the prospects for brand-name alternatives, and especially carrier app stores?
Hope is not acceptable in a business plan. Tying a company's success to the unfulfilled promise of Checkout and Market must not be absolute: contingencies are required. I am interpretting Gameloft's message as saying that they see no suitable alternatives on which they can plan their business. That's troubling.
"We have significantly cut our investment in Android platform, just like ... many others," Gameloft finance director Alexandre de Rochefort said at an investor conference...mostly due to weaknesses of Android's application store..."Google has not been very good to entice customers to actually buy products. On Android nobody is making significant revenue," Rochefort said.The reason for my concern is that I expect that larger, more-established companies have better and more frequent opportunities to engage with the organizations such as Google that individual developers do not. We know very well that the folks at Google Checkout and Android Market are notoriously uncommunicative and unhelpful with their customers: merchants and ISVs. I do not know if an outfit like Gameloft has better communications channels than the larger community, however my own business experience says that it is reasonable to assume that they do. If they do not, that is a further worry about the commercial maturity of these Google operations.
So when Gameloft says they can get no joy from the Android Market and are going to abandon products for the Android platform for at least the short term, I have to wonder what they know that they are not telling us. Are they getting unacceptably vague feedback from Google regarding improvements, or even a complete unwillingness to talk? Are they unimpressed with the prospects for brand-name alternatives, and especially carrier app stores?
Hope is not acceptable in a business plan. Tying a company's success to the unfulfilled promise of Checkout and Market must not be absolute: contingencies are required. I am interpretting Gameloft's message as saying that they see no suitable alternatives on which they can plan their business. That's troubling.
Labels:
Business
Wednesday, November 18, 2009
Asymptotic Pricing: Unattainable Cell Phone Rates
"...Recipriversexcluson: a number whose existence can only be defined as being anything other than itself." This fictional entity, which was invented by Douglas Adams, is at the heart of Bistromathics -- also an Adams' invention -- as described in the third book of his five-book trilogy: The Hitchhiker's Guide to the Galaxy. Although entirely fictional, the idea of a recipriversexcluson is not entirely useless to the better understanding of cell phone service pricing, as we will soon see.
Cell phone plans and their pricing can be terribly confusing and opaque. In this post I will not be attempting to cut through all that nonsense; it needs doing, but not here. Instead I will focus on one core portion of the pricing model: voice minute rates. That is, the actual amount you are paying per minute to use your cell phone. So let's drill down into that a bit further. To start, I will consider a simpler model that will exemplify the concept I want to explain: which is -- to coin a term -- asymptotic pricing.
In the past -- and perhaps at the present, but I don't actually know -- Chapter's book stores offered a discount card. How it worked is that for $25 you get 10% off on all your book purchases. The term was one year (if I recollect correctly). To many book lovers that may at first blush seem like a good deal. The first time I was offered this card by the cashier it did momentarily peak my interest. However a few moments thought was enough for me to find the flaw in the offer, so I smiled knowingly and politely declined. Let's look at how it works, with the aid of a graph.
You pay $25 up front, so you are immediately in a losing position. To dig yourself out of this hole you must buy books, lots of books. How much? Well, to break even you must purchase $250 worth of books (before sales taxes) since 10% of $250 is $25. That is, you pay $225 for the books plus $25 for the discount card, which equals the undiscounted total of $250. Of course, beyond that you do begin to see real savings. As an aside we should note that for Chapter's there is a marketing advantage in that you are motivated to buy lots of books, perhaps more than you otherwise would. If you do, they win, and if you don't, they also win. Win-win for them and lose-sorta win for you.
Regardless of how much you spend on books the one total discount you will never see is 10%. You can get arbitrarily close to that number by buying ever more books, but even if you buy all the books in existence, and even all those that existed in the past and that will exist in the future, you will never save 10%. That is because the initial outlay of $25 doomed you as surely as the poor horse whose rider dangles a carrot in front of its nose to make it run faster; the horse will not get the carrot no matter how fast and far it gallops.
In mathematics that 10% figure is an asymptote: a number that can only be reached by a limiting function, in this case with the limit taken to infinity. (There is a pale similarity to the earlier Hitchhiker's reference.) With this simple example in hand we are now ready to tackle cell phone voice minutes pricing.
Forget about data, SMS, vertical services, premium phones and other add-ons to your cell phone plan; it's only voice minutes we care about here and now. Even so, the graph (below) is more complex than the discount card example. In this hypothetical though typical pricing plan we see that the basic plan costs $30 (you can add in other fees and taxes if you wish to make it more realistic) for 150 minutes of air time. Again, we're keeping it simple, so no free weekends and so on; we'll stick with minutes that are chargable.
If you give any thought at all to the price when you are considering signing the contract, it is very likely that the one figure you will calculate is your expected price per minute by dividing the monthly fee by the number of minutes. In the present case that is $30 divided by 150 minutes, which yields $0.20/minute. However, as you can see in the graph, that value is only attained if you use precisely 150 minutes. If your usage is any other amount you will pay more. If you use less, you might pay substantially more per minute. (I've assumed a typical value of $0.35/minute for every minute you use beyond the 150 limit.) This is a bit like coin flipping: flip a fair coin 1,000 times and the likelihood of getting exactly 500 each of heads and tails is much lower than you might guess. At least if you do the flipping experiment many times the average will tend toward 500, but this is not the case with cell phones since you are relying on your (poor) ability to estimate how much you will use your phone:
If you think about it, that makes perfect sense. With unbounded use of your phone -- which is still very finite, coming in at 44,640 minutes in a 31-day month -- you will pay as much as $15,601.50, or $0.349495967/minute. That's pretty close to $0.35 though not equal. This limit is approached when minutes of use (MOU) is much, much greater than the 150 minutes in the basic plan.
Well, that's it. You now understand cell phone pricing just a little bit better than before. Just don't bring any of this up at parties or you will shortly find yourself with no conversational partners.
Cell phone plans and their pricing can be terribly confusing and opaque. In this post I will not be attempting to cut through all that nonsense; it needs doing, but not here. Instead I will focus on one core portion of the pricing model: voice minute rates. That is, the actual amount you are paying per minute to use your cell phone. So let's drill down into that a bit further. To start, I will consider a simpler model that will exemplify the concept I want to explain: which is -- to coin a term -- asymptotic pricing.
In the past -- and perhaps at the present, but I don't actually know -- Chapter's book stores offered a discount card. How it worked is that for $25 you get 10% off on all your book purchases. The term was one year (if I recollect correctly). To many book lovers that may at first blush seem like a good deal. The first time I was offered this card by the cashier it did momentarily peak my interest. However a few moments thought was enough for me to find the flaw in the offer, so I smiled knowingly and politely declined. Let's look at how it works, with the aid of a graph.
You pay $25 up front, so you are immediately in a losing position. To dig yourself out of this hole you must buy books, lots of books. How much? Well, to break even you must purchase $250 worth of books (before sales taxes) since 10% of $250 is $25. That is, you pay $225 for the books plus $25 for the discount card, which equals the undiscounted total of $250. Of course, beyond that you do begin to see real savings. As an aside we should note that for Chapter's there is a marketing advantage in that you are motivated to buy lots of books, perhaps more than you otherwise would. If you do, they win, and if you don't, they also win. Win-win for them and lose-sorta win for you.
Regardless of how much you spend on books the one total discount you will never see is 10%. You can get arbitrarily close to that number by buying ever more books, but even if you buy all the books in existence, and even all those that existed in the past and that will exist in the future, you will never save 10%. That is because the initial outlay of $25 doomed you as surely as the poor horse whose rider dangles a carrot in front of its nose to make it run faster; the horse will not get the carrot no matter how fast and far it gallops.
In mathematics that 10% figure is an asymptote: a number that can only be reached by a limiting function, in this case with the limit taken to infinity. (There is a pale similarity to the earlier Hitchhiker's reference.) With this simple example in hand we are now ready to tackle cell phone voice minutes pricing.
Forget about data, SMS, vertical services, premium phones and other add-ons to your cell phone plan; it's only voice minutes we care about here and now. Even so, the graph (below) is more complex than the discount card example. In this hypothetical though typical pricing plan we see that the basic plan costs $30 (you can add in other fees and taxes if you wish to make it more realistic) for 150 minutes of air time. Again, we're keeping it simple, so no free weekends and so on; we'll stick with minutes that are chargable.
If you give any thought at all to the price when you are considering signing the contract, it is very likely that the one figure you will calculate is your expected price per minute by dividing the monthly fee by the number of minutes. In the present case that is $30 divided by 150 minutes, which yields $0.20/minute. However, as you can see in the graph, that value is only attained if you use precisely 150 minutes. If your usage is any other amount you will pay more. If you use less, you might pay substantially more per minute. (I've assumed a typical value of $0.35/minute for every minute you use beyond the 150 limit.) This is a bit like coin flipping: flip a fair coin 1,000 times and the likelihood of getting exactly 500 each of heads and tails is much lower than you might guess. At least if you do the flipping experiment many times the average will tend toward 500, but this is not the case with cell phones since you are relying on your (poor) ability to estimate how much you will use your phone:
Three-part tariffs optimally exploit overconfident consumers because overconfident consumers both underestimate the likelihood of very high usage, and the need to pay high overage charges, and underestimate the likelihood of very low usage, and the likelihood of not getting a refund for included “free” minutes.The first thing to realize is that the first estimate of the per-minute price is an unrealizable optimum. Second, unlike the book card case, the $0.20 rate is not an asymptote since you can (if you're very careful) actually get that rate. However (third point), there is an asymptote in the graph, except it's not where you might have expected it to appear. The asymptote, like in the book card case, is also a limit for minutes taken to infinity. The asymptote is at $0.35, which is the overage rate.
If you think about it, that makes perfect sense. With unbounded use of your phone -- which is still very finite, coming in at 44,640 minutes in a 31-day month -- you will pay as much as $15,601.50, or $0.349495967/minute. That's pretty close to $0.35 though not equal. This limit is approached when minutes of use (MOU) is much, much greater than the 150 minutes in the basic plan.
Well, that's it. You now understand cell phone pricing just a little bit better than before. Just don't bring any of this up at parties or you will shortly find yourself with no conversational partners.
Tuesday, November 17, 2009
Making Android Pay
Earlier this year I decided that for our small venture we would not publish anything other than free apps on the Android Market. Initially this was because of the woes of Google Checkout, but the trouble has gone far beyond that, and can no longer be attributed to Google's growing pains. It's a shame since there is a lot of potential with the built-in Android Market app on pretty much all Android phones: now over 3M out in the wild, and slated to grow to over 10M a year from now.
Google is putting precious few resources into Android Market and Checkout. You can count the countries from which developers can publish paid apps on the fingers of both your hands and still have one finger to spare. Improvements to Market and the Market app have been even slower in coming. I have to believe this is deliberate. It seems that, true to its knitting, Google is not keen on direct sales, preferring instead to distribute innovative and interesting technology and making their money on ad revenue. They won't spend, say, $5M to improve and extend Market (although probably even more for Checkout), but they will spend $750M to purchase AdMob.
Money is not the problem. The pace of improvements is glacial and the problems are widely known and discussed -- often with great vehemence. Rather than hope for better, I project forward from their known performance and reach my conclusion: the Android Market may never pay for the bulk of app developers. Even their most recent Android Market Developer Distribution Agreement only further emphasizes their disinterest in enabling paid apps. That is their right.
Despite this lament it has never been true that publishing mobile apps is the road to riches: precious few win at that game and I had no illusions. For me the attraction was product placement since every Android user has the opportunity to see and acquire our companies products with the built-in Market app, which is a cheap way (for us!) to build the core of a sustainable market. Every other method is either more expensive or attracts fewer eyeballs.
Carrier billing is a step in the right direction, but insufficient by itself: it addresses the limitations of Checkout but not Market. Carrier "channels" on Market seem to me to be more of a ghetto than a platform from which to get noticed. Verizon has made some progress -- well before introducing Droid -- with their own app store, and they may be further encouraged by early Droid sales. While this strategy will cause fragmentation and increase developer costs, it could relegate the Android Market to only being a pool for free apps. That will still be of immense value to Google and most developers, just not the path to commercialization for ISVs that are treating their businesses as more than a hobby. It is interesting, though a bit worrisome, that the enabling corporations in the field are not yet clear on how to proceed.
Carrier stores are perhaps the best hope for Android app developers. I am beginning to suspect that the Android Market carrier channels are no more than a short-term fix that will be replaced in 2010 with true carrier-sponsored app stores, and their own built-in market apps. I have indications from private sources that at least a few North American wireless carriers are moving in this direction. Exactly what these stores will look like and whether they will be better than Google's offering for app producers and consumers, I unfortunately do not know.
Which gets me back to my own sales strategy. Our view now is to use the Android Market as one way to generate awareness within the target market -- everyone with an Android phone -- and attempt to leverage that with sales through other channels. The legalities of the agreements that every developer is bound to by publishing to the Android Market must be carefully navigated but are not insurmountable. The freemium model can still work on the path we propose to follow.
Google is putting precious few resources into Android Market and Checkout. You can count the countries from which developers can publish paid apps on the fingers of both your hands and still have one finger to spare. Improvements to Market and the Market app have been even slower in coming. I have to believe this is deliberate. It seems that, true to its knitting, Google is not keen on direct sales, preferring instead to distribute innovative and interesting technology and making their money on ad revenue. They won't spend, say, $5M to improve and extend Market (although probably even more for Checkout), but they will spend $750M to purchase AdMob.
Money is not the problem. The pace of improvements is glacial and the problems are widely known and discussed -- often with great vehemence. Rather than hope for better, I project forward from their known performance and reach my conclusion: the Android Market may never pay for the bulk of app developers. Even their most recent Android Market Developer Distribution Agreement only further emphasizes their disinterest in enabling paid apps. That is their right.
Despite this lament it has never been true that publishing mobile apps is the road to riches: precious few win at that game and I had no illusions. For me the attraction was product placement since every Android user has the opportunity to see and acquire our companies products with the built-in Market app, which is a cheap way (for us!) to build the core of a sustainable market. Every other method is either more expensive or attracts fewer eyeballs.
Carrier billing is a step in the right direction, but insufficient by itself: it addresses the limitations of Checkout but not Market. Carrier "channels" on Market seem to me to be more of a ghetto than a platform from which to get noticed. Verizon has made some progress -- well before introducing Droid -- with their own app store, and they may be further encouraged by early Droid sales. While this strategy will cause fragmentation and increase developer costs, it could relegate the Android Market to only being a pool for free apps. That will still be of immense value to Google and most developers, just not the path to commercialization for ISVs that are treating their businesses as more than a hobby. It is interesting, though a bit worrisome, that the enabling corporations in the field are not yet clear on how to proceed.
Carrier stores are perhaps the best hope for Android app developers. I am beginning to suspect that the Android Market carrier channels are no more than a short-term fix that will be replaced in 2010 with true carrier-sponsored app stores, and their own built-in market apps. I have indications from private sources that at least a few North American wireless carriers are moving in this direction. Exactly what these stores will look like and whether they will be better than Google's offering for app producers and consumers, I unfortunately do not know.
Which gets me back to my own sales strategy. Our view now is to use the Android Market as one way to generate awareness within the target market -- everyone with an Android phone -- and attempt to leverage that with sales through other channels. The legalities of the agreements that every developer is bound to by publishing to the Android Market must be carefully navigated but are not insurmountable. The freemium model can still work on the path we propose to follow.
Labels:
Business
Thursday, November 12, 2009
Still No VC
This is a regrettably long-running story: there is no venture capital money around, and especially not in Ottawa. There is some hope that the situation will change, but that is only a hope, not reality as yet. Knowing all this, it was with some interest that I read this article about the effect Nortel has had on the Ottawa high-tech industry. What Pat DiPietro had to say in that article is very much on point even if it is nothing we don't already know.
First, there is a dearth of seasoned technology executives in Ottawa. This is true, and has been true for a long time. This is not to say we don't have them, it's just that their numbers are disproportionately small in comparison to the engineering talent. As Pat says, much of the blame must be laid at Nortel's doorstop. Bell-Northern Research was constructed on the AT&T Bell Labs model where the technology and research organization was kept separate from the business units. For anyone who has not experienced it directly, this must seem a perplexing concept. After all, how can it possibly make sense for the product builders to be disconnected from the product marketers and sellers? It is ineffective except in the now-extinct monopolistic telecommunications industry structure. It had its good points, but speed, efficiency and relevance were not among them. In Nortel's case, since the vast bulk of the business talent was not in Ottawa we now don't have enough of it to spread around.
Second, we seem not to have people with the needed skills. While it is true that a talented engineer can quickly come up to speed on new technologies, it is less so as the engineer ages. The most productive and creative engineers are young. This is of course not always true, just true enough that it is a valid generalization. A good engineering team should combine the energies of the young guided by seasoned professionals, thus most profitably exploiting the best of both groups. The old Nortel hands are often not found in these senior roles. Many have left the industry, taken on junior roles in government or elsewhere, or are otherwise uninvolved for a diversity of reasons. This is a subject all its own which I won't get into here. The result is that we do have a pool of telecommunications software and hardware professionals in town, but telecom infrastructure products are not where the buzz is these days. The younger folks who are well-versed in the new technologies and customer-facing services are somewhat adrift in their various "hobbyist" start-ups without either funding or experienced leaders. There is too often a large gulf between the young and the old in this town.
Third, we sell too soon, and thus fail to maximize the value from our limited pool of small, successful companies. I think that Pat is off base on this point. For one thing, the reason many start-ups in Ottawa sell too early is that the VCs won't or can't make follow-on investments (a recent example is Third Brigade selling itself to Trend Micro). The money is usually needed if growth is to be stimulated rather than allowed to proceed organically while generating modest profits. Pat can look in the nearest mirror to spot the problem. Apart from the lack of VC money, I am not convinced that the rate of acquisitions is too high; it is a normal sequence of events. It is also to be expected that the acquirers are American since they have the companies with the size and incentive to do these deals. I believe the real problem is that we are not creating enough new, well-financed start-ups to replace those that are acquired. It probably takes 5 of these for every acquired company for the technology industry to thrive; the talent gradually moves into new companies, using money from the acquisitions to grow the local technology industry. Unfunded start-ups don't count in this scheme; these orphans are often little more than lottery tickets.
Ultimately it is money that is the fuel that will light the fire of entrepreneurship in Ottawa. If it stays away too long the talent we do have will drift away. If that does happen it will not soon return.
First, there is a dearth of seasoned technology executives in Ottawa. This is true, and has been true for a long time. This is not to say we don't have them, it's just that their numbers are disproportionately small in comparison to the engineering talent. As Pat says, much of the blame must be laid at Nortel's doorstop. Bell-Northern Research was constructed on the AT&T Bell Labs model where the technology and research organization was kept separate from the business units. For anyone who has not experienced it directly, this must seem a perplexing concept. After all, how can it possibly make sense for the product builders to be disconnected from the product marketers and sellers? It is ineffective except in the now-extinct monopolistic telecommunications industry structure. It had its good points, but speed, efficiency and relevance were not among them. In Nortel's case, since the vast bulk of the business talent was not in Ottawa we now don't have enough of it to spread around.
Second, we seem not to have people with the needed skills. While it is true that a talented engineer can quickly come up to speed on new technologies, it is less so as the engineer ages. The most productive and creative engineers are young. This is of course not always true, just true enough that it is a valid generalization. A good engineering team should combine the energies of the young guided by seasoned professionals, thus most profitably exploiting the best of both groups. The old Nortel hands are often not found in these senior roles. Many have left the industry, taken on junior roles in government or elsewhere, or are otherwise uninvolved for a diversity of reasons. This is a subject all its own which I won't get into here. The result is that we do have a pool of telecommunications software and hardware professionals in town, but telecom infrastructure products are not where the buzz is these days. The younger folks who are well-versed in the new technologies and customer-facing services are somewhat adrift in their various "hobbyist" start-ups without either funding or experienced leaders. There is too often a large gulf between the young and the old in this town.
Third, we sell too soon, and thus fail to maximize the value from our limited pool of small, successful companies. I think that Pat is off base on this point. For one thing, the reason many start-ups in Ottawa sell too early is that the VCs won't or can't make follow-on investments (a recent example is Third Brigade selling itself to Trend Micro). The money is usually needed if growth is to be stimulated rather than allowed to proceed organically while generating modest profits. Pat can look in the nearest mirror to spot the problem. Apart from the lack of VC money, I am not convinced that the rate of acquisitions is too high; it is a normal sequence of events. It is also to be expected that the acquirers are American since they have the companies with the size and incentive to do these deals. I believe the real problem is that we are not creating enough new, well-financed start-ups to replace those that are acquired. It probably takes 5 of these for every acquired company for the technology industry to thrive; the talent gradually moves into new companies, using money from the acquisitions to grow the local technology industry. Unfunded start-ups don't count in this scheme; these orphans are often little more than lottery tickets.
Ultimately it is money that is the fuel that will light the fire of entrepreneurship in Ottawa. If it stays away too long the talent we do have will drift away. If that does happen it will not soon return.
Labels:
Business
Monday, November 9, 2009
Windows 7: Meh
I have long ago stopped paying attention to Windows releases from Microsoft. While it's true that every version does add new functionality, almost none of it is so striking that it's worth more than brief attention. Windows 7 is no different, no matter what short-term impact it might have on the PC market.
For me, Windows is simply there; I need it to make my PCs function. That's all. The only time I notice it is when it breaks or makes it difficult for me to do something that is urgent. Those are negative points, not positives. There are many positives in its continuously-improving features, it's just that those features are of the type: well of course they did this, and, what took them so long. This covers everything from uPnP and auto-update to driver availability and TCP/IP support.
None of it makes an impression on users since this is at best nothing more than keeping up with Apple and Linux, or trying to keep user hostility under control due to failures, poor usability and high cost. Few users love Windows; they simply need it in their daily lives. Most don't switch to Mac or Linux since it isn't worth the bother, learning curve, or perhaps the non-availability of the applications to which they are accustomed or need.
Pretty icons and translucent surfaces are very pretty but are not selling points. Most of the applications that are familiar to users -- Office, Internet Explorer, Outlook, etc. -- are not part of Windows, and are only noteworthy in the context of Windows since many of these are not available on other platforms. Competitors don't restrict themselves in this way since such a policy only reduces their potential market. For Microsoft, the only advantage, although it is an important one, is that once these applications are loaded up with your address books, archived emails, bookmarks and so forth, make switching costs high. In marketing speak this is known as stickiness.
Of course, that is why competing products make it easy to import user data and documents from Microsoft applications. Microsoft has fought this in the past with hidden APIs and non-disclosed data formats. This secrecy has declined remarkably over the years as Microsoft has battled anti-trust charges in the US and Europe.
When I purchase my next PC I will be buying Windows 7 as well. It's included in the price and it's already installed. That is simplicity itself. The thing is, if it were to instead come with XP or Vista I would be equally happy. For Microsoft this is a problem that in the long run is certain to bite them.
For me, Windows is simply there; I need it to make my PCs function. That's all. The only time I notice it is when it breaks or makes it difficult for me to do something that is urgent. Those are negative points, not positives. There are many positives in its continuously-improving features, it's just that those features are of the type: well of course they did this, and, what took them so long. This covers everything from uPnP and auto-update to driver availability and TCP/IP support.
None of it makes an impression on users since this is at best nothing more than keeping up with Apple and Linux, or trying to keep user hostility under control due to failures, poor usability and high cost. Few users love Windows; they simply need it in their daily lives. Most don't switch to Mac or Linux since it isn't worth the bother, learning curve, or perhaps the non-availability of the applications to which they are accustomed or need.
Pretty icons and translucent surfaces are very pretty but are not selling points. Most of the applications that are familiar to users -- Office, Internet Explorer, Outlook, etc. -- are not part of Windows, and are only noteworthy in the context of Windows since many of these are not available on other platforms. Competitors don't restrict themselves in this way since such a policy only reduces their potential market. For Microsoft, the only advantage, although it is an important one, is that once these applications are loaded up with your address books, archived emails, bookmarks and so forth, make switching costs high. In marketing speak this is known as stickiness.
Of course, that is why competing products make it easy to import user data and documents from Microsoft applications. Microsoft has fought this in the past with hidden APIs and non-disclosed data formats. This secrecy has declined remarkably over the years as Microsoft has battled anti-trust charges in the US and Europe.
When I purchase my next PC I will be buying Windows 7 as well. It's included in the price and it's already installed. That is simplicity itself. The thing is, if it were to instead come with XP or Vista I would be equally happy. For Microsoft this is a problem that in the long run is certain to bite them.
Labels:
Business,
Technology
Thursday, November 5, 2009
Choosing the Right Price
Markets go up, and they go down. Or to be more precise, individual stocks, commodities, other financial instruments and collections thereof (indices, ETFs, mutual funds) go up and down. Rarely do prices sit at one level for any length of time, whether the duration is measured in minutes or years. This raises the all-important question when one is looking at a real-time quote: is this the right price? What we mean by this is, is there some rational, evidence-based justification for this price, or is the market simply wrong?
That's a great question, and probably the best possible question. Is there an answer? More importantly, if there is an answer, is it knowable? Having that knowledge is a near certain path to untold wealth; all you need to do is determine what the right price is and place your money where the price correction -- which you are sure must occur, sooner or later -- will yield profits. Of course it isn't that easy, and even the very best get into heated arguments over what is the right price.
Here we have Rogers and Roubini arguing, among other things, about whether the current price of oil is right or wrong? We have Rogers saying that the most recent dip in the price was the market getting the price wrong, therefore the 47% rise off the bottom is not only not unexpected, but may yet indicate that the market is still undervaluing the commodity. On the other side, Roubini says that since the economy is still in distress, the previous low price is closer to being right, therefore the recent 47% rise is unsustainable and we should expect a correction.
Who is right? Is the price of oil too high, too low, or just about right? I certainly don't claim to know. If you really care about the fundamental arguments about pricing, these sorts of discussions are very pertinent: you need to know which of these guys is right. Since (as is commonly understood) the market is a future discounting mechanism, the current price is an indication of the future returns of the current economic trends, typically about 6 months out. If you believe that, and since oil is in an uptrend, you must believe that the market is saying we are gradually pulling out of the recession. Of course if you think the economy really stinks, you should believe the market is being stupid, so you should go short on the commodity or the producers and you'll be that much richer in 2010.
If you ignore the fundamentals and only look at the chart for guidance -- this is called technical analysis -- it is enough to know that there is an uptrend. Yes, it has to end sooner or later, but what you should do is go long now, then cash out when the trend turns against you. You too, if you get it right, will be richer in several months.
The one nice thing I do like about technical analysis is that you don't need to know if the price is right. It's enough to know that other market participants believe that the price is too low -- which is what is driving buying and the trend to higher prices -- and go along for ride while that belief dominates. It is however crucial that, once taking this approach, you do not subsequently become a believer. When the market turns, sell. With that discipline you are most likely to profit, and you can do it without knowing whether the price is right or wrong. Leave that for the pundits.
That's a great question, and probably the best possible question. Is there an answer? More importantly, if there is an answer, is it knowable? Having that knowledge is a near certain path to untold wealth; all you need to do is determine what the right price is and place your money where the price correction -- which you are sure must occur, sooner or later -- will yield profits. Of course it isn't that easy, and even the very best get into heated arguments over what is the right price.
Here we have Rogers and Roubini arguing, among other things, about whether the current price of oil is right or wrong? We have Rogers saying that the most recent dip in the price was the market getting the price wrong, therefore the 47% rise off the bottom is not only not unexpected, but may yet indicate that the market is still undervaluing the commodity. On the other side, Roubini says that since the economy is still in distress, the previous low price is closer to being right, therefore the recent 47% rise is unsustainable and we should expect a correction.
Who is right? Is the price of oil too high, too low, or just about right? I certainly don't claim to know. If you really care about the fundamental arguments about pricing, these sorts of discussions are very pertinent: you need to know which of these guys is right. Since (as is commonly understood) the market is a future discounting mechanism, the current price is an indication of the future returns of the current economic trends, typically about 6 months out. If you believe that, and since oil is in an uptrend, you must believe that the market is saying we are gradually pulling out of the recession. Of course if you think the economy really stinks, you should believe the market is being stupid, so you should go short on the commodity or the producers and you'll be that much richer in 2010.
If you ignore the fundamentals and only look at the chart for guidance -- this is called technical analysis -- it is enough to know that there is an uptrend. Yes, it has to end sooner or later, but what you should do is go long now, then cash out when the trend turns against you. You too, if you get it right, will be richer in several months.
The one nice thing I do like about technical analysis is that you don't need to know if the price is right. It's enough to know that other market participants believe that the price is too low -- which is what is driving buying and the trend to higher prices -- and go along for ride while that belief dominates. It is however crucial that, once taking this approach, you do not subsequently become a believer. When the market turns, sell. With that discipline you are most likely to profit, and you can do it without knowing whether the price is right or wrong. Leave that for the pundits.
Labels:
Markets
Wednesday, November 4, 2009
H1N1 Scare, Politics and Vaccine Fears
The progression in the H1N1 pandemic story has fascinated me. We went from panic to complaisance, then back to a more earnest form of panic: huge line-ups to get the vaccine. Much to my amazement, through all of this our various governments, federal and provincial, have stuck to their guns and performed quite well. The Liberals' recently attempted to cast the federal government in a negative light in this matter -- presumably hoping to profit from some understandable frustrations in how the vaccination program is unfolding -- is merely one more failed attempt on their part to latch onto an issue, any issue, that will elevate their and their leader's public standing.
It seemed that while H1N1 was only killing the elderly and the sickly, public sentiment became little more than a collective shrug once the flu-season ended in the spring. In that light, and the typically moderate symptoms suffered by almost all that have been infected, the continuing pressure from the government on the pharmaceutical industry to prepare for a mass vaccination program had the appearance of gross incompetence, and needlessly expense. When the unfortunate deaths of otherwise healthy children occurred recently, the potential seriousness of the threat became very real for most of us.
H1N1 is not killing a disproportionately higher number of people than other strains of the virus, yet there is the danger that it could. Since it more aggressively attacks the respiratory system and is highly contagious, there is an opportunity for a deadly mutation to turn the pandemic into at least a minor catastrophe. Or, it might not.
Either way, the risk of taking the vaccine is lower than the risk of not taking it. Even if you do wish to take the risk with your own health, there is the matter of whether you want to risk the health of others with whom you come in contact if and when you do become infected (rough predictions say 25% to 35% of us will eventually get infected, though not all will get sick).
However, rather than repeat what is said better elsewhere, what I want to say is that the point about supposed vaccine risks is becoming moot. When children started dying, the public mood changed. Now the trend is pro-vaccination. It seems that while we had the luxury of time to sit back and pontificate without expertise, avoiding the vaccine was a valid form of personal expression. Now it isn't. H1N1 kills in modest numbers among the healthy and affluent, kills more among the poor and sick, and could mutate to kill indiscriminately. That knowledge may now be enough to quell the evidence-free tirades against vaccination. I hope this trend will seep into other conversations about the supposed risks of vaccinations for other diseases. It should.
It seemed that while H1N1 was only killing the elderly and the sickly, public sentiment became little more than a collective shrug once the flu-season ended in the spring. In that light, and the typically moderate symptoms suffered by almost all that have been infected, the continuing pressure from the government on the pharmaceutical industry to prepare for a mass vaccination program had the appearance of gross incompetence, and needlessly expense. When the unfortunate deaths of otherwise healthy children occurred recently, the potential seriousness of the threat became very real for most of us.
H1N1 is not killing a disproportionately higher number of people than other strains of the virus, yet there is the danger that it could. Since it more aggressively attacks the respiratory system and is highly contagious, there is an opportunity for a deadly mutation to turn the pandemic into at least a minor catastrophe. Or, it might not.
Either way, the risk of taking the vaccine is lower than the risk of not taking it. Even if you do wish to take the risk with your own health, there is the matter of whether you want to risk the health of others with whom you come in contact if and when you do become infected (rough predictions say 25% to 35% of us will eventually get infected, though not all will get sick).
However, rather than repeat what is said better elsewhere, what I want to say is that the point about supposed vaccine risks is becoming moot. When children started dying, the public mood changed. Now the trend is pro-vaccination. It seems that while we had the luxury of time to sit back and pontificate without expertise, avoiding the vaccine was a valid form of personal expression. Now it isn't. H1N1 kills in modest numbers among the healthy and affluent, kills more among the poor and sick, and could mutate to kill indiscriminately. That knowledge may now be enough to quell the evidence-free tirades against vaccination. I hope this trend will seep into other conversations about the supposed risks of vaccinations for other diseases. It should.
Friday, October 30, 2009
No Competition for You
The media and blogs are rife with articles about and condemning the CRTC decision to block GlobeAlive/WIND from winning a wireless telecommunications license. As Michael Geist correctly points out, the blame does not lie with the CRTC. This will require political action to amend the law or, as an interim measure, find some way around it in the near term. I have more faith in the present government acting than I do in the Liberals, when I consider their respective past behaviours with respect to protecting "Canadian-ness" versus open markets.
The incumbent operators are of course pleased with this development -- as well they should be -- as witnessed in this Telus press release and the quote from Rogers in this article. However, I wonder if they will pursue this aggressively when this decision is appealed. It's one thing to gloat but another to directly lobby cabinet. They do not want to be seen to be publicly against competition.
Which brings us back to telecommunications law in this country and the role of the CRTC. Those with a long memory will recall that the Canadian Radio-television and Telecommunications Commission was named the Canadian Radio and Television Commission; same abbreviation (CRTC), but with a expanded mandate. It is no surprise that telecommunications law and regulation mirrors the zealous protection of Canadian content that existed for the media. Ownership matters since that tends to reflect content, business locations, domestic employment and ability of the government to control these companies.
Back when the media and telecommunications sectors were more strongly dominated by a small number, or only one, company -- whether nationally or by region -- there was some legitimate argument for the domestic ownership requirements. Perhaps even more than that, telephone service -- which at one time was almost the entire telecommunications industry -- is rightly held to be an essential public utility. Pre-competition, there was in effect a social contract between the industry and the various governments whereby the industry had to meet a long list of service objectives in return for a virtually guaranteed profit stream.
With competition in place, there is no longer a requirement to protect the incumbents. If one provider fails or unilaterally decides to withdraw from the market, either partially or entirely, the presence of other players helps to ensure continuity of service. Today we have one foot planted in the past and one in the present. We need to make the final step and strongly promote competition so that there is a diminished need for domestic ownership and control over telecommunications providers. As it stands right now, the CRTC's legally correct decision makes no sense when the government's policy is to promoted facilities-based competition. Wireless is the best way to achieve this, and that is what GlobeAlive/WIND will do.
The government will need to act. I predict that they will, and that their decision will be favourable to the GlobeAlive.
The incumbent operators are of course pleased with this development -- as well they should be -- as witnessed in this Telus press release and the quote from Rogers in this article. However, I wonder if they will pursue this aggressively when this decision is appealed. It's one thing to gloat but another to directly lobby cabinet. They do not want to be seen to be publicly against competition.
Which brings us back to telecommunications law in this country and the role of the CRTC. Those with a long memory will recall that the Canadian Radio-television and Telecommunications Commission was named the Canadian Radio and Television Commission; same abbreviation (CRTC), but with a expanded mandate. It is no surprise that telecommunications law and regulation mirrors the zealous protection of Canadian content that existed for the media. Ownership matters since that tends to reflect content, business locations, domestic employment and ability of the government to control these companies.
Back when the media and telecommunications sectors were more strongly dominated by a small number, or only one, company -- whether nationally or by region -- there was some legitimate argument for the domestic ownership requirements. Perhaps even more than that, telephone service -- which at one time was almost the entire telecommunications industry -- is rightly held to be an essential public utility. Pre-competition, there was in effect a social contract between the industry and the various governments whereby the industry had to meet a long list of service objectives in return for a virtually guaranteed profit stream.
With competition in place, there is no longer a requirement to protect the incumbents. If one provider fails or unilaterally decides to withdraw from the market, either partially or entirely, the presence of other players helps to ensure continuity of service. Today we have one foot planted in the past and one in the present. We need to make the final step and strongly promote competition so that there is a diminished need for domestic ownership and control over telecommunications providers. As it stands right now, the CRTC's legally correct decision makes no sense when the government's policy is to promoted facilities-based competition. Wireless is the best way to achieve this, and that is what GlobeAlive/WIND will do.
The government will need to act. I predict that they will, and that their decision will be favourable to the GlobeAlive.
Wednesday, October 28, 2009
Trust and the NDA
Dealing with legal agreements is never fun, yet in business it is a daily activity. Most employees rarely have to deal with contracts and other legal documents, leaving that for others and the lawyers they hire. Yet in the high-tech world there is one legal document that is very common: the Non-Disclosure Agreement (NDA). It often goes by other names but let's stick with NDA.
The reason the NDA is so common is that a total product solution for a technology-based company often requires complementary products and services from other companies. This is as true for start-ups as it is for the top names in the sector. These partnerships are initiated by one of the companies or both are directed to do so by a mutual customer. This activity is often delegated to a business development specialist or, for smaller firms, sales. In very early stage companies this may devolve to a technology executive for the simple reason that they may not have a business development or sales group yet. The first step of substantive discussions usually requires an NDA. Later, if all goes well, a contract will be signed to establish the terms of the business relationship.
NDAs are binding legal contracts, as should be obvious if you read through one in detail. They vary quite a bit in length -- I've seen some that fit comfortably within one page, while others run to a dozen pages or more -- but all are intended to achieve one important task: establish procedures, responsibilities and, yes, penalties, to control the necessary sharing of sensitive or proprietary information or technology. The shared intellectual property can range from disclosure of APIs, sharing of source code or pre-release object code, algorithms, customer lists, pricing, and so on.
For those previously unfamiliar with NDAs, it is sometimes assumed that they are a defensive weapon in much the same manner as a suit of armour was for a medieval warrior. With signed NDA in hand, many companies feel free to open up and share their most cherished secrets and the fruits of their labour, believing that the penalties for breaking the terms of the NDA will keep the other party honest. Regrettably, this is often not the case. Here is a recent example that provided the motivation for this blog post.
If you've ever discussed NDAs with lawyers -- which I've done a lot more than I'd like over my career -- you will find that they can be very relaxed about it all (unless they're particularly desperate for a few more billable hours). In contrast, if you are new to the game and are desperate for good advice on whether to sign an NDA or how to draft one, you may be quite anxious since the success or failure of your company may depend on doing everything right. Don't be annoyed with the lawyer's attitude: it's justified. Go ahead and ask why they don't share your concern and you may hear something like this (paraphrased from a discussion I once had with a lawyer along these same lines).
The lawyer will ask you whether you trust the person or company with whom you are negotiating the NDA. This may seem like an odd question since are probably thinking that the NDA is an alternative to trust, countering the risk of dealing with a shady character or a company with a hidden agenda by creating legal consequences for misuse of your valuable intellectual property. It isn't, as is beautifully exemplified by the above-referenced case: if they take the goods and run, regardless of your presumed legal protections, you've already lost.
Sure you can sue, but by then it's too late, it's expensive, the courts can be treacherous and you can be in the right and still lose, and even if you win there may no one you can collect from. It isn't only the small, asset-poor companies you need to watch out for. I have witnessed the very cream of the tech world sign NDAs with the sole objective of getting the goods from start-ups who go on to disclose everything in the hope of winning a top-tier partnership or (though this is only whispered in dark corners) being acquired. For the large corporation it can be far cheaper and faster to exploit what they've learned from you, while knowing that you do not have the resources to successfully pursue a case against a multi-billion dollar corporation in the courts.
The lesson here is that an NDA is absolutely not a substitute for trust. It is a supplement to a trust relationship so that no one makes any mistakes by making clear just what can and cannot be done with disclosed intellectual property. Even so, no matter how strong the level of trust, you should never disclose more than the minimum necessary. Also, disclose gradually in careful steps so that should you realize at some point that you've made a mistake, you can minimize the potential damage. A lawsuit should always be your last option, never the first, and one you almost certainly can't afford.
As when selecting a romantic partner or a building contractor, ask around discretely before walking down the aisle or tearing down the walls. Also, when you do sign an NDA, treat your new partner's intellectual property with the same care and diligence with which you wish they would treat yours. That way when the next deal comes around and the other party checks out your reputation, your existing partners will give the thumbs up sign.
The reason the NDA is so common is that a total product solution for a technology-based company often requires complementary products and services from other companies. This is as true for start-ups as it is for the top names in the sector. These partnerships are initiated by one of the companies or both are directed to do so by a mutual customer. This activity is often delegated to a business development specialist or, for smaller firms, sales. In very early stage companies this may devolve to a technology executive for the simple reason that they may not have a business development or sales group yet. The first step of substantive discussions usually requires an NDA. Later, if all goes well, a contract will be signed to establish the terms of the business relationship.
NDAs are binding legal contracts, as should be obvious if you read through one in detail. They vary quite a bit in length -- I've seen some that fit comfortably within one page, while others run to a dozen pages or more -- but all are intended to achieve one important task: establish procedures, responsibilities and, yes, penalties, to control the necessary sharing of sensitive or proprietary information or technology. The shared intellectual property can range from disclosure of APIs, sharing of source code or pre-release object code, algorithms, customer lists, pricing, and so on.
For those previously unfamiliar with NDAs, it is sometimes assumed that they are a defensive weapon in much the same manner as a suit of armour was for a medieval warrior. With signed NDA in hand, many companies feel free to open up and share their most cherished secrets and the fruits of their labour, believing that the penalties for breaking the terms of the NDA will keep the other party honest. Regrettably, this is often not the case. Here is a recent example that provided the motivation for this blog post.
If you've ever discussed NDAs with lawyers -- which I've done a lot more than I'd like over my career -- you will find that they can be very relaxed about it all (unless they're particularly desperate for a few more billable hours). In contrast, if you are new to the game and are desperate for good advice on whether to sign an NDA or how to draft one, you may be quite anxious since the success or failure of your company may depend on doing everything right. Don't be annoyed with the lawyer's attitude: it's justified. Go ahead and ask why they don't share your concern and you may hear something like this (paraphrased from a discussion I once had with a lawyer along these same lines).
The lawyer will ask you whether you trust the person or company with whom you are negotiating the NDA. This may seem like an odd question since are probably thinking that the NDA is an alternative to trust, countering the risk of dealing with a shady character or a company with a hidden agenda by creating legal consequences for misuse of your valuable intellectual property. It isn't, as is beautifully exemplified by the above-referenced case: if they take the goods and run, regardless of your presumed legal protections, you've already lost.
Sure you can sue, but by then it's too late, it's expensive, the courts can be treacherous and you can be in the right and still lose, and even if you win there may no one you can collect from. It isn't only the small, asset-poor companies you need to watch out for. I have witnessed the very cream of the tech world sign NDAs with the sole objective of getting the goods from start-ups who go on to disclose everything in the hope of winning a top-tier partnership or (though this is only whispered in dark corners) being acquired. For the large corporation it can be far cheaper and faster to exploit what they've learned from you, while knowing that you do not have the resources to successfully pursue a case against a multi-billion dollar corporation in the courts.
The lesson here is that an NDA is absolutely not a substitute for trust. It is a supplement to a trust relationship so that no one makes any mistakes by making clear just what can and cannot be done with disclosed intellectual property. Even so, no matter how strong the level of trust, you should never disclose more than the minimum necessary. Also, disclose gradually in careful steps so that should you realize at some point that you've made a mistake, you can minimize the potential damage. A lawsuit should always be your last option, never the first, and one you almost certainly can't afford.
As when selecting a romantic partner or a building contractor, ask around discretely before walking down the aisle or tearing down the walls. Also, when you do sign an NDA, treat your new partner's intellectual property with the same care and diligence with which you wish they would treat yours. That way when the next deal comes around and the other party checks out your reputation, your existing partners will give the thumbs up sign.
Labels:
Business
Monday, October 26, 2009
Head of State and Republicanism
Like so many peculiar Canadian myths, the one regarding the monarchy continues to baffle me. Whenever a media commentator discusses the question, whether it be pro or con, it warms the blood of many otherwise complacent folk. This article is the latest one that caught my attention.
The reason the question of the monarchy baffles me is that it is a settled question, regardless of the seeming controversy. By saying that I may have in turn baffled you! Let me explain what I mean.
Canada is not a monarchy. It is a republic. Yes, it is true that there is a piece of parchment somewhere that says we are -- as schoolchildren are taught -- a constitutional monarchy. There are also the visible signs of the monarchy such as the Governor General, the monarch's image and Latin caption on our money, occasional royal visits, and so on. This is the obvious stuff that blinds some into believing that we do have a monarch. We don't. Starting with so-called responsible government well before Confederation and up to the ultimate act of making the Supreme Court the highest court of appeal about three decades ago, we have moved slowly but steadily from a monarchy to a republic. But as of three decades ago, Canada is a fully-operational republic. No, not on paper (sorry, parchment), but in the reality of how power is wielded.
The monarch has no regal power in this country. None. Try to image if you can Queen Elizabeth actually attempting to influence, let allow propose or quash, acts of Parliament, or, going even further, dissolve Parliament. If she or her successor were silly enough to try any of these things you can bet that we would very quickly do what's needed to become a republic on paper (or parchment) as well. The government wouldn't stand for it and neither would the "loyal" opposition or us, the citizens. If we ask the same question of the Governor General, it is less clear, as we discovered earlier this year. The difference is that, while usually a purely ceremonial role, the role of GG does have power, but that power originates with the government -- a government elected by the people -- that nominates the sole candidate for the position. This is just like one style of republic where the President is appointed by the government or elected by Parliament; another other way is to elect the President by popular vote, though this step is often skipped where the role is not one of chief executive.
The role of GG or President is not entirely ceremonial since there are cases where parliamentary deadlocks must be broken (e.g. dismiss the government or ask another party to try to form one), but also can serve as a check to a broadly-unpopular act of Parliament when there is no other body to vet legislation. In Canada's case, the Senate is almost entirely toothless in this regard. I suggest that this one item is where the power of a GG contrasts with that of a ceremonial President: the GG does not dare to block legislation, whereas a President might. The difference is small though necessary but, importantly, has nothing whatever to do with the monarch.
Whether we take the step to formally abolish the monarchy is of no importance to me. It's enough that I know that Canada is, where it matters most -- formal power -- a republic. The imprint on our coins doesn't affect this reality.
The reason the question of the monarchy baffles me is that it is a settled question, regardless of the seeming controversy. By saying that I may have in turn baffled you! Let me explain what I mean.
Canada is not a monarchy. It is a republic. Yes, it is true that there is a piece of parchment somewhere that says we are -- as schoolchildren are taught -- a constitutional monarchy. There are also the visible signs of the monarchy such as the Governor General, the monarch's image and Latin caption on our money, occasional royal visits, and so on. This is the obvious stuff that blinds some into believing that we do have a monarch. We don't. Starting with so-called responsible government well before Confederation and up to the ultimate act of making the Supreme Court the highest court of appeal about three decades ago, we have moved slowly but steadily from a monarchy to a republic. But as of three decades ago, Canada is a fully-operational republic. No, not on paper (sorry, parchment), but in the reality of how power is wielded.
The monarch has no regal power in this country. None. Try to image if you can Queen Elizabeth actually attempting to influence, let allow propose or quash, acts of Parliament, or, going even further, dissolve Parliament. If she or her successor were silly enough to try any of these things you can bet that we would very quickly do what's needed to become a republic on paper (or parchment) as well. The government wouldn't stand for it and neither would the "loyal" opposition or us, the citizens. If we ask the same question of the Governor General, it is less clear, as we discovered earlier this year. The difference is that, while usually a purely ceremonial role, the role of GG does have power, but that power originates with the government -- a government elected by the people -- that nominates the sole candidate for the position. This is just like one style of republic where the President is appointed by the government or elected by Parliament; another other way is to elect the President by popular vote, though this step is often skipped where the role is not one of chief executive.
The role of GG or President is not entirely ceremonial since there are cases where parliamentary deadlocks must be broken (e.g. dismiss the government or ask another party to try to form one), but also can serve as a check to a broadly-unpopular act of Parliament when there is no other body to vet legislation. In Canada's case, the Senate is almost entirely toothless in this regard. I suggest that this one item is where the power of a GG contrasts with that of a ceremonial President: the GG does not dare to block legislation, whereas a President might. The difference is small though necessary but, importantly, has nothing whatever to do with the monarch.
Whether we take the step to formally abolish the monarchy is of no importance to me. It's enough that I know that Canada is, where it matters most -- formal power -- a republic. The imprint on our coins doesn't affect this reality.
Labels:
Politics
Thursday, October 22, 2009
CRTC Decision 2009-657 on Traffic Management
It's only the second day since the CRTC issued its ruling on ISP traffic management and there is already a large number of articles (here and here, etc.) -- summaries of the ruling and analyses -- circulating in print and on the internet. There is little point in having me do the same, so I won't repeat what others have already done. Instead I want to look at just how this ruling benefits the large ISPs -- the major carriers against whom the complaints were originally made -- beyond what has already been mentioned. I believe there is a reason why these companies are pretty happy with the outcome:
Apart from the notification requirement, the carriers are in total control of their actions. What I mean is, the CRTC is leaving every judgment call and technical determination of impact on retail and wholesale customer to the carriers; the CRTC has not injected themselves anywhere into the process whereby the carriers must seek approval before acting as they see fit. If you read the decision yourself, you may very well believe that I am wrong about this since the requirements the CRTC lists are, by and large, neutral to positive for the carriers' customers. The only mention I've come across that shows any level of doubt is that it is up to the carriers' customers to call attention to any inappropriate traffic management behaviour, which is very difficult to prove from outside the network. I believe the impact is greater than this would indicate.
First, notification is only a requirement if the carrier is instituting traffic management that is more restrictive than what is extant (paragraph 80). The only judge of this is the carrier itself, and so they can choose to not disclose what they've done (paragraph 84). Further, the criteria to measure restriction are exceedingly loose, apart from VoIP and media streaming (paragraphs 126 and 127) which requires pre-approval to throttle or block. This leaves the carrier free to argue, whenever the change is noticed and they are questioned about it, that what they are doing is not more restrictive than what was done before, just different. When conflict inevitably arises, as with the present issue, they will almost certainly continue on with their new traffic management practice unless and until their customers convince the CRTC to act against them. As we've seen, this can take a very long time, during which the carrier can do as they please.
One surprise is that they ruled wireless (mobile) data operators must conform the same traffic management behaviours ruling (paragraph 116). Although it will suffer from the same loopholes as cable and DSL internet access, it is something positive for consumers. This is an interesting development in light of the FCC's proposed rules out today (October 22) on the same matter: wireless network neutrality.
CRTC intervention may be even less likely in future, or at least longer in coming, with their direction to the industry to cooperate and negotiate over points of dispute (paragraph 76). This is not a Commission that intends to be interventionist, despite offers to hear complaints in certain cases (paragraphs 127 and 128).
While it is certainly true that there are political arguments to be made as to why the CRTC is acting as they are, there is an economic reason as well. Reading paragraph 35, the CRTC is consistent in not wanting (if you like) to kill the goose that is laying the golden eggs: it's the carriers that are making the investment and taking the business risk of building the internet infrastructure in Canada. ISPs that make use of GAS (DSL wholesale) do not, although they so provide a source of price and service competitive balance in an otherwise near-monopoly market.
My opinion is that they will continue to regulate with a light hand, and in so doing they will tolerate modestly discriminatory behaviour, believing that if the carriers' balance sheets remain strong, investment in their networks will continue. In contrast, they are not (yet) willing to bet on new entrants or existing small players to take a major role in building out infrastructure. Whether you consider this good or bad often depends on where your own interest lie. There is no absolute right and wrong in this matter despite attempts by some to colour it that way.
BCE said in a statement that it thinks the decision is a good one and that its "existing Internet traffic management practices are already compliant with it."I suspect these companies are happier with the ruling than is apparent in their public statements; they do not want to be seen as gloating. Let me now go through my reasoning why this is a potentially big win for these companies.
Michael Hennessy, senior vice-president of regulatory and government affairs at Telus Corp (T.TO), said the communications company doe not currently throttle traffic. However, it does employ some general caps on bandwidth usage.
"There are growing concerns about congestion," he said, adding the CRTC's decision is "very good and very fair", and that continued network investments and consumption-based pricing are among ways to address heavy traffic volumes.
Apart from the notification requirement, the carriers are in total control of their actions. What I mean is, the CRTC is leaving every judgment call and technical determination of impact on retail and wholesale customer to the carriers; the CRTC has not injected themselves anywhere into the process whereby the carriers must seek approval before acting as they see fit. If you read the decision yourself, you may very well believe that I am wrong about this since the requirements the CRTC lists are, by and large, neutral to positive for the carriers' customers. The only mention I've come across that shows any level of doubt is that it is up to the carriers' customers to call attention to any inappropriate traffic management behaviour, which is very difficult to prove from outside the network. I believe the impact is greater than this would indicate.
First, notification is only a requirement if the carrier is instituting traffic management that is more restrictive than what is extant (paragraph 80). The only judge of this is the carrier itself, and so they can choose to not disclose what they've done (paragraph 84). Further, the criteria to measure restriction are exceedingly loose, apart from VoIP and media streaming (paragraphs 126 and 127) which requires pre-approval to throttle or block. This leaves the carrier free to argue, whenever the change is noticed and they are questioned about it, that what they are doing is not more restrictive than what was done before, just different. When conflict inevitably arises, as with the present issue, they will almost certainly continue on with their new traffic management practice unless and until their customers convince the CRTC to act against them. As we've seen, this can take a very long time, during which the carrier can do as they please.
One surprise is that they ruled wireless (mobile) data operators must conform the same traffic management behaviours ruling (paragraph 116). Although it will suffer from the same loopholes as cable and DSL internet access, it is something positive for consumers. This is an interesting development in light of the FCC's proposed rules out today (October 22) on the same matter: wireless network neutrality.
CRTC intervention may be even less likely in future, or at least longer in coming, with their direction to the industry to cooperate and negotiate over points of dispute (paragraph 76). This is not a Commission that intends to be interventionist, despite offers to hear complaints in certain cases (paragraphs 127 and 128).
While it is certainly true that there are political arguments to be made as to why the CRTC is acting as they are, there is an economic reason as well. Reading paragraph 35, the CRTC is consistent in not wanting (if you like) to kill the goose that is laying the golden eggs: it's the carriers that are making the investment and taking the business risk of building the internet infrastructure in Canada. ISPs that make use of GAS (DSL wholesale) do not, although they so provide a source of price and service competitive balance in an otherwise near-monopoly market.
My opinion is that they will continue to regulate with a light hand, and in so doing they will tolerate modestly discriminatory behaviour, believing that if the carriers' balance sheets remain strong, investment in their networks will continue. In contrast, they are not (yet) willing to bet on new entrants or existing small players to take a major role in building out infrastructure. Whether you consider this good or bad often depends on where your own interest lie. There is no absolute right and wrong in this matter despite attempts by some to colour it that way.
Labels:
Politics,
Technology
Subscribe to:
Posts (Atom)