The recent revelation of a sharp upward increase in the forecast federal budget deficit for the current fiscal year is getting a lot of new coverage, and a lot of disbelief over how this could happen. By 'happen', most commentators mean not the deficit's size but the quality of the government's budget forecasting, and the possibility of dishonesty in their reporting of the numbers.
While I have my own thoughts on the degree of political interference with the budget numbers going back to before the fall election, let's assume for the moment that the government is not acting deceptively in any way. There is ample room for poor predictability during stable economic periods and even more so in the present crisis. We can criticize the quality of the economists and others in Finance, yet their job is not at all an easy one. Let's start with the following extract from the summary of the budget numbers, that immediately predates the most most recent revision:
You should be able to see where the previous $34B headline deficit figure is in the 2009-2010 column (-$33.7B). It is simply the sum of the revenue and expense items above it in the column.
First, notice the relative stability of expenses in comparison to revenues. Program expenses are not static, but the rate of increase is pretty steady and would appear to be in line with growth trends in the broader economy. This is unsurprising since the government has a lot of control over its expenses. This is even true with the sudden increase in Employment Insurance (EI) payments - if it gets too onerous they could, via Parliament, change the rules to bring it down to a more desirable amount. All programs are like that. Whether they choose to manage those figures comes down to policy and politics. My point is that the government has control over these expenses.
The debt charges are modest and change by an even smaller amount when compared to total expenses. This is expected since this line item is the interest rate on the total outstanding debt. We can therefore ignore this line item for our purposes.
The fireworks is all in that very first line: revenues. To understand this line we need to recognize that the wealth in this country is almost all in the private sector. If you consider the capacity for wealth creation then it is entirely in the private sector. The government does not create wealth; instead they take a portion of the private wealth and distribute it into a variety of public institutions and programs. The best the government can do in regards to wealth creation in our system is to provide a legal environment that is favourable to private sector wealth creation.
With the understanding that the wealth that is the source of all the government's revenues must come from the private sector we can begin to see that the government is only partially in control of its revenues. Let's look at a few examples, all of which (unsurprisingly) relate to some form of taxation.
We know that a lot of government revenues come from income taxes, both business and personal, and that personal income is an expense for businesses (in the form of salaries). Consider a hypothetical company with $100M in expenses and $110M in revenues. It's profit is $10M or 10%. It pays taxes on that $10M of, say, $1M in federal tax (since I'm focusing on the feds I will not look at provincial taxes). If the company is in a knowledge industry like high-tech, the bulk of company expenses are in salaries. Let's assume $80M for this compay. The personal federal taxes paid by employees might then be about $20M (income, GST, etc.).
Federal revenues from this company is something like $21M/year, or about 5% of its revenues. This is obviously a very simplified analysis, though I believe it is adequate for my analysis. If it is a public company in a moderately stable industry, the company's stock price will reflect its P/E (price-to-earnings ratio) in some fashion. The earnings are $10M in the present case, or divide by the float to get the P/E on a share basis.
Now let's have the company hit the skids as many are doing nowadays. Business evaporates, which is reflected in reduced revenues. For example, $70M rather than $110M. However their expenses are unchanged unless they reduce salaries (their primary expense) by dismissing employees that are no longer contributing to the lowered business volume. Let's assume they choose to dispense with profit for the near term so they dismiss ~30% of their staff to bring expenses equal to revenues. Their profit is zero, so no business income taxes are paid. $30M in salaries are no longer being paid, so federal taxes from the former employees declines from $7.5M to $1M. From this we see that federal revenues have declined from $21M to $13.5M. That's a lot, so it's good that most companies are not in so dire a situation.
In addition to the lost revenues, the government also must pay EI to those displaced employees. This does not replace all the lost salaries, but it does increase government expenses, some of which are recovered in taxation (though at a lower rate since the personal income level is lower). Let's say that program expenditures increase $10M and revenues increase $1M.
What started as net government income of $21M in this example is now $4.5M, a decline of nearly 75%. In fairness, this is only a short term effect since economic forces will adjust to the new circumstances, such as the rise of new industries, recovery of the old, and re-employment of the displaced. But since we're looking at this one budget year, the drawn scenario is instructive.
Now, let's get back to this distressed company with no earnings. Its stock price tanks (we've all seen this happen) along with P/E. The company is now less able to raise funds by debt or equity, reducing its ability to retool its business or make other adjustments to its operations. Its stock, much of which is in the hands of pension funds or cash investors, suffers accordingly. If the company goes bankrupt, as is likely with GM, the shareholder wealth is reduced to zero, forever.
For cash investors, there are capital losses instead of capital gains, which further reduces government revenues. Desperate shareholders pull money out of their RRSPs, at reduced valuations, to pay the bills. This act increases government revenues since the withdrawals are taxable, but the amount is lower than if the withdrawals were at higher a share value. Everyone reduces spending, unleashing a cycle of reduced GST taxation, further loss of business taxation, and then more employee displacement. Add it all up and you get -- a $50B deficit.
Our economy is a classic non-linear, chaotic system in the mathematical sense. As in weather forecasting no model of such a system can do a good job when looking forward more than a short distance. When a major exogenous event occurs, predictability becomes a joke. Flaherty is not responsible for that.
This is already a long post and I hope it was worth the time to read. I can't truly quantify or justify the size of the deficit or its frequent revisions, but when I look at the underlying upheavals on the street I can sympathize to a degree with the government. The politics of the matter can only serve to obscure what is largely a problem originating in the private sector - the dog - where the government finances are the tail of the dog: the dog wags the tail, and not the other way around.
One bright point is that the reversal when it comes could be just as sharp. The federal deficit for 2010-2011 could be far better than what's in the table above, even with the upward revision in this year's deficit.
[Addendum: I forgot one important point, the one that ties back to the title of this post. The volatility of the deficit mirrors the volatility of the markets for much the same causes, as demonstrated in the scenario I presented. In particular, small declines in revenue cause large declines in profitability, which then eventually cascade into a sharp decline in the tax base.]
Thursday, May 28, 2009
Tuesday, May 26, 2009
Race Weekend Recap
For those not in Ottawa, we just experienced our annual Race Weekend where something like 36,000 runners take to streets in various events including a marathon. It's a lot of fun. I once again ran in the half-marathon, which is the event with the largest participation: about 9,000.
I don't run much as a rule except to train for this one event. However I am fit enough to put in a decent showing. Now as I toss my running shoes aside until next spring I'd like to record a few of my memories of the event. While there's lots to gripe about if you try to do so, that is not my aim. After all, it's a fun event even though it is demanding.
With that preamble, on to my list:
I don't run much as a rule except to train for this one event. However I am fit enough to put in a decent showing. Now as I toss my running shoes aside until next spring I'd like to record a few of my memories of the event. While there's lots to gripe about if you try to do so, that is not my aim. After all, it's a fun event even though it is demanding.
With that preamble, on to my list:
- The weather was quite good - neither too hot nor too cold, although the sun was sizzling. My strategy was to grab a cup of water at every station and pour it over my head. That worked perfectly to keep the sun at bay.
- The crowds, as always, were great. Many must have shouted themselves hoarse from cheering the runners onward.
- There has been some criticism of the recent route change that includes several kilometers in Gatineau/Hull since the cityscape on the route is a bit grittier that the section in Ottawa. I find it refreshing to pass through areas where real people live and work. It also has the benefit of multiple ups and downs rather than the dead flat roads on the Ottawa side. It adds challenge while also adding some needed variety. The only peculiarity is the near absence of spectators. It gets eerily quiet as you cross the Chaudiere bridge and then the cheering returns as you cross the Alexandria bridge.
- I was surprised the first couple of times that spectators, complete strangers, called out my name as they cheered me on. Then I remembered that the race bib on my chest has my name printed on it.
- The "loot" in the runner's kit was pretty slim this year: a pack each of candies and chips. Not running related, but even so they were all consumed soon enough. I wish the athletic shirts they provide had a small pouch somewhere since they would then actually be useful. It's always a puzzle to figure out how to carry a key, food or other essentials when clothed in only lightweight running gear.
- There was lots of room to relax in the finish area since they strictly enforced the rule that says non-runners cannot enter. I even found some shade to lie down in. The downside is that it made the environment a bit sterile since there was no possibility of interaction between runners and their friends and families, which is often the best part about finishing the race. I would gladly give up my patch of shade to allow everyone to mix freely.
Labels:
Lifestyle
Monday, May 25, 2009
Keeping Up With Moore's Law
Throughout my long career the scientists and technologists have been successfully battling to keep Moore's Law on track. It's been so reliable that it has proven perfectly acceptable that, if you need to find a way to increase the speed or capacity of a technology product, you only need to plot the trend to the planned release date. If the trend meets the requirement, you're done. It's been very handy for those of us in the business of planning future products; doing nothing to achieve one's goals is nice work if you can get it.
As the technological barriers are progressively hurdled there are also those higher barriers that are imposed by the fundamental laws of physics. These range from light speed - limiting the speed a signal can travel within and between circuits - to quantum mechanics - the soft boundary between traditional electronics laws, including Ohm's Law, and the weird regime of the quantum. Reading this recent article in the New York Times called to mind this latter limitation: in particular that the quantum world is largely discrete and so you run into difficulty when the number of discrete particles drops to a low integer value.
As you improve the precision and accuracy of micro-circuit construction, the circuit density increases at a faster rate. This is simply because a micro-chip is 2-dimensional, and so density increases with the square of the lineal improvement. Unfortunately as we enter the realm of small quantum units, the number of those units decreases at a similar rate. For example (assuming that all other things remain the same, which they rarely do), if you have 100 electrons within a square circuit area that are employed to perform some function, if you double the lineal density you then have only 25 electrons on hand. While this is grossly simplified it does demonstrate the nature of the challenge: as particle counts drop toward one, the electronic laws that are based on mass quantities of particles become fuzzy in the face of the probabilistic nature of each of those particles. Macro-laws smooth out the individual differences of unpredictable particle behaviour.
Going to 3D has been a dream for as long as I can remember. In the case presented in the referenced article it can provide a way to avoid the quantum scaling problem for a little longer. However, 3D circuitry has its challenges. I remember when, long ago now, there was a move to 3D circuit boards. These allowed more circuit density with the discrete components then in use by allowing easier interconnection among the huge number of pin-outs of the many ICs and other, more basic components. There was a limit to this since components still had only two surfaces to work with for the most part (wires/traces, not components, were on the interior layers). In time the interconnection problem was eased by better CAD software and chips that replaced dozens of discrete components, making 3D circuit boards less of a focal point for further progress.
Within a chip, going to 3D has the same interconnection challenge while also having other penalties. Perhaps the worst are yields and heat. With increasing density there is an initial drop in yield, which is the fraction of components that exit manufacturing without disabling flaws. This can only get worse with multiple layers where the quantity of deposition layers and masks grows linearly with the number of layers in the 3D stack. It could be even worse. The benefits have to be great to justify the battle over refining processes to achieve economically acceptable yields.
Heat is already a problem in 2D microchips. It is more acute with microprocessors than memory chips, but it still exists and would need to be dealt with. Every gate generates heat during active operation (such as reading and writing), and heat generated within a 3D stack may be difficult to conduct to the surface for removal. There are of course many avenues open to attack this problem since it is very well understood. The optical method outlines in the article would most likely have less heat to contend with, but with the cost of other difficulties.
Since I am far outside this micro-electronic world I can only watch and wonder how these various difficulties can be overcome. If economical, reliable and, perhaps above all, scalable 3D solutions are discovered we will be firmly back on the Moore's Law trend line. Within some constraints, for a given technology base circuit density could increase up to the cube of lineal density. The third dimension most likely will not scale as well as the other two, yet even a modest achievement would have impressive results.
As the technological barriers are progressively hurdled there are also those higher barriers that are imposed by the fundamental laws of physics. These range from light speed - limiting the speed a signal can travel within and between circuits - to quantum mechanics - the soft boundary between traditional electronics laws, including Ohm's Law, and the weird regime of the quantum. Reading this recent article in the New York Times called to mind this latter limitation: in particular that the quantum world is largely discrete and so you run into difficulty when the number of discrete particles drops to a low integer value.
As you improve the precision and accuracy of micro-circuit construction, the circuit density increases at a faster rate. This is simply because a micro-chip is 2-dimensional, and so density increases with the square of the lineal improvement. Unfortunately as we enter the realm of small quantum units, the number of those units decreases at a similar rate. For example (assuming that all other things remain the same, which they rarely do), if you have 100 electrons within a square circuit area that are employed to perform some function, if you double the lineal density you then have only 25 electrons on hand. While this is grossly simplified it does demonstrate the nature of the challenge: as particle counts drop toward one, the electronic laws that are based on mass quantities of particles become fuzzy in the face of the probabilistic nature of each of those particles. Macro-laws smooth out the individual differences of unpredictable particle behaviour.
Going to 3D has been a dream for as long as I can remember. In the case presented in the referenced article it can provide a way to avoid the quantum scaling problem for a little longer. However, 3D circuitry has its challenges. I remember when, long ago now, there was a move to 3D circuit boards. These allowed more circuit density with the discrete components then in use by allowing easier interconnection among the huge number of pin-outs of the many ICs and other, more basic components. There was a limit to this since components still had only two surfaces to work with for the most part (wires/traces, not components, were on the interior layers). In time the interconnection problem was eased by better CAD software and chips that replaced dozens of discrete components, making 3D circuit boards less of a focal point for further progress.
Within a chip, going to 3D has the same interconnection challenge while also having other penalties. Perhaps the worst are yields and heat. With increasing density there is an initial drop in yield, which is the fraction of components that exit manufacturing without disabling flaws. This can only get worse with multiple layers where the quantity of deposition layers and masks grows linearly with the number of layers in the 3D stack. It could be even worse. The benefits have to be great to justify the battle over refining processes to achieve economically acceptable yields.
Heat is already a problem in 2D microchips. It is more acute with microprocessors than memory chips, but it still exists and would need to be dealt with. Every gate generates heat during active operation (such as reading and writing), and heat generated within a 3D stack may be difficult to conduct to the surface for removal. There are of course many avenues open to attack this problem since it is very well understood. The optical method outlines in the article would most likely have less heat to contend with, but with the cost of other difficulties.
Since I am far outside this micro-electronic world I can only watch and wonder how these various difficulties can be overcome. If economical, reliable and, perhaps above all, scalable 3D solutions are discovered we will be firmly back on the Moore's Law trend line. Within some constraints, for a given technology base circuit density could increase up to the cube of lineal density. The third dimension most likely will not scale as well as the other two, yet even a modest achievement would have impressive results.
Labels:
Science,
Technology
Friday, May 22, 2009
Capping Stimulus Spending
It's a bit funny to hear the IMF saying that Canada can afford more fiscal stimulus. This is true, but it is a bit like my saying that I can afford more crystal meth: I just know enough not to do more or, more correctly, any at all! This is especially important when far too much is being spent on companies like GM that are failures. However I think that is more of a political decision to keep on side with the US, just as is being done on vehicle emissions and other programs that affect trade.
I think Flaherty is being appropriately cautious to limit stimulus. Too much is likely to be wasted on projects that would otherwise never be undertaken. Then there are the clamours for money to go to companies and industries that will eventually become extinct regardless of any present action.
Considering that our government is not at all comfortable with this type of spending, and is only doing it out of deference to people's fears, and perhaps some political desperation, I expect them to delay every spending program as the time comes up to write the cheques. What they would hope, in my opinion, is that signs of an economic recovery will begin to appear and they will pounce on those to declare that further spending can be deferred or cancelled. I would agree with this approach, regardless of whether that is truly their intent.
Most of the benefit of stimulus spending is in rebuilding confidence. Showing that they are prepared to spend as necessary may be sufficient, making it possible to forgo actually doing it. Thus we get over the hump to the next growth cycle without taking on excessive public debt. The politics will of course get nasty should this scenario unfold, since it is in the Liberals' interest to hold the government to their spending promises.
I think Flaherty is being appropriately cautious to limit stimulus. Too much is likely to be wasted on projects that would otherwise never be undertaken. Then there are the clamours for money to go to companies and industries that will eventually become extinct regardless of any present action.
Considering that our government is not at all comfortable with this type of spending, and is only doing it out of deference to people's fears, and perhaps some political desperation, I expect them to delay every spending program as the time comes up to write the cheques. What they would hope, in my opinion, is that signs of an economic recovery will begin to appear and they will pounce on those to declare that further spending can be deferred or cancelled. I would agree with this approach, regardless of whether that is truly their intent.
Most of the benefit of stimulus spending is in rebuilding confidence. Showing that they are prepared to spend as necessary may be sufficient, making it possible to forgo actually doing it. Thus we get over the hump to the next growth cycle without taking on excessive public debt. The politics will of course get nasty should this scenario unfold, since it is in the Liberals' interest to hold the government to their spending promises.
Tuesday, May 12, 2009
Who Do You Believe?
Trials, inquiries and hearings, much like scientific investigations, benefit from evidence: the more the better. But what happens when objective evidence is lacking or non-existent? In science, you conclude nothing until you do have the evidence. In judgments involving human activity you have another option: choosing among conflicting claims based on the credibility of witnesses and protagonists.
Right now we have quite the selection of proceedings that are exactly in this state:
Who do you believe? When politics gets involved there is a tendency for us to gravitate toward the party whose political affiliation matches our own. For the politically active, the attraction can be so strong as to totally overturn sensibility. Unfortunately this is a poor way to get at the truth, if that is what one is after.
We are therefore subjected to attempts of the opposing parties to build a record of past credible conduct, or misconduct. They strive to show a record, both private and public, of acting honestly and with integrity, with the implication that past performance is indicative of behaviour in the present matter under investigation.
Like most people I have had to perform these types of judgments, and I know just how terribly difficult it can be. Whether it's interviewing a job candidate, questioning a political campaigner, or asking a child if that's what really happened, you have to discern between honesty, including when the person being questioned is nervous and under stress, and a liar who may be practiced at the art and prepared with believable answers. It ain't easy.
For this reason it is interesting to watch the questioners (lawyers, judges and MPs) in the above-mentioned proceedings to see how they chip away at the testimony of persons when that testimony is all there is to judge. Especially in cases where those questioners are partisan.
He said, she said... now choose who you believe is telling the truth.
Right now we have quite the selection of proceedings that are exactly in this state:
Who do you believe? When politics gets involved there is a tendency for us to gravitate toward the party whose political affiliation matches our own. For the politically active, the attraction can be so strong as to totally overturn sensibility. Unfortunately this is a poor way to get at the truth, if that is what one is after.
We are therefore subjected to attempts of the opposing parties to build a record of past credible conduct, or misconduct. They strive to show a record, both private and public, of acting honestly and with integrity, with the implication that past performance is indicative of behaviour in the present matter under investigation.
Like most people I have had to perform these types of judgments, and I know just how terribly difficult it can be. Whether it's interviewing a job candidate, questioning a political campaigner, or asking a child if that's what really happened, you have to discern between honesty, including when the person being questioned is nervous and under stress, and a liar who may be practiced at the art and prepared with believable answers. It ain't easy.
For this reason it is interesting to watch the questioners (lawyers, judges and MPs) in the above-mentioned proceedings to see how they chip away at the testimony of persons when that testimony is all there is to judge. Especially in cases where those questioners are partisan.
He said, she said... now choose who you believe is telling the truth.
Labels:
Politics
Monday, May 11, 2009
Natural Gas, Again
When I talked about natural gas a couple of weeks ago I didn't expect my prediction of a turnaround to come true quite so quickly. But there it is: six trading days later it breaks out of its technical downward range. This is very encouraging, although it is not an assurance that the price rise will continue.
I am continuing to ride my bet with shares in the (nameless) producer I mentioned in that earlier article. It and other companies in its sector are already showing some excellent price appreciation. There will be twists and turns to come - not a straight line up - and I will be following closely. Price is bound to reverse at some point, and I'll be looking at that 50-day EMA to hold as natural support. On a fundamental basis, provided that investors persist in believing that the economy is recovering, or at least that a recovery is in sight, natural gas will do well.
I am continuing to ride my bet with shares in the (nameless) producer I mentioned in that earlier article. It and other companies in its sector are already showing some excellent price appreciation. There will be twists and turns to come - not a straight line up - and I will be following closely. Price is bound to reverse at some point, and I'll be looking at that 50-day EMA to hold as natural support. On a fundamental basis, provided that investors persist in believing that the economy is recovering, or at least that a recovery is in sight, natural gas will do well.
Labels:
Markets
Thursday, May 7, 2009
Banning Mobile VoIP
Consider this: you know that you are going to die eventually. Is this a reason to kill yourself now? I would hope you answered 'no'. Yet when the subject is the future inevitability of VoIP on mobile phones there is surprise that the carriers don't cave in now and allow it. They have good business reasons to delay the inevitable; it's to their financial benefit today and possibly for some years into the future.
Microsoft is the latest smart phone OS provider to ban VoIP apps from their app store. I doubt that Microsoft particularly cares except they would need to bend to this carrier demand if they hope to get phones with their software out into the market. They are not the first to find that they must do this. Android is perhaps the only OS that is a more open to VoIP today since apps can be downloaded from third-party markets, beyond the censorship of the carriers.
The carriers quite naturally want to maintain the revenue from voice minutes, especially roaming and long distance charges, for as long as possible. They also want to set the timetable for any shift of voice minutes to VoIP, hopefully once they have a business strategy that works in their favour. This may be impossible, but they will try. No one should be surprised that the carriers would exercise their ability to impede VoIP for the present.
I covered mobile phone VoIP earlier with respect to the interests of the carriers, and also discussed some of the technical reasons why it is still problematical. Bandwidth is rapidly becoming a non-problem, but battery life, still a significant issue for smart phones, is a particular problem for VoIP: to receive incoming calls requires the power-hogging application-hosting processor to be continuously running. There are feasible solutions, just not ones that are available now. Even if they were, the carriers would try to keep them out of phones.
Mobile VoIP is coming, but will require patience.
Microsoft is the latest smart phone OS provider to ban VoIP apps from their app store. I doubt that Microsoft particularly cares except they would need to bend to this carrier demand if they hope to get phones with their software out into the market. They are not the first to find that they must do this. Android is perhaps the only OS that is a more open to VoIP today since apps can be downloaded from third-party markets, beyond the censorship of the carriers.
The carriers quite naturally want to maintain the revenue from voice minutes, especially roaming and long distance charges, for as long as possible. They also want to set the timetable for any shift of voice minutes to VoIP, hopefully once they have a business strategy that works in their favour. This may be impossible, but they will try. No one should be surprised that the carriers would exercise their ability to impede VoIP for the present.
I covered mobile phone VoIP earlier with respect to the interests of the carriers, and also discussed some of the technical reasons why it is still problematical. Bandwidth is rapidly becoming a non-problem, but battery life, still a significant issue for smart phones, is a particular problem for VoIP: to receive incoming calls requires the power-hogging application-hosting processor to be continuously running. There are feasible solutions, just not ones that are available now. Even if they were, the carriers would try to keep them out of phones.
Mobile VoIP is coming, but will require patience.
Labels:
Technology
Wednesday, May 6, 2009
Intrusive Ads
I am not alone in considering AdBlock Plus as my favourite Firefox extension. Unlike some users I do not block all ads (using the supplied catalogue of ad-server domains), just those that I find most distracting and expensive.
I have the blocking so well tuned now that I rarely have to give it any thought; it just works. It came to mind again when I read this article. It also recalled to me one of the reasons why I stopped reading TheStreet.com some time ago, and also why I stopped subscribing to their paid service, Real Money. The article focuses on the free service, yet the preponderance of flash and other intrusive ads, and related techniques, is no different on the subscription site.
Why do web site operators do this? Some have learned from experience - including the New York Times - that the more aggressive the placement of ads, and those that consume bandwidth and increase user distraction, drive away their audiences. AdBlock and similar tools wouldn't be necessary if they treated viewers better. This has also been a regular theme on Techdirt, where a variety of newspaper web sites have been doing this in a desperate attempt to offset the loss of advertising revenue from their printed product. It doesn't work.
As I mentioned above, I do not use AdBlock more aggressively than I deem necessary. I have no objection to advertising on web sites any more than I do in printed newspapers. I want to see good sites be financially viable so that they continue. However, when a site treats me poorly by bombarding me with high-bandwidth, unavoidable and attention-monopolizing ads, I either filter them or avoid the site entirely. I have yet to find any web site that is indispensable; there is always another that can serve just as well. That is, they need me more than I need them. Many sites have yet to realize this.
There are even some sites that refuse to serve content if their cookies are blocked. One example was Information Week until they, apparently, changed their minds. I often block cookies from sites that do not provide me with some value in return for tracking my usage of their properties. If I get that value, and they want to place a cookie, I let them do so. Many news site value my viewing habits, but I will impede their receiving that value if I do not get something I value in return. Intrusive ads not only have no value to me, their value is definitely negative.
I have the blocking so well tuned now that I rarely have to give it any thought; it just works. It came to mind again when I read this article. It also recalled to me one of the reasons why I stopped reading TheStreet.com some time ago, and also why I stopped subscribing to their paid service, Real Money. The article focuses on the free service, yet the preponderance of flash and other intrusive ads, and related techniques, is no different on the subscription site.
Why do web site operators do this? Some have learned from experience - including the New York Times - that the more aggressive the placement of ads, and those that consume bandwidth and increase user distraction, drive away their audiences. AdBlock and similar tools wouldn't be necessary if they treated viewers better. This has also been a regular theme on Techdirt, where a variety of newspaper web sites have been doing this in a desperate attempt to offset the loss of advertising revenue from their printed product. It doesn't work.
As I mentioned above, I do not use AdBlock more aggressively than I deem necessary. I have no objection to advertising on web sites any more than I do in printed newspapers. I want to see good sites be financially viable so that they continue. However, when a site treats me poorly by bombarding me with high-bandwidth, unavoidable and attention-monopolizing ads, I either filter them or avoid the site entirely. I have yet to find any web site that is indispensable; there is always another that can serve just as well. That is, they need me more than I need them. Many sites have yet to realize this.
There are even some sites that refuse to serve content if their cookies are blocked. One example was Information Week until they, apparently, changed their minds. I often block cookies from sites that do not provide me with some value in return for tracking my usage of their properties. If I get that value, and they want to place a cookie, I let them do so. Many news site value my viewing habits, but I will impede their receiving that value if I do not get something I value in return. Intrusive ads not only have no value to me, their value is definitely negative.
Labels:
Technology
Subscribe to:
Posts (Atom)