The comments to this post demonstrate a general lack of understanding, so allow me to explain - at least about the dot com part.
Before I begin, some background. I used to work in the dot com world, before, up to, and very shortly after the bust. I was, among other things, my employer's unit coordinator for Y2K readiness. In case you forgot, or lived in a cave, or are under 9 years old, back before the year 2000, programmers lacking foresight wrote lots of software (and manufactures built many microprocessors hard coded) to a two-digit standard for years in dates. The problem being, when the year rolled over in 2000, the systems would make the assumption that the year was 1900. There was much panic, mostly unfounded, that malfunctioning computers would result in nuclear missiles firing themselves, airplanes and satellites falling out of the sky, every grid going down, dogs and cats living together, and the end of the world as we know it.
Unfounded as much of this was, there was a legitimate need by the industrial world and the information society to correct the problem. This led to a huge upturn in business, starting around 1998 or so, in the tech sector. Companies like Dell, HP, etc., were cranking out new machines loaded with Y2K compliant software. That's what most consumers saw. But behind the scenes, industry was buying a ton of products and paying for a ton of services to ensure that things like robotic equipment, networking systems, telecommunications, medical equipment, and vast numbers of other devices were fully upgraded. In my workplace, we installed new operating systems on industrial tools used to manufacture memory chips because they would have malfunctioned otherwise. This resulted in a huge demand, so the tech sector ramped up production while prices climbed.
Many purchasers, at the personal computer level as well as at the level of massive industrial applications, saw this as a good time to invest in complete system upgrades rather than simply spending on more cost-effective patches. Why spend $500 on a new operating system and software when, for $900, you could buy a whole new system, complete with a faster processor, bigger hard drive, etc.?
You've heard the phrase "a rising tide lifts all boats." Well, it was true in the tech sector. The demand for products resulted in a corresponding demand for related goods and services. Sales of new computers meant sales of more memory chips - so my employer was rolling in profit. Similar situations occurred with manufacturers of modems, monitors, floppy discs and CD ROMs, and applications software. And companies that made these things all needed to expand their factories, resulting in huge profits for companies making things like assembly-line robots, photolithography equipment, silicon wafers, and the like. And this trickled into raw materials. Companies selling chemicals, industrial gases, even cardboard for boxes, made more money.
All this profit resulted in a stock surge. The NASDAQ doubled. Then it doubled again. Fast.
Along the way, investment fever spread, and many startups offering no real products other than "internet solutions" could make millions on an IPO simply because investors believed the sector was hot, so any startup would eventually make money.
Then came year 2000.
There were no serious glitches caused by Y2K. There were some smaller ones. I had some systems go down temporarily. But the fixes came quickly and global destruction was averted. Sad survivalists came out of their bunkers realizing they had a five year supply of canned corn and nothing to save it for. But the actual problems were still a few months out.
As soon as industry realized no more massive system upgrades were necessary, orders stopped coming in and facilities that were pumping out product had to shut down. On the home front, a larger than normal number of consumers had recently completely replaced their PCs, so demand for all the product already on the shelves dropped. This had a ripple effect to all the supporting sectors, as orders for capital goods and industrial materials dropped. This caused some well established companies to suddenly face financial hardship.
Worse, for investors, all those recent startups that had huge IPOs with no real offering went under fast.
By the end of 2000, the NASDAQ had dropped by roughly three-fourths. Many tech-sector workers - like me - found themselves laid off. About a trillion dollars worth of virtual wealth - wealth people had measured in the trading prices of their stocks instead of in fungible, tangible wealth - disappeared. That rippled out to other sectors as many people who were spending based on their virtual wealth suddenly found themselves out of cash.
Now, some people out there look back on the dot com bubble burst, and want to blame Bill Clinton. Hey, he was President, he must be to blame! Others want to blame the Republican Congress. Hey, they controlled the taxes and the spending, they must be to blame! Others want to blame corporations, while others want to blame too much government. Blame blame blame...
All hogwash. No one was too blame, because nothing blameworthy happened.
This was bad, but it was not a catastrophe. We simply had a market correction. The tech sector was growing. But what would normally have been a decade's worth of growth was crammed into about two and a half years, due to a unique short term demand spike caused by the need to resolve Y2K issues. That gave us a short term unnatural high leading into 2000. Afterwords, as demand deflated, we dropped down - lower than we would have liked, for sure, but not down below where we were just a few years prior to the growth spurt. Had the market been free to continue to adjust, it would have rebounded and dropped a few more times before reaching an equilibrium.
Unfortunately, something else came along to spoil things: a year later, when the market should have been recovering, terrorists attacked the United States, creating chaos on the economy. But within a few more years, the tech sector did normalize - just in time for other sectors, like banking, housing, and automotive, to begin to tank. But that's another story...
So to those commenters out there still playing the blame game on the dot com bubble burst: dude, get over it already! As J.P. Morgan once said, markets tend to fluctate. The worst thing we can do is use blame for fluctuations to justify government intrusion into the private markets.
programmers lacking foresight wrote lots of software
Oh, not so much, David. They were writing industry-standard code based upon the 8080. You may remember when Bill Gates noted that users would never need more than 64k RAM. That was the environment in the late 1970's and 1980's - when these constraints were firmly in place. It really wasn't that they lacked foresight, they simply lacked technological capability.
And it must be noted that the proggies worked rather well, given the limitations. As technology improved, we encountered a hitherto-unknown phenomenon: bloatware.
The first computer I purchased, back in 1980, boasted an unbelievable 128k of bank-switched RAM - a huge breakthrough in technology at the time. It also had two built-in floppy drives, each capable of holding as much as 360k data. It cost me $3,000.
Coders had to deal with the limitations of the time, and every digit was precious technological real-estate.
Posted by: Max | Thursday, 30 July 2009 at 12:59 PM
oh, there is no doubt that at the time it was the right thing to do - but the key words are "at the time." hence, my point about foresight. resources saved on two bits in 1980 did not, in my opinion, always make up for the expenditures necessary to reconfigure the virtual universe on short notice once people began to realize Y2K was a very real thing.
Posted by: Gullyborg | Thursday, 30 July 2009 at 01:33 PM
that was good .. do you have a post which explains the recent downfall of the economy?
Posted by: raj | Thursday, 13 August 2009 at 12:59 PM