kublikhan wrote:Parents were asking themselves: "Why spend $200-$300 on a game console when I can buy a personal computer for the same price that can play games and do so much more?"
But that's just it. They weren't the ones using them. They were buying them for their kids. Sure, there were lots of kids, enough to drive impressive sales figures, but it was still dominated by that one demographic.
Another factor is the shift from 8- to 16/32-bit computing. That shift incurred a large price-premium. By the early 80s, 8-bit CPUs and enough ram for 64K had been fully commoditized, but Motorola 68K and Intel x86 chips were still new. They also required more and more storage. Early hard drives were very expensive. Consequently, 8-bit computers that ran on carts and floppies (like the C=64) hung on in the mid 80s while the IBM PC started to colonize corporate America. By and large nerds did NOT go for the early PCs because they were simultaneously more expensive and had crappy graphics and sound. Early macs were grayscale and really oriented around desktop publishing. Macs and Amigas were really beyond the reach of most computer nerds in price. The Atari ST was an also-ran.
When we get to eMachines, we have largely done away with ISA (remember interrupts?) in favor of plug-n-play and we have Windows 95/98 which was on par with MacOS and 24-bit graphics cards. The time was right for the PC to trickle down into the mainstream. You also had the translucent iMacs around that time that were very popular because they were appliance-like.
If you were to make an analogy to EVs, think of the shift from lead to nickel and from nickel to lithium. Things take a dip during these transitions.
So the simple S-curve is an oversimplification in some cases.
With EVs you also have a format war of sorts (similar to VHS vs. Beta) with charging networks that causes market-confusion. Commoditization really cries out for standardization and we don't have it yet. Also, each car company is inventing its own autonomy suite. They're all reinventing the wheel. Think of the early days of PC graphics cards. Today there are only two R&D firms left, nVidia and ATI. If there's only a certain maximum amount of R&D to spend, dividing that thinly across many companies seems wasteful. It's OK during the formative stages as it fosters experimentation, but it has to eventually shake itself out. You see this in so many areas. eCommerce before Amazon rose to the top or social networks with MySpace and then Facebook or search with Yahoo and then Google.
When there isn't enough competition, companies make mistakes and push out products that don't really serve the market well enough. This yields mixed signals about that class of good if they don't buy into it. For instance, the shitty appearance of the Leaf. Nissan can have all of the motivation in the world to sell EVs but if they can't design their way out of a paper bag then analysts who look at their sales figures may just say "nobody wants EVs". No. Nobody wants
Nissan's EVs. The whole idea of a capitalist system is to foster healthy competition. Build a better mousetrap and you may see what seemed to be a failed market explode overnight. Smart businesspeople move into a market when conventional wisdom falsely-assumes there's no potential because they realize it it's been just poorly served by the incumbents.
Anyway, I am not expecting doomers here to be well versed in entreneurialism. It's all about spinning collapse scenarios while the world just ignores them and goes its own way.