Over at Why Now, we talked about the T-Mobile / Microsoft Sidekick data loss and veered off into Microsoft's technological decline since Bill Gates left in 2000. The speculation is that the changeover from having a manager who was an actual technology person, to having a manager who has no technology background (Steve Ballmer, who came up through the business office, not through engineering), meant that Microsoft's upper management lost the ability to actually evaluate the technology proposals and projects and manage their development, and the result is a series of products which are increasingly late, bloated, slow, and buggy, if they manage to ship at all.
This is, this is true of most big companies today in the United States. They're run by salesmen, cronies of the oligarchs who control half the wealth of the US, and salesmen are not by nature reflective souls and are chosen for loyalty, not for intelligence. They arrogantly believe it is not necessary to understand technical details of what they're selling in order to make proper judgements about its content and scope, all they have to do is sell, sell, sell and it all works out in the end. The problem is that since they don't understand the technology and worse yet have no desire to understand the technology, they're ill equipped to make critical decisions about product direction and feasibility. They fall prey to yes men, fads, and scams, and pour company resources into directions that are not productive. Furthermore, if it's not a product they can sell they aren't interested in it. Pure R&D is not something that they can sell, so they don't spend any money on it. This produces better quarterly profits for a while, but eventually their product line gets stale because it's just bigger/faster/smaller variants on the same old same old, not anything new and fundamentally different (see: Cisco Systems). And that's the state of the economy in the US today -- not competitive because it's been starved of core R&D. What innovation is being done is being done by foreigners, or by leftover relics of the 80's, and even that is just rehashing of old concepts that we had in the late 70's/early 80's. We basically have created nothing (zero) new in the past thirty years -- all we've done is implement things that we had already designed thirty years ago, but needed time to make smaller/faster/cheaper before it was practical to build them.
Now I hear you say, "iPhone!". But the iPhone contains not a single concept we didn't already have in 1979. Hand an iPhone to a Bell Labs researcher from 1979 (Bell Labs being another of those things no longer around), and he'd say, "huh, Unix in a telephone handset with an iconic touchscreen interface, why'd it take you thirty years?" Not a single concept there that would be new to him. Take him to the Google and YouTube web sites, and he'd say something like, "interesting, looks like someone finally got Douglas Engelbart's NLS deployed with a fuzzy content-crawling search algorithm similar to grep and find for locating content." There is nothing -- zero, nada, ixnay -- sold today that would surprise that Bell Labs researcher from 1979. He would be amazed at how small and fast processors and memories had become, but there are no fundamental architectural differences between the Vax 780 being sold in 1979, and the ARM processor inside an iPhone -- all that has happened is incremental improvements to make what took a couple of filing cabinets worth of equipment in 1979 fit into a phone case, not new concepts.
We've been feeding on seed corn for these past thirty years, mining thirty-year-old ideas for things worthy of implementation on the incrementally improved hardware. And crap, we haven't even matched some of the accomplishments that were around thirty years ago. Neither Linux, MacOS, nor Windows include the security features of Multics that made exploits virtually impossible, for example... a case where innovation has gone backwards since 1979. The problem with munching on seed corn is that eventually you run out of corn... and we're pretty much there now. Incrementally improving the hardware is arriving at fundamental physical limits (the speed of light, the size of individual molecules), and we're running out of thirty-year-old ideas to implement on this hardware. In short, we're reaching the end of innovation thanks to emphasizing selling over creating and manufacturing... and we all know what happens to civilizations that cease to innovate. They die.
- Badtux the Innovation Penguin
I agree of course with the stagnation-nation theory. The problem does seem to be one of a lack of R & D. The money souces seem to have dried up which helps no one advance. Seen the computer projection screen and projection keyboard demos ? All from something the size of a skinny hot dog . But wonder that it is , it won't save the industry . Education , innovation , financing and developement , the only things that will really save us . And all so underfunded as to be a farce.
ReplyDeleteMakes me wonder whare the next whiz kids will be from : Chna, Mexico, Cuba, Checkoslovakia, who knows , who knows
w3ski
Just an addendum: the accounting crowd (just as tech-deficient as the sales/mktg. crowd) shares dominance in the corporate boardrooms and sr. mgmt. suites.
ReplyDeleteNeither acctg. nor mktg., in college courses or the workplace,prepares one to manage operations or tech development.
Gee, this is very reminiscent of something I read over at the Baseline Scenario not long ago.
ReplyDeleteInstead of reinvesting profits, we've spent, at the very least, the last decade reducing costs, and improving productivity. These are good things, in the abstract, but the workers who accomplished these tasks got rewarded by having their jobs either eliminated or outsourced overseas.
All the money went into the pockets of the CEOs and their cronies, instead of the product.
Something that didn't get mentioned is the apparently dead concept of fiduciary responsibility. Corporate officers who enrich themselves at the expense of the companies they are running belong in fucking jail.
One quibble. Microsoft products have always been bloated, slow, buggy memory hogs. Can't lay that specificallyon Ballmer. It seems to have been integral to Gates' business plan.
Cheers!
JzB the Microsoft hating trombonist
I think there's a lot of truth in your words. OTOH, I think part of the problem is more fundamental. I've been in R&D my whole career, and we seem to be meeting diminishing returns in many areas. In other words, R&D takes more money to make less progress than it used to. Executives see this on the budget sheet and put money to a more effective use (in the short term).
ReplyDeleteIn the specific case of Microsoft, their major problem is that software is never consumed. As a result, consumers don't need a new operating system or word processor every couple of years. Microsoft has to try to sell them one anyway.
Jazz, in my opinion Microsoft got a bum rap in the mid to late 90's. From 1995 to 2000 they regularly produced good products that were technically superior to most of the other products on the market and significantly less expensive than alternatives that were technically superior. Look at Windows 95, for example. It made MacOS look like an obsolete toy, it made the lowest-end Unix workstations look bloated and outrageously expensive, in short it was a brilliant exercise in producing the possible in a short amount of time for a modest price. Yes, you had blue screens of death from time to time with Windows 95... but AmigaOS had its Guru Meditations, MacOS had its grumpy faces, and so forth. I.e., it was no worse than any other competitors in the home computer market that it was aimed at.
ReplyDeleteAnd Windows 2000 is, in my opinion, still the best OS that Microsoft ever released. It was fast, reliable, lean (running well even on limited hardware that Linux with KDE/X Windows was lethargically slow on) and the UI, while not as clean as an Apple design, did not have the arbitrary gee-whiz nonsense that came in with Windows XP that was good for nothing except slowing things down (a process taken even further to ridiculous extremes with Windows Vista).
But 2000 was pretty much end of the road for Microsoft... since then, they haven't done much of anything except meaningless revs that add no real value.
w3ski: One problem is that after 2001 the immigration doors were slammed shut. That whiz kid from India, China, Mexico, Cuba, Czech Republic, etc. used to end up here in the Silicon Valley sooner or later. Look at Linus Torvalds, the creator of Linux -- he got lured here from his native Finland back in '98, and has been here ever since. We've been cherry-picking the world's talent and now that's stopped. That can't be helping the innovation deficit either... it takes a suitably large pool of bright people banging heads together to come up with innovations, and having people scattered throughout the world isn't as horrid as it would have been in pre-Internet times but sometimes there's just no substitute for in-person mass quantities of talent in one place.
Anon, Balmer's background is the accounting/business manager crowd, not the sales crowd.
Jay, there's plenty of things that were implemented 30 years ago in mainframe operating systems that still do not exist in Windows. While operating systems are not consumed, Microsoft could at least bring those features down onto microcomputers now that the hardware is becoming powerful enough to handle them, but they've become incapable of handling such large projects. The small changes they've made internally have been useful, such as the security changes that came in with Vista, but fall far short of the product road map laid out by Chairman Bill back in the late '90s. Microsoft's current management infrastructure simply isn't capable of such major engineering efforts anymore, there's nobody in charge who has the expertise to properly decide which projects are worthwhile and which are cr*p or to evaluate the personnel to see which ones are the performers and which ones are the bullshitters. So things are late, bloated, buggy, have features of absolutely no useful nature just because one of the bullshitters talked one of the clueless managers into believing it was necessary, and so it goes.
I wonder how much of the problem is due to the changes in Microsoft's management, and how much is due to the increased complexity of their programs?
ReplyDeleteOf course, a lot of the complexity is not useful to most users. They add a bunch of useless features to convince buyers to upgrade. The extra features just make the useful stuff run slower. And lately their interfaces feel like playing a tile matching game, picking a little artsy icon and hoping it's the function you want.
Which is why I went Linux years ago, really.
Microsoft has a monopoly. What's there to change?
ReplyDeleteBesides short term bottom line thinking, another big driver of stupidity has been decreasing competition. Starting with Reagan's reign, bigger was supposedly better to help US business compete globally. After that, there was never a monopoly they didn't like. Or, at best, oligology, like cable and telcos....
However, in one sense, you're too gloomy. There have been game-changing breakthroughs, but these have mostly been in materials science. Nanotech, PCR, new methods of genome sequencing, result in products or medical advances that weren't envisioned by your archetypal Bell Labs guy. But, except for some "nano" cosmetics or some damn thing, those aren't exactly consumer products. Or they aren't yet.
(Gah. "Oligopoly")
ReplyDelete