May 11, 2010
The real reason why Steve Jobs hates Flash
It isn't as much about Steve Jobs and Flash as it is about prognostication. Where will computing be in 5 years? 10?
I always love prognostication in the IT industry. It's like weather prediction... you're lucky if you get some of it right, you hope people will later ignore the stuff you get wrong, and you definitely cringe at the stuff you didn't know would be developed that created significant changes.
Charlie is certainly emphatic. He makes any number of flat out statements in the post. (check out the number of "all", "everybody", and other emphatic - foot down words) Of course some of these are counted among my favorite IT statements:
The USA has some of the worst domestic broadband in the developed world, because it's delivered over cables that were installed early...
I have mentioned before how much I dislike the blanket statement about broadband service in the US being "the worst" in the developed world. I don't know where but I know I've mentioned it. Let me recap... when they talk about the mah-velous performance in other countries, are they including countryside away from the city? How are they measuring? Is this speed available to all? Who pays and how much does it cost? What is the size of the country compared to the size of the US? Okay you get the idea. I never see real comparable statistics - just a flat statement. I want to see good impartial data with a comparison of apples/to apples (so to speak). This is always a statement proclaimed as if everyone "knows" it to be true - rather like global warming they are saying, "the science is settled" and no one questions it.
...the PC industry as we have known it for a third of a century is beginning to die.
PCs are becoming commodity items.
Interesting - that leap of logic. Does this mean all commodity items will disappear because they aren't profitable enough? I expect Target and Walmart will soon be out of business since they depend so heavily on selling all kinds of commodity items at a very low profit margin.
The trend he notes is exactly the trend that always occurs with development of a product. The initial prices are prohibitive, they are difficult to operate, they are mainly used by enthusiasts. They eventually become easier to operate, the quality improves, the price drops, and regular everyday people start using them if they are fun, useful and/or make life easier.
Actually the margins have seldom been really high on any type of PC - when the prices were high, it's because the component prices were high. As the component prices drop so does the overall price allowing more to be sold. There is nothing new in this curve of development it's basic economics. This in itself does not portend the imminent demise of any product.
If you look at commodity products historically, those with prices that don't drop are the ones that tend to disappear. Although I'm sure it's not true in every case (few things seldom are always true or false)
Software will be delivered as a service to users wherever they are, via whatever device they're looking at...
Ah the mantra of the dedicated geek. Things will be sooooo much better when we get it delivered to us instead of fiddling about with it ourselves. Of course they usually mean things will be better when the peons get all services delivered. And any service they themselves don't particularly want to pay for or bother with is delivered for "free".
This geek utopia is a tricky place... there is lots of free software and internet access, but enough of a difficulty level to leave the geeks feeling superior to the peons of the computer world. No one ever says who develops and pays for all the free stuff - it's just there.
Let's go back a bit in history for a quickie overview. Computers began as big machines in dark cold rooms inside big businesses and universities that could afford them. After a bit, "terminals" were developed. Programmers/operators could use these to send commands to the computer (so much better than all those cards with holes punched in them!). These were the first "thin clients". They did nothing but have a screen - for showing typed commands and results - and a keyboard. All processing was done on the mainframe.
Then along comes the Personal Computer (for this post the term PC includes Windows, Mac, and Linux desktop/laptop machines). It gave everyone a chance to have access to all that processing power. Now you too could sit at home and do computations - just like they did at NASA to land men on the moon! Every computer everywhere was going to be small and on the desktop! Everything would be done via PC - it was the death of the mainframe server!!! (death I tell you!)
About the mid-90's that started to turn around. Many geeks were shocked to find that PC's (not even Apple's vaunted Mac) could process data as fast as those behemoth mainframes. Besides, after the millions invested, companies were stubbornly clinging to those nasty old things and wouldn't get rid of them.(how 70's retro could they get!)
Thus blossomed the new idea... everyone would soon move to "thin clients". These would be exactly like the terminals of old. They would be a screen - now in living color! - and a keyboard and maybe a small drive that ran some little bit of software. But mainly they would be just a screen and you would be hooked up to a mainframe... everything would be done via server - it was the death of the PC!!! (death I tell you!)
Sadly for prognosticators all over the IT world, it soon became apparent that businesses and people were once again not going to cooperate! They were stubbornly clinging to their antiquated PC's. They liked their silly little bits of software on their desktop, not stuck out on a server (if you could even get such types of software on a server!). They liked the speed at which they could get things done when the software was right there. And thus the pendulum swung once again, although not quite so far this time. PC's were in vogue and mainframes were accepted as being a necessary evil to do all that boring computing in the background that keeps business rolling along. (much like Mike Rowe's Dirty Jobs).
Now it's swinging again. We have iPads and smart phones, other pad type devices that will soon be available to browse the internet, pick up mail and do some basic tasks. Does this spell the end of the PC? I have no idea. Does it sound exactly like earlier predictions of the demise of one type of computer or another - it certainly does.
Just because there is a new fad, does not mean any particular type of device will disappear. "Everyone" will not be making the switch. There are many people who can not afford the cost of a "smart computing device" - even if people with disposable income believe them to be cheap. There are other considerations that will come into play. I'll leave those for Part 2 as these other issues are of some concern - at least to me if no one else.
For now - happy computing - either on your Mainframe, PC, or smart computing device of choice.
And on to Part 2. (unless you'd rather bang your head against a wall - I could see the appeal after all this geekiness)
I'm grateful for your post (grateful I tell you!).
Posted by: Rev. Paul at May 11, 2010 11:53 PM (VJui6)
Posted by: Cappy at May 12, 2010 12:03 AM (0YmyP)
I grew up with the Hollerith code...writing Assembler on IBM 360s and 370s and Amdahl 470 V6's. Gene Amdahl was a former IBM employee who went to Fujitsu for seed money, and built huge ass water cooled mainframes. Damn things looked like HAL. Back in the day, I can remember dropping a huge decks of punched cards and having to sort 'em. Double damn.
IBM's proprietary EBCDIC is a bitch to convert to ASCII, especially packed signed data. Unix and Linux became the norm for connectivity. They are inherently, networking operating systems, and thus, the most efficient with connecting / serving many client computers. Look at EDI applications, for example.
Bottom line...Big Iron IBM mainframes are history unless they're running Linux...AIX didn't make the cut. Windows proprietary registry is not really efficient or easy to maintain. *NIX configuration is done with text files...works the same way across the board. Easy to enable one machine to talk to another. This is key. Linux / Unix will prevail.
Great post...sorry to get a little off topic, but I think we've had a similar conversation before. I was probably drunk at the Blade's....but, I can still add and subtract in hexadecimal, and plot rocket trajectories in my head. I'll take credit for a good guess.
Have a good day, my friend.
Posted by: Yabu at May 12, 2010 07:43 AM (VxNeS)
Posted by: Teresa at May 12, 2010 10:29 PM (ZCuP9)
Posted by: patti at May 16, 2010 03:43 PM (HwsuI)
69 queries taking 0.0122 seconds, 242 records returned.
Powered by Minx 1.1.6c-pink.