There has been some ... interesting news from the tech sector this week.
Firstly, the Apple vs. Adobe vendetta gets even nastier, with a public letter from Steve Jobs explaining why Adobe's Flash multimedia format will not ever be allowed into the garden of pure ideology that is the iPhone/iPad fork of OSX.
Secondly, Hewlett-Packard are buying Palm, apparently for Palm's WebOS — with rumours of plans to deploy a range of WebOS tablets to rival the iPad — at the same time, they're killing their forthcoming Windows 7 slate, just as Microsoft are killing the Courier tablet project.
Finally, gizmodo (not, perhaps, an unbiased source in this regard given current events) have a fun essay discussing Apple's Worldwide Loyalty Team, the internal unit tasked with hunting down and stopping leaks.
It's probably no exaggeration to say that Apple's draconian security policies are among the tightest of any company operating purely in the private sector, with a focus on secrecy that rivals that of military contractors. But even so, the control freak obsessiveness which Steve Jobs is bringing to bear on the iPad — and the desperate flailing around evident among Apple's competitors — bears some examination. What's going on?
I've got a theory, and it's this: Steve Jobs believes he's gambling Apple's future — the future of a corporation with a market cap well over US $200Bn — on an all-or-nothing push into a new market. HP have woken up and smelled the forest fire, two or three years late; Microsoft are mired in a tar pit, unable to grasp that the inferno heading towards them is going to burn down the entire ecosystem in which they exist. There is the smell of panic in the air, and here's why ...
We have known since the mid-1990s that the internet was the future of computing. With increasing bandwidth, data doesn't need to be trapped in the hard drives of our desktop computers: data and interaction can follow us out into the world we live in. Modem uptake drove dot-com 1.0; broadband uptake drove dot-com 2.0. Now everyone is anticipating what you might call dot-com 3.0, driven by a combination of 4G mobile telephony (LTE or WiMax, depending on which horse you back) and wifi everywhere. Wifi and 4G protocols will shortly be delivering 50-150mbps to whatever gizmo is in your pocket, over the air. (3G is already good for 6mbps, which is where broadband was around the turn of the millennium. And there are ISPs in Tokyo who are already selling home broadband delivered via WiMax. It's about as fast as my cable modem connection was in 2005.)
A lot has been said about how expensive it is to boost the speed of fibre networks. The USA has some of the worst domestic broadband in the developed world, because it's delivered over cables that were installed early — premature infrastructure may give your economy a leg up in the early years, but handicaps you down the line — but a shift to high-bandwidth wireless will make up the gap, assuming the frequencies are available (see also: shutting down analog TV and radio to make room). It's easier to lay a single fat fibre to a radio transciever station than it is to lay lots of thin fibres to everybody's front door, after all.
Anyway, here's Steve Jobs' strategic dilemma in a nutshell: the PC industry as we have known it for a third of a century is beginning to die.
PCs are becoming commodity items. The price of PCs and laptops is falling by about 50% per decade in real terms, despite performance simultaneously rising in real terms. The profit margin on a typical netbook or desktop PC is under 10%. Apple has so far survived this collapse in profitability by aiming at the premium end of the market — if they were an auto manufacturer, they'd be Mercedes, BMW, Porsche and Jaguar rolled into one. But nevertheless, the underlying prices are dropping. Moreover, the PC revolution has saturated the market at any accessible price point. That is, anyone who needs and can afford a PC has now got one. Elsewhere, in the developing world, the market is still growing — but it's at the bottom end of the price pyramid, with margins squeezed down to nothing.
At the same time, wireless broadband is coming. As it does so, organizations and users will increasingly move their data out into the cloud (read: onto hordes of servers racked up high in anonymous data warehouses, owned and maintained by some large corporation like Google). Software will be delivered as a service to users wherever they are, via whatever device they're looking at — their phone, laptop, tablet, the TV, a direct brain implant, whatever. (Why is this? Well, it's what everyone believes — everyone in the industry, anyway. Because it offers a way to continue to make money, by selling software as a service, despite the cost of the hardware exponentially dropping towards zero. And, oh, it lets you outsource a lot of annoying shitty admin tasks like disk management, backup, anti-virus, and so on.)
My take on the iPhone OS, and the iPad, isn't just that they're the start of a whole new range of Apple computers that have a user interface as radically different from their predecessors as the original Macintosh was from previous command-line PCs. Rather, they're a hugely ambitious attempt to keep Apple relevant to the future of computing, once Moore's law tapers off and the personal computer industry craters and turns into a profitability wasteland.
The App Store and the iTunes Store have taught Steve Jobs that ownership of the sales channel is vital. Even if he's reduced to giving the machines away, as long as he can charge rent for access to data (or apps) he's got a business model. He can also maintain quality (whatever that is), exclude malware, and beat off rivals. A well-cultivated app store is actually a customer draw. It's also a powerful tool for promoting the operating system the apps run on. Operating system, hardware platform, and apps define an ecosystem.
Apple are trying desperately to force the growth of a new ecosystem — one that rivals the 26-year-old Macintosh environment — to maturity in five years flat. That's the time scale in which they expect the cloud computing revolution to flatten the existing PC industry. Unless they can turn themselves into an entirely different kind of corporation by 2015 Apple is doomed to the same irrelevance as the rest of the PC industry — interchangable suppliers of commodity equipment assembled on a shoestring budget with negligable profit.
Signs of the Macpocalypse abound. This year, for the first time, the Apple Design Awards at WWDC'10 are only open to iPhone and iPad apps. Mac apps need not apply; they don't contribute to Apple's new walled garden ecosystem.
Any threat to the growth of the app store software platform is going to be resisted, vigorously, at this stage. Steve Jobs undoubtedly believes what he (or an assistant) wrote in his thoughts on flash: "Flash is a cross platform development tool. It is not Adobe's goal to help developers write the best iPhone, iPod and iPad apps. It is their goal to help developers write cross platform apps." And he really does not want cross-platform apps that might divert attention and energy away from his application ecosystem. The long term goal is to support the long-term migration of Apple from being a hardware company with a software arm into being a cloud computing company with a hardware subsidiary — almost like Google, if you squint at the Google Nexus One in the right light. The alternative is to join the PC industry in a long death spiral into irrelevance.
Let's peer five years into the future ...
LTE will be here. WiMax will be here. We will be seeing pocket 4G routers similar to the MiFi but featuring 50-100mbps internet connectivity. (Meanwhile, fibre-in-the-ground speeds will be mostly topped out at 50-100mbps, except where new construction in high-value areas has permitted the installation of gigabit and faster links.) Internet access will be increasingly mobile. Phone screens are okay, but a 7-8cm diagonal screen is too small for anyone aged over 40-45 to be comfortable squinting at: hence the market for larger pads/tablets.
The availability of 50+mbps data everywhere means that you don't need to keep your data on a local hard drive; it can live on a server elsewhere, streamed to your pad as you need it.
Apple is known to be investing heavily in data centres suitable for cloud hosting. There are persistent rumours that "iTunes 10" will be some kind of cloud service, slurping up your music and video library and streaming it out to whatever device you've registered with Apple. There's MobileMe for email, and iWork.com for office documents. There will be more — much more.
The iPad by 2015 will have evolved. There will be smaller models with 7"/18cm screens, and larger desktop models. Most importantly they'll be using newer processors, either a descendant of today's Atom CPU (remember, Apple's demand that developers only use Apple's compiler toolchain mean that Apple can shift the app store to a new CPU architecture quite easily) or Cortex A9 class ARM cores — dual core, 2GHz, and up, vastly faster than the current machine ... or vastly more energy-efficient at the same performance level. But where it will really shine — the value proposition that will keep punters forking over huge gobbets of steaming money, in the midst of a PC industry that's cratering — will be the external benefits of joining the Apple ecosystem.
If you're using an iPad in 2015, my bet is that you won't bother to have home broadband; you'll just have data on demand wherever you are. You won't bother yourself about backups, because your data is stored in Apple's cloud. You won't need to bother about software updates because all that stuff will simply happen automatically in the background, without any fuss: nor will worms or viruses or malware be allowed. You will, of course, pay a lot more for the experience than your netbook-toting hardcore microsofties — but you won't have to worry about your antivirus software breaking your computer, either. Because you won't have a "computer" in the current sense of the word. You'll just be surrounded by a swarm of devices that give you access to your data whenever and however you need it.
This is why there's a stench of panic hanging over silicon valley. this is why Apple have turned into paranoid security Nazis, why HP have just ditched Microsoft from a forthcoming major platform and splurged a billion-plus on buying up a near-failure; it's why everyone is terrified of Google:
The PC revolution is almost coming to an end, and everyone's trying to work out a strategy for surviving the aftermath.
I believe it when I see it. Not so much that your analysis is wrong, but that the industry is right if it's gambling on cloud computing; reminds me way too much about how diskless net computers were going to be the future of computing.
Personally, I don't even trust Itunes enough to let it manage my music library, let alone I trust it to keep important data and software safe.
All very sensible, but the main counter argument to part of this is high contention ratios with wireless broadband. You still need a lot of fibre in the ground to enable all this capacity in dense usage areas. There's a good blog covering this sort of area by a guy called Dean Bubley. John
"...nor will worms or viruses or malware be allowed. You will, of course, pay a lot more for the experience than your netbook-toting hardcore microsofties — but you won't have to worry about your antivirus software breaking your computer, either."
You really think even these restricted devices will have absolutely no bugs that can be exploited to gain unauthorised access ?
You have a lot of faith in the OS and its sandboxing abilities if so.
Interesting. I'd be curious to hear what you think the ecosystem is going to be like outside of the Apple garden
Four words -- wideband radio jamming vandals. If the world is indeed moving towards a wireless cloud connection for data and computing then any spotty fourteen-year-old with a bit of technical acumen (or some downloaded wiring diagrams a la skript kiddie) and a soldering iron can destroy such an environment locally for the fun of it, for the bragging rights or just because. I'm really amazed that no-one at any of the Black Hat conferences has not already perpetrated a hardware widespread 100% wireless DoS on the attendees. It's happened in the past, ask any old radio ham in Europe about Woodpecker, the Soviet-era nuclear-powered OTH radar based next to the Chernobyl reactor complex in the Ukraine.
Basically wireless communication is fragile in so many ways that I can't see anyone, especially in sensitive businesses such as the financial or government sectors using it to any great extent. The promiscuous insecurity of wireless is a side-effect which will be addressed hopefully in a more complete manner than the frantic WEP/WPA update scramble in the 802.x implementations as the security key systems fell to increasingly more powerful computing resources as time went on.
Food for thought. One minor note: the paragraph that begins "Apple are trying desperately to force the growth of a new ecosystem" spells "Unless" "Wnless".
Athan: You really think even these restricted devices will have absolutely no bugs that can be exploited to gain unauthorised access?
Nope. (I have perl and vim on my iPhone. How about you?)
What I do believe is that code signing plus remote deactivation of apps on an always-on wireless device will make it a lot harder for malware authors to get their teeth into a platform; phishing and XSS are more plausible attacks (i.e. they deliver the goods -- stolen credit card/identity credentials -- for a lot less effort).
And I think many members of the general public are sick to the back teeth of being made to leap through flaming hoops to keep their PCs running, and will gladly pay the price for anything that offers a zero-admin experience.
In one word: Android. Google are positioning themselves to be the Microsoft to Apple's Apple.
I'd pencil in HP/Palm as a possible dark horse contender -- Palm was in a hail mary situation previously, but with HP's muscle behind them they could be a player, if HP is willing to bet the farm on their new subsidiary (and more importantly, license WebOS to potential competitors, the better to go up against Android).
Robert: broadband jamming/vandalism is entirely possible, but the sort of script kiddie type who might do it would also be cutting off their own bittorrent feed at the same time. Now, as a weapon in asymmetric warfare it makes a lot of sense ...
You were doing so well and then you said;
"If you're using an iPad in 2015, my bet is that you won't bother to have home broadband; you'll just have data on demand wherever you are. "
Are you sure about that? Really sure? Personally I would have said 2025, if your lucky. Have you seen the coverage grids? Have you experienced how rubbish the coverage is even within these areas? I live just on the edge of my phones 3G network. I can't get a normal phone signal most of the time, let alone the 3G bit. I have to rely on my home network, which is fine except I have to use email rather than texting.
Ever been on public transport and tried to use a phone? Its a foul experience whenever I've tried.
We would be better off if everyone just ran an open wireless network for people to piggy back from.
Carlton, what country are you in? If you're in the USA, of course your coverage is shit; the USA has been systematically under-investing in public infrastructure for decades.
Here in the UK, things are somewhat better. In other countries, such as Japan (where I happened to be last week) "home broadband" means a choice of 100mbps or gigabit ethernet, and they don't even have GSM phones -- they leapfrogged that generation completely.
"The App Store and the iTunes Store" and the Apple Store... the bricks and mortar (and glass and steel) buildings that are more profitable per square meter than Tiffany's :-)
The radio jamming vandalism would be done remotely, in a city centre somewhere or next to an office building, or in the rich people's suburb not in the oik's back garden (although some other oik might well deploy their own jammer there). A deniable bit of circuitry built on an open-source PCB dunked in a bath of isoprop before deployment to eliminate DNA traces, battery-powered and hidden somewhere public difficult to spot or locate. It fires up on a random-ish manner, frequency-sweeps and either emits white noise on the various 802.x channels or in the more sophisticated version it sends plausible traffic to any nearby routers, emulating a SYN-flood attack tying up the control channels.
Imagine the plot of "Unwirer" reworked for vandalism as the intent and you're pretty close to the situation coming down the pike. What would be glorious would be for someone to carry out such an attack at an Apple product launch...
I don't believe in this vision of the future, but if it happens then our hypothetical spotty kids with soldering irons will find they get rather more publicity that just a paper at a Black Hat conference. Unfortunately for them, they will also get rather more dead, eliminated by autonomous micro-UAVs which search out and destroy such terrorist threats to our critical infrastructure. "Sandflies", we will call them.
Charlie, I agree with your thesis, but still have a few nits to pick:
Steve Jobs' learned the value of controlling the sales channel *much* earlier than the App Store. Being frozen out of existing retail channels during his first run at Apple, plus the success he had with the Apple Stores in his second run, probably cemented that lesson in place very firmly.
Apple will probably never be forced to give it's devices away for free, given that the brand has already been transformed into one that can charge a fashion premium. Handbags and luggage are a complete commodity, but Coach and Zero Haliburton (among others) still rake in the $$$. An interesting question is whether a hardware-as-fashion-item world would remain unipolar. I suspect Sony would do rather well.
Free Software/Open Source never did kill off proprietary software, or take over the desktop (or if it does, it will be just as the desktop becomes irrelevant), but it did provide a vital counterweight to Microsoft's hegemony. Similarly, the growing movement around open data and decentralized federated systems are an important counterweight to the emerging world of centralized web services from Facebook, Google, and Apple. We can already see signs that many users are merely trading the onerous tasks of doing system management badly for the onerous task of doing privacy management badly. Apple hasn't really jumped onto the whole 'sharing' bandwagon, so it is too early to say what their take on it will be like, but better and easier system security didn't really protect them from Microsoft, so I don't necessarily think that they are going to have much of an advantage in the world of cloud apps (other than the head start of the App Store), and meanwhile the AGPL/Open Data/Open Web crowd is growing a *lot* faster than the GNU crowd did in its early days.
We certainly live in interesting times.
Charlie, how do you see this shift affecting corporate IT? The buzz has been there before about thin clients (a screen plugged into an Ethernet router, wired to a central server, basically) but it's never really come to pass - most companies still have hundreds or thousands of Windows boxen scattered across their physical offices, hooked up to common email servers (and sometimes shared drives, Sharepoint etc).
What does the tablet n' bandwidth revolution mean for corporations with 5-10 year adoption cycles? Especially when the number one activity carried out on yer average business computer is a heck of a lot of typing?
My take on it is that Mr. Jobs is preparing for a different extinction: His own. The key phrase to my mind was at the beginning, "Apple went through its near death experience". He doesn't state "While I was away" but it's pretty plain that's what he means.
I really think he's more afraid of his own successor at this point than he is of Google, Microsoft, or Adobe. He's seen what happened to Apple while he was away, he's seen what happened to Microsoft when BillG stepped back, and he's setting his company on a course that a dozen Gil Amelios can't undo. With the iPod/iPad/iPhone line, he's given them a pretty clear course to take for a prosperous future path: post-Jobs Apple can fumble along for long enough along those lines to survive until they find a new Visionary Leader.
(Of course, a little voice in my head is whispering, "Well, yeah. It would be naive and quaint to just take the guy at his word.")
What the heck? Charlie, I don't know what went wrong with the Google sign in (case in point about doing privacy and identity management badly?), but I'm Michael Bernstein.
Hi, Michael! (Yes, the google sign-in doesn't seem to be working too well with Movable Type here. Must look into it,when I've got time ...)
Yes, multiple good and valid points. Firstly, the big political hot-potato of the 21st century: privacy in a networked world. (Did I mention why, despite having an account, I don't use Facebook or trust it with my personal details?) Trading off privacy management against convenience is going to be a big story over the next decade.
Corporate IT ... corporate IT has been a generation behind the cutting edge of IT since the 1950s: too many careers are invested in doing things the traditional way. On the other hand, corporations take internal privacy management seriously in a way that individual users don't -- if nothing else, data breaches can result in big fines and loss of competitive advantage. And on the third hand, will corporate IT as it is currently understood still even exist in twenty years' time? It's structures reflect current business management doctrine; if the structure of corporations change, then their IT needs will follow, and that could go anywhere.
As for John Murphy's point: Steve Jobs is in his fifties, a cancer survivor, and a liver transplant recipient. That's pretty heavy, and although he can clearly afford the best medical care that money can buy, it will reduce his life expectancy relative to a healthy adult of the same age. He's probably got 5-10 working years left, but possibly a lot less. As Dr Johnson said, "nothing concentrates the mind like the knowledge that one is to be hanged in a week" -- I suspect Jobs sees Apple as both his life's work, and as something that can be snatched away from him at a couple of months' notice. So he's not merely a Visionary Leader -- he's a driven one.
portable cell phone signal jammer - 30 bucks
http://www.dealextreme.com/details.dx/sku.24230
ruining apples future - priceless
I believe, that in the future people are going to buy a device, and thats it.
Even today more and more people don't want to spend extra money for services and software, especially with good and free alternatives.
Take a look at Android. It integrates almost perfect with Google Services - for free. You don't have to spend any money to get backups of your device's data because the data is in the google cloud.
Google earns money through a second channel - ads.
All those free services and software are just a tool to get as much users as possible and to serve them ads.
Apple does not have anything similar. They have expensive devices, and services for which you have to pay. I am sure itunes makes a lot of money, but there are already so much legal alternatives (e.g. just create a playlist on youtube for your music needs).
With the introduction of the Iphone Apple has got a head start in this whole mobile internet future, but that is melting away rather fast.
Interesting. My next door neighbour is dedicated to investing into old school PC architecture - one computer each for the wife and himself, as powerful as their income can afford, regularly upgraded - and believes that things will largely remain the same in the foreseeable future since his main aim in using computers is to play rather demanding MMORPGs and 3D action shooters, as he believes to be the case with a lot of other people.
My household's gone the other route, more-or-less following the vision described in this post, fracturing our household computing system into half a dozen or so smaller devices (with a grand total of 1 DVD reader among them) tied into a local cloud and hooked further into the outside through a VPN and a static IP pointing to my own app server hosting a bunch of stuff, including a SVN server, file storage, etc. I like and often need having work materials available wherever I am, and while I do back the occasional document up to Google (my choice of horse to back in the coming years) several hardware and service crashes taught me NOT to trust external parties to take good care of my data... and that's without considering the privacy angle.
While a lot of people will, doubtless, buy into Apple's walled garden for the convenience of it, and many will go with Google for a slightly more open playground, rolling your own household cloud is something that will likely become ever easier (and it's not hard now), especially with increasing broadband speeds connecting the actual households to the world at large.
This is why the Ubuntu One Music Store does precisely that already, to blow our own trumpet a bit :-)
You've made a point that many completely miss when discussing Apple's insistance that it own the developer tools: It gives them the ability to move, fast, to any any architecture without having to rely on third parties.
As I'm sure you remember, it suffered - badly - from this when it made the move from 680x0 to PowerPC, when the only IDE capable of building PowerPC code was Metrowerks CodeWarrior. Apple was, essentially, totally reliant on Metrowerks to support any new gizmos in Mac OS - and, despite both company's best efforts, there was always a lag (or worse, bugs) in the tools as a new release of Mac OS happened.
That, I think is what Jobs is referring to when he says "we know from painful experience that letting a third party layer of software come between the platform and the developer... hinders the enhancement and progress of the platform."
Martin: Missed the iAd announcement, did we?
In this brave new world, where do you envisage yourself writing essays like this one? Until there is an improvement over the keyboard (whether physical or virtual), information work will feel much the same as it has for the last thirty years. Sitting in front of a screen, hammering away.
(Charlie - it's John (Gordon) Faughnan - notes.kateva.org)
Terrific essay. You made only one boo-boo -- but it strenghthens your argument. I go at it in a f/u post [1], but the bottom line is that PCs are not cheap at all. Where I live only a minority of the relatively wealthy population can keep a winbox running, and they're dependent on costly ISP services and friend/family tech support that less wealthy people don't have. Forget details like backup (not cheap). Don't mention networks, home servers, etc.
The new generation devices are far less costly to own and operate. The iPad's capped pay-go $15/280 data plan is subversive. I can only imagine AT&T's screaming as Apple squeezed on that one.
[1]http://notes.kateva.org/2010/04/stross-on-post-pc-world-mostly-right.html
The problem with this analysis is that if it was 100% correct, Steve Jobs would be paying a lot more attention to making MobileMe a service that people actually wanted to use -- Apple is falling way behind the curve in cloud computing, which is what you think they think is where their future lies. But if MobileMe is getting a drastic revamp, they're keeping the secret very well.
Yeah, yeah, Chicken Little. We've heard the sky is falling for 10+ years now. You can have my pc when you pry it out of my cold, dead fingers.
No new insight... move along.
Charlie wrote: "remote deactivation of apps"
Yes, and that's just the sort of thing that will keep me and many others happily outside of Apple's glass-walled prison-garden. It certainly looks pretty in there, but I like my freedom, thank you.
You talk about users trading freedom for security in computing. We do too much of this in the real world, and too much of that "security" is illusory. I don't think computer users are clamoring to trade in their freedom for security (there is a bigger market for Windows anti-virus software than for all Mac software combined), and I don't see Apple becoming a dominant force in computing via this model any time soon. It will, IMO, remain a niche provider of peripherals and luxury hardware and software.
PCs are already sold at and below material cost. People still buy overpriced Macs that use identical PC hardware and run the same OS you could run on a Hackintosh. Macs crash. Macs get viruses, too. Mac users run Windows-native software like Firefox that exposes them to security threats. The iPad at launch can't even properly open/edit documents created on a Mac. The Appleverse is no longer a vacuum-sealed utopia. It's an increasingly messy, glitchy, dangerous place, just like every other computing platform.
But users don't buy the hardware or the OS or the apps--they buy the Apple Experience (TM). Why would that change?
The majority of the world runs on Windows, and the internet runs on *nix systems. Selling iPads to e.g. yuppie WASPs in the US is merely a drop in the global computing bucket. Governments and militaries will not switch to Apple (although I hear there's a neat sniper app on the iPhone). Businesses with complex custom software needs (e.g. finance) will not switch to Apple. Educational institutions will not switch to Apple (except in isolated digital arts departments). More importantly, the developing world, rife with pirated or otherwise unlicensed usage of software, most certainly will not switch to Apple.
Apple is trying to position itself as the gatekeeper of content--as are many other big tech companies, including Microsoft--but it is doing so by trying to lock in consumers via hardware. And that business model is demonstrably unfeasible. Software is trending toward platform-agnosticism, thanks in large part to web apps and web-enabled apps. Apple is still trying to push platform-monotheism.
The fact remains that no one has successfully squeezed content into a proprietary mold. Every time they do (Windows Media, Quicktime, DVD security, etc.), we find a way to transcode and liberate it. Apple is savvy to this: it tries to retain control over content by positioning itself as the foremost provider of content and facilitator for content consumption. And permitting Flash on its devices would thwart this effort. Apple would no longer be the omnipotent gatekeeper of content and services on its own devices.
Steve Jobs "hates" Flash because it threatens Apple's position as a gatekeeper of content. Flash applications, games, and media players can deliver content and services to users without passing through the Apple tollbooth.
If Apple is going to maintain a foothold in technology--and I'm sure it will--it won't be some revolution spurred by the iPad and iWork and other half-baked, gloss-coated junk that means nothing to users outside the Apple Reality Distortion Field.
Steve needs to stop forcing Apple into the role of content's toll-taker, and instead focus on providing myriad means of access to content. More gadgets, less overpriced and redundant reinvention of the wheel. More convenience, less restriction. More ways of accessing, creating, and working with content, less single-minded focus on consumption. More expansiveness and pervasiveness, less sulky exclusivity.
I think Apple has a great opportunity to lead a portable computing revolution, but its pompous disregard for the way that people use and want to use computers and data, in favor of pushing a tyrannically myopic and insular view of computing, is its own doom.
Ubiquitous high-bandwidth connectivity has an interesting consequence. Why should Apple (or anyone) spend truck loads of money running tens of thousands of cloud servers in exceedingly expensive datacenters, when they control access to millions of computing devices which they don't even have to pay the power and cooling bills for?
Matthew Seaman: Why should Apple (or anyone) spend truck loads of money running tens of thousands of cloud servers in exceedingly expensive datacenters, when they control access to millions of computing devices which they don't even have to pay the power and cooling bills for?
See also "Halting State" (written in 2005, thank you very much).
If Apple's plan is to become a cloud computing player, they have a lot of work to do. I seem to remember that the launch of MobileMe in particular was a disaster and outages have been common. Furthermore, the ipad and iphone are still tethered devices -- they require a Mac or PC desktop with iTunes to function. Syncing over USB is not very Web 3.0.
I can definitely see itunes 10 and a future iphone OS removing this requirement, but the launch better go perfectly or Apple is going to have a lot of trouble establishing themselves as a cloud player.
If the future is the cloud, then everyone's worst nightmare is the cloud going down -- how long has it been since the last time twitter went down because gmail went down? And if Apple has a reputation as not doing that stuff well, it may not matter how Shiny their products are.
The one hole in your reasoning, which is otherwise quite sound, is that apple still requires a PC to use an iPad fully. You cannot install apps from the cloud directly - you *must* use iTunes software running on a PC. So long as the iPad/iPhone are force tethered to a PC in this way, then I dont see how their model can fully succeed.
also, some tasks are going to require a PC going forward regardless. These are mostly professional level activities such as software development and media and content creation that the cloud just cannot handle (maybe 10 years from now, but I really dont see how your going to get avid protools to run in the cloud).
Another factor is corporations. There are lots of reasons companies cannot allow anyone else to manage their data. In some industries there are actually statues that prevent you from outsourcing your data storage. Until that changes, the private server room (or perhaps private clouds) will still need to exist.
Charlie@10:
I'm in the UK. Coverage sucks. Not to mention the (semi) secret bandwidth throttling that happens.
You have just come back from a month in Japan where, as ever seemingly, the tech is just so much better than here.
We have a proposed tax on broadband to pay for people in the real sticks to get broadband, but nothing which will enforce the roll out of decent networks and coverage for the phone networks. I assume there is massive interest still being paid for the last big auction of bandwidth.
There's something important that too few people realise about corporate IT: it has not changed in any significant way in the past ten years, despite the best efforts of a great deal of very rich companies, with Microsoft at the top of that list. A lot of people want to change corporate IT in a way that makes them the sole holder of a "killer app". None have succeeded. Corporate IT today is doing the same damn things it was doing in 2000, with a different desktop theme.
Notably, and I have warned off a great many small UK companies who wanted to use Google docs: data protection legislation, in all of the EU, prohibits the export of protected data to countries which do not have data protection legislation. The US does not have it, and refuses to pass any such legislation, because corporate data theft is a major industry there. It is actively and severely illegal for any EU business to store any form of information about their customers, such as names and telephone numbers, on a system which might be physically located in the US. That's pretty much every document they process.
The Apple Loyalty Team article predates the confiscation of Gizmodo's computers.
Having worked at Intel, I doubt that Apple's security policies, if in fact accurately reported, are tighter.
Honestly, I'm not nearly as impressed with cloud computing for non-personal uses as the industry. I think the practice will last until either some disaster makes critical business or governmental information unavailable when necessary, or until there is some major security breach in one of the clouds. Perhaps that has already occurred at Google China.
Croak!
Not quite; you still (I think) have to connect it to your computer the *first* time you use it (not sure why :), but after that, you can do everything on the ipad (or ipod/iphone, BTW).
I would tend to agree with the main points and general direction of the arguments. Maybe this is the real reason why we can't watch Flash video on our iPhones (though the last time I looked Apple didn't own Nintendo and I can't watch Flash on my Wii either... is there a global conspiracy by consumer electronics device manufacturers to block end-users playing Flash video?!). One thing I'm sure about is there is still plenty of room for improvement in my mobile user experience, and this is needed before we get anywhere near the vision of the future described here.... we are only starting out on the journey to realise this vision, and an incalcuabley large opportunity exists to be exploited by all kinds of technology players across the whole eco-system before the vision of mobility underpinned by Cloud and SaaS, can become the reality for mainstream end-users of mobile devices. So perhaps a vision of 2020? If history is anything to go by then in 10 years we should have witnessed the rise of another 'Google', a couple more 'Facebooks' and perhaps even a another 'Microsoft' to challenge the existing order...
Starngely, this is nothing new - any of it.
Jobs has seen that the wold is going to change, and is trying to CONTROL ALL OF IT - after the change has occurred.
It can't be done.
Several times previously, other paranoid and megalomaniac, and sheerly otimistic persons, usually with very great TECHNICAL expertise have tried this, in other industries.
But there is always somewhere else you can't control, and inventions and developments you don't own, etc.
George Stephenson tried this, in about 1829-30, and it didn't work. I think Hughes of aircraft notoriety tried it, and there are numerous other examples.
Incidentally, this just puts the lie to the persistent myth that "Microsoft is EVIL, but Apple are saints" ....
I think Apple is trying to position itself as the gateway to content, as is Amazon. They have learned from the mistakes of the past and know that the only way to prevent content from liberating itself is to make the jailed content so cheap it isn't worth the trouble.
Cloud-something is happening but whether it is software-as-a-service aka apple/google or hardware as a service aka Amazon Ec2 that wins out, who knows. In five years it is a fair bet no one will have a lockin. My money is actually on amazon, their model is far, far more flexible.
As far as bandwidth goes, i have an IPad, there is still no use case for it really. There is no killer app for all the wireless bandwidth that an iphone does not solve pretty well. The way that technology evolves will depend highly on the apps that require it.
PCs are becoming commodities, like cellphones and mp3 players. The future isn't really in them any more. Apple seems to be positioning itself well, and can potentially offer something that others don't - quality control and security.
Google is positing itself in a similar position, but without the in-house hardware. They're possibly in a better position than Apple, but rely on other companies as partners and don't have the same degree of control.
I wouldn't write Microsoft off. They may look like they're falling behind in the consumer market, but that's a small part of their business. They still dominate in business applications, and have things in place or in the pipeline for businesses that are way beyond what Apple or Google offer.
In the future you predict, it looks like the concerns outlined in Richard Stallman's fiction piece "The Right to Read" (wikipedia) are going to become a lot more relevant.
(also his piece on "cloud" computing)
great argument.
however, thats a horror future - i certainly dont like it and i'm pretty sure cloud computing is not everything then.
This is sensible and well thought out, but what does it have to do with Flash?
No. I own hundreds of apps on my iPhone, and all of them were purchased and installed over WiFi (or over 3G for the smaller apps). I don't have my iPad yet -- it's 3G and is scheduled for delivery "before 3pm" today -- but I'll be utterly flabbergasted if it requires connecting to a Mac or PC for installing apps.
AFAIK, you only need to connect to a Mac or PC for initial activation, for device backups, and for device OS upgrades.
(And if Apple cared to, surely they could eliminate the need to connect to a Mac or PC with a device OS upgrade.)
Regarding fragility of wireless, you don't need "always connected", just "mostly connected".
HTML5 webapps with local storage (demonstration) should work perfectly in an area with low coverage quality. Storage: it's caches all the way down. Dedicated apps are built with this in mind, of course.
As an aside, it saddens me a bit that Javascript serves as the technological platform to this. Java predicted all this 15 years (half my lifetime!) ago. I believe it came before its time and showed the way (and what to avoid) for the rest.
(offtopic: congrats in the hnews front page ;)
beat off rivals, eh?
(And if Apple cared to, surely they could eliminate the need to connect to a Mac or PC with a device OS upgrade.)
By requiring iTunes to initialise & update the iPhone, Apple maintains a direct link with iPhone owners which no other smartphone manufacturer has: it's a huge advantage. There is no way Apple will break the iTunes requirement for this reason alone.
(And if Apple cared to, surely they could eliminate the need to connect to a Mac or PC with a device OS upgrade.)
By requiring iTunes to initialise & update the iPhone, Apple maintains a direct link with iPhone owners which no other smartphone manufacturer has: it's a huge advantage. There is no way Apple will break the iTunes requirement for this reason alone.
Have you actually used 3G "broadband" services?
They claim to be 7Mb/s but rarely deliver much beyond 100kb/s in reality. Mobile networks are not designed to carry vast amounts of data and while there are improvements in the works they are certainly not on the scale needed to support widespread use as fully featured "cloud devices".
Wifi simply didn't take off and WiMAX doesn't appear to be doing so either. Prior to these there have been several other attempts at wireless internet access and they have all failed for a host of reasons. I suspect the main reason is that the level of investment to make this happen even for relatively population dense places is just too great to be commercially viable.
Sure mobile devices are increasingly important but the scope for them to replace fixed devices, which have much better connectivity and are more comfortable to use (there is no way I'd be able to carry a device with a 22" monitor) is, I think, very limited.
The general point about Apple's move to absolute vendor lock in is a good one though. To what degree they can do so in the face of Google, Amazon and a host of others who are already providing cloud services remains to be seen. I think the days of people accepting such lock in died with AOL and Compuserve.
Great article! I totally agree.
Far to a complex analysis.
The real reason apple don't allow flash is profit & revenue protection. pure & simple
Flash allows other companies such as Facebook - farmville to have access to apples platform to generate revenue & profit.
Apple by banning flash retain total control over all revenue generated by devices.
As google own youtube and now have flash access they could start to remove apple/flash support from all apple devices could be interesting. 90% of the mobile market vs apple......
I find it intriguing that people continue to speculate how Apple is going to be the primary driver of the tech future. Reality check: Apple has 8% of the PC market share, 25% of the smartphone market share, and around 6% of the overall web browser share (including browsers on Apple PCs, iPhone and iPad). Given these numbers, Apple's influence and ability to drive the future of tech (e.g., killing Flash) is overrated. There's no question that Steve Jobs has done an amazing job of making Apple look like the dog wagging the tail rather than the tail wagging the dog. Whether the tail can actually wag the dog remains to be seen.
"Have you actually used 3G "broadband" services? They claim to be 7Mb/s but rarely deliver much beyond 100kb/s in reality"
The reality is, that in 2015, you'll make this comment:
"Have you actually used 5G "broadband" services? They claim to be 700Mb/s but rarely deliver much beyond 100Mb/s in reality"
And those 100Mb will be enough.
why the riaa goes after uploaders ... and the industry want's software as service. control of distribution... apple is becoming a cable tv provider for computing, except you can only get cinemax and no hbo. not api, eh?
Top tip, google doesn't really advertise, but it does check code into publicly viewable codebases.
Chrom OS, looks to my trained eye like something that will actual work. Project Angle and Native Client may not seem like a big deal but they speak of a future not of thin clients but of stateless and remote initiated fat clients.
Clients that can in fact be running windows. The plan being to supplant the old whilst coexisting with it.
Google doesn't have to ban flash, or windows, or anything, it can just outperform it. It will begin to happen towards the end of this year and you won't be impressed by it.
I suspect it will be dismissed as something that just works and isn't really any different to what we are currently doing and doesn't really have any market penetration anyway. Then people will build games for it. Then it will take over.
That Apple will try to move things in the "walled-garden-paid-for-by-subscription" direction seems quite reasonable to posit. How much success they will have in doing this seems, to me, much more questionable. I also have little trouble with the notion that this software as a service notion is believed by those in the computer industry because it provides them with a way to continue making money. I suspect that this belief embodies roughly the same level of realism that the music industry had in *their* belief that suing grannies and teenagers was the best way to preserve their business model.
The public has, in general, not been very accepting of walled garden approaches, but I am willing to listen to arguments that Apple is good enough at building walled gardens that they constitute an exception to this. I am more reticent to believe that software (and storage) rental is likely to be very popular. That runs straight into a major problem in consumer preference- people hate being on the meter and will often prefer a non-pay-per-use alternative even if it will demonstrably cost them more than the metered alternative.
If the success of iTunes and the iPhone indicate that Apple is an exception to the walled garden rule, the relative lack of clamor for MobileMe suggests that the same can not be said for pay-as-you-go computing. As far as the whole concept of putting your data in the cloud goes, it has always seemed to me to be a solution to a problem that I am hard put to see. After all, with local storage so inexpensive and capacious that certain SF writers can speculate about recording every moment of one's life, is there really a good reason not to store the all the information one cares about locally?
Spot on, Charlie.
It's interesting how many people's heads explode about Apple's walled-garden approach but fail to recognize that even three years ago the mobile carriers had a far more restrictive policy.
Today, for $99/year, you can build and sell applications that are permanently connected to the mobile internet and location aware. Those applications have access to orientation sensors and a pretty decent touch sensitive screen. Three years ago you would have needed to roll your own hardware, and then watch the mobile carriers laugh at you as you tried to get it onto their networks.
The astounding thing to me (as a 30-year software developer that spent the last 15 working with some of the largest carriers world-wide, and an iPhone developer for the last 18 months) is that the mobile carriers are falling over each other to get a device onto their networks that will support voice over IP (bye-bye long-distance charges). But they see the writing on the wall - 13% of international calls now are transmitted via Skype while 16% of US households have no landline.
What we are seeing in real time is the death of the voice networks. Apple bust down that door because the iPhone was just too good for AT&T to turn down. I'd bet that the dataplan for the iPad is a direct result of Steve Jobs walking into a meeting with AT&T with a prototype iPhone that runs on Verizon's network.
As for Flash - the iPhone OS is far from finished. We currently have version 3.1 for iPhone/iPod touch, 3.2 for iPad only and 4.0 (beta) for iPhone/iPod touch. Apple has to pull these together into a common platform (4.5?). There's more to come, I'm sure. For example, they still haven't addressed a proper synch framework for applications, and the ability for applications to share data is primitive. They can't compromise their agility just yet to support 3rd party framework builders that will lag still further behind.
My guess is that these restrictions will lift over time, but Apple sees Flash as receding so far in their wake that they dont even care. The target audience for this device is a set of people that don't even know what Flash is.
Greg: After all, with local storage so inexpensive and capacious that certain SF writers can speculate about recording every moment of one's life, is there really a good reason not to store the all the information one cares about locally?
Local storage works brilliantly ... right up until you're mugged/your house burns down/the dog eats your server. There's really no substitute for redundant networked storage.
The thing is, we had this glorious cloud-based computing world before. It was called the mainframe. Everyone worked via dumb terminals or computers running terminal software, and they connected via modems when they were too far for a direct serial connection.
The whole PC revolution was a reaction to that. It allowed people to take control of their computing environment, and not be at the whim of corporate decisions.
So now we're seeing a bunch of people move back to the mainframe model, but now it's a wireless connection instead of a wired one, and a web browser instead of a terminal. Will that be enough of a fundamental difference to make the central control palatable this time?
My guess is no. While some things will migrate back to the cloud, and some users will embrace it, plenty of people will find it as unpalatable as it was before, for exactly the same reasons. Already I'm seeing iPhone users getting pissed off at Apple's shenanigans and switching to Android. I think the only reason cloud computing has caught on as much as it has, is that there are a lot of people new to the computer world who haven't experienced what it's like to get fucked good and hard by a corporation that controls access to your data.
I know that it'll be a cold day in hell before I make myself dependent on having a network connection to get anything done, and dependent on a corporation to tell me what software I can run. Maybe I'm in the minority, but history suggests otherwise.
Stop kidding yourselves!
What Steve Jobs wants is NOT the next evolution in personal computers. What Steve Jobs wants is a walled content garden with only his devices pulled in and paying for access. NO THANKS!
I have an iMac, an iPod iTouch and a Google phone and I can definitely appreciate Steve's paranoid behaviour.
The hard simple truth is only Google has the current business model to support what the market wants. FREE cloud access and virtually unlimited storage. Everyone else, Apple included, can not afford not to charge users fees for their cloud storage. That means Apple HP, Microsoft etc are all at HUGE disadvantages to Google.
Google has been working very hard in the direction of cloud computing for a number of years and it appears that none of the current players want to get caught - like Microsoft's near miss of the Internet back in the 90's - so do the same players now hope not to repeat this near miss for the next generation of personal information.
ALL HAIL TDDM! The device doesn't matter!
Good luck with your walled garden Steve?
Hi Charlie,
It is an interesting read but I feel that there are a number of points that I will (try to) make.
1) Whilst the app market does indeed work well as a closed garden, with Apple as the gate keeper, there still seems to be a pesky app that gives access to thousands of applications - the browser. Mr Jobs may have banned Flash, but if iPhone supports HTML5, it may have been irrelevent. Modern CMS's can have multiple templates for presentation options, so making an iphone version of an online application is trivial. The functionality of browsers (and the apps that they support) is only going to increase with time.
2) "The cloud" may be an adequate solution for non-critical or non-confidential systems, but I myself would not trust an outsourced company to host confidential emails or information critical to our business infrastructure.
Bandwidth may have solved some issues, but it has also made network access more critical than ever.
Yes, obviously the future will be shaped by Microsoft and Nokia since they have the largest respective market shares. Or maybe your argument has a flaw.
So, Apple basically wants to become the next AOL? That worked really well for the first AOL... right until it didn't.
Charlie: "The real reason why Steve Jobs hates Flash .... And he really does not want cross-platform apps that might divert attention and energy away from his application ecosystem."
That is not quite how I would characterize this. Clearly the Safari Web Browser (and the new Opera Mini Browser for the iPhone/iPad) allow access to cloud applications outside the iTunes development ecosystem. This seems to undermine your argument regarding Flash, as HTML and Javascript are x-platform languages.
Note that Apple is excluding other languages and libraries from being used on the platform too.
To me that suggests that the issue is indeed about control of the platform as a consumer device, how it works, what can work and what is the controlled development path. This is in marked contrast to the open environment of PCs which can of course lead to the maintenance problems you cite.
BTW - how did you get perl on your iPhone. Jailbreak it?
To pick a nit, though Japan never picked up GSM, they had several homegrown 2G standards, such as PDC. They just started deploying W-CDMA earlier and with a lot more commitment than US/EU. And most operators have by now terminated their 2G services (NTT still runs PDC, Softbank killed theirs last month).
However, they were a lot more effective in deploying data services over the 2G network with i-mode, which had actual users when others were still convinced that premium SMS is the future (some still think so) or dicking with WAP (which is conceptually like i-mode, but telcos went for milking calves with power drills when deploying it, as usual).
The fact that Japan managed to go ahead with rapid deployment of wireless data and dump 2G quickly was, I think, helped by the fact that the carriers hold on to the handset market with a spiky iron gauntlet. I think that there's still no such thing as an "unlocked phone" over there (though foreign W-CDMA phones mostly work with local SIMs). (This probably also explains how they managed to implement MMS so that people voluntarily use it.)
Regaining the prime pieces of spectrum now used for GSM for high-speed data and whatnot looks a lot trickier at least on the European market. Operators have made some noises about this around where I live, but there's not that much leverage on the handset market. A lot of people use prepaids with the cheapest handset they can find (including 10 year old ones from the recycle bin). Unsubsidized GSM phones start at 20 euros and a reasonable one goes for 50. W-CDMA sets start somewhere around 120. That's a big difference for some and for the operators, prepaid minutes+SMS is still quite a bit of fairly steady cash rolling in on long-since paid-up infrastructure and spectrum.
It might even be that LTE will manage to supercede W-CDMA before GSM is gone (it remains to be seen how LTE will co-exist with W-CDMA spectrum-wise, since there's been pretty heavy deployment on that front in the last few years).
#34
It isn't illegal to store EU personal data in the US, provided the holder complies with the Safe Harbor rules.
http://en.wikipedia.org/wiki/Safe_Harbor_Principles refers.
(This I know, being in the belly of that particular beast)
I've never taken the plunge on an iPhone, or an iAnything, because it hurts my feelings to pay that kind of money for such a small appliance.
I regularly pick up leased boxes with an XP Pro license for my computing needs. All you need is an inexpensive monitor, and you have a perfect system.
Load that box with Linux, and well, it can do pretty close to anything you like (Ubuntu on my laptop, XP on the desktop).
Unless you are a hardcore gamer, or like data privacy, the cloud is a great option. I personally don't mind administrating my own computer, so it's a "no" to anything other than backups being stored in the cloud.
As to Japanese broadband speeds, one thought: Geographic proximity. It's easy to roll out high speed, when the fiber doesn't have far to go, and population density is so high.
You could superglue it onto a big rock.
Until commodity Internet connectivity reaches at least gigabit speeds, cloud computing just isn't going to cut it for our data needs. As connectivity speeds rise, so do our data needs. Even consumer-grade digital cameras are pumping out 12MP images and HD video these days. Storing this content in a cloud is not going to be a satisfying experience for anyone. I can't see the majority of people letting go of the comfort of big screens and tactile keyboards anytime soon. Touch interfaces will have to get a lot better and smarter before this happens.
From my personal perspective, as a professional photographer, I welcome the day when I'll be able to work on high-resolution images on a portable device as comfortably as I can at my desktop. I just don't see that day coming within 5 years.
I think you are spot on, and it starts to explain some curious things in the recent past. I should say that I've been following Apple primarily through the financial news and the picture there is very interesting. First off, they have about $85 billion in cash on their books, and everyone has been complaining that they aren't indicating any chance of paying dividends to shareholders. Then in the past few days they quietly bought an ARM company, and a small firm that makes a voice-based search app for the iPhone. So maybe they are planning to buy their cloud with some of that cash? Who might the targets be? Consider the potential benefits for a merger with Amazon, they get a cloud farm, and another retail source while Amazon gets the iPad as kindle3. Just idle speculation, but wouldn't THAT be interesting.
Excellent analysis. I won't argue the wireless aspect. Who knows what patents Apple itself might hold in that regard or what new thing they have seen that has pushed them this way.
Just want to wake people up to the fact that the *worst* thing hp could do with its new webOS is to *license* it. See why here:
http://ipadtest.wordpress.com/2010/04/30/hp-acquiring-palm-the-picture-version/
You wrote:
Even if he's reduced to giving the machines away, as long as he can charge rent for access to data (or apps) he's got a business model.
It seems really unlikely to me that Apple is aiming to focus their business model on "rent for access." They've always been a hardware company, and they've maintained from the start that the iTunes Store doesn't bring in significant profit.
Back in January, Apple stated (as quoted in an AppleInsider article) that the same is true for the app store:
I suspect the same will be true for books.
Apple has always been a hardware company that makes software (and sells content) in order to get people to buy the hardware. I see no evidence to suggest that that's changing with the iPad.
You're right that iAd changes this equation; advertising can be a huge source of revenue. But I would be very surprised if Apple were to change its entire longterm business model from hardware to ads.
They just had their "best non-holiday quarter ever," while remaining focused on hardware. You suggested that the PC market is dying—but my understanding is that Apple's market share is currently growing. I think there's still a fair bit of life left in their hardware-focused business model.
Hm. Apple, Google, Microsoft, and HP/Palm. What is the future of IBM (similar market cap to Google) and Sony (tiny by comparison) here?
And why has MS canceled the Courier project? What do they think they know that other companies haven't figured out yet?
Will people really be comfortable moving their data into the cloud when a lot of that data consists of pirated music and films? Won't the RIAA and the MPAA just arrange for the laws to be re-written so that everybody's data in the cloud will be scanned for copyright compliance, then deleted or access denied when the data looks like it might be pirated?
You are assuming that the capacity and capabilities of mobile networks will increase so as to support such speeds. There is little, if any, evidence to suggest that this is actually likely.
Can you point to any mobile broadband technology existing now that can deliver 100Mb/s to a sufficiently large number of network nodes to be viable?
VDSL was being demonstrated by BT providing 50Mb/s static broadband service back in 1997. It's now being rolled out in a very limited manner so it's not unreasonable to expect that for something to be delivering 100Mb/s on a 700Mb/s headline rate as you suggest that it should actually exist somewhere and be shown to work.
I don't buy what you're selling here. If your general line of thinking was correct, then thin clients (dumb terminals) would be the rule of the day.
The fact is, thin clients never really caught on because there's something to be said for having the power IN YOUR HANDS, so to speak.
Large servers/desktops/laptops are possible and they are cheap. Therefore, there will always be some kind of a market for them. Just as there will always be some kind of a market for a "sports car" that goes faster and turns better than the family station wagon.
It's in the human nature.
Great work, thank you. One thought about the walled garden: the entire Internet is inside those walls. Increasingly I think we will be no more upset about our inability to install binary code that executes uninterpreted on our devices than we are currently upset about our inability to manage register state on our PCs by flipping switches.
The argument isn't exactly the same because iPhone OS controls access to e.g. the sensors. But:
- that access will tend to increase over time
- we'll still have e.g. Arduino kits
- the "walling" of devices coincides, causally, with growth in their distribution
Freedom is an aspect of many distinguishable parts of our life; opining on the iPad's effect on freedom per se is kind of like saying that the forest is a different color because one species of tree, among many, has bloomed.
What I don't understand is why Microsoft is doing so badly. Seriously, and entirely un-ironically, I don't understand it. The only semi-serious explanation I can think of is that Microsoft really was just the projection of Bill Gates' ego... and, consequently, when Gates left only the shell remained...
Apple is paranoid about leaking information, but there's absolutely nothing new there, ever since Steve Jobs came back. The simple reason is that Jobs wants to reveal his secrets himself and when he's ready to reveal them, rather than let everyone else discuss them to death before the release.
The upside of this is that it's impossible for the media and for users to ignore the fact that secrets are being kept, which generates vast media coverage speculating about what those secrets might be. It's a huge PR win before the announcement, whatever it might be, during the announcement itself, and afterwards, as everyone has to talk about how right or wrong the speculation was.
Put simply, you can't buy that kind of coverage. And Apple doesn't have to, thanks to tight internal secrecy.
Charile,
I'm having a philosophical issue with the move to tablets and the death of the desktop, and it's not about bandwidth or Apple's walled garden. I'm a web developer, and I sit at a computer all day coding (when I'm not drumming up new business). From an ergonomic point of view, tablets and iPads and their ilk are horrendous. I am not going to bend my head down eight or ten hours a day to look at a screen on my lap. And since I don't get to see much sunlight, to me mobile data access mostly means checking my email via the internet on my dumbphone.
So what do you see as the form-factor of these new, nifty data devices everyone is so excited about in, say, the next 20 years? Surely I can't be the only person who is concerned about the health of the denizens of cubicle farms the world over.
Quite frankly, I think the desktop is here to stay, regardless of single-digit profit margins. People still sell hot dogs, even though that market has been the quintessential exemplar of perfect competition for 20 years or more. Sure, my data may live in Google's warehouse, but my user interface is still going to be a screen at eye level with a keyboard on my lap, at least until they come up with something better. What do you suppose that will be?
> In one word: Android. Google are positioning themselves
> to be the Microsoft to Apple's Apple
In one letter: C.
Android doesn't have a C app platform. You can't port desktop class apps to it, you can't port console games to it. You can't port iPhone apps to it.
Android also has no managed app platform, so it has malware. It is not consumer-ready. Consumers are 5-10 times the size of the PC market.
Personally, if I find myself moving to an iPad or similar environment, I will either (a) use it at a desk with a keyboard dock for creating volumes of text, or (b) get myself an old church lectern or similar and work standing up.
Those of us who're old enough remember seeing offices of the pre-typewriter age, when bookkeeping and paperwork was done longhand with a pen and copperplate handwriting. The desks were at waist height or higher, sloping at a 15-30 degree angle, with inkwells at the top/back and a lip at the frontmost edge to stop the ledgers falling on the floor; you could work at them either standing, or using a high stool.
You might be right abou some or even all of this, but I'll be out of computers by then, if so. The "cloud" will never be trustworthy. We lose enough data as it is, there will be no such thing as identity if everything gets moved to the cloud. People will live at the mercy of the latest clever hacker and security will be a happy memory of the past. Not even Apple's famed security paranoia will be able to survive that onslaught.
Whether or not Apple takes over the world, someone better get things moving againm, and fast...the world economy depends on it, ladies and gentlemen!
"featuring 50-100mbps internet connectivity"
You've bought into the 4G hype. Verizon's only promising under 10 Mbps for its initial LTE deployment. In 7 to 10 years, we'll have 50 Mbps rates.
It's all about spectrum and Shannon's Law. Verizon and AT&T now have about 20 MHz of spectrum across the country in 700 MHz to deploy LTE. Verizon is going to use 10 MHz channels plus some MIMO.
The LTE tests of 50 to 200 Mbps (raw speed, not throughput to a given user) involve larger MIMO arrays and 20 MHz or bigger channels and short distances.
I don't think Apple is looking to become a cloud computing outfit. Problem #1 is that the cloud is a fickle place. Remember Friendster? Well, My Space showed them! And just as soon as Murdock spent a few billions on My Space, out comes Facebook. In a cloud, you can be relevant on a Monday and by Tuesday, you're an Internet punch line. Plus, no one is entirely sure how you're suppose to make money. You think hardware is a commodity business, you should see "Cloud Services".
Hardware is much more stable and offers better customer loyalty. And, bigger profits. But, you must be a consumer products company and not a commodity company. The first thing Apple did when Steve Jobs came back was stop building Beige boxes and stop competing against Dell and HP. Apple never looked back.
I see Apple moving into whatever consumer products it can and be at the high end of that market. Will Apple embrace "The Cloud"? Yes, it will, but it will do so in order to get people to buy Apple products. Right now Apple's foray into the cloud have been weak and not entirely successful. That has to change.
If I ran Apple, I'd give everyone who bought an iPad or a Mac two free years of MobileMe. I would include in MobileMe 500Gb of backup diskspace with a Time Machine like front end and abilities. I would offer 100Gb of Dropbox like storage. I would offer push and remote syncing. I would offer service after service. And all for free.
When you setup your Mac or iPad, MobileMe would automatically be setup too and all of those precious services would be given to you for free for two years. And, for your entire family!
Then, at the end of those two marvelous years where you have your entire life built around MobileMe, I would start charging $9.99 or so per month. You'd now have three choices:
1). Stop getting MobileMe and see all of your stuff that you put there die.
2). Pay the monthly fee.
3). Buy a new Apple product.
You know, my son has been wanting a new computer. I could give him my old one, and buy myself a new MacBook Pro... Or, what about getting one for my wife? She's always complaining that I'm hogging the computer and she can never get online. I'll buy her a new computer. Or, maybe she can have mine, and I'll get myself a new Mac...
About the ergonomic problem: There's no a priori reason why 'mobile' apps can't eventually migrate to the desktop. A desktop computer could, e.g., have a separate touchpad interface instead of a mouse.
Hmmm... some general comments:
Quoth: (5 years) LTE will be here. WiMax will be here.
Nope. At least not in a remotely distributed way. There's just too much infrastructure to get put in place for that in 5 years. Maybe 10. We're at pretty much the same stage for LTE today that the carriers were at for 3G in 2000, where they were all claiming that it would be ubiquitous in a few months. It's still not, even in the UK, so I'd counsel caution. I'm also still to be convinced that WiMax will do anything than prove to have been an enormous hole into which companies threw cash. It doesn't have decent load balancing and the WiMax experiences so far, like Clearwire around here, are pretty ropey.
Second: Japan didn't skip GSM. They used another TDMA technology (actually, if memory serves they had 2 competing technologies) instead. The downside is that this effectively blocked the major Japanese phone companies from competing globally with the exception of Sony's ill-starred partnership with Ericsson.
Japan have now moved to WCDMA which is a GSM evolutionary technology and that's going to pay off for them. Although to move to WCDMA was an equipment switch horror of biblical proportions, which is another reason I'm skeptical about fast roll out of LTE.
Third: HP ditching Windows Phone 7. There's two factors at work here. Microsoft have become so obsessed with producing an iPhone competitor they've lost track of their real market. Sucks to be them. That's a problem right there for HP and the enterprise market. However, the real problem is Microsoft have struck a faustian deal with the real devil, Qualcomm Communication Technologies... HP don't like being told who to buy their hardware off, especially when QCT have, shall we say, "entertaining" license models...
So... to your other point, yes, cloud is the way to go, but it will be a Software and Service model - which is good for Apple's Objective C and Android's Java... ahem "Dalvik" approaches. Web Apps and HTML5 won't deliver the goods in the real mobile world so you'll need something on the device to manage and handle the experience.
I suspect that the real reason for the anti-Adobe feeling from Jobs is that it is a real problem for his new "vision" of building a pure One Platform to rule them approach that he's been moving to over the years.
Final thoughts: Based on rumours I'm hearing MS have killed the Courier purely because there's no point. There were dozens of dual screen devices running Windows 7 and Android on display behind locked doors at the silicon vendor stands for Mobile World Congress, and I don't think MS needs to have another failed trip into the product world. Frankly, they're crap at it. However, I am told that we'll be seeing some seriously sweet Windows 7 tablet computers which will be netbook priced with a bunch of nice UI overlays...
Oh, and they will run Flash :)
Your portrayal of the future is terrifying. I, for one, will never give up my wired Internet connection. Nor will I entrust my data to "the cloud."
I think you are pretty much on the money. What people generally don't get is that PCs are not leading edge technology and haven't been since the mid 1980s. PCs are tailing edge technology. The terms "leading" and "trailing" edge refer to the technology adoption curve. Because the majority already have PCs, the PC is in late adoption, i.e. the trailing edge of the adoption curve.
Most people today are deluded into thinking PCs are leading edge mostly because they haven't actually experienced the leading edge of a disruptive technology in their lifetime. Even the internet was already 20 years old when the dot-com boom happened - it was already well into the leading edge and quickly popped over the top of the curve within a few years. I'd been using the Arpanet since 1980 already when the first web server code was published. The value was obvious already to me because I was a true early adopter.
There haven't been many disruptive technologies developed or in development in the US over the last 20 years. Disruptive technologies are usually driven by manufacturing or industrial operations. Just another reason why outsourcing was a very bad idea.
The current state of Green and Alternative Energy technologies are an example of what a leading edge technology adoption phase looks like: disorder, confusion, uncertainty, no obvious winners apparent, diverse and contradictory products, scary to the majority because of all this. If everyone understands the technology and knows where its going, it's not leading edge - ever. This is exactly what PCs were like in the 1970s - when PCs were actually leading edge. Everything since then has just been an incremental refinement progressing toward the trailing edge.
One nit: most of the TV/Radio VHF and UHF bands are especially ill-suited for networking, especially high speed networking. If the US imagines this is the plan for catching up, America is doomed in terms of ever catching up to other countries on network access. First the channel bandwidth is tiny (the entire VHF band is equivalent to a single WiFi channel) and UHF is only a tad better. The more likely scenario is continuing increases in microwave band frequencies where bandwidth increases with the pace of demand.
The propagation of VHF and UHF also works against it - these frequencies propagate far too well. This is why early mobile telephone services from the 1930s and 1940s were epic fails. It was only when cellular took advantage of the more limited range of UHF compared to HF that mobile telephone took off. Going higher in frequency helps more because you can decrease cell sizes but get higher data rates and total date stream count capacity. 5G (whatever that will be) will likely be operating at 10-20 GHz.
@Charlie Stross:
Those Sidekick users were real glad that their data was in the cloud and not available on local machines, weren't they?
Great post Charlie and spot on indeed. Check out my reference to your post on my site. As I wrote about Steve, "Remember Hamlet and the quote, "The lady doth protest too much, methinks"? While in this case it isn't a lady, the question still applies to Apple's CEO regarding his comments on Adobe and Flash.
"The App Store and the iTunes Store have taught Steve Jobs that ownership of the sales channel is vital."
A datapoint: Apple just bought Lala, which sold streaming music in the cloud, and promptly closed it down.
http://www.wired.com/epicenter/2010/04/apple-kills-lala-music-service/
“In appreciation of your support,” reads a note on Lala.com that is only accessible to members, “you will receive a credit in the amount of your Lala web song purchases for use on Apple’s iTunes Store.”
One thing Charlie does not take into account is that Apple is not a hardware company. It is a movement. Just like Linux is. Hardware was never the point, it is not what makes Apple products attractive. Design and marketing are what Charlie needs to look at. Design and marketing will be just as valuable in cloud era as they are today. In last 20 years people were buying what they _want_, what they _believe in_, not what they _need_.
Argh. Hit post too soon. The article I was quoting concludes that Apple intends to use the Lala technology to launch a cloud-based music search, as you suggest.
Isn't Apple's data center in NC nearly completed? Sounds like cloud computing is very much in Apple's short term future.
(I hope so, I've been paying for Mobile Me/.Mac for many years expecting it to reach its potential...)
A datapoint: Apple just bought Lala, which sold streaming music in the cloud, and promptly closed it down.
Yeah. And I am really interested in knowing why they did that.
I'm going to make T shirts that say "show me the backhaul"
Also, if the iPad really does have a 50+% margin the price crash is a while coming
@Michael - the movement is what Apple uses to sell the hardware that at an industry-beating 22-24% margin makes it all the money.
While this is nowhere near the scale of Adobe vs. Apple or the iPhone leak, Apple last week also issued a lifetime ban on a guy buying multiple iPads for friends.
http://www.protocolsnow.com/2010/04/17/how-i-went-from-apple-store-newbie-to-lifetime-ban-in-one-week/
Apple's certainly been involved in some interesting tech stories lately.
"I'm going to make T shirts that say "show me the backhaul""
Funny thing, I was at a talk by Tom Huseby, the godfather of mobile technology investment and that was pretty much the theme of his keynote.
He said people can talk about universal broadband and cloud apps but there wasn't remotely enough fiber in the world at the moment to handle the strain it would place on the back end. All his money is going to network solutions and backhaul providers at the moment.
The funny thing about Apple's purchase of Lala is that Lala has been doing media playback via Flash. But before Apple closed it to new members, Lala did exactly what Charlie suggests "iTunes 10" would do - it allowed you to upload your whole music collection and have it accessible from any Flash-enabled browser.
One thing that Apple might've seen as a positive aspect of Lala, was that there was no dev API that allowed accessing your uploaded tunes via other methods (say, an iPhone app, XBMC/Boxee/etc.)
What Charlie's article boils down to, I think, is a question of access vs. ownership. I think Kevin Kelly wrote an interesting essay about that.
Umm, actually Android does have a C API.
http://developer.android.com/sdk/ndk/index.html
Also, it does have a managed app store and a security mechanism for fighting malware. On Android, each application runs as a separate Linux userid with its own sandboxed data space.
The reason the iPhone is reliant on Apple's code signing app store security wall is that everything on the iPhone runs as root. Damn stupid design decision, and I'm sure they're regretting it.
I've been wondering what Apple has been doing with buying data-centers, and this explanation makes as much sense of anything. Right now though I just feel bad for Microsoft because they're years behind despite having many more engineers than Apple. It's almost becoming shameful for Microsoft.
Still, I'm glad that Apple is trying to kill Flash on mobile devices. It really only makes sense for games nowadays, in my opinion.
As a web developer I hope apps dont eventually replace the need for traditional html web sites. Can you ever see that happening maybe 10 years down the road?
Are end-users really gonna upload 100+ GB of family photos, personal stuff, movies, and, yes, porn somewhere outside their home? Especially with that last one, I think not. Besides the privacy issue of the *IAA folks seeing all that pirated data in the cloud, it's a pain in the ass. On my cable line it would take, oh, forever, and make managing music a nightmare. It's already confusing enough to manage the Music directories on my netbook vs. two redundant ext. HDs. vs. my MP3 player. I have way too much music to fit comfortably on the mp3 player (and it still plays the same damn 10 songs on shuffle).
Frankly, I don't really see a purpose for porn on your phone, but I think others might disagree.
A BIG problem (or opportunity for bastard wifi providers) is that consumers are used to all-you-can-eat internet, and mobile doesn't offer it; my sister recently had a data account with two other people, until one guy scuttled it with nearly 24/7 YouTube/porn streaming, leading to overage charges in excess of $700 in a single month!
The first time that happens, consumers drop 3G/whatever either in anger or inability to pay. Said sis basically can't ever get a cellphone again because bastardly didn't read the small print.
@Dave O'Neill LTE is (as the name states) only an evolution and doesn't require an entire new infrastructure (like the transition from 2G to 3G in the early 00's)
The three largest telecoms in Scandinavia (TeliaSonera, Telenor and TDC) have all announced the availability of LTE later this summer. TeliaSonera even runs a commercial, somewhat limited network in Stockholm and Oslo.
Right now '3' provides the Danes with 21 Mbit speeds in Copenhagen and other big cities (mostly 16 Mbit I think). And that's just HSPA+
Granted we're only 25 million people..
Secondly; with Javascript libraries such as PastryKit and jqueryTouch together with HTML5's offline storage and CSS3 it's perfectly possible to make web application look and feel like their native counterparts.
I disagree...
While these new devices bring a large new population to tech, those in that population will migrate to devices that do more once they see the options that are available.
The PC has been dying for years. And I have yet to see game consoles or those other devices kill the PC. Are netbooks selling so well? Are people throwing away their PCs when they get the iDevice? No.
As for storing info/data on the cloud, I do not advise this-not if I need the data. Cut lines between Asia and the rest of the world, increases in data theft, and other network issues only make that dangerous. How can a firm guarantee uptime if you don't control the network?
While, on average, such uptime is guaranteed, that doesn't help a government access data on the cloud. State secrets, recipe for Coke, and other data on the cloud? I wouldn't put my social security number out there-who knows who owns those routers...
I want my data with me, and I have to have a PC to program, work, and game. Now, the form factor of the PC may change. But lose the computer so I can type on a damn on-screen keyboard? Are you nuts? Nobody needing to turn out text would do that...
I don't serve firms that HAVE to have their data, but after a three day outing of access to data due to cut lines near Taiwan, non-cloud and on-site business grew because the user is not willing to NOT have access.
What you describe in terms of delivery of content sounds very reasonable, I will always want a computer and not a terminal. Hasn't the market spoken? I just get goosebumps at thinking that our computations are run on a server in china or data being stored in Taiwan-not because they aren't smart or capable, but because I can't guarantee backup, ownership, security, access, or uptime. Again, it is all pretty until someone loses a connection...
Business Implications.
You focus on the impact to home users in this entry. However, I see a huge impact of Cloud Computing happening in the realm of businesses.
Cloud offerings like Amazon Web Services, Google Apps, and Microsoft Azure allow small firms to play big. Cloud brings the slashdot-effect to an end. Applications, media, and other content in the cloud essentially become infinitely scalable. Additionally, the deployment and maintenance of these resources becomes a non-factor. (You do touch on this)
The cloud will reinvent the world market to breed many more mom-and-pop operations that can feel like world-class companies.
This feels less like dot-com 3/4 and more like information age 2.
More so than any other country, America can innovate. Cloud will fuel innovation and create a desperately needed, new revenue stream. Information age 2 will push America back to the economic leader of the world by enabling a whole new crop of businesses that don't have to be huge to compete globally.
Now, if we could only fix this patent mess...
Sure it's been said already but 50-100mbps via 4G in five years' time? Not a chance! Cellular network technology simply does not support that kind of capacity in a mass use scenario. Fibre plus wi-fi might but who's going to lay fibre *everywhere*? Even Google might balk at that as it really ain't cheap.
Apple is becoming Microsoft. Remember how they "forced" their browser, media player, etc down our throats?
Microsoft is becoming IBM. Too big to do anything innovative anymore.
Google is going to be where everyone is headed because they will embrace most widely accepted tech while continuing to innovate and deliver.
Interesting future scope.
Here's something to ponder -- what if HP sometime around April 2011 does this:
With every laptop/desktop you get a free Windows powered Palm-ish device. Free, kind of a baby PC.
It integrates, it takes your stuff with you. The BC (baby computer) is your cloud.
I know it sounds ridiculous, but that's a backup isn't it? And it can access cloud services if necessary. But it's not the holy grail.
It's just a free device.
Oh yah HP makes money on a decent 4G connection for always on Internet and it rev shares with a Verizon.
You get 2 computers for the price of 1. Yes, commodity is the key word. But what the HPs and Google's of the world will do with it will destroy anything Apple is dreaming.
With the Flash letter Apple once again has put themselves out of the robust ecosystem that continues to evolve in computing. They don't like to play with anyone or help anyone. Apple is the lone wolf of the computing industry and they are relegating themselves to the same place that they did when they didn't license the Macintosh II platform. In the dustbin.
Just wait and see -- the business models of commodity mean that cheap devices become free no matter how good they are.
Very interesting read, most of which i agree with. BUT, right now, Apple is on their way to being able to control the distribution of all media, i.e. music, movies, tv, books, periodicals, newpapers, games, etc. This is the impetus behind their strategy, more so than cloud computing. iAd can only make money if the content is there to introduce it. No one else matches Apple's ability to sign content deals.
But I think you're underestimating what can happen when behemoth corps panic. The reaction MS, Google and now HP will have is to get the content providers to jump into their ecosystem as well. And they will, no one wants Apple to be the only one who can profit from all this. Initially prices will vary b/c so-and-so publisher wants to charge more for this and that. But in the end, market forces will prevail, and people will start choosing their ecosystem based on content prices. Ads are just a byproduct of this. A hugely profitable byproduct, but still just a byproduct, which explains iAd's timing.
Wow, a lengthy article on the future of computing, 110 comments and only one mention of Chrome OS?
Chrome OS is Google's champion for 5+ years, not Android! One thing I don't get about Apple's strategy is how they expect to control web applications? If they don't allow something in the webstore, people can always just create an HTML5 app for that ... Apple will get none of the profits
Charles,
You forgot to mention one thing, and that's how everything old is new again.
What I mean by that, this strategy, Google's, Apple's and even Microsoft's, is that they are slowly returning us in many ways to the days of mainframes and dumb terminals. Except now they like to call it cloud computing and thin clients.
It's only that now they can do it over broadband and 4G.
I'll still want my desktop though...
"WHat the doormouse said" should be required reading before dissing this article.
Over and over again in silicon valley have the big established companies failed to get onboard, and the small guys won. APple was one of the small guys, now that they are a behemot, things will get interesting.
For people who object that the PC is not dying:
As a user-end commodity, it's not. Lots of people are still going to own PCs of various standard types (desktop, notebook, netbook). But from a commercial, manufacturing sense it's dying much as the calculator market is essentially a dead end -- it is on its way to becoming an ubiquitous and cheap commodity with little in the way of profit in making it because of high competition among low-cost producers.
Similarly for PC software. In this case, the thing that is keeping commercial word processing / spreadsheet etc. packages alive (from a continuing revenue stream point of view) is the corporate market, which isn't fully comfortable yet with OpenOffice (or Google apps) and provides by far the greatest cash flow for those applications. When they do, gradually, the revenue streams associated with the core commercial business applications will be much reduced.
Apple's strategy is aimed at a subset of the home market here, so it's beside the point to look at issues with the corporate market or users (like software developers) for whom a locked-in ecosystem is not attractive.
Also note that in synch with the development of cloud computing is the scaling of larger and larger portable memory storage. Individuals who want to keep some data private can easily store it in local flash drives which they carry around with them, making both private and cloud-stored information available wherever they go, from whatever machine they're at.
I think Charlie's generally right about Apple. What I find unclear is exactly what's going to happen in all the other niches which Apple isn't aiming at but which are also going to be heavily affected by the same general technical changes.
Apple specifically builds premium devices. There is no sign that they are interested in making devices affordable for the billions of people in the world that don't own an iToy.
Atom and Arduino are the leading edge of what is to come, not wifi-web-and-touch-screen. Embedded, powerful and cheap, oh so cheap. iphoneOS doesn't run on these platforms and the Ghanaian farmers watching the forecast, doing soil sampling with cheap sensors and making futures markets for coffee beans on their cellphones don't care about that walled garden -- they're too busy working on the real future to care whether there's a front-facing camera.
To borrow from an old "if you could shoot yourself in the foot in any language" thread, the walls of the Apple garden are moss-covered brick, ivied and topped with granite capstones. Wrought-iron gates are set into exquisitely carved Gothic archways every mile. As you get approach the wall, you see that the bricks appear to be engraved with Runic symbols. Stepping nearer, the runes start to shimmer and twist in unpredictable patterns. Behind the wall you hear the sounds of pageantry and children laughing. When you attempt to throw your Windows laptop over the wall, a laser slices your hand off at the wrist and incinerates the computer.
Yes, there's a world behind that wall, but it's not what it seems. Instead of bloodying your knuckles trying to scale it or losing your mind decoding the writing, focus on the real issue at hand -- how to use our enormous *non-Apple* processing resources to solve climate change, resource depletion and MRSA.
One last thing -- multimode fiber easily carries 1Gbps. It's what the last mile is made of.
Uh, 150Mbit shared, vs 300-500+Mbit on any cable network in the country. DOCIS 3.0 has been out for a few years now. Its a fairly low cost rollout for DOCIS 2 systems, and has more bandwidth (due to channel bonding) than your going to get from wireless for the next 50 years.
So, why is my internet connection only 24/2Mbit? Because nothing in my area can complete. Every time AT&T or someone shows up with a faster connection, TW just goes in the back room and uncaps my modem a little more. When you can actually get 30-50Mbit over the air, TW will just send me a new modem like they did for the DOCIS 2 rollout.
The wire line, PC users will be streaming real HD or x4 HD, and everyone using their iPads will still be stuck just trying to read their email in a dead zone, or waiting 3 minutes to load Google beside the 1000+ users sharing the cell with them.
Some great counter-arguments against the pure 'in the cloud storage' theme in this post:
Justa_Fool | April 30, 2010 17:38 - there will always be some kind of a market for them. Just as there will always be some kind of a market for a "sports car" that goes faster and turns better than the family station wagon. It's in the human nature.
Marc Mielke | April 30, 2010 19:16 Are end-users really gonna upload 100+ GB of family photos, personal stuff, movies, and, yes, porn somewhere outside their home?
Counsel | April 30, 2010 19:32 As for storing info/data on the cloud, I do not advise this-not if I need the data. Cut lines between Asia and the rest of the world, increases in data theft, and other network issues only make that dangerous. How can a firm guarantee uptime if you don't control the network?
Charlie, your counter to these comments seems to be:
Charlie Stross April 30, 2010 16:03 - Local storage works brilliantly ... right up until you're mugged/your house burns down/the dog eats your server. There's really no substitute for redundant networked storage.
However, isn't there a critical distinction between an online backup vs not having any local data and exclusive cloud storage of data?
- Personally, I would like remote storage in case of "mugged/your house burns down/the dog eats your server" but that remote storage must not be Facebook or Google who demonstrate cavalier attitude towards data privacy.
- Sales of portable backup are increasing with decreasing costs.
- Its just not feasible given current/projected wireless data cost/capability to sync/upload mega data.
I want local storage, remote access and backup via encrypted channel.
A good analysis, though such a walled garden apple style ecosystem is hostile territory for opensource. And unfortunately, opensource offers tangible benefits as in mana from the sky, wealth for free. Trying to run a walled garden soviet style central planing economy and keep everybody out who's not wanted, and trying to compete with an economy 2.0 where mana falls from the sky, is like trying to run 100m with one leg tied behind your back.
If what you analyse is what Steve Jobs is trying to do, he's doomed to fail spectacularly. And of course there'd be the nagging little thing with monopoly regulations if he should succeed against all odds.
Worldwide, Apple doesn't even have 3% of smartphones marketshare.
Reference:
http://www.idc.com/getdoc.jsp?containerId=prUS22322210
If you read that study more attentively you'll see that the marketshare they quote is for all phones worldwide, not smartphones.
If you look at smartphone sales in North America (the leading smartphone market), Apple is snapping at RIM's heels in #2. (Source.)
I've been reading the latest Steve Jobs news (http://www.socialnews.biz/tag/Steve%2BJobs) and I would say that he won the debate. Apple is a strong system because they don't allow third parties to make their system unstable. It is a big advantage of the mac over PC.
Okay. Just one thing... project your scenario 40 years into the future. Well hell, no need to go that far, just 10 or 15 years.
We will have two or three cloud providers with non-interoperable apps with all the incentive to lock us into their and their own services. Us being people as well as companies. Theyll be incentivated to provide ERPs, POS tools, private content (bookstores, videostores...etc), but we will never be able to share them with others seamlessly unless they have something that will be compatible. Keep going this line, and they will break the web and on we go to a scenario not that different from the elden days of the unix wars or the browser wars.
In this scenario, I predict some company with money that is loosing to this two or three providers could re-enter the market (microsoft, hp and/or IBm, thats my bet, against google and mac... man, history does repeat itself) by providing a freePad with a freeStore and OpenPlatform applications focused on working with any and all kinds of devices that will comply to an open specification. The grislock will be broken like ms broke the unix market idiocy of its time or like mac broke the soho dynamic of its time or like google broke or like foss liberated the web to go where it wanted... you catch my drift.
In the end, i think, you are seeing only a little part of the great big wheel that will (mhm) keep turning.
Why did you delete my comment?
You are an apple-fan or some paid writer..
when Jobs announces that he's renaming the company ipple next Tuesday this will all be confirmed…
Devil: you are a troll.
Now fuck off, and don't let the doorknob hit you on the ass.
Somewhat off topic but very relevant to this discussion:
http://futureoftheinternet.org/
Very good book that goes beyond Apple and Google, and puts into perspective how and why the Internet became a success. In a nutshell, Jonathan Zitrain explains that the progress and innovation in personal computing and particularly web-based innovation were boosted by the freedom that the Internet and PCs offered, as opposed to the proprietary services from the old days such AOL and Compuserve.
Now Apple and everyone else is trying to put the Internet in a box and, in turn, will remove the freedom that developers enjoyed. The perfect example is the hatred for Flash. You want to develop an iPhone app that uses Flash? Well you can't. Because Apple says so.
Steve isn't thinking too much like you. He saw a market for a device between notebook and a netbook, and dove in. The Itunes was a success because he gambled on the success of a vertical business. The App store was a spin off from that successful idea.
Now the idea of "the cloud" killing off the PC world is a big big change requiring very slow adoption. I'm sure Apple sees it, but is something a competent management will plan slowly for, and not make bet big bets on it.
Buying lala.com... I think it's pretty obvious? Steve Jobs is trying to create the Apple Network--a major broadcast network operating over the internet.
Personally, I think the psychological impact of major surgery has aggravated his egotism. & what will happen to Apple when he dies?
"You'll just be surrounded by a swarm of devices that give you access to your data whenever and however you need it."
As long as they float by themselves like this.
To me that seems to be the main point: Apple's love relationship to media providers puts them in the perfect position to enforce RIAA's wishes for digital media, i.e. a pay-for-play model in which you pay for access constantly (think streaming radio). Media companies have had a hate-hate relationship with the Net for quite some time, doing everything in their power to squash Fair Use and legal sharing. Apple's iTunes was a godsend to them precisely because of its "walled garden" features.
Most people wouldn't know the difference if pervasive wireless broadband would enable them access to their "iTunes Online" from pretty much anywhere, but we are to far away from such deployments for such a gamble to pay off. Twitter crashes, Google DDOS attacks and, for gamers, the PS3 apocalypse should be a good enough argument against putting everything in a cloud. Bottom line: Apple really wants to be a gatekeeper for content as that would ensure them a revenue stream for years to come, with an extra layer of control provided by their hardware lock-in. Microsoft's dream turned to reality by their historic nemesis.
Also, as many have mentioned, one must keep in mind the rest of the world and corporate IT, where Apple is nigh irrelevant and Google, Microsoft, Oracle and others reign supreme. Apple quite simply doesn't have what it takes to even play in those markets and has never wanted to. So my money is in Google, Red Hat, Canonical, Amazon and others in the whole cloud computing fad.
Absolutely agree with you on the commodization of the PC industry, the transition to the cloud will be complete when a derivatives market is established around it, trading futures in compute cycles seems like a good time (Collateralized Storage Obligation scandal anyone?). Also, there is another industry that faced this sort problem: textile/fabrics/fashion. Textiles became so cheap that they were commodity items, and thus we have a fashion industry. The enormous accessory market for smart phones shows that these commodity items are already fetish items. Most of these accessories amount to a sweater for your phone.
@Jacob.
There's a couple of things here. Firstly, for the US market where Verizon is making a lot of hay out of their LTE "trials" in Seattle and Boston - it's a completely new network. But even in Europe where, you're right, it's an evolution, it's not necessarily a straight evolution when you look at the patchworks of 3G networks with different BSS/BSC and Node B configurations all doing wonderful inept jobs at what they do. Scandanavia has some lovely network geometries compared to say, the UK, where I know of 1 carrier where they've 3 different vendors for 3G Base Stations and Node Bs in the London area alone. Germany is an even bigger mess.
Then there's the phone modem side to... so I'm reserving judgement on that.
Likewise, while you can make a web app kinda look and kinda feel like a native app, it still won't work like a native app. At least not unless somebody finds a way to get the different OEMs to start working together (yes, Nokia, Apple, HTC, LG, Samsung etc... I am looking at you)...
So if you want something with secure, persistent on device storage (a must for certain applications), easy access to phone system functions, and so forth, you're not going to want to build a web-app.
It's one of the problems I can see for solutions like Phone Gap - that in the end they'll end up as complicated as writing native and then lose their advantage for anything other than really "basic" apps - and if that's your target then something like "Appcelerator" will be better.
I'm against this.
This age of corporate control must come to an end. Really, this is right out of "Nuck Futz Fundiez" Jack Chick/Left Behind stuff where you need the "Sign of the Beast" (aka absolute allegiance and approval) to "Buy or sell in the market".
And furthermore, I think the big companies, or rather the dictator CEOs want to kill the desktops not to protect from copyright abuse but to kill any individual's ability to compete with them on any level. They don't want people to write software, record music, mix and compile, make games, unless they are part of some corporate team who's job it is to do that.
This must be resisted. I will not buy and iPad, iPhone or any Apple product and I'll be spreading the word for other people not to. No "Negative or false information" just the truth. They block out competition despite operating in a supposedly "Capitalist" society, they ban "Porn" despite America's free speech laws. What they do and say alone is enough.
The "theory" isn't really new (as in the death of the PC as we know it). Larry Ellison and Scott McNealy were running around proclaiming the same thing almost 15 years ago.
Some people want so bad for Microsoft to fall hard and fast that there is always a new paradigm right around the corner that is going to change everything.
Sorry Charlie, the iPad nor anything else is even close to replacing Microsoft's client dominance.
Also Charlie, your analysis about Broadband in the US isn't quite right either, but that's not surprising either.
Good analysis. Yes cloud computing is the future of software/hardware. But...I think Steve's ultimate motivation for creating a walled garden is simply to control the overall user experience. He knows that great hardware doesn't mean shit without great software. So they want/need to control the whole ecosystem in order to put out the best products that the mass market goes nuts over. Flash is something they can't control and therefore becomes a risk to the overall user experience they are trying to create/manage.
The PC industry has reached maturity with regard to what can be done with the established players all working co-dependently (MSFT, INTC, HP, DELL, Adobe, etc). In order to move the user experience to the next level, and provide new innovative features and usage models, you need to tear everything down and start over. Apple is doing that by controlling the whole ecosystem, by creating and molding the iPhone/iPad OS, hardware, software, dev tools, apps store, etc into one cohesive platform that is managed with the best possible user experience in mind. Steve's aversion to Flash is no different than a neighbor who complains about
Charlie, the US isn't technically the leading smartphone market - they're the leading high end phone market. Nokia haven't been able to break the US market but for smartphones they pretty much own everywhere else.
I've seen some extraordinary data on regional download numbers for Nokia's OVI store from Asia.
Nokia has just made another S60 push into the US with one of the top selling phones on T-Mobile - the interesting thing is that people generally don't realise that it's a smartphone they've bought.
The question does remain if that will drive people to download and buy apps for them though.
I like this little thesis however, I think you are grossly over-estimating Apple's chances. Apple gains all of it's press, creative mojo, & its synergies from one man. Steve Jobs. Subtract or delete it, & what is Apple?
Nothing.
Without Steve Jobs, even with this ecosystem, Apple is driven by his salesman presence & personality. Without Steve Jobs to be that pitchman, that creative force, & no obvious successor in sight, Apple cannot sustain itself. Think of it as the Empire parody. The Empire was this huge force in Star Wars. Power, resources, creativity, & force to control an entire galaxy. However, it was because Emperor Palpatine was using a version of "Force Meditiation" to keep his will permeating through the Empire. Steve Jobs is the same way with Apple. If that Emperor is gone, the Empire lost that cohesion that sustained it. This applies perfectly to Apple.
In my opinion, this gambit is doomed to failure. You just can't groom another Steve Jobs. The person has to be born with that talent. I think he's a one in a billion type of genius. Once he's dead, Apple has no future. They won't have that kind of synergy or focus. They won't have someone that inspires the masses. Their products will fall into less than stellar design, Microsoft will fully shift into the cloud because Ray Ozzie can guide them into the next couple of decades, Steven Sinofsky can continue to guide Windows development, & they have a future success post-Bill Gates.
If you take a good look at what's happening with Google Android, Google is rapidly gaining on Apple. What happened to Apple at the hands of Microsoft is being repeated with Android. Android will become the platform just like Windows. It will eventually take Apple's spot & challenge Blackberry's dominance. History is repeating itself again, because Steve Jobs is re-creating the mistakes of the 1980's Apple. It's Mac OS-X Team vs the iPhone/iPad/iPod Touch team. I can see in about 15 to 20 years, Pirates of Silicon Valley 2: Echo of the Past being made. Where Apple is torn apart because Steve puts the Mac team vs the iPhone/iPad/iPod team. Just like Apple II/III vs the Macintosh team. Steve dies, the products aren't as good, the ecosystem keeps them alive. However, they aren't innovating new hardware as the x86 dies & new hardware takes it's place. Apple bleeds money over time & gets back to 1997 again in 2017, 2027 or 2037. This time no company comes to Apple's rescue, it get bought up whole & disappears.
It's like I keep saying, "And you'll see why (2017/2027/2037) will be like worse than 1997."
Apple is doing the same thing the console makers do: curating content on their platform, and taking a cut.
The only difference is that Apple puts more emphasis on maintaining brand, while the console makers are primarily concerned about trying to wring a profit out of their hardware.
The Apple vs Flash conflict is similar to Sony's recent decision to kill Linux on PS3.
Apple is doing the same thing console makers do: curating software on their platform, and taking a cut.
Why won't Sony allow Linux on the PS3,and why do they pump out a steady stream of firmware updates that break homebrew attempts? Because they don't get any money from that stuff, and they are desperate to wring a profit from their hardware.
Just what industry rock have you been hiding under?
How many companies are going to want all of their vital applications and information handled by a 3rd party network? How many people are going to want to do their paperwork on a 3'-5' screen? Who's going to be writing out a 30 page document on a touchpad or write it out for letter recognition software?
The Cloud is a niche, and always will be. The majority of software in the world is not huge corporate creation, but tiny 3rd party groups, individuals with hobbies, and open source designers. The Cloud is only appealing to big companies, and people wanting exposure (who are willing to sit through an extended review process). Everyone else knows full-well that local storage is much more efficient for resources, and far more convenient.
And, let's not forget, the Cloud is only useful when internet is stable. Take a lesson from Ubisoft: it's not. Take a gander at your ISPs which are fighting tooth-and-nail against any form of competition or net-neutrality. You want reliable internet, you're going to need the infrastructure first, and the companies in charge of that want to drag their heels as long as possible.
As for PCs, they will always have a market as well. That markets diminishing, for several reasons, but it won't die out. Ever. Companies don't want your mobile devices, they want hard, locked down machines that won't have any risk being misplaced or tampered with. Computer science, a still expanding industry, has no need for toy technology, and needs the heavy-duty power that won't fit in a hand-held. And no, mobile devices won't catch up, because the requirements will continue to grow as well.
I know you want to be sensationalist to get the readers, but try to tone down on the ridiculous hyperbole. The mobile market and Cloud computing is good for attracting the casual users - the ones who saw a computer as a web-surfing, email sending hunk of plastic. Yes, that's a large market, but hardly enough to kill off the PC.
I think the future will be super-thin substrate screens that are like paper. We will think "remember that first iPad? We thought it was so thin and now it seems like a brick." You will roll your paper-thin tablet into a tube and stick it under your arm, or place it into a protective sheath of some kind. If you want a bigger surface area to work with, you will have larger options, like entire desks or walls that turn on or off when you want them to.
Apple has already patented various 3D themes for a future OS, so I imagine that the OS will be 3D, data will be stored in the cloud, images and fonts will be gorgeous, and the discussion of platforms will be as relevant as talking about which typewriter is better in 2010.
I won't say there's going to be an aftermath, but a lot of things you wrote about are probably going to happen.
Hmmm, sounds like you might be interested in DropBox -- uses the cloud for making the data you select accessible on any of your machines (including iPhones), all nicely encrypted on the remote site, using ssl for transmission and available as a normal folder on your own machine (with a local copy maintained so when you don't have a connection you still have your file).
genoki wrote: "I want local storage, remote access and backup via encrypted channel."
Good point. As Terrabyte hard drives are about to cost $50, it makes more sense to buy 2 of those and connect those to the web from your home and clone the data to some other place of a family or friend you know. That is still cloud storage. Upload speeds, especially with Fiber to the Home, means you can stream any of your personal files over the Internet. Easy synchronizing of your personal files with cloud storage services by Google will also work. But using Google exclusively will not be attactive until Google provides 4-years of Terrabyte Cloud Storage at the same cost as buying 2 Terrabyte hard drives (plus few cents for bandwidth and electricity, notice the hard drives only need electricity when you need to access them on the cloud).
I find it really hillarious that Apple is valued at 4237 Billion. Haha.
All they have for a hope to keep such high revenue and profits is the iphone OS devices which already amount to more than half of Apple's revenues and profits.
$100 Android phones, tablets, laptops are coming. Those are without cellular contracts needed! Those are unlocked! Those are without profit margins to the brands of the past as real competition in the consumer electronics industry is really happening.
If you look at the manufacturing costs and bill of material of such current devices as the Nexus One and the iphone, those cost $150 to manufacture including all materials! Competition with Android means WE WILL be able to buy those kinds of devices for below $150 and NOT have to sign any 2-year contracts with any carriers.
Charlie speaks of Wimax and LTE in this article. I think he should also mention Google's White Spaces. The White Spaces are right now being regulated in the USA and Europe, and it seriously looks like White Spaces is going to happen. What this means is following:
- Free unlimited wireless broadband for all, everywhere, worldwide. No more telecom carrier contracts needed when buying consumer electronics with wireless broadband Internet access.
You may not want to believe that true Free wireless broadband will happen just because the 700mhz TV spectrum from one day to the next will be open and used in a totally unlicensed fashion by all consumer electronics manufacturers. Here's how the free 700mhz network will be built worldwide in the matter of just a few months:
1. check out FON.com, now imagine FON routers based on 700mhz. The router costs less than $20, and can be mass manufactured in millions of units in the matter of months. Now Google can simply organize to distribute those routers for $20 or for free to broadband users around the world.
2. Enforcing Net Neutrality (also imminent in both US and Europe), ISPs of ADSL, Cable and Fiber CANNOT block usage of their consumer lines with routers of free 700mhz Wireless broadband sharing.
3. Clever cloud based management of the White Spaces spectrum, BLAM! you have free wireless broadband everywhere and when you buy a $100 unlocked Android phone, tablet or laptop, it will come with WiFi and White Spaces (also called WiFi on Steroids), which includes unlimited access to wireless broadband including VOIP.
We are living the era of total disruption of all the previous business models. This is absolutely fascinating!
The view in your article may also explain why Apple have $40Billion sitting in the bank and they're continuing to grow it. They're planning and (assisting in) forming the future.
Great article and debate. But we're overcooking it all a bit. The iPad's opened up a new market (not replaced one)
I've certainly found the cloud handy, and now outsource 100% of my porn.
I think the industry will grow, becoming ever more complex, like it has done, it won't all suddenly land in one direction.
If this is at all close to Apple's plan, they are going to have to get a better grip on their own policies. It's in the news today that Apple refused to allow a game that is a cartoon version of participating in a seal hunt (like in Eastern Canada).
There's no blood, and lost points for trying to club baby seals, as it is modelled on the legal activity. But the app is not allowed.
Apple, meanwhile lets you shoot animals, shoot police officers, and flick pygmies into volcanoes.
Apple has thoroughly mixed "stopping things because they are bad for the platform" with "stopping things because Apple thinks they are bad". Having opened that door, Apple will likely find itself under more and more pressure to sanitize the applications.
If the cost of "a platform with no malware" is also "a platform of worldwide common denominator standards", then it would not surprise me to find the platform is incredibly safe, but bland beyond belief.
I'm also curious about the launch of the iPad in Canada. Canadian wireless carriers have gone out of their way to make wireless services a maximally priced solution, not a commodity priced tool. We don't have unlimited wireless here - even the iPhone comes with at most a 3GB plan. (Overage is *expensive*.) I don't think I've ever seen a pay-as-you-go data plan in Canada. This, of course, is great for the bottom lines of wireless carriers, but lousy for devices that need cheap data to make them ubiquitous. This same approach meant that Canada was one of the last places on the planet where you could buy a Kindle e-reader (we've had them for just over five months).
"As I'm sure you remember, it suffered - badly - from this when it made the move from 680x0 to PowerPC, when the only IDE capable of building PowerPC code was Metrowerks CodeWarrior"
Also, he probably remembers the relatively smooth transition from 68k to Intel at NeXT. Some 3rd party apps didn't make the transition, but that was mostly where the vendor was out of business or had lost interest in the small NeXT market.
After going from 68k to Intel, NeXT added support for Sparc and HP PA-RISC, and 3rd party apps made that further transition pretty much without a hiccup.
And this was for a platform that only ever had about 100,000 users.
Microsoft is not doing badly. In fact they are building their own version of the cloud services. Azure. They have the developer base and the money. The have the tools too. Apple has nothing of the sort. It always has been Microsoft vs. Google. To mix Apple with these two is just wrong. Apple is just trying to control content and light devices apps. The lion's share of data and computation (the profitable part) is at stake beetween big g and big m.
Ubiquitous cloud computing on portable devices is bad news for anyone who uses their computers to do non-trivial work. IPad like devices with low resolution screens are acceptable for the twittering masses who don't do anything more intensive than reading MySpace and drooling out twits several times an hour.
Overgrown smartphones, however, are not adequate for people who create content, write software, or do engineering. Try to do structural editing on a >2000 word document on an sub-XGA screen or running Catia on an A4-class CPU if you don't believe me.
If iPad-like dongles take over the computing market, prices for real computers are going to skyrocket due to declining demand and the few remaining hardware manufacturers realizing that they can charge whatever the corporate market will bare. Consider what dedicated image editing platforms used to cost in the days before Photoshop to get some idea of what we're in for. This will not be a good thing for freelancers or the creative industries in general.
The other problem with cloud computing is corporate censorship. If a cloud victim^W user publishes something their cloud provider doesn't like, there's nothing stopping the cloud provider from maliciously reading, altering or deleting the user's data.
Would you trust Apple not to wipe out all of your personal cloud data if your published something they didn't want to hear?
Just a heads-up and traffic warning -- this article has been linked from the New York Times' technology page. With a prominent blue "Antipope" sub-head.
I thought companies, such as Hewlett-Packard, Apple and Microsoft, took singlar verbs. Those companies, individually, are not more than one. "Apple is" not "Apple are."
All of that tirade to tell us that Apple is going into SAAS? Who else and their grand-mother isn't aiming for that exactly?
I don't see iPad/iPhone doing what you say unless/until there's a big dumb-packet wireless company out there that just does packet data. No ringtones, no wallpapers, no SMS, just dumb-ass packets real cheap and real fast. _MAYBE_ bundled VoIP and texting-over-IP.
Incumbent wireless carriers (IWECs?) will fight that tooth and nail, so to get there we need a large, geographically-diverse company to either spam the nation with unlicensed WiMAX or acquire a smaller wireless company and its licenses.
I vote Walmart, since they could rope off a section of their parking lots and lease them to companies wanting to rack some stuff in shipping containers (such as Google, Netflix, et al).
Cringely talked about this years ago, but with wireless getting bigger by the second, maybe it's time?
“ What would be glorious would be for someone to carry out such an attack at an Apple product launch…”
Here in the US, there are explicit laws against jamming devices, as seen in a recent action against a training center jamming cell phone use.
US freedom of speech protection might make it OK to encourage others to break the law, no matter how flagrantly. IANAL. My quick scan of Wikipedia suggests that ecouragement, together with a partial checklist for doing so, might be treated differently in the UK, where it could run afoul of anti-terrorism laws. IARNA UK L.
In Australia too, while Vodafone and Optus claim to cover 95% of the population with their G3 networks (Which even in the Metropolitan areas is very spotty) the Telstra G3 network does actually cover 98% of the vast area of this Country.
It always peeved me off when the iPhone was first released how many restrictions were placed on it because of the limitations/failures in the US tele system that were not present in other Countries that I travel too.
Fortunately, Apple are appearing to learn from that.
With an iPad a user still needs a computer with iTunes.
Yawn. Been there, done that. In the mid 1970's, it was widely predicted that corporations would not own their own computers in the future. Instead, they would have 3270 terminals hooked up to hosted virtual machines on IBM mainframes via SNA, or they would have ASCII terminals hooked up to accounts on mainframes elsewhere via packet-switched networks. A few smaller companies did try this when they computerized their own operations, but ran into the following problems:
1. Service guarantees: If in-house, they can guarantee access to their data. If outsourced, they must rely on the vendor, and how can they trust the vendor?
2. Data ownership. Who owns the data -- the vendor upon whose computers the data resides, or the company? And what happens if the vendor goes out of business?
3. Security. How can they be absolutely sure that the vendor is not in cahoots with one of their competitors, or that someone else can't get access to their data?
4. Bandwidth. You have basically unlimited bandwidth to your computers locally, so your terminals are always super-responsive. The 3270 protocol with its half duplex screen-transmit-based paradigm was a work-around to that problem, but an imperfect one.
5. Data retention issues. Corporations have a legal duty to retain certain data for certain periods of time. They also have a strong desire to purge said data the moment it's not legally necessary to retain it, because if it's still around if (when) someone sues them, it can be subpoenaed and used against them in a court of law. Without ownership of the computing facilities, corporations have much difficulty enforcing data retention policies.
The end result was that when smaller minicomputers and microcomputers became available, the time sharing services went under in short order, because the small to mid-sized businesses whose accounting etc. was done via time sharing services swiftly brought that in-house.
Now I'm hearing the same arguments made about cloud computing -- how it's easier than managing your own infrastructure, about how much more reliable the large systems used for cloud computing are, etc. -- and from a business perspective, it's dejavu all over again. Perhaps individuals will embrace it for their personal data. But I don't see corporations putting any of their corporate data into the cloud, other than data that has to reside there anyway -- i.e., the customer-facing data on their web sites. And as long as corporations are reliant upon in-house computing systems accessed from personal computers, there will be a market for personal computers, if only to access that one application that simply doesn't work "in the cloud" (for me, it's the corporate source code control system). As for the notion that the fundamental structure of how companies do business will ever change: Err, no. We've been doing double-entry accounting for hundreds of years now. Computers didn't change the fundamental structure of how we do accounting, payroll, and other critical business functions of that nature. All it did was make it faster and easier. 'Nuff said on that.
@Cumudgeon (sic) "Ubiquitous cloud computing on portable devices is bad news for anyone who uses their computers to do non-trivial work... prices for real computers are going to skyrocket due to declining demand"
No, it's not. It's a very good thing. Those clouds are powered by actual computers -- actual boxes with actual CPU, RAM and storage running in actual data centers. Those massive investments in scale are driving down the price of commodity hardware, and that's going right to the bottom line of personal computer hardware.
Ask yourself *why* the personal computer keeps getting cheaper. It's because *all computing* is now basically sourced from the same core personal computer components. With few modifications, the processors, motherboards, RAM and hard disks used in servers are the very same ones in your personal computer. They just use more of them, or use some that are trivially different (more cache RAM, RAID, whatever).
People thought that the advent of appliance computing meant the end of personal computing. It didn't. People thought mainframes and thin clients were the future. They weren't. Instead, PCs were the future of the industry, and they've taken over everything. We don't buy mainframes any more. We buy racks full of x86 and x64 hardware.
That's the future Jobs is trying to work around. Apple can't survive on the razor thin profit margins of a boutique PC maker in a world incredibly dense with powerful, incredibly high performance/price PCs. The iPod taught Jobs that making complementary computer products was a huge money maker. The iTunes store taught him that controlling the media channels and the application channels were huge money makers. The Apple store taught him that he could sell these boutique products in actual boutiques to enhance perception of value.
The assertion that larger scale of cloud computing will drive down demand for "real computers" is ridiculous. Servers are made of components. Demand for more server computrons will drive down the price of components. And that's all a home PC is: components. In a very real sense, the cloud is made of personal computers.
Any computer manufacturer that tries to create barriers to entry or vendor lockdown will be roundly spanked by a line of Taiwanese and Chinese PC components makers stretching around the world. The "cost" of a premium corporate server can only be slightly higher than the cost of building and supporting your own components.
It's not the worst time to be a hardcore PC hobbyist. It's the *best* time. We have seen the servers, and they are PCs.
One valid comment Dr. Jobs makes: over the years I have worked with several cross-platform development tools, and normally their UIs are indeed way inferior to native ones. The LCD principle forces it to be that way.
Charlie,
Well done. So many people are completely unimaginative about the power of progress these days, I absolutely LOVED this version of a near future that takes into account the pace of modern technological development. You spent some good time and thought on this.
~Truett Ogden
@Truett
Charlie, interesting analysis, but with some wishful thinking included, like the above phrase. :) The trend I've noticed is more akin to "access to your data when and how we feel like giving it to you." Apple's closed-system and DRM mentality offers plenty of evidence for this (e.g. anti-flash, anti-personally-developed apps, PDF reading okay in email but not via browsers, no sd-card slot or direct USB so one has to load pictures to the iPad via iTunes [yeck!], then can only view them with the specific apps provided, etc. etc. etc.). It's bad for innovation when vendors create artificial walls.
People seem ultimately to place less value on systems that don't allow the kind of "whenever/however" access you mention -- as noted in comments on the ebook pricing survey I did, many people felt ebooks had less value because of the DRM that didn't allow them to treat them with "whatever/whenever" access. But less value doesn't mean no value, so pretty, manicured, walled gardens will persist (and be annoying), while there will remain a market for open access and personal control over one's data. With a cloud you can never be 100% sure they won't change the terms on you (of both access and privacy, as there have been assorted examples of late), so I suspect personally-hosted data will remain desirable.
Mmmmm, while I fully agree there Charlie, you also have the problem of no control over your information when it is "out there" on the cloud or where ever.
I was recently looking at having to upgrade our server at work and in doing so also looked at the Hosting option. Full redundancy, back-up, etc.
While it may not happen with Google or Apple (But still possible), my biggest fear came when I asked myself, "what if the Hosting Company went broke". If the Receivers came in and just locked the doors, how would I ever get access to all of our critical Company information and datafiles?
Fortunately, we operate out of two locations and we were able to put a server in each to provide full and immediately redundancy in the event of any possible disaster.
Having a local home network as described here is good, but it should also be fully supported by another form of external redundancy - whether that be on Google, Mobile Me, etc.
But by the same token I would also build redundancy that I have control over into my Google, Mobile Me stored data with my local home network.
I think you've hit it on the head, but it's ironic that one striking note from a recent presentation by Kobo was that delays in the App approval process were a major spur to getting them to release on multiple platforms. Developers need to protect themselves against arbitrary screw-ups in Apple's approval system.
Let's not forget that Apple lucked into the whole App-store thing. When the iPhone launched there were no Apps, and this whole fantastically lucrative subculture only happened because of pressure from Mac developers.
What I find interesting about these kind of articles and the comments posted, is the all or nothing mentality people stand behind when expressing their thoughts about new technologies.
In my opinion, not very many new fundamental and game changing ideas have come about in the last 10 years. Companies are just trying to improve old ideas and spin them so we will spend more money. One age old comparison that comes to mind is the automotive industry. New features yes... but all cars have had a steering wheel for quite a while now.
As for the PC dying and becoming a commodity. The PC is here to stay... Period! Wasn't that the goal of Jobs and Gates? I have heard it said in a number of interviews that each of them wanted every home/family etc, to own a low cost PC.
Moving on....
iPad = Better Windows XP tablet edition and certainly isn't a new concept. Tablet PC died out because it sucked for inputting data, The iPad won't be much better.
Tablets are ok for consuming information and terrible for creating it. (My Opinion)
Next....
Cloud computing = new age Mainframes.
Cloud computing has many benefits but privacy and remote access issues loom. I do back up some things to google, flikr etc.. But I am not going to put my most private data in the hands of a company unless I absolutely have to.
Next....
Wireless data providers can be reliable and convenient, but will not be faster, more secure or MORE reliable than land lines for a while.
The trick to all of this is a mix of technologies and not convincing yourselves or others that technology is going in any one given direction.
I like to play games on my PC and my PS3. Each serve a similar yet separate purpose, but I certainly don't try to create content, or write code on my PS3 and would have little interest to do so on an iPad (and yes I know you can run Linux on a PS3, blah, blah, blah).
I like to watch movies on my PC and my TV and some times my phone. Why exactly do I need to choose one as my sole source of video.
I think it makes sense that I would use each of these technologies depending on situation and their is nothing wrong with owning multiple devices for multiple situations and purposes. Example... I have a media center hooked up to my big screen and I still subscribe to cable TV and have a DVR.
Long story short.... I want multiple platforms and devices for separate situations with each of them having some functional overlap and if they can communicate with each other..... even better.
I'm going to do something I've never done before -- post without reading 167 comments when I have a cold.
Have you seen what's happening to Gizmodo because they took the iPhone apart?
Great analysis!
"Stench of panic" in Silicon Valley, I would say mixed with the exquisite fragrance of opportunity, as usual in transition periods. "it's why everyone is terrified of Google", except all the companies who leverage our cloud platform to seize these new opportunities, eg http://www.google.com/enterprise/marketplace/
Ping me if you come in the Bay Area, I would love to invite you for lunch at the Googleplex!
This is completely backward: "Even if he's reduced to giving the machines away, as long as he can charge rent for access to data (or apps) he's got a business model."
The machines are the business model; the data is the means. What percentage of Apple's profits are provided by iTunes + App Store? Almost none.
The last thing Apple will ever do is give away machines. The core lesson from the last 10 years of Apple's success is that software and networks are most lucrative as the means to sell devices.
Nice explication of the current trends, Charlie. I think you're right that Jobs is anticipating the cloud and trying to put the Apple brand on it before Microsoft and/or Google can beat him to it. But I'm not convinced that the scenario will actually play out the way everybody is predicting now.
For one thing, as a lot of commenters have pointed out, Corporate IT isn't going into the cloud at a great rate just now. They've spent trillions of dollars over the last 20 years or so building up enterprise infrastructure and management fiefdoms and they're not going to let go of all that really easily. Consider one global company I've worked for, that spent 10 years (just under one person-millenium) and about US $4E8 building a supply chain system with "off the shelf" software. Now that they've bungled their way to something that works as long as they don't try to change it, there's no way they're going to move to someone else's cloud. Rent someone else's computer center and sysadmins, sure, but not push all their corporate-process-specific software out of their own control.
And note that RIM, the Blackberry people, have proven a perfectly good business model for companies wanting to supply peripheral computing power to corporate IT; it's just not the model that all the big boys are lusting after just now, because there aren't as many dollar signs hanging off it.
Incidentally, I haven't read all the comments on this thread, so I apologize if this has already been said, but you should be expecting incoming about now; your post was linked on slashdot a few hours ago.
Ever been on public transport and tried to use a phone? Its a foul experience whenever I've tried.
... then you haven't ridden on the Paris metro - you have signal many meters below ground level. The US is falling behind in infrastucture in all sorts of ways.
"Those clouds are powered by actual computers -- actual boxes with actual CPU, RAM and storage running in actual data centers. Those massive investments in scale are driving down the price of commodity hardware, and that's going right to the bottom line of personal computer hardware."
Actually no. Server kit uses CPUs that are a lot "hotter" than user PCs are fitted with, in some cases literally hotter consuming 200-300W. The server mobos are totally different too with all sorts of options such as Integrated Lights Out (ILO) for remote access that are pointless to fit on desktop kit. Memory -- any real server uses ECC rather than cheaper commodity non-ECC modules, and the module sizes are much larger too, as much as 16GB per stick to allow a single blade server with 2 or 4 processors to be fitted with 100-200GB of RAM. The HDs in servers tend to be SAS or Fiber Channel rather than SATA for improved throughput. The power supplies and cooling systems are again very different from anything in the commodity market.
Servers and desktop/laptops have differentiated to the point where the internals are effectively created by two separate semiconductor industries.
I think this idea of data "in the cloud" is silly. Does anyone seriously believe that people want all of their data stored remotely and essentially loaned back to them when they want to look at it?
Imagine the scenario you are suggesting. I write something from the "terminal" at my home or take pictures with my camera, and everything is stored remotely on the server at (let's invent a company) Doogle. What happens if Doogle goes out of business? Or a disgruntled Doogle employee decides to delete massive amounts of data? Or a disgruntled Doogle employee decides to sell all my data to the highest bidder? Or Doogle decides that maybe I should pay $1000 if I don't want my data deleted today?
Generally speaking, people like to own AND possess things. If you don't believe that ... Have you ever bought a book or a DVD and then decided you'd store it in a public place? Even though you may only watch that DVD once a year, I'm guessing you don't store it in the lobby of your apartment building and just HOPE it will be there next time you want it. But for some reason, you expect people to do this with their private data?
Ridiculous. Will never happen.
There are laws against hacking and interrupting wired networks too, with significan legal penalties for the perpetrators if and when they are caught. It's just that running a DoS across a wired network is much more difficult and more likely to be detected and stopped quickly, and physical access to the cabling is tricky to arrange if that is the route the vandals want to take. With ubiquitous Arduino-style wifi jammers it's a lot easier for the technovandal to get his jollies by causing folks who bought into the wifi cloud concept a lot of grief, and a lot more difficult to stop it happening if some imagination is employed by the vandal when choosing the jammer's operation mode and location (solar-powered jammers, anyone? Only works when the sun shines...)
Wireless jammers for mobile phones are available now for mere money; the specious argument for their existence is to provide interruption-free locales such as theatres, meeting places etc. so that phones will not ring and disturb the event taking place. They cost about 100 bucks or so and cover most commonly used frequencies, outputting as much as 10W of RF per channel. There's no reason that output couldn't be boosted to 50 or 100W with a low-cost external RF amplifier to take down mobile phone service over a couple of square kilometres or so.
The other alternative I mentioned is SYN-flood style attacks where a wifi-equipped PDA or laptop running rogue code attempts to dominate or disrupt a local router and prevent it providing service to other legitimate users. Imagine a virus payload that could install such rogue software on vulnerable devices...
Your post mirrors, in many ways, my view of things perfectly. The PC is dead and the idea of "personal computing" is also dead, replaced with, as you say, gizmos, and a new era of "social computing". The writing has been on the wall for a long time and while some may scoff saying this "thin client" model was tried before, the reality is that there will always be a few pioneers, the Altair, the Sinclair, the Osbourne, before the match takes light as it did with the IBM PC.
The only exception I has is with the idea I read in your post that SJ is somehow in some life and death struggle for survival. The other sure, but Apple is firing on all cylinders and you need look no further than their last quarterly report to see that.
SJ is king of the world right now and well deserved too, perhaps not based on likability but certainly on merit. He is money in the bank to Apple and their shareholders and he can do anything he damn well wants to right now.
Apple isn't struggling, they are leading and the latest new you mention about Window 7 tablet is simply an admission of failure on the part of Microsoft and it's partner to be even in the save game let alone the same ballpark. People still need PCs and they will continue to need them for a while. There is a lot more going on out there than just people tweeting and listening to soft jazz.
Businesses still need to function, people need to work and that's a piece of the puzzle nobody has figured out beyond the PC. People will adopt personal technology at an entirely different uptake rate than their employers or our governments (our biggest employers) will.
I don't see a time when your local utility will be tracking your account in Apple's cloud. They may choose to send your bill through it but their cloud will remain their own. It's easy to carry to the vision of iTopia to the absurd as is often the case when we try to foretell a future which doesn't exist yet.
And Apple is going to do all of this on their iDevices... without a simple garbage collector; technology that has been around and proven for 15 years. Apple's vision of the future involves throwing developers back to the programming stone age. Garbage collection is a critical to modern programming as object orientation and even method invocation; technically you dont need any of them (if you write in assembly) but they sure make everything so much easier and more maintainable. Apple needs to stop hanging on to the past and allow its developers to write code the modern way. Screw Flash. Give me garbage collection or forget it.
Tony, your comment at about 38 contained a whopper of a false 'statement of fact'. You boldly and incorrectly seem to state that an app for the iPad requires that you first acquire it on a PC with iTunes and then transfer it to the iPad. If that is not what you meant to imply then I apologize for this correction. You do need an iTunes account which currently means you need a Mac or PC and the initial step in setting up a new iPad is to connect it via USB to a Mac or PC.
But after that you can purchase any music, movie, TV program or app directly onto the iPad without a Mac or PC anywhere to be seen. The same is true for an iPhone and iPod touch. On a day to day basis they can all be quite independent from any desktop computer. (They can be charged by connection to a power outlet USB).
This may be a distinction without a difference but it would not be hard to imagine a cloud based replacement for the functionality provided currently by a desktop computer. I mention all this because I already acquire most of my apps directly on my iPad and iPod.
"Have you seen what's happening to Gizmodo because they took the iPhone apart?"
It's not because they took it apart, it's because they bought a stolen one and took it apart. Receiving stolen goods is quite definitely against the law.
Engadget posted photos of the new phone. Apple did nothing to them.
Apple - read Jobs - may be gambling on free, more or less, monster pipes but really that is not their big hold card. All they really want is the music/app store to make sure that anything they sell provides them with their vig. Their guaranteed cut of the action is the point. The music store model, which has never gotten a penny from me, was the holy grail. Money for nothing and chicks for free. Apple found enough suckers to give them free money and all they had to do was use the Monsanto model. You want to grow plants on your ground, fine, so long as you pay me for the seeds.
As for the HP plan well they are seriously stupid. Android has a small chance in the same way that Linux has and for the same reason. Apple is the new Microsoft but without the soft and cuddly parts. Microsoft was the new Germany, Apple is the old.
(This from a guy writing this on an iMac and who still has NeXT for Intel software hiding in a room.)
"What would be glorious would be for someone to carry out such an attack at an Apple product launch..."
Your notion of what is "glorious" is profoundly stunted.
That's not glorious, it's petty, sociopathic ankle-biting.
I totally agree. Apple is building at LEAST one mega-gigantic cloud computing server farm right down the street here in North Carolina:
http://www.macrumors.com/2009/07/07/apple-to-build-1-billion-server-farm-in-nc/
Chris
I expect it's only a matter of time (probably less than six months) before the AppleTV is revamped into being based on iPhoneOS and the iPad CPU.
And hooked up to the App Store.
Whomp, practically overnight Apple would have a hi-def game console with thousands of inexpensive games and other apps, an ad network, and a network gaming infrastructure.
All they need are controllers, and iPhones/iPods/iPads could make pretty sweet touch-screen controllers.
"The reality is, that in 2015, you'll make this comment... And those 100Mb will be enough."
Well yes, by modern approximations of acceptable speeds they will, but just as you've noted that broadband services will improve, mightn't our demand for bandwidth also?
It's like saying one hundred years ago I wanted a horse drawn carriage instead of a horse, but now that I want a car instead of a motorbike, I should be happy because the motorbike is enough. The argument is still based on antiquated notions of what 'enough' is.
What happens when streaming Hi-Def 3D movies, TV shows and video games becomes the norm? That's still current technology, not to mention what other advancements the next five years will see. 3D was unheard of (in it's current incarnation) a few years ago...
"All they really want is the music/app store to make sure that anything they sell provides them with their vig."
Except lots of apps are free. No vig. And most are cheap.
Charlie, this is off topic but,... how can you, as a working author, spare the time to read all of these comments? I've spend nearly an hour catching up on them.
The only apple device I have is an iPod. The hardware is reliable, the interface great. If the data on it happened to be stored elsewhere that wouldn't bother me - as long as it was sufficiently secure (non-modifiable, non-deletable, not-readable by others, always available). The only advantage I would expect from offsite storage would be that I could recover it in the event of a hardware failure.
"The availability of 50+mbps data everywhere means that you don't need to keep your data on a local hard drive"
Wireless speeds don't necessarily translate to "I'm outsourcing all my data storage." You might do that, but to others it means being able to have local drives at their house that interface with the network.
This article seems to make the presumption that everything is a black and white binary choice, i.e. that one will either outsource to a cloud computing service or die. In point of fact, I see powerful desktops in the home as serving as a personal cloud server, something which the article does not seem to acknowledge.
Yes, big companies would love it if your vision of cloud computing becomes the one and only thing, but I predict you are wrong.
So let me get this straight. Your prediction is that in spite of after another consecutive year of record profits in excess of 100 billion dollars, the entire PC industry will explode in a fiery apocalypse any day now, and only Apple will survive, protected in its lucite and titanium walled vault. Then, as the fallout clears up, it will survive with its apps and developers dressed in white robes living off survival food and fending off attacks by zombie mutant IT companies and software hackers who used to work for what is now the charred remains of Microsoft, HP, Adobe etc?
Someone should option the rights to this pic. Sounds like a great popcorn seller.
"Microsoft is not doing badly. In fact they are building their own version of the cloud services. Azure."
Azure (Which is mostly "Oh, look: Microsoft is doing Cloud Computing. Yawn.") by itself isn't particularly interesting. The *real* strategic move is a piece of it that has received very little attention to date, currently code-named 'Dallas': http://www.microsoft.com/windowsazure/dallas/
Local storage works brilliantly ... right up until you're mugged/your house burns down/the dog eats your server. There's really no substitute for redundant networked storage.
A couple of responses come to mind here. The first is that networked storage doesn't exactly have a pristine reputation for safeguarding user data either and there is the small matter that your data is inherently more important to *you* than it is to third party custodians. Still, I can agree that the vulnerabilities of networked vs. local backup are different, which means that one can make use of both as a belt and suspenders approach to keeping data safe, which brings me to my second point.
That second point is that if I stipulate what you assert, all that does is demonstrate that the market for redundant networked backup is potentially long-term viable. With cost of local storage so low, I would still contend that there is no reason not to have the primary store of data be local, avoiding any potential latency/contention issues at relatively negligible cost. Put simply, there may be a good reason for wanting your data backed-up in the cloud (at least for those who, unlike me, aren't paranoid about data security issues) but there is no good reason that I can see for customers to want their data to reside in the cloud and the cloud alone.
Apple is clearly on the path to gain Market share focused to win through consumers purchasing in cyber space. The golden hand-cuffs are the cross-section of free apps mixed with purchaseable ones until you can only buy them using their products.
They are essentially reverse engineering Microsofts accomplished plan and its working.
Remember how Apple got started (IBM)
The Data Centers they are buying up will be to support their own information and power.
The sexy hardware will eventually lose it's sex appeal. By this time, Apple will become the evil Microsoft monopoly which will be securing itself daily from the hackers of the world.
E-Commerce has always been the play.....Amazon may come in second. SaaS and absolutely the cloud will be the model of the future as it is secured.
Well, I don't think american is Mr. Stross' first language. He's probably more comfortable with english.
Would you mind if I submit your blog post to OSnews.com? I found this really topical and insightful.
This is completely backward: "Even if he's reduced to giving the machines away, as long as he can charge rent for access to data (or apps) he's got a business model."
The machines are the business model; the data is the means. What percentage of Apple's profits are provided by iTunes + App Store? Almost none.
The last thing Apple will ever do is give away machines. The core lesson from the last 10 years of Apple's success is that software and networks are most lucrative as the means to sell devices.
CORRECT!!! Steve Jobs said it himself during his iPad introduction, "Apple is a mobile DEVICES company."
Apple reversed the HP model: Price the devices (printers) cheap to make a killing on the services (ink) - aka the Gillette model. Apple priced the songs at $0.99 to make a killing on the iPods. Half the apps in the App store are free, and Apple doesn't charge either the developer or the consumer for these apps.
Apple is content to let Adobe, Google, and Microsoft make killings on software as long as it doesn't impede Apple's ability to sell hardware. In his words on Flash, Steve Jobs pointed out that half of Adobe CS users use Macs. Excel, Pagemaker, and Postscript were the killer Mac apps of the eighties.
On the other hand, Steve Jobs won't ever let another company hold Mac software hostage again - if he can do something about it.
@164:
There are many narrow hardware bottlenecks that have to be supported for low cost content creation to continue on relatively cheap consumer PCs. For example, most content creation from writing to video editing requires large, high resolution displays. Displays suitable for more than twitter are already in short supply; shrinking the PC ecosystem towards some CEO's wet dream of SAAS will only make things worse. Don't expect the server market to help here as most servers are headless.
There was a period of a few years--ending only a few months ago--when it wasn't possible to buy a desktop monitor good enough for even hobbyist photo editing without spending over $2000. It only ended when the LCD industry relented and started production of sRGB eIPS panels.
It doesn't take a lot of shrinkage in the PC ecosystem to put up barriers to entry for content creation.
This is a brilliant article, actually. As to whether the time frame is right or not, I could not say but just as in StarTrek, that ubiquitous computer is just there...always and that is what is beginning to happen. Right now, it in the home where there are often several computers lighted and running and waiting for orders. Whether we go to cloud or some other methodology is not so much relevant than that for certain, most people have no real need for a number crunching desktop or even a laptop. They just want to be able to access Email and surf the net, learn on line and so forth and for that, a decent bandwidth will do the job. Considering for ten years I suffered on a 26k modem but did not give up, goes to show how important even a terrible connexion keeps one tethered to the world outside. Just remember the very first time one logs in on some file or webpage across the world. I found myself looking at files in Finland on InfoMac. That was hit..the needle went right into my artery and I was hooked. This was 1994. With the numbers of changes that have occurred since then, how could changes not accelerate faster and faster? Apple is doing the smart thing and you are correct in their reasons for doing what they are doing. Smart, simple, and fast is the name of their game. They knew this with their good webpage from years ago. They had this figured out that we just want to connect NOW, we want safe downloads,safe software, easy and dependable updates, no crap on the computer from Apple and a beautifully built machine. People are willing to pay for this..up to a point but that point is dropping as people also realise that they do not need even an iMac. They need an iPad and that is more than enough. What with tens of thousands of applications ( apps) and piles of music..how can the average user need more? Throw in a super fast 4G..it is a no brainer. A desktop computer is for specialised purposes but for the rest you have it well figured.
"Elsewhere, in the developing world, the market is still growing — but it's at the bottom end of the price pyramid, with margins squeezed down to nothing."
...the real reason why the U.S. is missing the boat - this is all we have to say about billions of new customers finally getting a voice in our global economy. Try to stay away from Steve Job's cool-aid everyone ;)
Okay, this is the US perspective, but I don't know of a nation in the world (maybe France?) that doesn't have the same issue:
Where the hell is the cloud going to be built?
We've already got server farms that have the power needs of small cities.
How many more of these cities are we going to build? And where, pray tell, are we going to get the power for them?
We've already got every undeveloped acre in the US BLM deserts spoken for by multiple speculators, not that most of the projects are viable, but we'd be insane to actually build all these projects (downwind from scouring sand dunes? Great place for a solar farm). Ditto for wind farms, which have fewer issues.
This is the thing that bugs me about the cloud: we can't support the electrical infrastructure we currently have, and the cloud is going to need massive amounts more. Perhaps, just perhaps, switching to cloud computing is going to help force people to fix the mess we're currently in, but I'll believe it when I see it. Personally, I think it's going to crash and burn in the long term.
Most of the power project advocates in the US are pumping out 90 percent BS, to the point where they're ignoring good sites for power plants and line corridors to run them through wilderness areas, apparently just to fuck with the environmentalists. It's stupid, but there you are. That's my definitely biased opinion, but when they want to build downwind from windy sand dunes and across huge desert washes in flash flood country, they're doing it out of blind greed, not because they have a clue.
I hate to think of what will happen when these clowns start building nuclear power plants. Complex energy technology and greedy investors don't seem to mix well in the 21st Century. I suspect some of them are investing in cloud computing as well.
I do agree with the analysis of Apple's and HP's plans, and what Google's doing with Android. The article does somewhat ignore the open-source alternative for skilled users, but I'm confident we'll continue to hack around, regardless of what the majority does. What I disagree with is the practical implementation of broadband wireless on the scale described.
Granted, I expect wireless service providers to stream bits over every scrap of spectrum they can get, and I expect wireless chipset and equipment vendors to continue to develop ever more advanced codings to stuff more bits-per-second through a channel. But I think some real physical limits will limit the growth of available bandwidth, and growth in subscribers will cause demand for bandwidth to outstrip supply.
First, spectrum: there's only so much available, and as some commenters above pointed out, it's not all usable for all purposes. Lower frequencies travel further and penetrate buildings better, giving larger cells covering more subscribers, while higher frequencies allow smaller cells and more density, but suffer more from obstructions. Spectrum is also jealously guarded by incumbents, expensive to increase your share of, and there's only so much that's usable. According to Wikiedia, the North American GSM bands total about 690MHz, but being generous let's say in 2015 we may have 1GHz of spectrum covering the dense urban areas.
Second, spectral efficiency. 802.11a-1999 delivers 54Mbps on a 20MHz channel, or 2.7bps/Hz. 802.16d-2004 gives around 5bps/Hz, and we can expect 4G and LTE to do better. 802.11n-2009 claims a theoretical 600Mbps on 40MHz, but that's acheived with four overlaid signals using MIMO, so the actual efficiency is 3.75bps/Hz. MIMO helps, but it makes devices bigger since the antennas need to be separated physically and polarized differently. Channel coding is one place where Moore's law does help, since more powerful DSPs allow more efficient modulations. However the Shannon-Hartley limit still applies, and even with super low-noise designs you'll eventually get into diminishing returns. The five years between 802.11a and 802.16d almost doubled spectral efficiency in OFDM, and 802.11n is in between but benefits from MIMO. For a ballpark number, a usable 10bps/Hz is probably acheivable by 2015, giving 1Gbps available in the densest cells.
Third, efficiency of the MAC, QoS, and error-correction. 802.11a may offer 54Mbps, but the actual usable throughput is about half that, due to frame timing delays and the primitive retransmit policy. On WiMax networks, half, a third, or a quarter of the bits are sacrificed to redundant error-correction, which is still more efficient than retransmitting an entire corrupted packet. Good QoS also goes a long way towards improving the overall experience, but voice, music, and video will be serviced before the bulk data transfers used for cloud-storage. Even ignoring bandwidth lost due to QoS, a one/eighth reservation for error-correction would give a usable 875Mbps in that dense urban cell.
Finally and most critically, this bandwidth is shared among all subscribers in the cell. 875Mbps sounds great when you're the only one using it, but once you're sharing it with 20, then 50, then 100 other subscribers in that cell, it'll get painful. Your carrier will have some fraction of that bandwidth, and in order to recover the costs of building these dense high-speed networks, they'll have to sign up as many subscribers as possible. Multiply this by the increase in load from people transferring multi-megapixel images and HD video, even highly compressed.
As much as I love high-bandwidth mobile computing, the first two points suggest that bandwidth will be a slow and expensive resource to grow, the third suggests we won't get to use it fully, and the fourth suggests we'll have to share it among a rapidly-growing bandwidth-hungry subscriber base. Barring something which fundamentally alters that balance, I'm skeptical of a rosy future for ubiquitous mobile computing on wireless broadband.
Apple has iAd/Quattro & Siri. They have what they need.
Re-reading my post I see it's too late for me to be doing math (10bps/Hz on 1GHz bandwidth gives 10Gbps, not 1Gbps), but I think the difficulty of growing the bandwidth supplying relative to the incentives driving growth in demand will be the real problem.
Do you truly believe in this "Cloud Computing"?
Do you remember for or five year ago all PC/Technology magazines were all about "new paradigm" of the "Grid Computing" which "WILL CHANGE THE WORLD" ?
Who remembers about grid computing now? Did it change the world? I do not think so.
Each wave of hype about "grid computing" "cloud computing" ... in 4-5 years it will be "shadow computing" or ""ghost computing" or "blaze computing" or whatever new advertisement slogan marketing experts from big companies like Google, Oracle or Apple will invent is about new wave of making money. It is not about technology. What's new about storing your data in a network location? Sun was speaking about it 15 year ago. They just did not manage to wrap it into shiny box of nice marketing terms "cloud computing" and sell it with a high mark up as Oracle did with "grid computing", Amazon/Google/Apple do it with "cloud computing"
Have you ever tried to use "cloud computing".
With all its power outages and other problems?
Perhaps, you want to keep your data on your devise (which become smaller and smaller, have you seen 250G flash drive?) instead of copying it to a "cloud" data storage
I do believe in technologies and I do not believe in all this hypes of "cloud/grid/whatever computing"
Glad I'm not the only one more than a little concerned over the lack of any meaningful discussion on ergonomics.
Tactile keyboards for touch-typing. Looking at the monitor with head/neck at an appropriate angle. Screen REAL ESTATE that's not going to be replaceable in a portable format.
The best counter-argument I've heard is that the ideal computing device will be both. Just as laptops can already be docked for "desktop" duty, so will the pads be able to. But considering the limitations of physics on such machines (chips need cooling... 17" MBP baconator anybody?), I just don't see that becoming the norm outside science fiction for the forseeable future.
Rather, there will still be workstations with keyboards, monitors, and the whole shebang. These might be powered by desktops, increasingly by laptops, and to a small extent (a very small extent unless processor power and storage capacity ramps up rapidly) tablet devices in the next decade or so. But as long as there is that "dock", and a desk with those peripherals, we're not really into a "new era of the tablet". Thankfully. We just have desktop computing in a different form factor. And frankly, I don't see how that's much different that what we have now with syncing portable devices to stationary one.
Can't remember what this has to do with Flash.... something about walled gardens... oh crumbs, we've derailed!
Well. Be fair. It's a movement based largely on the "sexiness" of the hardware. There's no such thing as an Apple fan or an Apple press release/keynote that doesn't gush on about the "gorgeous" industrial design in a dozen different fetishist ways.
iPod. Just an Mp3 player, right? Didn't look like the other MP3 players out there. iPhone, initially groundbreaking due to multi-touch screen... now that there are many, people still almost always comment on competitors' "boxiness" or "lack of sophistication" in terms of design. iPad... come on people, we've had tablets forever. But they didn't look like the iPad and its magic "just waiting to be touched" design.
So yes, it's a movement. And it's about style. Some of it is the simplified, accessible UI (software). But the hardware is a huge factor, make no mistake.
Stuart,
Last time I tried to use Ubuntu One, it took all traffic on my network to 1000 ms latency and uploaded slower than dialup. I've subscribed to the bug on Launchpad, but I haven't heard about any movement.
Allow me a little skepticism that this will compete with Apple if they make a serious foray into this market.
Smoke and mirrors.
Graham: Charlie, this is off topic but,... how can you, as a working author, spare the time to read all of these comments? I've spend nearly an hour catching up on them.
I don't; most of these came in while I was in bed, sleeping.
Luckily I have forseen the likelihood of being simultaneously boingboinged, slashdotted, reddited, blogged in the NYTimes, and making the top ten on HN, and appointed Minions with comment-moderating privs to police the trolls, griefers and flame-merchants who drifted in from some of the sillier, more immature parts of the 'net.
Heteromeles: Okay, this is the US perspective, but I don't know of a nation in the world (maybe France?) that doesn't have the same issue:
Where the hell is the cloud going to be built?
We've already got server farms that have the power needs of small cities.
How many more of these cities are we going to build? And where, pray tell, are we going to get the power for them?
Iceland, apparently. Cold air for cooling even in summer, lots of uninhabited terrain, and copious geothermal energy.
(That's their business plan for digging the country out of its current economic crater, and it's not an obviously stupid one, although other factors -- the cost of laying all the trans-Atlantic cables it'll take, and the inevitable packet latency issues -- need addressing.)
Greetings,
Just to address the many comments that have said, 'Feh! I love my keyboard! After all, I'm a web developer and I can't do THAT on an iPad!'
Yeah. You're right.
But the large, large majority of people aren't web developers. They aren't writers. They aren't...us.
The majority of commentors on this blog post are not the majority of prospective customers, and one thing Apple's been really good at is building products that the non-technically inclined can grok.
Sure, YOU can buy 4TB of disks, network it with your Drobo, install an IMAP and SMTP server on your hand-built Linux box, and serve it wirelessly in your house while you sneer, 'Why would anybody want to store their data in the cloud?!?' and...check your gmail, because that's the email address everybody has for you, and subconsciously you're always worried about your DSL line going down and losing email... :)
Most people won't do that. They'll just check their gmail, or hotmail, or ymail, and not even really THINK about the fact that it's stored 'in the cloud'.
Here's the sad, painful truth. The average person doesn't care about the technical issues we find so transcendent. It's not that they're less intelligent, it's that they have entirely different priorities.
So they'll buy into keeping their data 'available' on Apple's network. They'll buy client devices that you (and I!) will scoff at, but that meet their needs. If Apple can keep the 'brand' of delivering an exceptional user interface (and thus we tie this back to Flash restrictions!), then Apple will capture a substantial number of those users with, what to you and I is just a 'dumb terminal'. Well, okay, maybe a 'mildly intelligent terminal'.
I certainly agree with the technical arguments that many have made about the wireless bandwidth not being supportable. On the other hand, the argument about cloud resources being unreliable is remarkably...backwards. Having built services on a 'cloud' system (EC2) it's...almost unreal how easily redundant you can make your system, and resilient to unexpected loads.
Great article, by the way. Very thought provoking, especially the core concept: whether the future you describe is correct or not, these companies are all acting like they believe that it will happen like that.
And that's the most important piece of all.
-- Morgan
Interesting essay. Some of your points could potentially be extrapolated to other industry sectors also facing challenges to their business models - automotive, hollywood, etc.
Looking at the price of DropBox, and even allowing for the price to halve every year until 2015, I still don't see moving everything into the cloud being that attractive - and I don't, yet, have much digital video.
And while streaming is an option for content produced by other people, I still generate over 1 Gb of still digital photos a year, so god knows what it is for people with children and HD video cameras.
(Although I can equally understand how they could be sold on a system where they don't need to think about backup, etc).
Equally, the App Store strikes me as one in the eye for the idea that 'web apps' / SaaS is the future. The web as delivery mechanism, yes, but I think there is still a - rightful - suspicion to rental models. (See also iTunes sales vs persuading people to subscribe to Spotify).
My guess is that what we'll end up with is some hybrid model - local solid stage storage is getting cheaper and larger all the time, and the best way to improve performance is going to be using that as a huge content cache.
Cloud computing depends on reliable communication.
There's a lot been said above about how easy it would be to disrupt wireless system, and I know some of the people saying it have the technical knowledge to back up that claim.
My wired broadband has lost contact several times this year. In every case, the info I have points to the ISP's hardware suffering from a mammary reorientation event.
Getting this sort of thing to work is going to need some big changes behind the scenes, in how the system is built and managed to be reliable.
Not the computer: the system. I could trust Apple to produce a reliable computer, but how many different parts would I have to trust for a reliable cloud?
A lot of this sounds like sheer lunacy on Apple's part, and pipe dreams from a LOT more people.
I don't think people in Europe, or even the Urban areas of the US, appreciate the SHEER SIZE of the US.
Outside the coastal metropolitan areas, and a few large cities in the interior, people are still using dial-up from having NO OTHER CHOICE (other than satellite, which is expensive and sucks for the gaming aps many want broadband for). There is LITTLE hope of these areas getting reliable 3G coverage, let alone the things to come mentioned in this article, in the next decade.
I personally have several gaming friends (all of us in our 30s & 40s) who recently WENT BACK TO DIAL-UP (from satellite), in Texas, because there was no DSL, no Cable, and no cell signals for using a wireless provider. I live in Kentucky, with access to DSL & Cable, but the cell service sucks for both phone and data, until you get 30-40 minutes closer to a major city, and then only in the I-75 corridor. In fact, my home is in an effective dead zone for wireless, for ANY carrier.
Any attempt to even THEORIZE a possibility of going to a wireless infrastructure in 90% of the land area of the lower 48, is right up there with the idiocy I heard one big-city environmentalist who suggested that everyone in rural Kentucky should get rid of their cars for mass transit (where LARGE county seats have 6-10 thousand people, and most counties are lucky to have a seat with TWO THOUSAND, and have the highest pop density in a 300-400 square mile county, and yet terrain makes it to where even in such small areas, it can take an hour to go, at highway speeds with no red lights (45-55 MPH, not an urban 25 MPH speed limit), a distance that is 20 miles as the crow flies). The population density and terrain of most of rural America CANNOT support wireless, or public transportation, or nearly everything else that the California coastals and Northeast US urbanites take for granted - not even with government subsidies!
I'm trying NOT to interject politics into it, but it does also influence why so many "blue state" types with their high-pop-density infrastructure just don't GET the "red state" types, whose way of life is so different BECAUSE those luxuries the other guys take for granted, CAN'T Exist in their area. (not simply "don't", but CAN'T, because of terrain and the need for self-reliance, in place of utilizing "group" resources like public transportation and cell service - let alone luxuries such as a SEWER SYSTEM).
A lot of number-crunchers and CEOs could really benefit from being totally cut off from their current lives, and have to live utilizing a rural community infrastructure. Maybe they'd make predictions and plans based more on the reality of how 90% of the physical area of the US' population lives (And consider making the MUCH more affordable wired upgrades to the infrastructure, than waste time on wireless pipe-dreams that only affect the 10% that are metropolitan areas).
10 years ago I was using a Pentium 133mhz with Windows 95 installed on it. Today, I use a MacBook that synchs with two seperate cloud servers where I store the majority of my files, plus a separate server where I store work material.
I get my e-mail on a touch screen mobile device that is able to receive a steady 10mbps connection, and will also synchronize files with my mac at home using a cloud service.
The evolution of this platform has gone by in the blink of an eye, and today, I do 50% of my work on mobile devices. I dont even have an iPad yet either.
Anyone who says that this kind of evolution is not possible needs to look to the past for answers. The internet snuck up on us in the blink of an eye, remember that.
All this high level BS about SJ's motivation, when in actuality it is nothing more than G R E E D.
To the fanbois, what planet do you live on? It will be ages before the US has the coverage and speeds needed to provide the infrastructure to support the this thesis.
"And why has MS canceled the Courier project? What do they think they know that other companies haven't figured out yet?"
I think they probably know they don't want to introduce the tablet computer version of the Zune. Just a guess.
In the late 18th, the 19th, and early 20th centuries, "company towns" began to be formed, small communities centered around a factory -- towns that had "company stores" to provide the workers with foodstuffs, clothing, fabrics, hardware goods, etc. In time, these stores came to be considered symbols of oppression.
Something similar has been approached, but until recently been never realized in the new world of data handling.
Imagine, for example, the furor that would arise today were Microsoft to engineer a new Windows operating system that would prevent totally using any word processor other than its own WORD application. Only this year did the European Union force Microsoft to present other internet browsers than its own EXPLORER on an equal footing in the latest version of WIndows.
But Apple, always fiercely defended by its ultra-loyal devoted partisans, has seemingly managed to create its own "company store," successfully selling one data handling device to which it totally controls normal access, the iPhone, and now presumably, the iPad.
Many now look with growing disappointment at the company's restrictions on outside resources, and its censorship or suppression of software it finds objectionable -- sometimes disgracefully on purely competitive business grounds.
Certainly, Apple has the right to sell what it wishes in its own stores, but preventing others from selling software to its products? That's precisely the 21st century update of the "company store." And forbidding outside developers to speak out about their relations with Apple -- is this not Big Brother in action?
When commentators have been critical on this point, Apple devotees have responded: "It's a company, and they can do what they want." And also, "There are contracts for the developers, and they signed them willingly."
Those writers are displaying a woeful misunderstanding the law, because, just because both sides have signed a piece of paper with words written on it, a valid contract is not thereby created. There are many, many reasons such paper agreement can be considered invalid---and chief among them being a finding by a judge that its provisions are against "public policy."
Also, I think there is a reasonable probability that many of the provisions of Apple's absurdly restrictive "contract" with developers for its iPhone (and presumably iPad) system would be voided with a court challenge, since they are clearly against certain public policies. Attempting to forbid, by a specific provision, an outside developer from speaking out about relations with Apple, and about the contractual provisions themselves, is certainly a BIG BROTHER, perhaps Fascistic, tactic! Should this muzzling not be against public policy?
Monopoly avoidance is another such public policy
Perhaps Apple's "company store" policy can also be voided, because Apple does have a quasi-monopoly, established by its restrictive operating systems, over the hardware universe it has pioneered.
Great post indeed.
May I just add that information management is a story of diastoles and systoles alternating virtualization and embodiment.
On one end, there is the cloud, on the other, there is ambient intelligence. In between stands large bandwidth, ubiquitous devices and personalized data-stream processing.
From comments above, I foresee that Europe might be the new place of choice for innovation ;-)
Flannel: To the fanbois, what planet do you live on? It will be ages before the US has the coverage and speeds needed to provide the infrastructure to support the this thesis.
Simple answer: I'm not American, and neither is Apple. It may be headquartered in Cupertino, but it's a global corporation, and it's not catering exclusively to the needs of rural-Americans. In fact, it doesn't need to cater to them. The urban market is more than large enough to provide a decent level of profit.
Your personal corner of the American continental land masses is not the centre of the universe: get with the program.
The investment to build any new processor chip is mind-boggling. Do it right, and billions of new customers at small margins may beat the small number of customers for high power who can pay high prices.
Apple has gone from 68000 through PPC to Intel, and then added iPod/iPad. Premium design, but they can change.
There's been a lot of changes under the hood in the Intel line, but I find it hard to see Microsoft as being flexible.
Apple keeps trying. It also has failures. This is Charlie's extrapolation, rather than a leaked Apple plan, but Apple do come up with new ideas, and sometimes they work big.
Sometimes they work more than once. Look at how they re-worked the all-in-one concept of the Mac Classic.
But for the iPhone they had to work with a mobile phone company. I live on a small, densely populated, island. I'm not in some remote valley, but if Apple picked the wrong partner, there's be no point in me buying the product.
And if I were thinking as Charlie suggests, I'd be looking hard at the intersection of politics and infrastructure. There are all sorts of ways of managing the development of the New Network, but I can see it being worth pushing certain political ways of thinking about problems. Can something such as this be left to an unconstrained market?
Well put. The limit to PC growth is lack of user skill, and we are up against it now. Poorly defended PC software, both OS and applications, requires attention and knowledge to keep it running. Many current users don't apply enough of either, through choice or ignorance.
When most people want to go somewhere in a car, they don't need to be outfitted with a complete machine shop capable of making any kind of vehicle, which is then used to build them a car. That's sort of like giving people a general purpose computer and then using it to run a web browser.
As an ol' "computer utility" guy, who bought the idea that the value of the computer was access to shared data back in the 60s, I view this talk of "the cloud" with mixed affection and suspicion. Yes, of course, the little pile of data on my private machine will be less useful than access to the world's information. Yes, of course, centralized experts COULD have better access to tech that will make computing cheap, powerful, reliable. But the details of how clouds will work are going to determine whether they avoid security, reliability, scalability, and usability disasters.
The great bulk of all wireless capacity that has been created or will be created is accounted for by cell subdivision, i.e. the creation of more cells. The first requirement for a cell is a backhaul link, and 2xE-1 lines is nowhere near enough any more.
Further, I recently noticed Google advertising for Salesforce.com administrators and developers; it looks like once the Google has slurped up all our data, analysed it, crunched it, and found the bits that are really valuable to it, it pumps them into yet another big cloud operation. I wonder if Salesforce uses Google Enterprise?
@175:
I was going to make my perennial plug for American jobs jobs jobs, in this case, a mondo rewiring project (to go with the mondo highway upgrade project, etc. People really seem to underestimate how much literal spadework needs to be done for a lot of projects.) But then this came up @223:
So I'll veer into the another Closing of the American Frontier post. It seems that a common thread for a lot of these discussions is the poor shape of rural regions in the U.S. - not just in terms of the condition of the fiber or concrete roads, but in terms of employment opportunities, education levels in the general populace, age, etc. The urban corridors and the more densely populated suburban regions are just fine, thank you very much.
Given the state of American finances and American can-do attitude, maybe instead of having hundred-gigabyte feeds reconnecting Rural America to the rest of us, just the opposite will be true: those regions incapable or unwilling to rewire their infrastructures to bring them up to 21st century standards will just lag farther and farther behind instead of having anyone else picking up the slack at the federal level. The King Log Way to going Galt if you will. Connecting the dots, I'd guess that things like broadband/cloud computing will cause the Square States to empty even faster than they are now and the mid-21 will see a net population flow to the urban centers and costal regions where amenities like good public transportation (perhaps nuclear powered) is the norm. So instead of saying things like "Why don't we try to colonize the sea floor or Antartica before trying to colonize space?", the new meme might be "Why don't we try to colonize North Dakota before trying to colonize space? The Dakotan regolith is rich in aluminum and other metals which could be processed . . ." :-)
I agree with Robert Cailliau who states he, and I, will always own a computer and not rely solely on any terminal or iPad-like device...
http://www.robertcailliau.eu/Blog/2010/2010-en.html
An idea with perhaps limited appeal: I thought at one time automobiles might be equipped with some sort of computing resources[1]. Noting fancy, but you could have a rig in the trunk (or some sort of secure space) with maybe a terabyte of memory and some modest chipware which could then connect with your tablet or phone or whatever via a secure broadband connection. Shoot, put the heavy machinery on your bike which is parked outside the coffee shop.
A silly alternative future perhaps, but one I find appealing because it's silly :-)
[1]This was more of a story idea, Americans being wedded to the automobiles the way they are. You might as well put in the metaphorical kitchen sink as well as the hobby room in the back of your mechanical mover.
While generally insightful about the trends and implications related to cloud computing, most of the comments miss the point entirely. Some have already hinted that the notion of Apple giving away their devices is ridiculous. Let’s look at the iPad hardware for a moment: A highly sensitive capacitive touch screen matrix with a beautiful IPS panel, high quality glass and aluminum surfaces, all powered by substantial battery reserves and a relatively fast processor. The individual components bring nothing new to the table, but the reality is that no competitor has even come close to presenting us with something this well put together. People did not know they wanted this, they did not expect much beyond unresponsive and inaccurate touch interfaces, TN panels with terrible image quality, easily scratched cheap plastic surfaces, or two to three hours of battery life. Without this foundation, any advantage of the iPhone OS would be diminished. It took Apple’s “magic�, or rather their insistence on design aesthetic, presentation, and usability to focus our attention and make people realize what they really want from this category of device. Their competition is only now beginning to realize this and has demonstrated it by pulling upcoming products that were essentially crippled netbooks.
What does this foundation enable? It primarily enables a platform that is intuitive, responsive, and largely invisible. In other words, it enables an ideal platform which with to consume content and accomplish the basic tasks that even the most simple users expect from their computers, but in a new and engaging way. The hardware is so good for its purpose that it disappears, as it should. Typical issues of malware and mis-configurations also mostly fade away. The question of whether something is stored locally or on the cloud is less relevant. While the nearly port-less iPad certainly benefits from this not-really-that-new category of services, the very fact that such premiums are charged for increased local storage should signal that they are not its raison d’être.
Average users just want computers to work, and as long as they fulfill certain expectations, they are judged on how easily and quickly they allow you to fulfill them. Anyone who has a computer expects to check their email, look at pictures on it, and watch videos of some sort. In fact, the iPad targets people from whom the computing experience is mostly limited to these actions, Apps are the very tasty icing on the cake.
There are important discussions about where computing is moving, especially with all the trends pushing towards a mobile environment. The iPad will not, by itself, be the device that pushes us significantly in one direction or another.
There's another reason why Steve Jobs hates flash:
It simply isn't that good! Flash quality pales next to newer technology.
Flash will be obsolete in a few years.
Spot on. I think a lot of the naysayers aren't thinking broadly enough, or strategically enough. In addition, many of them are getting lost in "forest for the trees details" that ultimately won't matter.
Time will prove you correct and this is something Microsoft is *really* running out of time on.
Hmmm... Moore's law isn't petering out. Look up memristor. That is just one example of a technology that is going to extend Moore's law for at least 20 more years. In fact, the memristor allows 3D circuits that are as tall as they are wide and deep. That means we could see a time in the near future where Moore's law becomes invalid because we have to replace the factor of 2 with a factor of 4.
You can divide computer history into "the before times", the killo age, the mega age, and the giga age. The before times are the times before integrated circuits and before we could put thousands of transistors on a chip. Based on the number of transistors on processor chips the before times end around 1969. The kilo age ran from 1969-1989, The mega age ran from 1989-2009. We are in the dawn of the giga age which should end no later than 2029 when we enter the peta age.
The model you are proposing is basically the same as the television/telephone model. A model in which the terminal device is relatively stupid compared to the network. The growth of the PC, the smart phone, even now the tablet, is based on the terminal device being at least as capable as the network. I believe that the economics of the situation continue to support putting the "smarts" in the terminal device and not the network. As devices get smarter following Moore's law and cheaper following Grosch's law people have a strong incentive to buy new terminal devices every 1 to 2 years as they currently do. OTOH, network operators would have to buy millions of devices, servers and disks, every year to keep up the the advances of Moore's law (and the much faster rate of change of storage devices). The network providers are incented to install devices and run them until they either fail or until continued operating costs exceed the cost of replacement. (That is a bit over simplified, but you get the idea.) That means that the "interesting" parts will continue moving to the personal device while the network become primarily a transport and transcoding service.
Why do I need to store anything in the cloud when I can have a terabytes of storage in my pocket? Why do I need cloud computing service when I can have a thousand cores in my pocket? I only want the cloud for back ups.
The network providers are fighting like mad to keep from becoming a pure commodity. Most hardware vendors realize they are nothing but a pure commodity. Dell figured that out 20 years ago. Failing to accept the fact nearly killed IBM. Now we see that Apple is fighting like mad to keep from becoming a pure commodity. To bad for them. Gates bailed from MS when it became clear that the operating system was about to become a pure commodity.
I've been watching this happen since before times and I hope to live long enough to see the dawn of the peta age. Watching the mighty crumble before the onslaught of technology has been fascinating.
Ol'Bob
I suspect that Steve Jobs is like me. He went to a couple Flash websites. He hated them, because Flash websites break the web experience in several annoying ways. After a few failures in visiting Flash websites he made the same decision I made; that any website made of Flash wasn't worth visiting at all, ever, for any reason.
In other words, Flash sucks. Apple doesn't suck. Thus there cannot be Flash on Apple. It's a perfectly understandable decision. (Whether Flash doesn't suck for other applications than building a website is an open question, but my suspicions are against the technology.)
Besides this issue, Charlie, I think you're right about Apple's ambitions. It agrees with my personal knowledge of where the wireless business is going, plus a few things I know that I probably shouldn't talk about on a public forum. However, the very human forces mentioned above - the failure of other client/server models, plus the preference of people for being in charge of their own data - probably means failure for some or all of Apple's plans.
Albeit your cyber-vision of looking around the curvature of the earth and through the cloud to derive from scifi movies what we have all seen to be thus far a meer fantasy. Apple dictates who your mobile carrier is and how your OS is run like feeding the smelly sheep in their storefronts. There is still something very personal about the Personal Computer. It defines and identifies a space you can call your own.
There is a feature in Windows 7 which I see (in principle, don't worry) as a much more likely way of having your data available everywhere. I'm stressing YOUR data in that sentence. Namely, you have YOUR data on your PC (or whatever) stored inside YOUR home, and you get worldwide access to it, including mobile access. I don't think people will ever store the bulk of their data like they store their money - there's no interest :D, just overhead and uncertainty.
I may just be old-fashioned, but I don't WANT cloud computing or software-as-service; I want a computer that will continue to function when I'm in the middle of nowhere, and not turn into a thousand-dollar doorstop; and programs that I can load, use, and tweak to my heart's content.
The reasons I haven't bought an iPad all run along these lines; if it doesn't have a big hard-drive and enough power to run Photoshop, I don't want it...
I am sorry, but what exactly is MS lacking behind in?
At this point in time, over 80% of the world´s computers are running some sort of windows OS; Xboxes are the best sellers for games platforms; smartphones are using the Windows Mobile system and they have just released a successful OS in WIndows7.
Apple hardly controls the hardware scene with 10 - 15% market shares; and in my opinion they are making a fundamental mistake - the walled garden - which will eventually come back and bite them on the bottom.
We sometimes forget that we are all consumers, and at the end of the day what does the consumer want? good, nice and CHEAP.
Apple mobile products while nice are hardly cheap, and as for being good, they create lots of problems; most notably flash, lack of converging technologies, no usb, tethering and the walled garden ethos. Anyone with a PC can access content and share it very easily, both virtually ande physically but can't with Apple products.
Also, future business models will have to factor in the overwhelming case that people don't want to spend money like they did before. Nobody will pay for something they can get for free.
pepgalego: there are numerous mistakes in your response -- around one per sentence -- but I can't be arsed correcting your misconceptions because it is evening and there is beer to be drunk.
Nice one Charlie..retweeted by me (and some guy named Wil Wheaton).
What is MS lacking in? Innovation. Otherwise they wouldn't be trying to pull a SCO on Android right now. Has anyone actually seen Steve Ballmer and Darl McBride at the same time?
They may have a point though, since Android lifted their C# language and is trying to sell it as 'Java'. Wait...
Yeah but the diskless net computer thing never took off, whereas the cloud computing thing is in full swing already and getting bigger and bigger by the day.
Golly, where do you guys live? I'm located about 5 miles as the crow flies from the main campus of the state university and the best any carrier can do is 1.5Mbps DSL. No wireless--I can see the tower, they just don't bother to point anything this way. I can drive an hour from where I'm sitting to where the best net connectivity is 28.8K modem over POTS. Cell coverage is spotty.
Satellite is available, gives about 1Mbps down and modem speeds up and runs about $99/month. I shouldn't complain, though. There are places in the state without telephone or electricity.
You guys really need to get out more.
Will you trust the cloud with your data when companies have been known to lose data, have data stolen, and go out of business? Well ... would you store your money in a bank? You don't have to go back many years to find that many people would say you were foolish for doing so ... for exactly the same reasons. Banks go robbed; bankers embezzled their customer's money; banks folded and all the depositor's money disappeared. But when was the last time that happened in the US or anywhere in the developed world? As societies we decided that it was important to make bank deposits safe, and we've done so. (Don't bring up the recent financial crisis. No one lost their deposits due to that. Yes, *investments* lost - but we really never claimed to protect those in anything like the same way.)
It's quite true that, today, you have little protection against a variety of cloud vendor failures. But it will soon be in the interest of vendors to put in place insurance, escrow agreements, and other mechanisms so that they are in a position to offer strong assurances that data you hand them will be kept safe. They'll have to do this to get commercial business, and it will then migrate over to consumer offerings as well. Beware, however: The price will go up. The business model of all the "$5/month for unlimited data storage" is simple: If they lose your data, they give you back your $5. While disks themselves are very cheap, keeping them working properly isn't. If you look at the plans they offer to their business customers, you'll see much higher levels of protection, SLA's - and significantly higher prices.
Does this mean you won't carry data around locally? I doubt it, at least not in the foreseeable future. Even if ubiquitous bandwidth goes up - and it will, but it will take a while - latency will remain an issue, network glitches will remain with us for a while, "dead zones" will never completely go away. But I suspect the model will be like IMAP: A "safe" copy out in the cloud, cached copies in one or more devices that are *usually* connected and quickly synchronized, but you can operate in disconnected mode and sync later if you have to.
BTW, if you don't like the money example - consider electricity. Which is more reliable: A local generator or the power grid? Unless your scale is rather large or you are in a very unusual location (at least in the developed world), it would cost you a huge amount to get your local generator to come even close to the grid's reliability. (This has nothing to do with moves to solar or wind or other localized generation - which almost always relies on the grid for backup when the sun "fails" to shine for long enough, or the wind to blow.)
-- Jerry
Is the iPad really that tethered? One of the more eye opening things to me with an iPhone was installing an app on the fly. Anecdote: my partner had to get a plane flight at very short notice on weekend. So we manicly booked a Qantas flight on the web and dashed out of the house to for the airport. And realised we didn't know the departure gate and time might be tight. So while I was driving along an express way and through a tunnel, she downloaded Qantas' app directly to her phone and then checked out her flight booking.
Of course that needed 3G and I gather lots of iPads won't have that, but surely they have the same app store functionality?
However, in general (backups, syncing a lot of other stuff) your tethering comment is on the nail.
- Cameron
I'm a believer in the cloud, but I want it to be My cloud.
It really isnt that hard to do, you just need a cheap linux machine at home, and a good broadband connection with high up and downstream speeds.
My setup:
Open Suse 11.0 running SAMBA, DNS vna VMWare server. All free and running on a PC it cost less than NZ$300 to build. It could be your old desktop. It has 1GB ram. This is enough to run as a basic desktop, and VMWare server for Linux apps. They are very light weight.
My DNS looks up from the root servers so I am not at my ISPs advertising mercy,
SMABA shares disks containing over 1TB of movies, music and ofice data files.
One VM is a Cyrus iMAP mail server that gets mail from my GMail and ISP account.
Another VM is my OpenDNS VPN. I Allow connection from SSH and the VPN through my firewall.
That is the hard part done.
Anywhere I have a net connection on any PC I can install VNC, or OpenSSH client and get a secure link back to home, then run any app I want just like I was at home, including streaming video and music. If I need an app not on my laptop, I simply remote desktop vis VNC to my server with a full desktop Linux.
What makes all this work is STANDARDS. SSH, VNC, HTTP and HTTPS, iMAP, iCAL etc. My cloud, available anywhere I have net access. I dont need iPad to have all my apps on it, I just need it to have VNC or SSH and a network conenction. This is what makes a netbook a NETbook.
I dont need a big laptop for everything any more. I only use my laptop now when I need a bigger screen than my iPhone.
The other advantage is don't NEED to have an iPad, but all the other major players except Dell have just killed their tablets. I cant believe I'm hoping Dell will provide some competition to Apple 8)
I dread Apple being in control of the internet - the company that brought you the mighty mouse whose claim to fame is the un-cleanable scroll wheel! They replaced it by making a mouse uncomforable to hold and no buttons at all! Strange as the Mac Mini is one of the BEST mini desktops ever. How could these designs have both come from the same company?
Creative types won't like this. Why? Let's take video editing for instance. One can easily fill a 500 GB hard drive with raw footage from a shoot - and that doesn't include graphics or even the final product! Imagine trying to download all of that data from the internet.
The interesting thing: Creative types tend to be loyal to apple, but if this is the model of apple, creative types will have to flock to PCs. Then again, as someone who has worked in television for ten years, I can honestly say that PCs do a better job at editing than any mac I've used.
Apple has gone from 68000 through PPC to Intel, and then added iPod/iPad. Premium design, but they can change.
There's been a lot of changes under the hood in the Intel line, but I find it hard to see Microsoft as being flexible.
Historically Microsoft has been far more flexible than anything Apple's ever gone through. Which given the sheer diversity markets its successfully competed in is pretty impressive. Its possible that this is changing, but people have written them off before.
Apple has changed a little in recent years, but they still remain essentially a niche player, who make high end hardware for wealthy consumers who are willing to pay a premium for design. Its a good niche, a profitable one and possibly one that can survive commoditisation. But its still a niche.
I have no idea what the future holds, but the following seems to generally hold for the tech industry:
1) Yesterday's kings are ususally tommorow's road kill.
2) Yesterday's desirable tech, is tommorow's cheap Taiwanese/Shanghai clone.
3) The future is largely defined by the company with the highest volumes, which given Apple's (quite sensible) business plan is unlikely to be them.
Fantastic article.
One huge miss though, Google. They are chasing this exact same future but with on big difference they are 10 steps ahead since from day one their were a cloud operation, focused on data and services, the most successful Ad revenue system ever created and in a much democratic path with its Android OS.
Apple's iron hand is probably going to be their demise or their limitation while Google with Android constantly tries to include hardware manufacturers, from smartphones, TVs and even cars. Google's vision os a unified ubiquitous platform(connected to their cloud and consuming their ads) is a much safer bet than what Apple is doing.
A lot of people seem to think that it's an either/or proposition. Either you control your own data or you trust it to a corporation. I think that we are approaching an era of virtual storage. For example, I have something like 4 computers that I use to some extent, 5 if you count my iPhone. I go to some trouble to make sure that any computer that I might use for a particular purpose has access to the most recent data that I might need, using a mix of manual copying and a patchwork of synchronization software.
But there is another model in which my data is buffered in the cloud. It still exists on my computer, or at least one of my computers, so if I am cut off from the cloud or my cloud provider loses my data, it is at most an inconvenience, as the cloud copy can be reconstituted from my local copies. If my hard disk dies, I can restore it from the cloud (Keep in mind that the average person today does not have their data properly backed up, and you'll begin to get an idea of the appeal). I don't have to worry too much about keeping track of where a particular piece of data actually lives--I can consider all of my computers and the cloud as one big disk. If I need a the latest version of a file in a particular place, it is there, with perhaps a synchronization delay.
Yeah, I know. Do I really want to trust a corporation with my precious data? Actually, most of my data is not that sensitive, and encryption methods exist that would satisfy all but the ultra-paranoid. And my data is already in the hands of corporations. My banks have my account numbers, my social security numbers, my credit card numbers. Anybody I order from has my credit card number at least transiently, and I'm trusting them to get rid of it after I place the order. And of course, the security of my home PC depends upon me, and everybody in my family, being careful about which links they click and which software they install; it's probably no more secure than the computers of the vendors I deal with. I'm sure that there are some people who are so worried that they use a credit card that allows them to create one-off credit card numbers for each purchase, but most people don't go to such trouble. And a product/service that appeals to most people will sell very well.
I haven't been able to read both Charlie's article and all 248 (as of the moment) comments. The comments I've read reflect most of the arguments for and against Apple, Jobs and their rule-the-universe plans I've read or heard.
To date, this has been principally what we in the US call an "inside baseball" discussion. The debate was mainly between us techies and geeks. As a perfect example review John@244.
That changed - at least here in the US - on Thursday 28 April. On that night, John Stewart, the brilliant comic host of the comedy news program, The Daily Show, went on an 8 minute anti-Jobs, anti-Apple rant. The proximate cause was the gestapo-like door breaking entrance into the home of Jason Chen by a California "Rapid Enforcement Allied Computer Team" police unit. Chen is the editor of Gizmodo who reviewed the iPhone 4 which was found in a northern California bar.
Stewart is a self-proclaimed Apple fanboy. But, he overflows with anger at the tactics Apple has used. Although his specific topic is the Chen/Gizmodo fiasco, it seems as though he may have heard some of the complaints coming from the tech community.
I commend it to you all:
http://www.thedailyshow.com/full-episodes/wed-april-28-2010-ken-blackwell
Stewart speaks to an audience which is likely to be Apple oriented. So, his arguments may have more influence on the public than any of us have been able to achieve.
The great thing about fiction is that it can alter not only (present) reality, but the future too.
"The Universe is made of stories, not of atoms" (Muriel Rukeyser, as quoted in Andreas Wagner's
'Paradoxical Life' (2009)).
The greatest added value of your essay is that it has produced 250 comments here, 500 in Slashdot and who knows how many more in other influential sites. All these comments represent a healthy down-to-earth (not atmospheric) think tank that does have the critical mass to alter the future.
In posterity, you will be remembered as Cirrus Minor, the mortal that warned humans of the perils of worshiping false gods bearing cheap gifts.
"despite the cost of the hardware exponentially dropping towards zero"
Asymptotically
/nitpick
@neil weston
'Telstra G3 network does actually cover 98% of the vast area of this Country [Australia]'
no it doesn't, not even remotely.
http://www.telstra.com.au/mobile/networks/coverage/index.html
Looking at the map, I'd say geographic coverage would not greatly exceed 15%. (and this is borne out by my experience travelling through central Australia)
Holy mother of god!!! Its the beginnings of Skynet! Soon there will be Jobbots, I mean robots ruling all....
Info released by Google says that Android users are unlikely (and often unable) to apply the latest updates to their phones.
Android phones are often stuck with the firmware version their phone
shipped with primarily due to fragmentation problems that require the
hardware maker, software platform vendor, and the mobile provider to
work together to create and deliver custom updates for each model.
On the other hand, Apple iPhone updates deploy rapidly.
Hewlett Packard recognizes the problem of synchronizing different hardware and software vendors and bought Palm to try to overcome those issues.
As long as Microsoft relies on vendors like HP to integrate and deploy their product for resale to end-users, Microsoft will be selling crap. Apple owns the channel and is using it to provide superior products and superior experience. The best experience Microsoft can offer is to cancel more of their vaporware.
This goes against everything that made the internet successful. The decentralized nature of network components allowed a type of robust resilience against any type of failure (corporate, physical or otherwise). The openness allowed for an anarchy that led to rapid innovation and global accessibility.
What we are talking about here is analogous to digital democracy versus a digital dictatorship, where steve jobs is the all seeing, all knowing decider.
"I don't understand why microsoft are doing so badly."
Y'know, it's hard to take any of this seriously when we have apple fanboys (not charlie or any of the comments here, thankfully) patting apple on the back because nokia has a measly, tiny 60% share of the mobile market, while the great, all-powerful apple has a MASSIVE 16% share.
So no. What on-the-fly redefinition of 'badly' is MattP using? M$ had a monopoly position in OS and office markets (it still bosses apple by a very wide margin) and used it to force itself on other sectors. That it is doing less well than before is IMHO down to regulators and suits addressing the monopoly position.
I'm not defending M$ in any way. Simply I can't put up with the delusional thinking of apple fans.
It now occurs to me that MattP was being sarcastic. If so, sorry MattP. The rest of my comment stands, tho.
"Will you trust the cloud with your data when companies have been known to lose data, have data stolen, and go out of business? Well ... would you store your money in a bank?"
When the screwup happens, the bank can replace my money with other money; it's nice and interchangeable like that. Data, not so much.
@Tony: You are misinformed. There is an App Store app on the device, from which you can install apps directly. There is an app-size limitation if connected via 3G, with no size limit over Wi-Fi. After the first "activation" connection to a computer, the device can be used on its own, forever, without ever touching a computer again. You can create, sign in to, and sign out of, iTunes Store accounts (also used with the App Store) directly on the device as well.
This applies to iPod Touch, iPhone, and iPad. In addition, all of the above goes for the iBookstore on the iPad as well.
@Tony: You are misinformed. There is an App Store app on the device, from which you can install apps directly. There is an app-size limitation if connected via 3G, with no size limit over Wi-Fi. After the first "activation" connection to a computer, the device can be used on its own, forever, without ever touching a computer again. You can create, sign in to, and sign out of, iTunes Store accounts (also used with the App Store) directly on the device as well.
This applies to iPod Touch, iPhone, and iPad. In addition, all of the above goes for the iBookstore on the iPad as well.
We shouldn't forget the reasons behind Job's 'control freak obsessiveness' attitude. I found this interesting Steve Jobs. Behind technological choices there's still a human story.
Further to my remarks above, Charlie badly needs to check out this eComm Europe presentation by Moray Rumney from Agilent. It's thick with chewy data.
@ 232
Reminds me.
You are talking about the "before", kilo, Mega, Tera-ages.
Right - HOW MUCH TOTAL computing power?
What is the computing information-density on this planet?
Now & in 10 years time?
Charlie got there first, in "Accelerando", and thought 1 MIPS/gram was close to the inflection point. I don't know how far we are down that road yet, or how close we are to uploading Lobsters / Cats / Us, but I think we need some answers based on real numbers, real soon now, before we can continue this discussion in a maningful manner.
As for the greedy fascist S Jobs (see also post #247) - ignore him: he is trying to control the computing universe, and as I said before, other people have tried this, in other technologies, and it has never worked before.
Furthermore his (corporate) behaviour would be illegal here.
good write and very nicely centered around the topic given. i enjoyed reading this. the idea of cloud data storage all the big web3.0 G,A & M(blers) are focusing will happen, soon.
I would recommend The Daily Show rant as well, for UK viewers found on 4OD at
http://www.channel4.com/programmes/the-daily-show-with-jon-stewart/4od#3057607
Shown here on the Thursday, the 29th of April. Only available for another 4 days.
Disclaimer: I only read half thread...
It is funny, that the topic is covered in single sentence/paragraph, and the article continues with irrelevant technology daydreaming.
In my opinion, the primary reason Apple is banning Flash from their new devices is that Adobe failed to follow their architecture. An operating system without sound architecture is doomed to fail. Examples in the Microsoft world, as many as one could care to see.
But, if you follow the Flash developments, you will note that the most recent Flash version (10.1rc2) does now support hardware acceleration on the Mac. Something, Adobe have been claiming is not possible. How come they discovered it is possible?
It has always been possible --- via the proper abstraction MacOS is offering.
If they care to follow the same route with the other Apple platform -- the iPhone OS (which is, basically stripped down Mac OS) -- there will be Flash on the iPhone/iPod/iPad as well.
Is Flash such a big deal? No. There are other methods to deliver multimedia. Developers have been just too lazy to look. The same way, as you describe Apple wanting to control the food chain, for years, was what Adobe was doing with Flash. Is this big deal? No.
In few years, Internet users might not even know what Flash is (was). Just as nobody today knows/cares what V.34(bis) is/was. And it was big thing in it's time.
50-100 Mbit in mobile networks? No way! There is simply not that much spectrum available. WiMAX is too much hype for nothing. How much can fit in 20 MHz? The current 802.11n speeds are 300 Mbps over the air and about 200 Mbps actual data throughput -- for a single client! That is the total capacity of the cell. And it is half-duplex. And needs 40 MHz with MIMO. Achievable speeds drops rapidly with increasing the number of connected clients.
In contrast, any fiber network can run 1Gbit, or 10Gbit speeds, full-duplex. Any fiber can be used with CWDM or DWDM to split it in tens of 1Gbit or 10Gbit links, etc.
The Internet has changed. Once, the Internet was having a high-speed backbone, with speeds falling towards the edges. This is by the way, the model the mobile networks follow.
Today, the Internet is fast 'local' networks, that might span cities, interconnected with slower 'pipes'. And the Cloud servers sit somewhere away. Will this appeal to anybody?
To the mobile user, sure. To the normal computer user, no way.
Muy bueno! excelente!
great article!! congratulations!!
I love how Job's AKA Mr Politician says that Flash cannot support multi-touch input when we wrote an open source library and put on google code almost a year before the iphone was released... (touchlib.com)
Shown later in a demo here: http://www.youtube.com/watch?v=JOckVTgWBr4
> I love how Job's AKA Mr Politician says that Flash
> cannot support multi-touch input when we wrote an open
> source library and put on google code almost a year
> before the iphone was released... (touchlib.com)
I think it's a given that Jobs is not a techie like
you. Whatever he says anything remotely technical
related, all the following phrase - "and widely
deployed as a mass market product"
If touchlib is so good, why doesn't Adobe incorporate
it ASAP? Maybe because your code is GPL'd?
Ah, now you sell the dilemma facing Adobe....
If I understand this right, then Dell's Databackup service is essentially attempting to move my data into some kind of cloud. But Dell is a hardware seller, so how does that fit?
Plus, I have a 1Tb flash drive which connects to my pc by usb 2.0. I can backup all my data from my computers onto it, and when i need to i just connect it to another computer. When full I can get another or buy one twice the size and transfer everything to it. why do I need to risk all my data to the cloud?
Cloud computing is already here in the face of Google.
Apple will have a hard time rivaling Google in this rapid developing market. Apple already have slow development of the Mac line. Large corporations are currently moving away from the Microsoft products, but they are going for Google Apps, not for Apple products. In five years, I suspect there will be devices sold with GoogleOS.
If all apps will be delivered as a service, how come you need to install apps at all on the iPhone? You have no clue what you are talking about. Apps are hybrid on device/saad at best and will stay that way because jobs has more control.
Steve jobs hates flash because he loses control. You can't control what goes into an app if you allow it to run flash.
Allow flash on the iPhone/iPad os and the genie is out of the bottle.
a thought provoking post Charles and the cloud is clearly going to shake things up. I'd expect Apple to play a big part in that but I'm not so sure the availability of 50+mbps data everywhere is going to be quite as everywhere as we may like - in the developed nations yes, but the revenue potential is huge from emerging markets yet I doubt we'll see 50+mbps in many of them for a while. The BRIC countries may not be part of the Apple plan of course.
just a thought...
i also meant to add it's worth a look at Scoble's latest post to see some of the other very big barriers to making this a reality anytime soon
http://scobleizer.com/2010/05/01/the-blacked-out-world-of-music/
Charlie,
Leo LaPorte says you've never hear of This Week in Tech TWIT. If that's true, you're missing one on the best, most entertaining and knowledgeable forums on the net He says he's invited you on several times and, you refused. Listen to or watch it several times and, I think you'll change your mind.
Or, you could take at least some of what he says at face value. Flash is sort of a junky product. If there is another motive it could be more immediate - Flash has become the primary means of streaming music and movies on the web. Maybe Apple would rather people go to an iStore and buy copies.
Corporate IT has already gone cloud. Internally speaking, at least. I work for a major US bank. Desk PC's are almost extinct; they use WYSE thin client replacements, so all data and software are contained on remote servers.
My views are similar to yours about 2015 with one exception. I see most homes running a WYSE-like thin client - all most people need - which requires little (or no) hardware expenses, no software maintenance, and month-to-month rental of data space.
Google could make this happen now, and quite easily.
Apple will lose this race because they are greedy and self-centered. The computing world doesn't, and never has, revolved around Apple. I agree with you about the Mac, but I see Apple circling the drain as a viable competitor in 2015.
Google will bury Apple.
Tom in Delaware
> Google will bury Apple.
Microsoft will bury Apple - 1997
IBM will bury Apple - 1981
Good comment. But I think you should mention that Google and Jobs will decide what information will be good for us, either in advertising or in everyday news. As the present development shows nobody will stop this.
Thought provoking piece.
Then the Internet goes 3D, and we all need a desktop again, maxed out with features and appropriate hardware accessories. And so once again the mobile smart phone/netbook/ipad is back to being a second-rate auxilary device used in transit.
Nice article, and clearly lots and lots of comments, but most of them miss the point. The idea of moving data to a central server and using a "lite" computer for access has been around for at least 10 years (ignoring the original data terminals linked to mainframes). Most of the comment take up one or other specific point, e.g. wireless fragility, virus protection, etc. None of this is fundamental to the issue, which is how to create a new high-value (read high-profit) business model to replace the commodity model now emerging for hardware devices. As an aside, no one has mentioned the lease or rent model for hardware which I feel could also have a place in the future.
Anyway this a call to drop comments about app. details and focus on the basic idea - cloud+lite devices, and a new business model!
Nokia are building a huge cloud too and are investing very heavily in software engineering disciplines behind it.
and what happens with linux :D
there probably gonna be some servers from where you could access linux os for free... or free software becomes forbidden by law....
Hint: don't compare the US market for mobiles to say the European market.
One of the showstoppers for any competition is that most US carriers are using contracts/services bound to a specific device. (I remember my first mobile, pre-GSM, when I was in highschool around 1991/1992 had no SIM slot either.) Without the possibility of using a different phone device (even worst case eating the price of the subsidy), there cannot be even theoretically a market.
Fact is that I've been using data on mobiles since at least 2005, I've been installing at least Java apps without asking the carrier for a very long time, ...
One of the major reasons everyone is looking to the cloud and "software as a service" is that it will completely eliminate software piracy. Or at least that is what Microsoft believes.
I read Jobs' letter, and frankly, it's bull. He says that Adobe has exclusivity to Flash player? Who cares, it's free! I have already played probably half of the games in the app store for free on newgrounds (amateur animator's website). Furthermore, what Jobs' doesn't say is that many flash movies are limited to the origin website (like newgrounds). I agree with Adobe: Jobs just wants his money-sucking app store.
Bruce:
Real reality check:
(And 24% behind you know who).
This will never happen in 5 years.
One reason: Porn.
As long as explisit websites use flash, machines that won't use flash will never be a true success.
Granted, everybody want one, but they will return to a device that can use flash because of this.
The technology of wireless broadband will never outclass cabled broadband because of nature. Sure, in cities it'll be ok, but in places with fjords, mountains and maybe even some obscure earth radiation, it will never work.
(Talking to myself - how pathetic is that?)
I forgot about the Lala streaming music service that also has a substantial cloud presence -- Apple bought them and is "shutting it down" at the end of this month. Expect to see their tech resurrected as a new wrinkle in itunes (or some suggest -- a web-based version of itunes)
Uh. The Grid IS a cloud. New name, same tech both still working.
I don't see this as an either/or situation. Just like radio receiver users eclipsed radio broadcasters but hasn't ended HAM radio, so too will Data devices (both mobile and desktop) users gradually surpass regular computer users. And that's cool. Data devices, whether tethered to large corporate clouds or small household clouds (already working on this at my house), will continue to multiply.
As for the ergonomic complaints above, c'mon, don't be deliberately dense. Apple already makes docks that can output video and already make displays with the computer attached to the back (iMac). So sure, in a year or two there will be 24" tablet screens for desktop use and they'll likely span across multiple screens. Or you'll be able to dock your iPhone and have a 24" monitor and wireless keyboard available. Thinking the current 3"/9" diagonal screen is the end of hardware development is just stupid.
Finally, will customers really avoid Apple's walled garden for the freedom of doing what they want on their systems? Dude! Most users, even my wife who works in tech support, does not need or want that kind of freedom. Users want their messaging, games, entertainment and basic photo manipulation. They want a bright shiny store with clean aisles, full of bright and shiny and that's it. Walmart knows this and Apple's figured it out as well.
Now, how difficult does Apple make it, for a determined user to gain control of the hardware? Not very. 'Course, Apple doesn't care about tech users. That's why they're now pushing 'Magic' and for most folks, that's going to be good enough. Just being able to load software, transfer images and sounds and not being nickled and dimed by the phone company is a step up. Currently, iPhones help folks feel more in control then they have been.
This is an excellent entry, one of the most provocative I have read in a while. I wonder however, if, while gazing into the future, you may have used too narrow a lens.
Pushing different aspects of computing out into the cloud affects far more than just personal computing. In fact, I think we could make a reasonable argument that personal computing will be the least impacted area of computing as the cloud becomes ubiquitous. Why? Several reasons, the most significant of which is the fact that cheaper/less-powerful computers are in most cases not going to be necessary. Moore's Law keeps working, and computers get more powerful and cheaper at a predictable rate. True, like one of Zeno's paradoxes , even though we keep halving the price, computing never quite gets to be free, but the distinction likely approaches irrelevancy as the prices decline. And vendors still remain profitable with narrowing margins if they are able to streamline their business processes. The best example of that is probably the disk drive business, which has existed on razor-thin margins for years. Economic Darwinism, perhaps.
For me, a much more interesting aspect of what is going to happen in the cloud has to do with the impending shift from using computer products to using computer services. Once data and services get pushed out into the virtualized space of the cloud, the relationship between data and physical location starts to become irrelevant. It isn't irrelevant today, mostly due to network latency, but as new technologies provide ways around those latencies (not ways around the laws of physics mind you, but ways of compensating for them) most users won't care whether the data they are accessing is next door or on the other side of the planet. That being said, personal computing is less likely to find immediate advantage in this. It's much more likely that web-based services will be most immediately useful to environments that benefit greatly from collaborative work or that use the web to increase the availability/continuity of things that are important to them. These are of course much more likely environments for commercial computing than they are for personal.
Anything in the cloud is, by definition, going to be accessed virtually by people/processes that for the most part don't really care where the data is or what processes intermediate. In light of that, I wonder if Apple playing nicely with Flash is even important? Buyers, both for personal and business computing, will make purchasing decisions based on their own needs. Those that love Apple will live without Flash. Those that love flash will live without Apple. And those like me, who like Apple and find utility in Flash, will pick and choose according to our needs.
HP is making some very nice products.
One thing apple does extremely well: a complete hardware and software line.
IF you could live with just Dell or HP or just Sony, you could do better... but still, apple would outshine.
the flash thing? that is very strange and seems like bad business...
on the other hand, see what apple did with os 4 and multitasking... my guess is that flash does NOT run well, foreground or background vis a vis power.
BUT... I sure would like some DATA to prove that. Gee.
Moore's law, as typically used today, is expanded to both clock rate and chip fabrication / component density increases.
Clock rate increases have petered out, they're approaching an asymptotic limit in the roughly 5 GHz range at the moment, with slight increases as transistor sizes shrink.
Component density increases are on track for several more generations and then they hit another "...and then a miracle occurs" in the long term semiconductor fab roadmap.
Memristors are an example of a truly 3-D chip component, but they're special. We've already stacked many metal layers on top of each other, handling ever more complex on chip interconnection requirements, and reducing the limits on chip area usage down to approaching actual silicon plus spacing buffers used in logic elements. But most components can't grow up to the degree memristors can.
Memristors are a special case; they're not even really "silicon chip fab", they're wires with a bit of titanium oxide in between them, and a little doping. You can easily go from one layer (X lines, memristor layer, Y lines) to two (X, mem, Y, mem, X2) and more (X,m,Y,m,X,m,Y,m,X repeat ad infinitum).
But this is because they're not using silicon, particularly not using single crystal silicon for a logic element in the memristor itself.
Silicon is still, for the moment, needed for transistors. And a real limiting factor in memristors is that at some point on the input you have to select which word line you have energized, and take the results of the crossbar memristor off into transistor logic land.
Various other approaches than silicon may take us further, including doped nanotubes and photonics and spintronics. But for the moment, with devices we understand now and can fab now, memristors included, we aren't 3D.
s."Well put. The limit to PC growth is lack of user skill, and we are up against it now. Poorly defended PC software, both OS and applications, requires attention and knowledge to keep it running. Many current users don't apply enough of either, through choice or ignorance."
I can't buy that. There has *always* been a lack of user skill. For instance, the security community has been trying to teach people to not open random email attachments since roughly forever. That has unsuccessful, over a period in which the installed base grew by several orders of magnitude. The same mistakes are repeated time and again, but not by users alone. An example of this is the iPhone running everything as root--a typical vendor FUBAR, and shades of single-user Windows from the 1990s.
I don't see this situation as getting anything but worse, for as far out as I can see. But neither do I see it having much affect on a headlong rush into the future. I have no idea why that is, but not understanding it doesn't provide me a means of denying it.
People, in the main, will trust their data to the cloud, and continue to trust very shoddy software. Vendors, in the main, will abuse that trust, and provide said shoddy software.
It is what it is. I mostly try not to think about that rather dismal vision, and concentrate on getting my (microscopically small) piece of it right. At some point there will have to be a societal solution, but that will arrive piecemeal, and create a rather large mess. For instance, I live in a country (US) with benighted data privacy laws--nothing like what the EU enjoys. I'd love for that to change, but even opening the discussion will get our politicians into a Porn/Terrorist/ThinkOfTheChildren lather. We could end up worse off than we are now.
Another way that could go is for legislation mandating financial responsibility for producing software that allows security breaches, as advocated by Bruce Schneier. Where would that leave software with GPL, BSD, etc., licenses, in the Land of the Free, and the Home of the Lobbyists?
Has Leo LaPorte read and posted here?
He says that Adobe has exclusivity to Flash player? Who cares, it's free!
As someone who's been bitten by Acrobat (forced update, encrypted files no longer readable until out-of-print vendor updates them) I want something that isn't exclusive to store my data, thank you.
On: "The target audience for this device is a set of people that don't even know what Flash is."
I tend to disagree on this one, even the most computer illiterate users I've met, know what Flash is, even if they don't see how widespread its use has become.
I attended the Adobe CS5 roadshow recently and the Flash session was packed, but 90% of the developers and designers had an iPhone or Macbook.
While these are mainly web developers and not Objective C coders, their clients want the product to be seen on all hardware types from mobile to desktop, they just aren't going to (or have the skill to) build an iPhone app that replicates the website, cut-and-paste some PHP code they can handle, understand that PHP code and they are in deep-water.
The end-user might not care if it is an iPhone app or web site, (except the inconvenience of having to close down the browser and startup the app) but the users do want the content. If it is only a few sites that don't build iPhone apps, then the user will blame those sites that don't, however if it is the majority of sites that don't build an app, then the user will blame Apple for excluding them from the general Internet.
I'm a JEE coder, so my front-ends aren't pretty, but the idea of NOT being cross-platform and open-standards in the JEE World is a curse of software maintenance pain that nobody in their right mind would pursue, our product choices are based on avoiding vendor lock-in, the iPhone/iPad/iTouch platform is one I recommend our business units to avoid. I wonder how many other Corporates see it the same way?
"Uh. The Grid IS a cloud. New name, same tech both still working."
For sufficient nebulosity in the definition of 'cloud', and assuming that 'The' Grid exists, when there are actually many grids, implemented with varying amounts of software commonality, and not being able to tell which one you thought was the new name.
'Cloud' is the one term which can compete with 'Web 2.0' in ill-defined hand-wavy marketing bullshit. Grids, OTOH, are generally very well defined. If I want to participate, for instance, in the LHC grid, at any tier, there are requirements to be met. I need to have a certain amount of bandwidth, storage, and CPU cycles available, I have authentication and authorization requirements which have to be met, etc.
In either case, QoS is a large factor. But grids are more about research, where budgets are tight, whereas clouds are about marketing. There are various implications here. For instance, programmers in the grid community are probably writing in different languages (FORTRAN is much more common) and code is probably more modular. Grids, still being closely tied to the science/engineering communities, place value on code portability. Again, budgets are tight.
Clouds, OTOH, are very much 'flavor of the month' commercial approaches, where vendor lock-in is important. If you write an app for the Google cloud, it's likely to be a major porting job to switch to, say, Amazon Elastic Compute Cloud (EC2). Some libs exist to help you, on some platforms, but by their nature, they can only deal with the common denominators, and common denominators are intentionally minimized by the cloud vendors, for obvious financial reasons. Vendor support for those libs obviously ain't gonna happen, save as a one-way (to our 'solution') proposition.
grid != cloud, for nearly all marketing definitions of 'cloud'.
IT vendor marketing has been interesting to watch (corporate clouds versus corporate grids) as you can get some insight into CxO cluefulness by watching where they come down on this issue.
And I think many members of the general public are sick to the back teeth of being made to leap through flaming hoops to keep their PCs running, and will gladly pay the price for anything that offers a zero-admin experience.
Microsoft's sales figures and the lack of take up of Linux contradict you.
Because, as you point out in your article, (words to the effect of) people will have to pay more for this groovy new ultra-safe experience. So they'll be charged for the safety, as an extra, when it ought to be there anyway.
The idea that one day I will have to pay a subscription so that I can "access" my own data, is so messed up I'm at a loss to express myself.
Apple need to wake up and forget this "everyone locked into an eternal subscription model" dream of theirs - people are sick to death of hearing about soaring profits from a company that overcharges for everything.
The problem with this "everything will be in the cloud" idea is that it means I don't have control, or meaningful ownership, of my data. And that, I'm afraid, is unacceptable.
Blah Blah Blah... & Bill Gates is still the richest man in the world.
Doesn't anybody get it? Steve knows that every Apple fanboy, status driven consumer, and old fart doesn't know anything about computers, so hes totally cashing in on it!
If you think that all your problems on a computer, like viruses, malware, and bugs can be stopped by a particular OS, then, sorry genius, you are a dumbo too.
Dont get me wrong, Apple does sell great products, and even has opened the world of technology and the internet to a lot of people. But they are a business, and a business is made for one thing, MONEY!
For everyone out there who thinks that Apple is here to save them, or believe if I have Apple then that equals no more problems, you might want to think about that everytime you see that rainbow wheel spin for an eternity
Computer problems can only come from one thing, and thats user error. The computer only does what you tell or click it to do, and some of those clicks can be harmful.
I like to think of the compuer and the internet like driving on the streets. There are some fun places to go to, and not so much fun places to go to. If you don't know how to get to where you are going or don't know how to drive, then get the f*#! off the road you moron!
dave h: Microsoft's sales figures and the lack of take up of Linux contradict you.
Microsoft does not sell operating systems to the general public. (Have you ever tried installing Windows from scratch? Painful, just painful.)
Microsoft managed in the late 80s to secure a cosy bundling deal with the hardware vendors, none of whom felt like devoting the software resources to writing their own competing OS from scratch. Once established, they structured their wholesale pricing to lock out competitors -- big discounts off a spuriously high list price, which would go away if the vendors didn't ship an MS OS on every box sold. At the same time, network externalities kick in; if all PCs come with Windows, then there's no point writing software for any other platform because everybody's got windows.
The consumer, of course, had zero say in this. The effect of the Windows monopoly has been to keep them unaware that things could be different in any way. Don't get me started on "19xx is going to be the year of UNIX on the desktop" (followed by 20xx and Linux). The Mac environment between 1990 and 2000 was succumbing to the same bloat and obscurantism as Windows, and was (sensibly) taken out back and shot, to be replaced by a descendant of NeXTStep with a MacOS compatability bag on the side; a decent workstation OS, but most of its complexity is deliberately hidden from the user (who'd flee screaming if they had to grapple daily with the tools hidden in /Applications/Utilities, rather than going through a patchwork of Assistants).
Speaking as a sometime IT professional I agree with you about the undesirability of data lock-in in the cloud. On the other hand, so do a lot of other people (what is Open Cloud?). And despite my contrarian position, I don't think Apple will succeed in locking their customers in, and I believe that like the Mac before it, the iPad will become a more open platform over the next few years ... but they'll provide a valuable service for those who are too clueless to secure their assets for themselves.
googly eyes: If you think that all your problems on a computer, like viruses, malware, and bugs can be stopped by a particular OS, then, sorry genius, you are a dumbo too.
Actually, I have a theory about where we went wrong. First, in 1949 we ended up with the Von Neumann architecture rather than the Harvard Architecture in (almost) all of our computers. Using common addressable storage for code and data was the first big mistake. Next, around 1970-72, Ken Thompson decided to save bytes when designing the C programming language by using null-terminated strings, rather than a type-safe data structure storing length metainformation (as in, say, Modula-2). Thus permitting all sorts of stack-smashing and buffer-overrun attacks.
Alas, we're locked in these horribly insecure design decisions at such a fundamental, deep level in our computing architectures that we can't back out again without reinventing everything, from microprocessor instruction sets to low-level programming languages and even software engineering techniques.
But we know how to fix our security headaches. We just lack the will to ditch everything and start again.
Computer problems can only come from one thing, and thats user error. The computer only does what you tell or click it to do, and some of those clicks can be harmful.
Written by a true non-programmer. (Anyone else remember Intel's F00F bug?)
This is the future of computing:
Free software, free OS, no AV-software, mostly mobile connection.
This future is really not made for Apple and especially Microsoft. Google, Linux and open source software are the winners.
Sorry, can't see that. I'm not in their marketing area.
try this one http://tv.gawker.com/5526868/jon-stewart-slams-apple-over-handling-of-gizmodo-case
Matias: This is the future of computing: Free software, free OS, no AV-software, mostly mobile connection.
Would you care to justify your assertions?
(Bear in mind that someone has to make the hardware. Hardware vendors are usually in it for a profit. If free software stacks are available, some will seek to pare their margins to the bone and sell on low price to the consumer; others will look for ways to differentiate themselves, add value, and thus gain the ability to charge more. What makes you think that one side of this see-saw will automatically win?)
For the most part and for most people, I think the data security fears connected with cloud computing will be a non-issue (that is, not in fact a meaningful barrier - whether it turns out to be a meaningful risk is something only time will tell). Most of us use cloud computing already at least some extent anyway. I've been known to doomsay about the security of cloud servers myself - and yet I use Gmail perfectly happily for work purposes, and as such have dozens or hundreds of confidential work files sitting on Google's servers. Immediate convenience and efficiency win out over theoretical danger every time.
There will always be people who take pains to protect their data privacy, who fear government surveillance, etc etc, and suich people will always find their own ways to live outside the mainstream ecosystem. But most people won't much care about this in reality - in part because they'll have implicit faith in the system, in part because they won't be doing anything (very) wrong, and in part because they'll know that no one is likely to be very interested in them anyway. With some simple reassurances in place (basic encryption, say, as with teh Goog's default use of https) most people will be perfectly happy to switch to cloud computing and data storage.
A lot of commentators on tech blogs and teh like seem gobsmacked by the way Apple, Google et al seem to be lunging for this clouded future that - to them - seems utter anathema. But in the area of mass-market tech, tech geeks are not the real market. Charlie's vision seems utterly plausible to me because it's not just a corporate wet dream but a consumer dream... Charlie has described a perfect computing environment for ordinary people, even if it's a nightmare to Cory Doctorow.
outeast: you may find it amusing to know that Cory and I are sometime collaborators (on fiction, at least).
Well, I know you've got that 'Atlas Rebound' in the works* but I hadn't known of your earlier collaborations. I'll have to hunt them down this evening.
(Oddly, I had Cory and you on my mind today before I even read this post, having finished Palimpsest this morning and started on Cory's Little Brother. It has been interesting to segue from one to the other, especially given your choice of ubiquitous surveillance as the Stasis' defining characteristic of civilization.)
*Yes, I know.
Jobs is smart enough to know that there are some serious things coming down the pipes.
But Apple has successfully used marketing to sell more expensive hardware the entire life of the company. Their niche has been steady and successful for them.
Commodity hardware and software have not broken Mac enthusiasts away from their platform of choice. And seriously, I don't see that changing.
The threat to Apple is that people realize they've been paying too much. If they haven't by now, will they ever?
On the security issue, I also am going to be one of those Luddites who keeps their personal data on a local machine while everyone else uses the cloud.
"For the most part and for most people, I think the data security fears connected with cloud computing will be a non-issue."
Well, not for me.
But you have hit on what bugs me about a lot of these comments. While I'm a professional paranoid (though I joined the profession through inclination, not to mention perhaps some childhood trauma), I agree that most people--and this includes a lot of quite very knowledgeable IT folks--have a lower threshold of paranoia than I do, and will actually use Google for their e-mail, and so on.
Moreover, who am I to object to that? They put their security/convenience trade-off elsewhere from where I do. That's not wrong; that's just an economic decision. And one I participate in: paranoid or not, I use a laptop that has not always been continuously in my physical possession or secure storage.
That's where the other side of the paranoia extends: my encrypted backups are not much good to me if I don't have access to the keys to decrypt them.
*Sigh*.
Then too, what if the iPad -- and forthcoming GoogleSlate, UbuntuTablet -- thing puts a disruptive complexion on the whole general consumer economy; making window-shopping mall-rats out of us all? I, for one, would not weep and wail. I can see a critical mass making total creative use of the situation -- even to the point of it being good for the planet. But another critical mass, of people-disrupted, will not be happy, and I think I see them already having intuitions of having their outlook undermined: digital realities coming home to roost on a much bigger nest than Wikipedia and Craigslist. When your Nexus 2 and my iPad become publishers of the main things we consume on them, will uber blogness start to get on the really long end of the lever? Will my barista be that much happier to see me, or that much more distressed over further novel limits on all the career upgrade paths?
Bill Gates is the second richest man in the world.
I would sum up this long thread of comments as "Wow, it's like talking to my elderly parents about technology". They're quite literate and argue well but fail to grasp the fundamentals of what will happen even in the near future, stuck too much on the recent past and learning through fashionable tech blogs rather than through experience (Ok, i'm not including Mr. Stross on this generalization!)
Here i will try to fix some popular misconceptions:
- Steve Jobs is neither a God nor Devil. He is merely a person.
- If you have complaints about what's out there, make something better. Don't whine about how a company is being too closed or making stuff too expensive for you. Beat them. If you can't beat them, then quit whining.
- The walled garden model of AOL has returned in a huge way with Facebook. Don't underestimate them; they are Google's biggest direct rival.
- Macs have taken over the developer market and in colleges. Go to a modern tech conference or go to a top college; count the number of Apple logos you see.
People are only now beginning to realize that computers are not a commodity and it is worth more to pay more (People who whine about price still live in their geeky high-school years when they had no money).
- Flash interfaces are garbage. Steve Jobs hates garbage interfaces.
- "Cloud" is irrelevant. We already are living in the cloud. A huge majority of computing time these days is spent using "Cloud" resources, i.e. the INTERNET.
- Datacenters are irrelevant. Well, not really, but the amount that people write about them is ridiculous. Datacenters are merely industrial factories. They're not all that special. Sure, Google or Apple might be able to pay 1/2 the price per rack than I can due to their scale, but it's really not that big a cost in the whole scheme of things. Civilians simply obsess about it because it's something they can grasp, while something like "paying for skilled programmers" goes whoosh over their heads.
I can't legitimately run OS X on commodity hardware and I don't find Windows or Linux meet my needs so I don't see how I've 'been paying too much' when there isn't a satisfactory alternative.
And I like the hardware's industrial design better than anything I could cobble together for a similar price as well.
The threat to Apple is that people realize they've been paying too much. If they haven't by now, will they ever?
I've got two Macs, purchased because they did what I wanted them to do, at a price I was willing to pay. I could have saved money buy buying a commodity PC and installing Linux and open source software, except that the work-arounds I'd have had to do to actually use them for what I want would have used enough hours to buy another Mac, at minimum-wage rates.
If you like futzing around, open source is probably the way to go. If you just want to use the computer to do something so you can go back to your life, which is mostly offline, then a packaged solution may be well worth the money. It is for me, anyway.
I'm quite curious about this, being one of those developers who doesn't, and won't, use a Mac. What does it do that you needed that something like Ubuntu didn't? Speak to me as someone who just upgraded from fvwm1 to fvwm2 last year, and that merely to use Gnome (I'm not saying that was a good idea), albeit not Metacity or Nautilus.
As for Andrew's points: is there a God, then? Woz? Dan Bricklin? Alan Kay? Knuth? Turing? Or is there just not one?
This is a great discussion, but what about the developing world. The dynamics and the economics of countries like India will not allow a cloud sort of model on wireless for a LONG time. India is still struggling with access... leave questions about cloud based access.
And given that there are a billion people in India, another billion in China, and another billion in Africa - high-end devices and the associated ecosystem that you're talking about will either not happen for another 7-8 years or you will see a whole different avatar.
A perspective to keep while addressing such paranoia: USA+EUROPE > THE WORLD :)
I have seen a lot of posts by people who confuse cloud computing with touch-typing articles on a portable device. Nothing could be further from the truth.
What we will see is specialises devices, all connected to the cloud in a mix of local & remote storage. Check e-mail : portable device, write a novel? Desktop bound device with nice keyboard, multiple screens.
And the demise of Microsoft? Microsoft has found itself a very nice specialised megaton of an office niche (for now) : remote desktop servers.
There is lots of IT companies now providing this service -- they just hire rack space in a data centrum close to a high speed junction of the internet.
Your small organisation does not want an IT person?
Just remote desktop to an organization that manages, backups & and provides all your critical apps and data (Office, Outlook whatever) on a Windows 2008 server.
And that Windows 2008 server is probably just a VMWare image that can be managed & moved between actual hardware at will.
For the end user the experience is the same -- same word, same excel. They just share part of a server.
Not an option if you are bandwidth & latency starved ; but in the connected world a good example of how the cloud is making inroads into companies as well.
@Aashish Ramdas
To quote someone smarter than me: "The future is already here - it is just unevenly distributed". (William Gibson)
Different parts of the world move at different speeds. By looking clearly you can see possible futures.
Really enjoyed a series about the Indian railways; and how things keep moving when my Western European commuter train would have long given up.
@ 304 - 307
I may be OT (again?) but, erm...
Do the conditions and problems specified in the posts above still operate if we are in economics 2.0 or higher versions?
And how close are we (that is the developed world) to that state yet?
Or even its accelerating preconditions?
Nice article. It just got quoted on Blogging Stocks.
I agree with most of this, but the real reason Steve hates Flash is the reason that I hate Flash. In fact, first, I hate almost everything about Adobe as a business. I'm reasonably satisfied with Acrobat Pro 9 as an application, but Adobe has this gawdawful patch installer that has made my life miserable for years. I've worked hard to remove all Adobe products from our workflow and Acrobat is the last one remaining. InDesign is very nice also, but I avoid it as much as possible. I have no artistic skill so Photoshop and Illustrator are not on my radar. Dealing with Adobe is not worth the pain.
Now Flash. Flash sucks. It encourages every dim-witted web designer and marketing weasel to flood my browser with irrelevant moving swooshy logos accompanied by 1980s porn music when I trip over one of their web sites. I make it a point to not do business with any organization whose website has Flash on their home page. Flash is buggy. Whenever browsers fail, Flash is usually involved. I don't want Flash on my phone, e-reader or computer.
Requiring iPhone/iPad developers to use the Cocoa Touch SDK is a good and sensible position for Apple. Most bad software is cobbled together by untrained tinkerers who understand neither the data structure, nor the purpose of the applications they code. The Apple Human Interface Guidelines and SDK frameworks eliminate entire classes of failure. Developers who want to design for the iPhone/iPad or even Mac, will do a better job, faster if they use the proper tools. These arguments for off-beat or compromise cross-platform IDEs remind me of the authors who query our agency thusly: "Pleese read my fiction book, as it is a powerful work of imagination that will appeal to readers of Hulk Finn and The Color Purple." It helps to know the language of the target audience.
"Charlie, how do you see this shift affecting corporate IT? The buzz has been there before about thin clients ... but it's never really come to pass - most companies still have hundreds or thousands of Windows boxen ..."
"What does the tablet n' bandwidth revolution mean for corporations with 5-10 year adoption cycles? Especially when the number one activity carried out on yer average business computer is a heck of a lot of typing?"
I'm not Charlie but I've seen this play out before. I started back when people used punched cards for data and now do a lot of my work on an iPhone. I was in the middle of the insurance industry in the 80s and watched this all happen then. Just with different nouns.
Back then corp IT was totally about 327x terminals accessing mainframes. Then all the data was massaged and stored in batch runs at night. Insurance companies in the US like Travelers, Aetna, The Hartdord, etc... had 50,000 of these terminals that came up every work day. Their programming was done in CICS, IMS, and other IBM "lock in" technologies and ONLY worked on 327x displays or emulators of the same. (I never understood the value of tying your data base to a particular terminal at the time until I came to realize later that it was a marketing tie-in, NOT a technical benefit.)
Mini computers were around, especially Wang WP systems. PCs were starting to show up but the IT staff tended to mock both as they weren't "real" computers. Just look, people had been claiming the end of the mainframe since the Apple II, VIC-20, etc... They would just never be good enough.
Over time purchasing cycles we as much based on the quality of the 327x emulation as anything else. And IBM was working very hard on "improving" 327x at the time. Basically making it more complicated by the month to thwart the emulations. And to be honest it worked for a while. Around 85 or 86 it as amusing to watch a PC-XT come into a company for use with spread sheets and such then the corp it guys would standarize it and with an SNA card and 327x emulation software from IBM and there only be 40K or so left for the user to run Lotus in. This on a PC with 640K of RAM. The point, IBM and the IT guys didn't care if it every ran Lotus 123. But to be allowed into the corp on the network it had to have 327x. So a lot of PCs wound up with no network connection and there was a LOT of re-typing of data. Pissing off users to maintian corp standards is NOT a new thing.
Eventually IBM lost the desktop and most folks today don't even know what a 327x is. Mainframes are still a very big market. But it's a data center back end only. It' mostly irrelevant to the desktop. At least from the viewpoint of the user touching a computer on their desk.
Now Windows is in the same position. They own the desktop. They keep doing making improvement to lock folks in. ActiveX, .Net, etc... Many decisions on bringing in a non-Windows based PC are dictated by the ability of the system to run a virtualized version of Windows, but the rules for doing this keep changing.
But over time the future or something similar to it as Charlie described it will come to pass. The question is more one of how long rather than if.
Now that I've read all the way through this I'm struck by a few things.
Most folks here don't seem to know much history. The transition Charlie is talking about has happened before. Tapes to disks, cards to 327x, 327x to minicomputers, 327x & minicomputers to PCs, and now PCs to mobile devices. There's nothing new here. And each transition seems to have as a requirement to repeat all the mistakes of the previous ones.
As to vendor lock in, research IBM mainframes about them putting core OS functions into microcode then patenting the instruction implementations. You COULD run an IBM OS on other computers but it would run much less optimally. Or the free software as long as you ran it on an IBM system. Look at MS and the DR DOS fight with Win 3.0/3.1. And there's more.
Plus a lot of folks here don't seem to even understand current facts or the obvious. Just because a tablet is flat it doesn't have to be used that way on a desk. iPad stands for use on a desk seem to double in number every week. And you just plug in a keyboard when you use it this way.
As to "you can't do this with radio" there are lots of options that break the current way we do things. Fiber to the block with smaller WiFi cells is an option. Look at Sirrus/XM radio and what they had to do to make satellite radio work in large cities. Is this what will happen? Who knows but something WILL happen. And yes rural US is an issue. And not we're not all going to move to the big cities. But areas like eastern KY (I grew up in western KY and it's not much better) and eastern NC where the population density is below any practical CURRENT way to give them broadband are problems. But you can move to Raleigh, Greenvile, Wilmington, etc... if you need better. No need to move to NYC. Something similar will happen in India and Africa and other such places. Lots of Africa has cell towers that run off solar with diesel backup because the demand is there. Something will happen.
The AVERAGE consumer buys a windows machine now because it's familiar. But that is changing. People used to buy bias ply tires in the US instead of radials for a long time then over about 5 years bias ply vanished. It was the same issue. Familiarity.
There's more but most of the comments here seem to be by people who have a limited experience of the history of computing and a total lack of the ability to put themselves in the place of a typical consumer buying a product. As others have said, just reading this blog makes us not typical consumers.
Great blog article and great discussion points. As a local Silicon Valleyer, born, raised, and living and working next door to all the above I do see this dramatic undertow current of technology shifting focus happening around here. Palm, literally 200 yards from my front door, is now HP's latest gamble to stay "in the game." We'll see... HP, by the way, next door to where I work, has had a buzz about the HQ campus since the iPad launch...and fellow Microsofters who work at the MTC, just a stones throw from the Googleplex at Shoreline Ave., is scrambling to get some game aside from their Enterprise customers.
Apple's got the whole world in SV scrambling because they are actually cutting the market and making it happen, whereas the others have just been talking it over their afternoon Starbucks or Peet's Coffee runs. Heck, I'm using my iPhone to buy my Starbucks latte with their iApp, which incidentally, can now be used in EVERY Target retail store with a Starbucks...yep they have infused Apple seeds into the coffee beans here!! Honestly...Apple's reasons for the uber secrecy is no surprise. They grabbed and caught the right set of waves just as others thought it wouldn't break until 2012 or so. I mean, would you necessarily want your iPhone 4G being reversed engineered by some Android guy who had beers with you over a Shark's hockey game night out and see them come next Monday with YOUR design-to-market???!!! Think not! The valley is too small and people know people around here...everyone! Heck, even Larry Ellison knew about the cloud concept back in 1985... a year after Macintosh, no secret, but still he's not sharing his strategic Captain-of-Industry vision just so IBM can undercut him after some keynote at Oracle World...nope, not a chance. With Ironman 2 releasing here this Friday, May 7th, we see Oracle boasting that it is the KING of the CLOUD. Go figure...especially now with their Sun rising computer armor.
With local broadband now being piloted in the greater SF Bay Area at speeds up-to 50MB/s it is no doubt there will be more touting of putting your personal data on a easy to access, fast, reliable, and highly available cloud...but I still think we'll have broadband networks into our homes. WiMax and 4G offerings will slowly build speed as local communities buy-into broadband as we have here in Mountain View, CA with Google Wi-Fi for the whole city.
Steve Job's clearly understands what Intel knew years ago, especially with Moore's Law. 10x every 6 years...heck being a former Synopsys guy who listened to tape-out engineers talk about this day in/day out regarding 40 & 30nm chips...it is only relative in time that an iPhone dock with a 30" touch-screen, maybe even your home HDTV-3D setup will be that screen...it's all here... just in crude pupa form. Once the industry realizes what Apple's about to launch with the Lala.com streaming entertainment technology for iTunes...the very idea of buying a 4TB storage unit becomes irrelevant. Seagate, WD, and HDS are going to refocus on small, portable, encryptable solid state storage for consumers, and larger, faster high-capacity systems for the cloud. We're already seeing that generation of storage arrays hit our data center designs...it's inevitable.
So, speculate and postulate as you will... but be prepared... the future is happening now and has left the building as far as local Silicon Valley is concerned. It is only now a race to market for a competitive edge for those who didn't innovate the current consumer inflow that Apple has harnessed with subtle genius. Definitely have to say, sitting on my iPad 3G typing this has been fun and has me more involved in my computing than I have been in years...Flash is dead...or at least on life support. :)
So as we say here in SV... change is good, change is change, change is inevitable... those who resist, well, best of luck to ya and hope you enjoy driving your petroleum-based transportation! ;) (yeah, we've got Tesla here, too!) :)
-M@
I've come to this a bit late and right down the bottom of the comments but I just wanted to say that this piece is the single most chilling thing I've ever read by Charlie.
Mathew: With local broadband now being piloted in the greater SF Bay Area at speeds up-to 50MB/s it is no doubt there will be more touting of putting your personal data on a easy to access, fast, reliable, and highly available cloud...
Is that MB or Mb per sec?
If the latter, we've had it around here for years.
If it's the former, well, it suggests SF is finally playing catch-up with the state of the art.
Last month, visiting friends in Tokyo, they noted that in their (new) apartment block, the choice is between 100BaseT and Gigabit Ethernet to the fat pipe in the basement ...
The success or failure of new technologies depends on the ability of existing infrastructure to support them. I've seen a lot of drive-bys in the comments posted here by people who were clearly mistaking the conditions in their own parochial back yard for the Natural Order if the Universe. In contrast, Apple appear to be cosmopolitan enough to realize that the USA's antiquated utilities aren't the be-all and end-all of the game if you're playing on a global scale, and to be designing a product line for the next decade rather than the next earnings quarter.
I can run Xcode and Interface Builder and hence create apps for the iPhone, iPod Touch and iPad.
@174:
Corporate IT isn't going into the cloud at a great rate just now. They've spent trillions of dollars over the last 20 years or so building up enterprise infrastructure and management fiefdoms and they're not going to let go of all that really easily.
True, they are not going to let go: but large corporates also disappear all the time as they fail; they are on a lifecycle like everyone else.
not in europe
really good cell phone coverage
Here in the UK, things are somewhat better. In other countries, such as Japan (where I happened to be last week) "home broadband" means a choice of 100mbps or gigabit ethernet, and they don't even have GSM phones -- they leapfrogged that generation completely.
Really? In Japan they leapfrogfed GSM?
Ehm, no. The underlying network, their 2G network, is called PHS. It's a different sort of cel network then GSM is and is incompatible.
The only reason you can now finally use a phone in Japan is because your phone has WCDMA support.
WCDMA/UMTS also runs on 2G PHS systems which means that you are running your voice calls over UMTS now.
They never used GSM, true, but PHS was and still is, crap as well with much lower reach then GSM has.
Charlie,
In case you haven't heard it, the Angry Mac Bastards discussed this essay this week (starting about 20min in).
http://angrymacbastards.blogspot.com/2010/04/episode-58-is-up.html
Sorry, wrong episode: http://angrymacbastards.blogspot.com/2010/05/episode-59-is-up.html
For some commentary by the Apple engineer who wrote launchd (the OSX program that replaces *init* found in other *nix) see the first 3 minutes of his talk at Google in 2007. The talk is long ago, but given that data point and keeping Charlie's forecast (above) in mind, I think Charlie is on the right track. (Hint: It's not all about Flash. There is a bigger goal.)
http://video.google.com/videoplay?docid=1781045834610400422#
Hi temptingfate.
To compare with Monsanto it would need to accomplish tree facts:
-Each song would be purchased only in iTunesStore.
-You could listen it only a limited reproductions number, and then vanished.:)
-In the Apple hardware you only could have and listen songs purchased in iTunesStore.
None of this is true, then it is not the same by far.
And for the rest of the subject, I thing that the article magnified the things. These are business, hard business perhaps, but not a war.
Only said that we, the people, must be aware against monopolistic strategies and think more in Open Source ecosistem, as simple as life do it.
Uf! what a long name, hello!
I have a Mac, I'm running a Mac, I have going into web you posted with a Mac, and they pop up and say me that have and update about an application that I don't have. I can see that things haven't changed yet and more than twenty years my post-traumatic stress after have left microsoft is a ghost that not crash like windows. ¿ And say you that is going to the Cloud? aaaaarg! :o) Something similar to V
!Es broma Manuel!
Yes, an Apple Server Center is coming inside de Black Mountains, and the legacy of Jobs will go to the open source boys'. You can bet for it. The last Jobs' joke.
20 + year IT pro would never allow anything personal on the cloud. New people and stupid folks wont know enough to care. So this is going to go cloud no matter what. He who has the most money will win. Right now that is still MS. and yes you can arrive late and still win.
It's great to see what the megalomaniacs have planned for us little guys. "Computing in the Cloud" sounds just lovely until that puffy little cloud becomes a thunderhead and ends up holding everyone's data hostage.
All this sounds so wonderful, but as that old saying goes, "who watches the watchmen?"
Perhaps I'm being a bit paranoid, but having all one's data on someone else's servers, even if it's encrypted and you have their "cross-my-heart-hope-to-die" promise that it will be safe doesn't mean a tinker's damn in the real world.
interesting but I think ultimately wrong. The entire premise is a bit off base. PC's aren't about where the data is stored. They are about DOING STUFF to data.
For example - I might use my PC to watch flash video - its doing 'stuff' to the data coming over network. I use my PC to play games - its doing some stuff to the data here and the data coming over the internet (WoW) for example. I might use it to fix a picture - or skype to my friend in Japan.
Why would we cease to need PCs? When we no longer need to do any significant 'stuff' to data no matter where it's stored OR we don't feel the need to do this stuff locally.
I don't see that happening very soon. Computer Scientists have a track record for thinking of alot more STUFF to do..neat stuff.
This is why time sharing failed. People discovered that they would rather buy their own machine so they could do that "stuff" quicker and faster not have to share with other people.
Anytime their is a lull in technology this end of the PC thinking starts to pop up. But soon as really interesting and difficult things like accurate universal translation, holodeck like 3d games, useful personal robotic companions and more start to actually appear people will be less willing to share their "CPU time." They will want to own it..
Anyway we will see what shakes out but I think this is a rather dim view of possible progress. The 'PC' might disappear but people are going to want to own their own processing power so in any real sense its going to live on..
AOL was a successful walled garden for many years, but not forever.
Charlie, the future you describe for the iPad also reads like the future of Google's Chrome OS.
The fact that Google has gone so far as to create Chrome proves that they've seen the future are are prepared for it. The difference between Google and Apple is that Google's infrastructure is already in place and is mind bogglingly powerful. Chrome will be wonderfully open, it is linux after all.
@ 304
I'm definitely not a true programmer, but you are so smart that you looked beyond what I was trying to say. I'm talkin' about basic computin', which is what the baby boomers do! For a baby boomer to worry a bug in the code is beyond my point.
@ 304
I'm definitely not a true programmer, but you are so smart that you looked beyond what I was trying to say. I'm talkin' about basic computin', which is what the baby boomers do! For a baby boomer to worry about a bug in the code is beyond my point.
Comparing USA's broadband to UK or Japan is apples to oranges. Try updating a landmass of 3.719 million square miles compared to a 94,058 square miles. Same applies for Japan.
Try updating a landmass of 3.719 million square miles compared to a 94,058 square miles.
But it's not a landmass of 3.719 million square miles in population terms -- more than 50% of the population lives in large cities/metropolitan districts. You only have to hook up the population, not the uncharted wildernesses of North Dakota and Nebraska; the wilderness you can leave to satellite phones.
Not a good trustworthy theory....
When the iPad came out, I thought of this series of videos that (strangely) were produced by Microsoft a few years back. I met the Microsoft VP that was in charge of producing these as he needed a venue in Hong Kong to film some sequences. Anyway, it strangely looks closer to Apple's future reality than a Microsoft future vision:
http://www.youtube.com/watch?v=V35Kv6-ZNGA
http://www.youtube.com/watch?v=W0UuzwS2z-8
I enjoyed the article, but your use of plural verbs where singular ones were needed was distracting.
I enjoyed the article, but your use of plural verbs where singular ones were needed was distracting.
You're American.
Standard English usage is to refer to corporations -- collective groups of people -- as plurals.
Consider your deviant colonial usage sneered at ;-)
Microsoft cancelled plans for the Courier to please OEMs in their ecosystem... apparently Gates was part of that decision. I know, hard to believe after he championed the product for well over a decade; they gave up just when they were in sight of having their own product. They probably thought the odds of beating the successful iPad, and the likes from OEMs, was not a war they wanted to fight now as they are all focused to "being in the cloud" somehow, somewhere. With Microsoft far behind in consumer innovation and time to market with the Kin (competing with the iPhone) and the Zune (competing with the iPod), there is so much underdogging that one is willing to play. Ozzie, where are you?
Okay, I've only made it to comment 177 but I'm getting there.
Have you ever bought a book or a DVD and then decided you'd store it in a public place? Even though you may only watch that DVD once a year, I'm guessing you don't store it in the lobby of your apartment building and just HOPE it will be there next time you want it. But for some reason, you expect people to do this with their private data?
Um, I store all my cash at the bank. I expect it to be there to be withdrawn when I need it. I can access it from wherever I want with a shiny piece of plastic and a password. Sure, the bank might go out of business. Sure, they might be robbed or burnt down. But they have security measures in place and contractual obligations that make me feel safe enough to keep my lifes worth with them.
Seems similar to me.
While your analysis of the Apple/Flash situation may (or may not) be correct, your grammatical use of possessive nouns is highly unusual and personally I would say highly annoying.
I note these examples from your article: "HP are...", "Microsoft are...", Apple are...," yet you refer to "data is stored" when, to be grammatically and numerically accurate it should be "data ARE stored."
Why is this a big deal? A company is an individual unit, even if comprised by multiple sub-units or multiple personnel "units." If you are teaching others to use incorrect grammar you are doing more harm than you might imagine. Still not sure? Imagine if I were to have said, "If you IS teaching others..." wouldn't that make any difference to you? One would likely be surprised to hear you say "no."
And while not many would argue that you should use "data are" rather than today's more accepted "data is" should further prove my point. Just because you is [sic] writing in English don't mean nothing [sic] unless your writing are [sic] good.
Do we agree? If we does, we is good.
While your analysis of the Apple/Flash situation may (or may not) be correct, your grammatical use of possessive nouns is highly unusual and personally I would say highly annoying.
No it's not: you're American, I'm British, and you're in the wrong (British usage wins, on this blog).
Next, I suppose you'll be telling me you drive on the "right" side of the road even though most of the world's population drives on the RIGHT side rather than the LEFT. Trust me, when I'm driving in the UK I will be driving on the left side even if it isn't "right." Likewise, when visiting Los Angeles one would be foolish to drive on the left side even if he thinks he is right by doing the opposite.
As all good writers know, it's usually best to write for your audience. If most are British, write in the British style. If most are American, write in the American style. If most is urban and speaks Ebonics, best you be write in urban style.
But, these is just suggestions for yous who want to write well, right.
YO! sounds great... Bring it on! Its time the "computer" became a real world user device, instead of a techy annoyance that you keep having to get your brother/father/son/techy friend to fix.
Next, I suppose you'll be telling me you drive on the "right" side of the road even though most of the world's population drives on the RIGHT side rather than the LEFT.
Nope, I drive on the CORRECT side of the road.
Here's a hint: this is MY blog. I set the grammar rules, in accordance with English usage. You are a guest here. Read the moderation policy before posting again, lest I consider you to be a troll and ban your sorry ass.
In my humble Opnion Steve Jobs is overestimating his power. Newer in the history any new technology has been successful without keeping backwards compatibility.
OS/2 was much superior to Windows, but Windows was just the more backwards compatible challenger. Linux and Open Ofiice only became usable as a desktop when it begun to be able to be backwards compatible with the mainstream's documents
With his crusade against Flash, which is currently in use in millions of websites, he must fail and he will fail.
Does he really think he has the power to ignore millions of creative Flash developers and millions of their customers in communication departments who do work using his Apple products?
Does SJ really think they will be glad to recode their sites just to please his visionary dreams?
If I were one of them, I would be so angry that I'd never buy anything else from Apple again in my life!
Laszlo
Newer in the history any new technology has been successful without keeping backwards compatibility.
Tell that to Steve Jobs -- specifically, explain why the Macintosh was a complete failure (zero backward compatability, remember).
Does he really think he has the power to ignore millions of creative Flash developers and millions of their customers in communication departments who do work using his Apple products?
Yep, apparently he does. And yep, YouTube have recoded all their videos to H.264; according to a recent study, 60% of videos on the web are already iPad compatible, and the proportion is rising fast.
"OS/2 was much superior to Windows, but Windows was just the more backwards compatible challenger."
Were you around at the time. MS kept changing the Win API, many feel to block OS/2. And IBM was still in the mode of doing EVERYTHING as if it was a mainframe sale. Gerstner worked hard to change that but OS/2 died long before he got close to that goal.
"With his crusade against Flash, which is currently in use in millions of websites, he must fail and he will fail. Does he really think he has the power to ignore millions of creative Flash developers and millions of their customers in communication departments who do work using his Apple products?"
A key point. As of now there's not a production version of FLASH on any mobile device. Maybe this will change after Google's conference this week but just maybe a part of this was Job not wanting Apple to be the point on Adobe's deployment of this technology.
Nicely ironic really that the company that has historically been the tools merchant for those "creative Flash developers and millions of their customers in communication departments", is the one to be confronting their favourite tool of trade.
Flash has always been ah - one of those things that is bound to exist in our world, but really wouldn't be needed if things were done a better way. If half the energies expended in those creative departments went instead towards a a bunch of good artistically minded machine coders (you know - the ones doing amazing things with the machines around about when Charlie was doing uni) .. well, what a world that would be.
Same thing could be said of the games industry really as that of the web publishing. Actually it's sort of melding into one nasty thing lately. Makes you want to turn off the computer and go read a good book i tell ya.
Anyway I think there's a lot of folks around who'd like to see flash return from whence it came. There are a few flash killers lurking around besides apple. Unity engine for example brings things closer to what people 'should' be doing when making their machine dance.
Ah am I the only one a bit scared that 'cloud computing' - as nice and environmentally friendly as it appears, is just another abstraction layer that lets us happily dismantle the planet beneath us before we have a clue on how to live on it sustainably?
Ah am I the only one a bit scared that 'cloud computing' - as nice and environmentally friendly as it appears, is just another abstraction layer that lets us happily dismantle the planet beneath us before we have a clue on how to live on it sustainably?
That is a topic for an entirely different discussion.
(And as we're dropping back to one legit comment a day -- and a dozen spams -- I'm closing the comment thread herewith.)