Doing predictions about Apple is risky business. Heck, predicting anything in the tech industry is risky business, and people who did would generally like you to forget they did so at some point: as we know, being a pundit means never having to say you’re sorry.
I’m willing to risk some skin on one matter, however, which is that of Apple’s operating system future. In short: Mac OS X is the new Classic, and iOS will move over to non-touchscreen interfaces, the desktop, and x86, while taking on more Mac-like features. This might be unexpected on my part, critical as I have been here of many aspects of the new computing iOS in particular embodies (but also the Mac App Store, Gatekeeper, etc.). But while I am critical of these aspects, I am also lucid enough to realize, starting in particular when the iPad shipped, that iOS is a revolution, and that all computers will one day work that way.
The iPad revolution
I used to think that our “old world” user interfaces, while not perfect, could eventually be learned by a majority of the population; I did not believe that only stupid people refused to work with computers (I very well understand unwillingness or inability, which sometimes aren’t completely distinct, to deal with the necessary complexities of today’s computers), but I also believed that our current state in computing was as good as it could be given the constraints, and that eventually computer usage would become part of the basic skills, like literacy and mathematics. In fact, I still believe this could have happened with current computers given enough time.
But Apple has shown a much better way. The Mac and Windows PC of 2010 are pre-alpabetic writing systems; nice pre-alphabetic systems, mind you, that anyone could learn in theory, but in practice only a minority of scribes were literate. By contrast, iOS is the first alphabetic system.
During the iPad introduction, for the roots of the iPad Steve strangely chose to go back to the beginnings of Apple with the Apple II (including a rare reference to Woz), and the first PowerBook; strangely because what the introduction of the iPad corresponds the best to is the original Macintosh. Both are compact computers with completely different graphical interfaces first seen elsewhere (the Lisa/the iPhone), but which suddenly took a much broader significance, and were as a result derided for being for play and not real work. While it was not necessarily the Mac itself which succeeded afterwards, we all know where the ideas pioneered in the marketplace by the original Macintosh went.
What about the incumbents?
But does this mean that current operating systems will change to become more like iOS? I don’t think so, or, rather, for those which try and do so, I don’t believe it will work. As we have seen with Mac OS X, operating system transitions are as much about transitioning user expectations as they are about the technology itself. A change of this magnitude will require new operating environments that people can see with a new eye, rather than by modifying an existing OS where any change is perceived as grating, or may not really work unless most applications are on board.
For instance, in Mac OS X Lion Apple changed built-in apps to automatically save documents being edited, as well as adopt a number of new behaviors (for instance, when a document is set to be an attachment of an email, the application editing the document, if any, is asked to save its work to disk so that the document state the user sees in the editing application is actually taken as the attachment), and encouraged third-party applications to do the same. This is undeniably progress, think of all the aggregate amount of work that is no longer being lost because of a power failure or application crash, across all Mac users running Lion or later.
But many people (including noteworthy people) complained about a part of this change, which is the loss of the “Save As…” command. The very notion of that command, which is that you are currently engaged in modifications to a file that you intend to not apply to that file but instead write elsewhere, is deeply rooted in the assumption that the system does not write file modifications until you explicitly tell it to; and so “Save as…” is foreign, at best, in an autosave world, but at the same time people have gotten used to it, so are they wrong to be missing it? After some adjustment attempts, Apple eventually relented and added back full “Save As…” functionality, but I don’t know if they should have; in particular, I have not checked to see how it interacts with the “attach to an email causes a save” feature.
Speaking of this feature, what do you think happens when someone uses it with, say, TextEdit documents currently being edited and it works as expected, and then uses it on an unsaved Word document (assuming Word has not been updated with that feature)? The last saved version gets attached instead of what the user expects, and he only knows about it when his correspondent replies with an inflammatory email telling that there is nothing new in the document, what have you been doing all this time? As a result, the feature will be poisoned in the mind of the user, who will adopt paranoid and near-superstitious steps to ensure the latest version always gets used in attachments. A feature that works unreliably is worse than no feature at all.
Another such problematic change is auto-quitting applications that are not topmost and have no document open. What is the issue here? After all the application has no visible UI in this case, and when recalled it would behave the same way (typically, by showing a new untitled document) whether it was auto-quit or not. Except this is not actually transparent at all: the application as a result disappears from the command-tab interface, most notably, so when the user wants to switch back to it using command-tab, he can’t.
What about the feature of saving the list of applications running upon shutdown, and reopening them upon restart (which in turn causes them to restore their open windows and state if they support it)? But there is a problem with that: applications can still veto a shutdown, so if half the applications have quit, but one prevents the shutdown at this point, once you have cleared the situation with the reluctant app when you shut down again only the list of applications still running at this point is saved and will get restored. So, oops, the feature only partially and inconsistently works, which I argue is worse than not having the feature at all.
Etc, etc. The list goes on. Now as John Siracusa ponders in his Lion review, users are expected to look down on applications (or mechanisms, like command-tab) that ruin the experience for everyone else, but it is not necessarily easy for the average user to make the correct connection, especially when the failure (wrong attachment, incomplete list of applications launched after the reboot) is removed from the cause. But even these cases are still easier to trace than what is coming next.
Copland 2010: Avoided
In Avoiding Copland 2010, John boldly (given, again, the risk inherent in doing so) tried to predict Apple’s operating system future, and referred to the situation of the Mac prior to Mac OS X, Copland being the failed attempt by Apple to provide the Mac a modern operating system. As he explained, what the “classic” MacOS lacked compared to Mac OS X was:
- systemic features: memory protection and preemptive multitasking,
- a modern programming framework: Cocoa with its object-oriented programming model, for the form present in Mac OS X,
the combination of the two being more than the sum of these parts. He then went on to try and find out the parallels which could be made between the Mac OS X of 2005 and the traditional MacOS of 1995, to figure out what Mac OS X would need in the future in order to stay competitive; but he focused on the second category on improvements, namely what kinds of programming paradigms Apple may need to adopt (such as functional programming, garbage collection, dynamic languages, etc.). I remember reading his article and being skeptical: after all with memory protection and preemptive multitasking in Mac OS X, applications were now isolated from each other, so from now on whatever the need for new operating system functionalities needed to stay relevant (be it systemic features or a better programing framework), applications could adopt them individually without having to have the others on board, so no clean break similar to Mac OS X would ever be needed in the future.
Of course, I was wrong. But not because a better programming framework turned out to be necessary (I don’t feel there is much competitive pressure on Apple on this front); rather, I forgot the one big globally shared entity through which applications can still interfere with each other: the file system. So a new clean break is likely needed to provide sandboxing, the systemic feature to solve the situation. With sandboxing, applications… well, I hesitate to say that they would be properly isolated, because I now know better than to proclaim that, but they certainly would be better isolated by an order of magnitude.
And on the sandboxing front, Apple is certainly among the most competitive, if not in front (at least as far as client software is concerned)… thanks to the iPhone, which was sandboxed from day one. Thanks to a hardware opportunity a few years prior, Apple already has in house the production-quality, shipping operating system with a third-party software ecosystem, ready for the challenges of the 10s; compare to when they had to buy such a system from NeXT in the 90s. This time Apple is ready.
Playing in the sandbox
I have not yet shown why sandboxing is so necessary; it certainly is less obvious than for memory protection and preemptive multitasking. Doing so would take an entire post, but I should mention here two advantages: the mysterious file system, and privacy. For the first, it may be hard for us nerds to realize how mysterious the file system, or at least some aspects of it, is to the average user, what with it mixing the operating system, applications, user files, downloads, and the like. Why did this folder for a game (containing, as it turns out, game saves) suddenly appear in the Documents folder? And so forth, and I am sure many such interactions that I don’t even think twice about and no longer realize any more, are instead frustrating mysteries for many users. Sandboxing would put an end to that.
As for privacy, you may say that you trust the software installed on your machine. But that’s the thing: you need to trust something entirely before you install it as a Mac app, there is no way to benefit from an application that is useful yet in which you do not have entire trust. Contrast that with the web, which is the platform actually challenging Apple the most in this area: with its same-origin restrictions, the web already has a strong sandboxing model which allows people to use Facebook and a myriad of other services, easily adopting a new one with low commitment and low friction, confident that the service only knows what the user entered in it (granted, the myriad of “like” buttons on the Internet mean that Facebook is aware of far more than what users explicitly tell it, but that’s between Facebook and the sites integrating the “like” button; sites can choose not to do so). Apple needs sandboxing in order to stay competitive with the web.
Of course, you have probably noticed Apple actually tried to retrofit sandboxing on Mac OS X. When I heard that Lion would feature application sandboxing, I tried to imagine how such a feat could be possible while keeping the Mac file usage model, and thought that they would have to turn the open and save panels into a privileged external application, which upon selection from the user would grant the sandboxed app the right to operate on the file, while keeping API compatibility, no small feat. Which is exactly what they did. Impressive as it may be, I feel this effort to maintain the “all files mixed together” file system while adopting a sandboxed underlying infrastructure will eventually go to waste, as Apple will need to build a new user model eventually anyway to go with the sandboxed infrastructure (for instance, users could rightly wonder why a file “just next” to one they selected is not automatically picked up, say the subtitles file next to a movie file), a new user model where documents of each app are together by default, and “all files mixed together” is for specific scenarios for which there could be a better UI.
And on a software development standpoint, the retrofitting of sandboxing on Mac OS X has been less that stellar, with deadlines pushed multiple times, Apple needing to implement scenarios they had not foreseen but turned out to be so important for users that apps would not ship without it, like security-scoped bookmarks, notable app developers publicly giving up on sandboxing, other apps having to become clunkier or losing features in order to adopt sandboxing… All for little benefit since, as far as I can tell, an unsandboxed app can still read the data stored by sandboxed apps.
The Mac OS X transition model
In theory, Apple could have introduced Mac OS X in a less disruptive fashion. Instead of pushing it to consumers when they did, they would have kept Mac OS X 10.1 and 10.2 to developers only, while encouraging them to ship carbonized applications. Then for 10.3 with a sufficient number of Mac OS X-optimized apps around, they would have introduced it to consumers as simply the evolution of MacOS 9 and minimal cosmetic changes, with carbonized apps and new Cocoa apps running natively, and other apps running in a completely transparent Classic environment, and the users would have benefited from a more stable operating system without any visible transition.
But in practice, that would have of course never worked. There are numerous reasons why, but foremost is the fact third-party application developers are not perfectly cooperative. They are ready to listen and adopt Apple initiatives, but there had better be some sort of immediate payout, because otherwise they are spending time, and thus money, changing their code while adding exactly zero feature. So developers would have waited for Apple to ship Mac OS X in order to carbonize their app, otherwise what’s the benefit in a MacOS 9 environment, and Apple would have waited for a critical mass of carbonized apps before shipping Mac OS X to consumers. Uh oh.
Instead, by shipping Mac OS X 10.0-10.1 to early adopters, then progressively to less early adopters, Apple provided right from the start an incentive for developers to ship carbonized apps, first some specialized developers would have enough of their audience using Mac OS X for them to be worth it, then more and more developers would find it worthwhile to port, etc. More importantly, early adopters and early applications would actually set up expectations, instead of incumbents setting them in the case of a “progressive” transition, so that if an app got ported technically, but not in spirit (say, it would use a lot of resources), it would stick among the ported apps and it would be pressured to fit to the new standards. And playing just as important a role, the Classic ghetto clearly marked which apps would all together go down whenever one of them would crash, making sure to mark the boundaries of the negative network effects/broken windows (where a minority of apps could ruin it for everyone else). The Aqua interface was an essential part of this strategy, being no mere eye-candy, but eye-candy that coincided with a major OS environment change and helped mark it for average users.
By contrast, what Apple is currently doing with the “iOS-ification” of Mac OS X is merely add superficial enhancements from iOS to a system whose behavior is still fundamentally defined by existing users and existing applications, a system which has to stay compatible with existing installs, existing peripherals, existing workflows, existing mechanisms, etc. Mark my words: history will look back at the recent “iOS-ification” of Mac OS X as as quaint and superficial and meaningless as the Mac-like interface of the Apple IIgs.
Mac OS X as the Classic of iOS
I expect that Apple will soon realize that trying to drive Mac OS X toward having all iOS features is a dead-end. And some setback or other, leading to cost-saving measures, or just their obsession with efficiency, will make them suddenly question why the heck they are spending effort maintaining two operating systems in parallel. I do not think Mac OS X and iOS will merge either; this kind of thing only happens in tech journalists wet dreams. There will be no Mac OS XI, instead my prediction is that the Mac will have a new OS, and Mac OS X will run in a Classic mode inside that OS. But Apple already has a new OS, it’s called iOS, hence that new OS would be a “port” of iOS to the desktop, called… “iOS”.
So in practice, this would allow Apple to provide an incentive to applications adopting modern features like sandboxing and a more modern interface (larger click/touch targets, etc.), by having them run outside the Mac OS X ghetto and inside iOS instead, and thus for instance making their data impossible to read from a an app running inside that ghetto; and I am sure other such immediate incentives are possible, given the benefits of sandboxing to users. Currently Apple can get some applications to adopt sandboxing thanks to their clout, embodied in the Mac App Store, but application developers resent it, there has to be more positive ways of encouragement. Also, the “iOS-native” area would be, again, defined by the early adopters (here, in fact, the existing mobile iOS apps, see later), so applications not having, for instance, autosaving would quickly stick out and be pressured to adopt it, if it wouldn’t be mandatory in the first place in order to run outside the ghetto.
Meanwhile, the new Classic environment, with Mac OS X running inside, would allow users to keep using “legacy” Mac OS X apps during the transition, exactly the same way the Classic environment of Mac OS X eased the transition from MacOS 9. The current Mac OS X interface (which only has a passing resemblance to Aqua at this point) would be the new Platinum, a ghetto inside the world of the native iOS interface, which would play the role of the new Aqua.
Now given all the speculation about Apple potentially adopting ARM chips for future MacBooks, or Apple potentially using Intel chips in future mobile devices, I think it is important to clarify what I am talking about here. I expect Apple to keep using the most appropriate chip for each device class, which for each existing Apple product line is still ARM chips for tablets and smaller, and Intel chips for the MacBook Air and up, and I don’t expect it to change in the short run (Intel and ARM chips are just now starting to meet in the performance/power point in the no-man’s land between the two device classes).
So I expect iOS, for the purpose of running on laptop and desktop class devices, would be ported to x86. This is not much of a stretch, extremely little of one in fact because first, iOS already runs on x86 (at the very least, the whole library stack and some built-in apps), and second most iOS App Store apps do already run on x86, both thanks to the iOS Simulator. This unassuming piece of iOS app development, including the iOS runtime environment, indeed runs natively on the host Mac, without any processor emulation involved, and iOS apps need to be compiled as x86 to run on it. As many iOS app development processes rely on running the app on the iOS Simulator for convenience when debugging, better debugging tools, easier and faster design iteration, most iOS apps are routinely compiled and work on x86, and would easily run on the x86 iOS port; and Apple, having most of iOS running on x86 already, would have little trouble porting the remainder of iOS, it could even once ported be installed on Macs shipping today and not require new machines.
What would the interface for iOS on the desktop be? I can’t remember in which Apple product introduction this was mentioned, but the host mentioned that touching a vertical desktop screen did not work in practice, and neither would it be practical to read and view a horizontal desktop screen — and that they had tried. So while some are clamoring for iOS to come to the laptop (and desktop) form factor so that we can finally have large touchscreen devices, they are misguided: if it was technically possible, then Apple would already have done it on the Mac, with Mac OS X. Maybe this will happen some day, but this would happen independently of iOS running on desktop class devices.
Instead, as using a touch interface with a mouse is at least tolerable (as opposed to using an interface meant to be used with a mouse on a touchscreen device), iOS on the desktop could be used, at least its basic functions, with a mouse, but in fact a Magic Trackpad would be recommended (and bundled with all desktops from that point on), as providing something of the best of both words: multitouch, and the ability to browse a screen much larger than typical finger movements using the typical mechanisms of acceleration, lifting and landing elsewhere, etc. Of course, since it would be a touch interface separate from the screen instead of a touchscreen, there would need to be a visible pointer of some sort; likely not the arrow shape we’ve all known and loved since 1984, as this arrow is meant to be able to click with pixel accuracy, which is antithetical with the iOS interface paradigm. Maybe something as simple as a circle could do the trick.
Now remains the most important and thorny question of how these laptops and desktops now meant to run iOS would be called; would we still call them Macs? I have no idea.
iOS would have to change, too
But just like Mac OS X was not Rhapsody, iOS, in order to run on the desktop and be an attractive target for current desktop apps and users, would have to adopt some Mac features. For one, the model of designing iOS apps specifically for each screen size clearly wouldn’t scale, there would have to be ways to have apps reliably extend their user interface; most of the infrastructure is there mind you, even with springs and struts a lot can be done, but there would have to be new mechanisms to better take advantage of space when available; think CSS media queries and responsive design for native apps.
But having apps themselves scale with the screen is only part of the challenge, with all that screen space we certainly would not use it to run one app at a time, so ways to have multiple iOS apps on screen a the same time would have to be devised. That would be an awful amount of work to define and implement this new on-screen multitasking model for the 21st century, no doubt about it.
Then iOS would have to adopt some sort of document filing system, of course, as well as Developer ID, because having all apps having to come from either the iOS App Store or Mac App Store would be untenable for a desktop machine with many uses, corporate ones in particular.
Then it would need the ability to specify different default apps for, say, browser or email, as well as more comprehensive systems for apps to communicate with each other, with services for instance, better URL handlers (with UI to specify the default in case more than one app claim support). iCloud that doesn’t suck, but that’s a necessity for iOS on mobile anyway.
Most significantly, iOS for desktop would need to support some Mac OS X APIs, possibly with some restrictions and dropped methods/objects (kind of like Carbon was to the Mac Toolbox), most notably AppKit, but not only it, though it is safe to say Carbon wouldn’t make the jump.
And Apple would need to add (meaningful) removable media support, external drive support, backup support on such a drive, support for burning disks, wired networking, multiple monitors, a Terminal app, a development environment, some solution to the problem of communal computing/multiple users, etc, etc.
So, easy, right? Just a little bit of work. In all seriousness, that would be a big endeavor, bigger maybe than Mac OS X, and, like Mac OS X, iOS on desktop would probably require a few iterations to get right. And that’s why Apple, whatever they actually intend to do (because they will not keep maintaining two operating systems forever), should start the process by telling developers as early as possible; for instance Mac OS X was introduced for the first time in May 1998 (how time flies…), and wasn’t really ready to replace MacOS 9 until 2002 or so.
So what do you think will happen? Even if I am completely off-base here, at least I hope I provided for some interesting reflection for you to think on the matter.