Steve

I wasn’t sure I should write something, at first. Oh, sure, I could have written about the fact I didn’t dress specially thursday morning or didn’t bring anything to an Apple Store, as I thought for Steve I should either do something in the most excellent taste or nothing, and I couldn’t think of the former (and so I kicked myself saturday when I went to the Opera Apple Store to buy a Lion USB key, saw them, and thought “Of course! An apple with a bite taken out of it… dummy!”). Or I could have written about the fact he was taken from his families at a way too early age. Or about the fact, except for this one (and variants of this one, though one would have been enough), I was appalled by the editorial cartoons about the event (“iDead”? Seriously?). Or about a few obituaries I read or heard where the author put some criticism along with the praise (which by itself I don’t mind, honestly, he was kind of a jerk), but put in a way that suggested the good could be kept without the flaws, while for instance in an industry where having different companies responsible for aspects of the user experience of a single device is considered standard practice, being a control freak is essential to ensure the quality of the user experience that has made Apple a success. Or about how his presence in the keynotes during his last leave of absence (while on the other hand he stepped back from presentation duties during the previous one), and his resignation merely 6 weeks ago, both take on a whole new meaning today.

But at the end of the day, what would have I brought, given the outpouring of tributes and other content about Steve Jobs, many from people more qualified and better writers than I am? Not much. However, I read a piece where the author acknowledges the impact Steve Jobs had on his life, and I thought I should, too, pay my dues and render unto Steve that which is Steve’s, if only to help with the cathartic process. I hope it will contribute something for his family, his family at Apple, his family at Disney/Pixar, and the whole tech and media industries in this time of grief.

I was quite literally raised with Apple computers; from an Apple ][e to the latest Macs, there has always been Apple (and only Apple) hardware in the house, for which I cannot thank my father enough. As a consequence, while I had no idea who Steve Jobs was at the time, he was already having a huge impact on me. Not because I think he designed these computers all by himself, but because, by demanding seemingly impossibly high standards from the ones who designed them with him, or in the case of later Macs, by having made enough of a mark at Apple that the effect was (almost) the same, he ensured a quality of user experience way beyond that of any competitor, which allowed my young self to do things he wouldn’t have been able to do otherwise, and teaching him to expect, nay, demand similar excellence from his computing devices.

Then I started learning about him when he returned to Apple in 1997, from a press cautiously optimistic that the “prodigal son” could get Apple out of trouble, then how he spectacularly did so. I indirectly learned from him (in particular through folklore.org) that it requires a great deal of effort to make something look simple, that there is never good enough, merely good enough to ship this once (because on the other hand, real artists ship) and that the job of the software developer is to be in service of the user experience, not to make stuff that is only of interest to other software developers and remain in a closed circuit.

Imagining my life had Steve Jobs not made what he made is almost too ludicrous to contemplate. Assuming I would even have chosen a career in programming, I would be developing mediocre software on systems that would be as usable a mid-nineties Macintosh, if that, and would have very little of the elegance (come on: setting aside any quibble about who copied whom, do you think Windows or any other operating system would be where it is today were it not for the Mac to at the very least compete with it and make it do one better in the usability department?). And the worst thing is that I would have been content with it and considered it as good as it gets, and it would have been the same for almost all of my peers.

It’s thus safe to say that as far as my influences go, Steve Jobs is second only to my closest family members. By envisioning the future, then making it happen through leadership, talent and just plain chutzpah (for good or ill, it doesn’t seem to be possible to make people believe in your predictions of what the future will be made of, other than by actually taking charge and realizing it), he showed us what computers (and portable music players, and mobile phones, etc.) could be rather than what most people thought they could be before he showed us. And by teaching a legion of users, multiple generations of developers, and everyone at Apple to never settle for great but always strive for the best, he has ensured the continuation of this ethic for a few decades, at least (this is, incidentally, the reason why I am not too worried about the future of Apple, Inc.).

Thank you Steve. Thank you for everything. See you at the crossroads.

Benefits (and drawback) to compiling your iOS app for ARMv7

In “A few things iOS developers ought to know about the ARM architecture”, I talked about ARMv6 and ARMv7, the two ARM architecture versions that iOS supports, but I didn’t touch on an important point: why you would want to compile for one or the other, or even both (thanks to Jasconius at Stack Overflow for asking that question).

The first thing you need to know is that you never need to compile for ARMv7: after all, apps last updated at the time of the iPhone 3G (and thus compiled for ARMv6) still run on the iPad 2 (provided they didn’t use private APIs…).

Scratch that, you may have to compile for ARMv7 in some circumstances: I have heard reports that if your app requires iOS 5, then Xcode won’t let you build the app ARMv6 only. – May 22, 2012

So you could keep compiling your app for ARMv6, but is it what you should do? It depends on your situation.

If your app is an iPad-only app, or if it requires a device feature (like video recording or magnetometer) that no ARMv6 device ever had, then do not hesitate and compile only for ARMv7. There are only benefits and no drawback to doing so (just make sure to add armv7 in the Required Device Capabilities (UIRequiredDeviceCapabilities) key in the project’s Info.plist, otherwise you will get a validation error from iTunes Connect when uploading the binary, such as: “iPhone/iPod Touch: application executable is missing a required architecture. At least one of the following architecture(s) must be present: armv6”).

If you still want your app to run on ARMv6 devices, however, you can’t go ARMv7-only, so your only choices are to compile only for ARMv6, or for both ARMv6 and ARMv7, which generates a fat binary which will still run on ARMv6 devices while taking advantage of the new instructions on ARMv7 devices1. Doing the latter will almost double the executable binary size compared to the former; executable binary size is typically dwarfed by the art assets and other resources in your application package, so it typically doesn’t matter, but make sure to check this increase. In exchange, you will get the following:

  • ability to use NEON (note that you will not automatically get NEON-optimized code from the compiler, you must explicitly write that code)
  • Thumb that doesn’t suck: if you follow my advice and disable Thumb for ARMv6 but enable it for ARMv7, this means your code on ARMv7 will be smaller than on ARMv6, helping with RAM and instruction cache usage
  • slightly more efficient compiler-generated code (ARMv7 brings a few new instructions besides NEON).

Given the tradeoff, even if you don’t take advantage of NEON it’s almost always a good idea to compile for both ARMv6 and ARMv7 rather than just ARMv6, but again make sure to check the size increase of the application package isn’t a problem.

Now I think it is important to mention what compiling for ARMv7 will not bring you.

  • It will not make your code run more efficiently on ARMv6 devices, since those will still be running the ARMv6 compiled code; this means it will only improve your code on devices where your app already runs faster. That being said, you could take advantage of these improvements to, say, enable more effects on ARMv7 devices.
  • It will not improve performance of the Apple frameworks and libraries: those are already optimized for the device they are running on, even if your code is compiled only for ARMv6.
  • There are a few cases where ARMv7 devices run code less efficiently than ARMv6 ones (double-precision floating-point code comes to mind); this will happen on these devices even if you only compile for ARMv6, so adding (or replacing by) an ARMv7 slice will not help or hurt this in any way.
  • If you have third-party dependencies with libraries that provide only an ARMv6 slice (you can check with otool -vf <library name>), the code of this dependency won’t become more efficient if you compile for ARMv7 (if they do provide an ARMv7 slice, compiling for ARMv7 will allow you to use it, likely making it more efficient).

So to sum it up: you should likely compile for both ARMv6 and ARMv7, which will improve your code somewhat (or significantly if you take advantage of NEON) but only when running on ARMv7 devices, while increasing your application download to a likely small extent; unless, that is, if you only target ARMv7 devices, in which case you can drop compiling for ARMv6 and eliminate that drawback.


  1. Apple would very much like you to optimize for ARMv7 while keeping ARMv6 compatibility: at the time of this writing, the default “Standard” architecture setting in Xcode compiles for both ARMv6 and ARMv7.

China declined to join an earlier coalition, Russia reveals

The saga of France’s liquidation sale continues (read our previous report). Diplomatic correspondence released yesterday by Russia in response to China’s communiqué reveals that China was asked to join an earlier coalition to acquire South Africa’s nuclear arsenal (an acquisition China mentioned in its communiqué as evidence of a conspiracy), but China declined.

This seemed to undermine China’s argument of an international conspiracy directed against it, at the very least it strengthens the earlier coalition’s claim that its only purpose was to figuratively bury these nuclear weapons; it should be noted high-profile countries Russia and USA are members of both coalitions.

China then answered with an update to their communiqué (no anchor, scroll down to “UPDATE August 4, 2011 – 12:25pm PT”) stating the aim of this reveal was to « divert attention by pushing a false “gotcha!” while failing to address the substance of the issues we raised. » The substance being, according to China, that both coalitions’ aim was to prevent China from getting access to these weapons for itself so that it would have been able to use them to dissuade against attacks, and that China joining the coalition wouldn’t have changed this.

Things didn’t stop here, as Russia then answered back (don’t you love statements spread across multiple tweets?) that it showed China wasn’t interested in partnering with the international community to help reduce the global nuclear threat.

For many geopolitical observers, the situation makes a lot more sense now. At the time the France sale was closed and the bids were made public, some wondered why China wasn’t in the winning consortium and had instead made a competing bid with Japan. China and Japan are somewhat newcomers to the nuclear club, and while China’s status as the world’s manufacturer pretty much guarantees it will never be directly targeted, its relative lack of nuclear weapons is the reason, according to analysts, it has less influence than its size and GDP would suggest. Meanwhile, China is subjected to a number of proxy attacks, so analysts surmise increasing its nuclear arsenal would be a way for China to dissuade against such attacks against its weaker allies.

So the conclusion reached by these observers is that, instead of joining alliances that China perceived as designed to keep the weapons out of its reach, China played everything or nothing. But the old boys nuclear club still has means China doesn’t have, and China lost in both cases, and now China is taking the battle to the public relations scene.

Geopolitical analyst Florian Müller in particular was quoted pointing out that, given the recent expansion of its influence, it was expected for China to be targeted by proxy, and other countries were likely acting their normal course and were not engaged in any organized campaign.

So to yours truly, it seems that while the rules of nuclear dissuasion may be unfair, it seems pointless to call out the other players for playing by these rules, and it makes China look like a sore loser. But the worst part may be that the Chinese officials seemingly believe in their own, seemingly self-contradicting (if they are so much in favor of global reduction of nuclear armaments, why wouldn’t they contribute to coalitions designed to take some out of the circulation?) rhetoric, which would mean the conflict could get even bitterer in the future.

France goes down, its nuclear weapons, and China

So France is going belly up. Kaput. Thankfully not after a civil war, the strife is more on the political side, though a few unfortunately died in some of the riots. But after regions like Corsica, Brittany and Provence unilaterally declared independence, after Paris declared a real Commune in defiance of the government, and Versailles, the usual fallback, did not seem safe either, it became clear there was no way out but eventual dissolution of the old, proud French Republic state; much like the USSR dissolved in 1991, but without an equivalent of Russia to pick up the main pieces, the Paris Commune being seen as too unstable.

Among the numerous geopolitical problems this raised, one stood out. Among its armed forces, the French Republic had under its control several nuclear warheads, the missiles to carry them, and a fleet of submarines to launch them. Legitimately terrified that these weapons could fall under the hands of a rogue state or terrorist group, the international community sustained the French government long enough for it to organize a liquidation sale of its nuclear armament and other strategic assets. But Russia certainly wasn’t going to let the USA buy them, an neither were the USA willing to see Russia get them. Realizing that making sure these weapons didn’t fall in the wrong hands was more important than for either party to take control of them itself, Russia, the USA, and a few other countries like India, the United Kingdom, etc. formed a coalition and jointly bid, and won, the dangerous arsenal.

Though they agreed on a few principles before making this alliance, it was considered urgent to get control of the arsenal in the first place, and at the time the sale was closed the coalition had not agreed on what to do with those weapons. But most geopolitical observers and analysts agreed that the coalition would end up keeping the weapons around just in case, but inactive and offline, and that was if they were not just going to disband them outright; after all, for them to be used would require joint agreement of all parties, an agreement that was extremely unlikely to be ever reached.

But China suddenly started publicly complaining that the members of the coalition were engaged in a conspiracy against it, citing military interventions from some of the coalition members in foreign countries, various international disagreements, and now this France liquidation sale (China did not take part in the coalition, it made, with one ally, a separate bid for these weapons but was eventually outbid by the coalition). Observers, however, were skeptical: these events did not seem connected in any way except for the fact of being mentioned together in that communiqué; plus, as if the joint ownership didn’t already ensure at least immobilization by bureaucracy, the coalition includes one partner of China: Brazil. And the fact the coalition spent quite a bit of money to acquire this arsenal, more than some initial estimates, probably only reflected the importance of keeping them out of the wrong hands, given the unstable international landscape, what with rogue states, terrorist groups, less than trustworthy states gaining importance, etc. Not to mention China itself bid rather high in that auction.

In the end it is suspected that, while Chinese officials may believe this conspiracy theory themselves, these complaints made in public view were actually intended to fire up nationalism in the country, or even better, in the whole east Asia.

The saga, unsurprisingly, didn’t stop there: Russia answered, read all about it in the followup. – August 5, 2011

In support of the Lodsys patent lawsuit defendants

If you’re the kind of person who reads this blog, then you probably already know from other sources that an organization called Lodsys is suing seven “indie” iOS (and beyond!) developers for patent infringement after it started threatening them (and a few others) to do so about three weeks ago.

Independently of the number of reactions this warrants, I want to show my support, and I want you to show your support, to the developers who have been thus targeted. Apparently, in the USA, even defending yourself to find out whether a claim is valid doesn’t just cost an arm and a leg, but can put such developers completely out of business with the sheer cost of the litigation. So it must be pretty depressing when you work your ass off to ship a product, a real product with everything it entails (engine programming, user interface programming, design, art assets, testing, bug fixing, support, etc.), only to receive demands for part of your revenue just because someone claims to have come up with a secondary part of your app first, this someone being potentially anyone with half of a quarter of a third of a case and richer than you, since you’d be out of business by the time the claim is found to be invalid. It must be doubly depressing when the infringement is from your use of a standard part of the platform, that you should (and in fact, in the case of iOS in-app purchase, have to) use as a good platform citizen.

I have known about iconfactory.com and enjoyed their work for fifteen1 twelve years now, I use Twitterrific, I have bought Craig Hockenberry’s iPhone dev book, I follow him on Twitter and met him once. I know that the Iconfactory is an upstanding citizen of the Mac and iOS ecosystem and doesn’t deserve this. I am not familiar with the other defendants, but I am sure they do not deserve to be thus targeted, either.

So, to Craig, Gedeon, Talos, Corey, Dave, David, Kate and all the other Iconfactory guys and gals; to the fine folks of Combay, Inc; to the no less fine folks of Illusion Labs AB; to Michael; to Richard; to the guys behind Quickoffice, Inc.; to the people of Wulven Games; I say this: keep faith, guys. Do not let this get you down, keep doing great work, know there are people who appreciate you for it and support you. I’m supporting you whatever you decide to do; if you decide to settle, that’s okay, maybe you don’t have a choice, you have my support; if you decide to fight, you have my support; if you want to set up a legal defense fun to be able to defend yourself, know that there are people who are ready to pitch in, I know I am.

And in the meantime before the patent system in the USA gets the overhaul it so richly deserves (I seriously wonder how any remotely innovative product2 can possibly come out of the little guys in the USA, given such incentives), maybe we can get the major technology companies to withdraw selling their products in that infamous East Texas district (as well as the other overly patent-friendly districts), such that this district becomes a technological blight where nothing more advanced than a corded phone is available. I don’t think it could or would prevent patent lawsuits over tech products from being filed there, but at least it would place the court there in a very uncomfortable position vis-à-vis of the district population.


  1. Let’s just say my memories of my time online in the nineties are a bit fuzzy; it’s a recent browse in digital archives that made me realize I in fact only discovered iconfactory.com in 1999

  2. The initial version of that post just read “innovation” instead of “remotely innovative product”; I felt I needed to clarify my meaning

April’s Fools 2011

So, if you read my previous post before today… April’s fools! And not in the way you might think. This behavior of the iPad 2 is real, I did not make it up, I did indeed verify it this week. The joke is that I claimed to be surprised, hoping to make people believe this unexpected behavior was an April’s fools. Posting strange-sounding yet true information on April the first—now that is the real prank.

It’s hard to tell how successful I was in tricking people into believing this was a joke; I did however get a few emails explaining (as I pretended to request) how such a thing was possible. Congratulations guys, you did not fall for it!

I completely expected this behavior of the iPad 2, I knew about ARM having a weakly ordered memory model, and have known for some time (this test code was prepared over the last few weeks, for instance). By pretending to be surprised, I attempted to raise awareness of this behavior, which many people are completely unaware of; indeed, programmers have rarely been exposed to weakly ordered memory systems so far: x86 is strongly ordered, and even if these programmers have worked on ARM they have only worked on single-core systems so far (the only consumer hardware I know of that exposed a weakly ordered memory model are the various bi-pro PowerPC PowerMacs, which are not very common and back then Mac code was mostly single-threaded). I’ve been thinking about ways to raise this awareness for some time, but it was hard to find out how since it was pretty much a theoretical concern as long as no mainstream multi-core ARM hardware was shipping. But now that the iPad 2, the Xoom, and other multi-core ARM tablets and handsets have shipped I can show everyone that this indeed occurs.

Later today or tomorrow, I will replace the contents of that post with a more in-depth description and a few references, in other words the post I intended to write in the first place, before I realized I could turn it into a small April’s fools prank. It will be at the same URL, in fact you might have noticed the slug did not really match the title, I intended this as a small hint that something was off…

Whether you thought the iPad 2 behavior was a joke, you knew this behavior was real but believed I was genuinely surprised, or you saw right through my feigned surprise, thank you for reading!

(On that note, I should mention I have been sloppy in checking my spam filters residue so far, and my ISP deletes them automatically after one week. So if you ever wrote me and I never answered, this may be why. My apologies if this happened to you, please send the email again if you feel like doing so.)

ARM multicore systems such as the iPad 2 feature a weakly ordered memory model

At the time of this writing, numerous multicore ARM devices are either shipping or set to ship; handsets, of course, but more interestingly this wave of tablets, in particular the iPad 2 (but not only it), seems to be generally based around multicore ARM chips, be it the Tegra 2 from nVidia, or the OMAP 4 from TI, etc. ARM multicore systems did exist before, as the ARM11 was MP-capable, but I’m not aware of it being used in many or any device open for third-party development; this seems to be really exploding now with the Cortex A9.

These devices will also introduce a surprising system behavior to many programmers for the first time, a behavior which if it isn’t understood will cause crashes, or worse.

Let me show what I’m talking about:

BOOL PostItem(FIFO* cont, uint32_t item) /* Bad code, do not use in production */
{ /* Bad code, do not use in production */
#error This is bad code, do not use!
    size_t newWriteIndex = (cont->writeIndex+1)%FIFO_CAPACITY; /* Bad code, do not use in production */
    /* see why at http://wanderingcoder.net/2011/04/01/arm-memory-ordering/ */
    if (newWriteIndex == cont->readIndex) /* Bad code, do not use in production */
        return NO; /* notice that we could still fit one more item,
                    but then readIndex would be equal to writeIndex
                    and it would be impossible to tell from an empty
                    FIFO. */
                    
    cont->buffer[cont->writeIndex] = item; /* Bad code, do not use in production */
    cont->writeIndex = newWriteIndex; /* Bad code, do not use in production */
    
    return YES; /* Bad code, do not use in production */
}

BOOL GetNewItem(FIFO* cont, uint32_t* pItem) /* Bad code, do not use in production */
{
#error This is bad code, do not use!
    if (cont->readIndex == cont->writeIndex) /* Bad code, do not use in production */
        return NO; /* nothing to get. */
        
    *pItem = cont->buffer[cont->readIndex]; /* Bad code, do not use in production */
    /* see why at http://wanderingcoder.net/2011/04/01/arm-memory-ordering/ */
    cont->readIndex = (cont->readIndex+1)%FIFO_CAPACITY; /* Bad code, do not use in production */
    
    return YES; /* Bad code, do not use in production */
}

(This code is taken from the full project, which you can download from Bitbucket in order to reproduce my results.)

This is a lockless FIFO; it looks innocent enough. I tested it in the following setup: a first thread posts consecutive integers slightly more slowly (so that the FIFO is often empty) than a second thread, which gets them and checks that it gets consecutive integers. When this setup was run on the iPad 2, in every run the second thread very quickly (after about 100,000 transfers) got an integer that wasn’t consecutive with the previous one received; instead, it was the expected value minus FIFO_CAPACITY, in other words a leftover value from the buffer.

What happens is that the system allows writes performed by one core (the one which runs the first thread) to be seen out of order from another core. So the second core, running the second thread, first sees that writeIndex was updated, goes on to read the buffer at offset readIndex, and only after that sees the write in the buffer to that location, so it read what was there before that write.

A processor architecture which, like ARM, allows this to happen is referred to as weakly ordered. This behavior might seem scandalous, but remember your code is run on two processing units which, while they share the same memory, are not tightly synchronized, so you cannot expect everything to behave exactly the same way as in the single core case, this is what allows two cores to be faster than one. Many processor architectures permit writes to be reordered (PowerPC for instance), among other things permitting this allows an important reduction in cache synchronization traffic. While it also allows more freedom when designing out of order execution in the processor core, it is not necessary: a system made of in-order processors may reorder writes because of the caches, and it is possible to design a system with out of order processors that does not reorder writes.

Speaking of which, on the other hand x86 guarantees that writes won’t be reordered, that architecture is referred to as strongly ordered. This is not to say it doesn’t do any reordering, for instance reads are allowed to happen ahead of writes that come “before” them; this breaks a few algorithms like Peterson’s algorithm. Since this architecture dominates the desktop, and common mobile systems have only featured a single core so far and thus don’t display memory ordering issues, programmers as a result have gotten used to a strongly ordered world and are generally unaware of these issues. But now that the iPad 2 and other mainstream multicore ARM devices are shipping, exposing for the first time a large number of programmers to a weakly ordered memory model, they can no longer afford to remain ignorant—and going from a strongly ordered memory model to a weakly ordered one breaks far more, and much more common, algorithms, like the double-checked lock and this naive FIFO, than going from single processor to a strongly ordered multiprocessor system ever did.

Note that this can in fact cause regressions on already shipping iOS App Store apps (it is unclear whether existing apps are confined to a single core for compatibility or not) since, while very few iOS apps do really take advantage of more than one core yet, some nevertheless will from time to time since they are threaded for other reasons (e.g. to have tasks run in real-time for games or audio/video playback). However, Apple certainly tested existing iOS App Store apps on the iPad 2 hardware and they would have noticed if it caused many issues, so this probably only affects a limited number of apps and/or it occurs rarely. Still, it is important to raise awareness of this behavior, as an unprecedented number of weakly ordered memory devices are going to be in the wild now, and programmers are expected to make use of these two cores.

What now?

So what if you have a memory ordering issue? Well, first you don’t necessarily know that it is one, just like for threading bugs; the only thing you know is that you have an intermittent issue, you won’t know it is memory ordering related until you find the root cause. And if you thought threading bugs were fun, wait until you investigate a memory ordering issue. Like threading issues, scenarios in which memory ordering issues manifest themselves occur rarely, which makes them just as hard (if not harder) to track down.

To add to the fun, the fact your code runs fine on a multicore x86 system (which practically all Intel Macs, and therefore practically all iOS development machines, are) does not prove at all that it will run correctly on a multicore ARM system, since x86, as we’ve seen, is strongly ordered. So these memory ordering issues will manifest themselves only on device, never on the Simulator. You have to debug on device.

Once you find a plausible culprit code, how do you fix it (since often the only way to show the root cause is where you suspect it is, is to fix the code anyway and see if the symptoms disappear)? I advise against memory barriers; at least with threading bugs, you can reason in terms of a sequence of events (instructions of one thread happening, one thread interrupting another, etc.); with memory ordering bugs there is no longer any such thing as a single sequence, each core has its own; as in Einstein’s relativity, simultaneity in different reference frames is now meaningless. This makes memory ordering issues extremely hard to reason about, and the last thing you want is to leave it incorrectly resolved: it’s neither done nor to be done.

Instead, what I do is lock the code with a mutex, as it should have been in the first place. On top of its traditional role, the mutex ensures that a thread that took it sees the writes made before it was previously released elsewhere, taking care of the problem. Your code won’t be called often enough for the mutex to have any performance impact (unless you’re one of the few to be working on the fundamental primitives of the operating system or of a game engine, in which case you don’t need my advice).

For new iOS code, especially for code meant to run on more than one core at the same time, I suggest using Grand Central Dispatch, and using it in place of any other explicit or implicit thread communication mechanism. Even if you don’t want to tie yourself too much to iOS, coding in this way will make the various tasks and their dynamic relationships clear, making any future port easier. If you’re writing code for another platform, try to use similar task management mechanisms, if they exist they’re very likely to be better than what you could come up with.

But the important thing is to be aware of this behavior, and spread the awareness in the organization. Once you’re aware of it, you’re much better equipped to deal with it. As we say in France, “Un homme averti en vaut deux” (a warned man is worth two).

Here are a few references, for further reading:

This post was initially published with entirely different contents as an April’s fools. In the interest of historical preservation, the original content has been moved here.

PSA: “previous” and “next” links in archives

(PSA, for the readers not familiar with this bit of US culture, stands for Public Service Announcement; these are similar to ads, except that instead of promoting a product, they serve to forward a message of public interest, like “Don’t do drugs”)

On the web, many archives can be browsed chronologically; I’m not just thinking of blogs here, many web pages can be thought of as being in an archive, webmail for instance. And more often than not, the links to do so are labelled “previous (item)” and “next (item)”. And here lies the rub: “previous” will typically take you to a newer item, and “next” to an older item. Wait a minute…

Oh, I certainly see the faulty logic that leads there. It started innocently enough, with “next”, denoting a link to the next page to go to see more items in the archive, but then “previous” came in for the link to go in the opposite direction; and now many sites are content with this.

But there are much better choices. For instance one is “earlier” and “later”; “earlier” has the advantage of being relatively positive (as compared to, say, “older”), which is good to encourage the reader, who has reached the bottom of the front page, to dig deeper in your archives.

Now why am I calling attention to this? Because I am guilty of this myself. Indeed, when I posted “Raising the Level of Discourse” my blog gained a second page. I immediately went to check it and saw that it indeed features “Previous Page”. The technical reason is that it seems to be the default for the theme I’m using, but that’s hardly an excuse; even if I can’t modify the theme myself (I’m on WordPress.com and have to use the available themes; I can customize one or two things and add CSS, but that’s it), I have to own up to my choices: a good craftsman does not blame his tools, but if they are not up to the task, either fixes them himself, gets them fixed, or changes tools altogether. However, this takes time, which I haven’t taken for this yet, so in the meantime it is still “Previous Page”; I apologize for the inconvenience.

The issue was resolved when I switched to a different theme in the beginning of 2012; I’m keeping this post for historical interest. – May 22, 2012

So it’s true

So it’s true. Along with announcing support for subscriptions, Apple has confirmed the policy changes that many suspected were behind the rejection of the Sony Reader app: that apps can no longer link to a place to buy content for the app (there can still be such a place, just it must not be linked from the app), and must instead offer an at least as advantageous in-app purchase to do so (Apple first released a press release announcing support for subscriptions and this new policy that would apply to them, then confirmed this new policy would also apply to all paid content used by apps, not just subscriptions).

In the way it’s presented in the quote attributed to Steve Jobs in the press release, it sounds like a wager between two gentlemen: a friendly, interested contest, here to see who can bring the most people in, everything else being otherwise equal. That the publisher earns less in one case, and yet maintains equivalent prices is not unheard of, either: for instance books sold directly often have the same price in Amazon or a bookstore (setting aside any shipping, of course), even though the retailer (and this includes Amazon) takes quite a bit of margin. In fact, in some cases like the tabletop gaming world, publishers have a store on the web but downright encourage customers to buy from their friendly local game store, because they know the benefits these places provide. So on the face of it, this looks reasonable.

Except for this: what value, exactly, does Apple bring to the table here? For in-app purchases that unlock features, there is a quite justified case to be made that, since Apple distributes the app with the feature potentially present, just not unlocked yet, the hosting, screening, and to an extent, DRM services provided by Apple apply as well to in-app purchases, justifying the same 30% cut. However, none of these apply for content in-app purchases: new content can be added all the time, without the need to send a new binary to customers (as an aside, I wonder when Apple added the ability to allow in-app purchases to be added without the need for a new binary), so this content is not coming out of Apple bandwidth, never goes through the approval process, and likely has its own DRM.

This leaves payment processing. Now don’t get me wrong, iTunes is quite awesome payment processing. Back when I was a teenager I really wanted to pay for shareware, but I had no way to do so, and my parents refused to do so for me, so I played pretend by printing the order form (this was even before we had Internet). But a teenager today can buy a prepaid iTunes card with his pocket money pretty much anywhere, or use iTunes credit gifted by his parents, and buy apps for his iPod Touch, the family iPad, or the family Mac. This is awesome. So I think Apple can justify having a payment processing fee slightly larger than, say, that of PayPal. But most definitely not 30%.

Oh, there is, of course, the immense privilege of your app being allowed to run on Apple devices, but it has been established again and again that it is a deadly sin to make it difficult for people to build on top of your platform, because a platform is only as good as what has been built on top of it. No successful platform has ever collected royalties from applications, except for game consoles, but you have to remember consoles are sold at a loss or zero margin (recouped on the games), which is not exactly the case for iDevices.

The end result is that, contrary to the cases I mentioned earlier where publishers would earn less, but still earn money, through another retailer, this leaves publishers with the prospect of selling to Apple device owners at a loss, due to the fee disproportionate with the value Apple brings, as it is certainly larger than the share of profits the publisher can afford to spare for just payment processing (given that they still need to do pretty much everything else). Even if the publisher has a way to sell directly to these same consumers, and even if he was confident most of the sales would occur directly, uncontrolled selling at a loss is still way too big a risk for any publisher to take. I don’t think Apple is seeking rent as much as trying to establish a balanced system, but even with the best intents they have set up a policy that will drive publishers away.

Even if you see no or little problem on Apple’s side with this new policy, consider the following: the end result of this, whatever the reasons or whichever way the faults lie, will be that the Apple “ecosystem” will have its own specific publishers (of books, movies, comics, etc…), either Apple itself or ones that are, in practice, dependent on Apple; publishers different from the ones used by the rest of the world. Is it really what you want? Is it really what Apple wants?

Back in the 80’s, Apple with the Mac had years of advance on everyone else, and made obscene profits exploiting their then current strengths while positioning themselves too narrowly, before this caught up to them in the end. While the mechanisms and kind of partners (application developers/content providers) this is happening with are different, I wonder if it’s not what is going to happen with iOS as well.

First Impressions of the Mac App Store

I try to be original in the subjects I tackle, but if you are a Mac user, there is no escaping the Mac App Store, which is probably the most important thing to happen to the Macintosh platform since Mac OS X, at least. It remains to be seen whether it will be in a good or a bad way, but for now, I’ve given it a test drive.

Trial Run

After uneventfully updating to 10.6.6 and launching the Mac App Store application, I decided to buy Delicious Library to catalog my growing collection of webcomic books (it’s not as big as the one Wil Shipley pimps in the Delicious Library 2 screenshots, but I’m getting there), and of course to get a feel of how the Mac App Store works for a paid application download, not just a free one. This was when I encountered the first issue:

Screen Capture of a Mac App Store dialog in French, with text being cut off in the middle

“effehargements”? I’m afraid I don’t know that word

Gee, are things starting well… I mean, did localizers get so little time to give feedback on the size of user interface elements that this couldn’t be fixed for release? Any other explanation I can think of would, in fact, be worse. I’m not going to focus on that too much since it’s likely to be fixed soon, but it’s a bad first impression to make.

After logging in with my Apple ID as instructed, I was unsurprisingly told I had new terms to accept. Less expected is the fact these terms are an extension of the iTunes Store terms and conditions; apparently the commercial relationship users of the Mac App Store have is an extension of the one most of us already have with iTunes, not an entirely new one or an extension of the Apple Online Store ones. The main reason, I guess, is that they can use the credit card already associated with your iTunes account, and any iTune Store credit you may have; plus, that way the Mac App Store benefits from the iTunes Store infrastructure (servers and stuff).

Of course, by the time I was done reading the terms, my session had expired; it’s as if they weren’t expecting you to read them… I’m noticing this everywhere, though, it’s not just Apple. So I logged in again, accepted the terms, and bought Delicious Library. As widely reported, the application then moved to the Dock with a nice, if slightly overdone, animation (sure, have an animation, but they may have used a simpler one), where it showed a progress bar while it downloaded, up until the download was complete, at which point it jumped once, and stayed in the Dock (while, behind the scenes, it had been put in the Applications folder). This may seem gratuitous, but to me this is indispensable for the buying/downloading experience, as opposed to the disconnected experience of downloading software on the Web.

I then tried out Delicious Library, entering a few books, etc. (unfortunately, I do not have a webcam attached to my Mac pro, so I had to enter the ISBNs by hand). I’m not going to get into a review of Delicious Library here, I just checked that the application was working correctly.

Then, I checked something I had been wondering about. Even though the Mac App Store will only work on Mac OS X 10.6.6 onwards, this is not necessarily the baseline for the apps bought on the Mac App Store themselves: apparently, they can support earlier releases of Mac OS X, including Leopard. Obviously, they cannot be bought from there, they have to be transferred from a Snow Leopard Mac where you bought them. But I was wondering how the computer authorization process (documented in various articles, like Macworld’s hands on, read just above “Work in progress”) would work on a Leopard machine where the Mac App Store cannot be installed.

So I took the Delicious Library application, and moved it to my original MacBook, which remains on 10.5 for a variety of reasons (I don’t have another Mac with Snow Leopard on hand, to test on pre-Mac App Store 10.6.5, unfortunately). When I connected my MacBook to the network (for the first time in the year), there was no update, which would have been necessary to add such support. And when I tried to run Delicious Library, this is what I got:

Delicious Library crash report, listing an “unknown required load command 0x80000022”

Uh oh, Wil

This error is, in fact, not related to the Mac App Store at all, it seems instead that the application relies on some other Snow Leopard-only feature, probably by mistake. Apparently, this build was never tested on Leopard. I double-checked, and the application does declare it can run on Leopard in the Mac App Store application, as well as in its property list (from which the Mac App Store information was probably generated). So, I went looking for a free app that would run on Leopard; Evernote fit the bill, so I downloaded it and transferred it. It ran without problem, however being a free app, it did not need to validate its license on the MacBook or anything of the sort, I would have to test with a paying app. Osmos declares it runs on Leopard (as early as Tiger, in fact, though it’s Intel-only, so not on a PowerPC machine), so I bought it (the things I’ll do for you people) and transferred it. But it didn’t run any better than Delicious Library, though for a different reason (it required a version of libcurl more recent than the one found in Leopard). So, it’s another app that hasn’t actually been tested on the baseline Mac OS X version it claims it supports, great. I stopped the expense there.

Note that a large majority of paid apps actually require Snow Leopard, if their Mac App Store listings are to be believed. I’d wager that none of the paid apps that declare otherwise were actually tested on Leopard, and that all paid apps actually require Snow Leopard and probably 10.6.6 to run correctly; anyone care to confirm otherwise? I have no intent on spending a bunch of money to test that theory.

General Criticism

Besides the events of this run, I want to make more general observations on the Mac App Store. Contrary to the music, movie, book, comic book, etc. industries, where digital distribution is a relatively new phenomenon, people have been selling computer software over the network (not even necessarily the Internet back in those days, think Compuserve, AoL, the numerous BBS…) – and making a living out of it – since the beginning of the nineties, if not earlier. And yet, even after 20 years, for the majority of Mac users the act of buying software still means the brick and mortar store, or at best, a mail-order store like Amazon. There is no household-name software that’s distributed mostly digitally, except for some open-source applications like VLC or Firefox, expander/viewer/reader companion apps, and rare successes like… uhh… I’m sure I’ll think of one eventually.

Welp, while my questioning of whether such software existed was rhetorical at the time, it turns out there is a piece of software that in fact qualifies: Skype; their unusual business model is irrelevant: indisputably, it is commercial software mostly distributed digitally. This goes to show that when you solve a very practical problem with killer tech, you can overcome the barrier between digital distribution and the mainstream Mac market; that being said, it remains a hard problem for anyone else. – May 22, 2012

I’ve said the Mac App Store is probably the most important thing to happen to the Macintosh platform since Mac OS X, and that’s because it promises to provide at last a way to distribute software outside of the brick and mortar stores, that the rest of us will actually use; this, in turn, will allow developers who do not have the means to distribute their products in stores to access the majority of Mac users; of course, virtual or physical, shelf space and attention remain limited, but now we can avoid a hugely inefficient step in the middle.

Since the Mac App Store will set the expectations of Mac users for years to come, how it works, what it allows users, the kind of software found on it, etc., are extremely important, not just for Apple in the short run, but for the health of the platform in many years down the line. To me, the Mac App Store delivers in the main area it was supposed to: provide a great, integrated, end-to-end buying/downloading experience. However, it falls short to a smaller or greater extent in all other areas.

Let’s begin by the design. It’s a straight port of the “App Store” iPad app. Really, couldn’t they have done better? Surely, they could have made better use of the space afforded by the desktop, instead of using the strict iPhone/iPad tabbed design. Why have one entire tab to the updates, couldn’t this be put in a notification area in all modes? And breadcrumbs? Are they forbidden now? But the worst is surely that weird window title bar, with no title, and the stoplight window controls in the center left of the bar; I mean, is space at such a premium that they couldn’t have gone with a traditional unified title and toolbar design? It would have worked very well with the Panic toolbar design! To add insult to injury, these “toolbar tabs that go to the top of the title bar” are actually click-through! Argh! Now not even the top center of a window is safe for clicking (the worst thing is, I was already instinctively avoiding them when clicking to bring the Mac App Store window to the foreground, showing how thoroughly pervasive click-through has already damaged my computer habits).

As I’ve said, the Mac App Store provides a good, more importantly connected experience, from the buying intent to the moment the app is ready to use in the Dock. I’ve heard some complain about this automatic Dock placement, but to me this is not a problem, or to be more accurate this is not the problem with it. If you think about it, this policy actually is the most efficient: as it stands if, in the minority of cases, you don’t want to regularly use the application you just bought, you can just drag it out of the Dock; otherwise, you do nothing. The alternatives are not putting it automatically, in which cases in the majority of cases you are going to fetch the newly bought app from the Applications folder to put it in the Dock (which is more work than dragging an application out of the Dock), or asking you each time, in which case every time you have to read a dialog, choose on the spot, and click; and let’s not mention having a preference. The same goes for automatic placement in the Applications folder. Yes, I know browsers have these kind of options (Downloads folder/Desktop/ask each time), but that’s mostly because of historical reasons, your not necessarily expecting to be downloading something, and the wide variety of things you could be downloading from a browser.

But that’s from the perspective of an experienced user. For user actions, and doubly so when actions are made to happen “automatically” like that, the way to undo the action should be obvious from the way it was done (or shown, animations are very useful for that); I don’t mean the way to reverse immediately if the action was a mistake (the undo command is here for that), but the way the action can be reversed later if so desired. Here the way to “undo” the Dock placement remains reasonably obvious (drag it out of there), but users are going to think it gets rid of the application. Besides the fact this not the case, users will be reluctant to move applications out of the Dock for fear of not being able to find them again, and will keep them all in there. Yeah, I know, Mac OS X Lion and the Launchpad are supposed to solve that, but they’re not there yet, and in the meantime, the Mac App Store is here and users will use it. People do not get confused by however complex the system is underneath (do many even suspect that applications are in fact folders containing the binary and support files? No.), but by “innovations” that purportedly simplify some aspect of the task while leaving some or most of the complexity still visible elsewhere.

Besides the way the Mac App Store application currently works, there are issues with what the Mac App Store enables, or to be more accurate, does not enable.

For long, developers have asked for a way to update their applications as part of the (Apple Menu)→Software Update command, or to get access to the crash reports from their applications that were sent to Apple (though the issue was more that the “send the crash report to Apple” feature gave the expectation that the developer could do something about it or was notified of the issue). But I’ve always felt that this could not be done without Apple and the developer being in a tighter relationship, because of the potential spoofing and security issues that could occur; and now the Mac App Store is that relationship. However, there are Mac App Store improvements that could be given to non-Mac App Store applications (and it’s in the long-term best interest of the Mac platform that this option remains viable); for instance, it’s been a long time since distribution through a disk image was cutting edge, and Installer packages are too interaction-heavy. It’s not possible to have one-click download and installation from the web for obvious security reasons, but couldn’t Apple make available an application packaging method that can be downloaded, then, when double-clicked, would ask something close to the quarantine question, possibly show an EULA (as disk images can do), then install the app in the Application folder, show where it is, and discard the package, without any further interaction? I’m sure plenty other improvements of the sort could be made.

I also take issue with many Apple policies with the Mac App Store. Many of them are the exact same complaints developers have had about the iOS App Store (with the exception, of course, of the inability to distribute applications outside of it; developers can distribute betas and other special versions of the application however they want); while they may seem developer complaints that users shouldn’t care about, most of them disrupt the relationship between users and developers, resulting in a lose-lose situation. These issues are, among others: not inclusive enough (for instance, applications cannot install kernel extensions; why have that facility then?), no customer information whatsoever, user “review” system that gives the expectation developers can give tech support on them even though they can’t, still no “unfiltered Internet access” rating, no support for upgrade pricing, and most of all, no real support for demos/try before you buy.

That last one is the most infuriating. Apple is showing all Mac users a more practical alternative to shrink-wrapped software stores, and the policy is still that you have to buy with your eyes closed, only on the basis of a description, a few screenshots, and “reviews” that… could be better? Frak! And don’t tell me this demo business is confusing, people get to try before they buy in real life all the time: with TVs, consoles, audio systems, etc. in the electronics store; with clothes, shoes, etc.; with cars in their auto dealer, etc, etc, etc. Do I need to go on? I’ve always been suspicious of the “experience optimized for impulse buying” argument for the absence of real demos on the iOS App Store (it seems to me there are already plenty of apps at impulse buy prices, so it would be a good idea to encourage non-impulse buy prices), but here on the Mac App Store it makes no sense at all. Oh, sure, developers can distribute a demo from their web site, but it feels about as disconnected as a broken wire. This, alone, will ensure that I will rarely, if ever, buy again from the Mac App Store; not because I’m going to go out of my way to avoid using it, but because I’ll always be afraid of wasting my money on something useless, as I never buy on impulse. Practically all the downloaded software I own, I bought it after trying it, and I’m not going to start changing that now; that would be going… backwards, back at the time of brick and mortar stores, precisely those the Mac App Store is supposed to obsolete.

By the way, in case you have an iOS device and want to encourage “try before you buy”, there is a simple way: go to the “try before you buy” iTunes badge, to mark this as an iTunes store link featured group, download 10-20 of them that seem interesting to you, and try them out. That’s it, that’s all I’m asking of you: there is bound to be a few that you will like and where you will buy the complete version; this, in turn, will send Apple the message that yes, we do want to try apps before we buy them.

I’m deeply torn about the Mac App Store; not just how it currently is, but the whole principle of it. As it currently is, it works and will be used without a doubt, while having a number of issues and setting a number of bad expectations. While I have no doubt many issues will be fixed, Apple has been pretty stubborn about some of them (I mean, for how long have we been asking for a trial system on the iOS App Store?). And there needs to be life outside the Mac App Store, but Apple seems utterly uninterested in improving anything there.