iOS 11 and its built-in App Store: weeks six and seven

Following the disruptive changes in iOS 11 and the fact we will have to use its redesigned App Store app going forward, I am diving headfirst and documenting my experience as it occurs on this blog. (previous)

Not much to report on for these two weeks, as I’ve been very busy covering the Saint-Malo comics festival for Fleen, then writing up these reports, then I had to compensate for lost time on the day job…

Though in a sense, it is an experience I can report on, as I’ve covered this festival, and in particular taken notes, exclusively using an iPhone (5S), an iPad Air 2, and Apple Wireless keyboard, and the setup worked very well. Half of the reports themselves were also typed up with the iPad. The notes were taken, appropriately enough, in the notes app, and reports typed up directly in the mail app (I used one or two additional apps, e.g. iBooks to keep offline access to the festival schedule). It is hard to say how much I benefitted from the new system functionalities, especially as they relate to multitasking, compared to what was already present in iOS 10, but on the other hand I feel they served me well, no regression.

  • I did have to relearn how to put two apps side by side, here notes and mail, but that was only a small learning bump.
  • The system generally does not allow pasting of raw text… which is an issue when composing email with data copied from many different sources. Get on that, Apple.
  • In the mail app the text editor would turn my quotes into French guillemets («»), and while I can explicitly specify straight quotes when using the virtual keyboard with a long tap, I have not found any way to do so with a physical keyboard… So I left them in; my editor had to contend with them when editing my piece for publication.

iOS 11 and its built-in App Store: weeks four and five

Following the disruptive changes in iOS 11 and the fact we will have to use its redesigned App Store app going forward, I am diving headfirst and documenting my experience as it occurs on this blog. (previousnext)

  • Last time, I forgot to mention that not only did the version number go directly from 11.0 to 11.0.2, but the latter was itself quickly superseded by version 11.0.3 (software update which I still do through my Mac for paranoia safety purposes, and regardless requires quite a bit of download). I wonder what happened there…
  • I use ellipses (…) quite often, and it took me some time to realize why I sometimes couldn’t find them on the iPad: they are gone from the French keyboard… I have to stick to the English one.
  • I haven’t managed to transfer old apps on older devices yet, but what I have done is uninstall a number of apps, especially ones that often get updated (Candy Crush, anyone?). This has resulted in a notable decrease of the number of apps I have to update at the end of the day, which is both a relief and a reduction of the download needs.
  • Speaking of which, in the storage preferences an attempt is made to provide the date of last use of the apps, but it does not take into account the use of the app through its extensions for instance (which includes a provided keyboard, a share sheet, etc.)

iOS 11 and its built-in App Store: weeks two and three

Following the disruptive changes in iOS 11 and the fact we will have to use its redesigned App Store app going forward, I am diving headfirst and documenting my experience as it occurs on this blog. (previousnext)

  • I think they went a liiiiiitle overboard with drag and drop. Case in point: I often make long presses in Mobile Safari to either preview the link or get the “title text” of an image (a very common practice in webcomics); but with iOS 11:

    1. I must wait even longer because the browser has to allow for the possibility for the gesture to be a drag and
    2. I must also move even less during that time because then the browser will interpret it as a drag.
  • When hitting an HTTP link in-app, it now always goes to the relevant app if there is one (instead of the somewhat random behavior from iOS 10), and no longer allows to go to Mobile Safari (through the top-right app+arrow symmetric of the top-left one that is still here and allows to return to the app in which the link was tapped). This is actually a regression for me, as when using Twitterrific I sometimes want to go to the Twitter app (to vote in a poll tweet for instance), but sometimes I do want to go to Mobile Safari, most often in order to open a new tab so that I can view the tweet later (e.g. it has a video and I don’t have my headphones on); in the latter case I now have to do copy, manually switch to Mobile Safari, new tab, and paste+visit. Another impact is that if the app refuses to load the content for some reason (e.g. Instagram refuses to do so if you are not logged in, and it may happen for multiple reasons, such as the fact multiple people may be using the iPad), there is no proposal to go to Mobile Safari, again needing a manual process (though in my case I solved this by uninstalling Instagram).
  • Speaking of which, Mobile Safari on iPad now no longer groups tabs in tab browsing view. I am neutral on the change.
  • In the Calendar app, you can no longer leave the event name empty. I use this in order to record instances where I miss a radio broadcast (e.g. the train was going through a tunnel while it was broadcasting): I just have to adjust the time and put it in a dedicated calendar, which makes recording this quick and easy (important so that I don’t end up forgetting). Now I also have to title those “Nothing”. Small but annoying.
  • On the iPad keyboard, when in symbol mode the key in place of the “shift” key is titled “#+=”, as in the iPhone… but contrary to the iPhone, these characters are not present in the keyboard you get after pressing that key. Uh?

iOS 11 and its built-in App Store: week one

Following the disruptive changes in iOS 11 and the fact we will have to use its redesigned App Store app going forward, I am diving headfirst and documenting my experience as it occurs on this blog (next).

  • Syncs are obviously much faster now. Beforehand the routine would be:

    • start the download of all app updates just before I leave home for work (…provided I remember)
    • once back home from work, sync the iPhone (~10 minutes)
    • then sync the iPad (~10 more minutes)
    • then trigger a Time Machine backup, to be sure to have the devices backups in there.

    Now it is more like:

    • once back home from work, sync the iPhone (~1 minute)
    • then sync the iPad (~1 more minute)
    • then trigger a Time Machine backup.
    • Meanwhile, tell my iPhone and iPad to download their app updates during the night.

    The undeniable advantage is that I can do these operations right after another without interruption, while with the long syncs I would have to leave to do something else while iTunes would do its thing to the iPhone or iPad… and often forget to launch the next operation until a few hours later.

  • I hate the new way the lock wall paper comes into view, because for split second you get the dreaded feeling the midtones are too dark. Indeed, if you’re like me you have bad memories of images being transferred from a Mac to a PC without gamma correction and the midtones appearing too dark as a result (same if you’ve worked on video or image editing, of you’ve prepped files for printing, etc.) I suppose I’ll get used to it at some point

  • The built-in QR code scanner works well. The implementation as a pseudo-notification is pretty nice to ensure you pick the right one when multiple are nearby:

    Interface of the built-in QR code scanner in iOS 11, inside the Camera app

    (this is from a sheet of paper I put to indicate the references of a few art pieces I put on my wall at work, in case my coworkers get curious)

  • I haven’t transferred the 32-bit only apps to the respective older devices (iPhone 3GS and iPad 2) yet. The hardest will be to transfer user data; I will probably need to start from a backup, merge with a backup of the old device (because some apps were already there), and reload that backup… fun stuff.

  • The new ability to uninstall apps is a godsend. Previously, in order to preserve game data (which took me a long time to obtain) weighing on the order of a few kilobytes (not even a megabyte), I was forced to keep 600 MB of data around like an anchor around the neck of this data. No more.

    The Chrono Trigger app shown as taking up 649.3 MB (app) and 41 kB (data)

    Note: you must have your Apple ID set up in the App Store app for it to appear (when I attempted to use it on iPad, I had just logged off from it for unrelated reasons, and was wondering why it would not appear as it would on the iPhone).

  • Animated GIFs put in the camera roll now do animate when you view them there. However, as a result you cannot edit them.

  • The Files app. This will be an ongoing section because I expect never to be done with it.

    Note the Files app does not provide a storage space by itself; rather, it is meant to unify the view between the different storage providers (iCloud Drive, Dropbox, Amazon Drive, etc.) on your device; I use Documents by Readdle. But it enables more than just this unified view.

    • A major change it permits is that you can now download unrecognized files, including blobs, from the browser to the Files app. Of course, that file has to go into one of the storage providers, but the action is different from the traditional “open copy in…”: the semantics are different, in the latter case for instance Documents by Readdle could decline to open the file which would prevent me from saving it there, while by going through the “Save in Files…” option in the share sheet, it is saved there without trouble.

    • However, there are limitations. In particular, Mobile Safari royally ignores the download attribute (rdar://problem/34745102/), including its value (blobs are saved with name “unknown.dms”), as well as the ”Content-Disposition: attachment” HTTP header (which forces the browser to download the resource, rdar://problem/34721730/): it always attempts to load the resource as a first step, which may be dangerous in some cases.

    • Also, all the file management issues we’ve been complaining for years for Apple to improve were dumped wholesale in the Files app, and in particular the presence of file typing extensions:

      In the immortal words of John Siracusa: “Would the real unknown please step forward”. Come on Apple!

    • However, if Mobile Safari is able to display or play the media, you seem to be out of luck: I have not seen any option to download it. Images you can save in the camera roll, but audio files for instance have no download option of any sort.

    Plenty more discoveries to follow, I am sure…

iOS app management removed from iTunes (a first reaction)

Perhaps a bit lost in the noise of last week’s announcements was the release of iTunes 12.7, which removes iOS app management (oh, and ringtones, too): you can no longer buy iOS apps on the desktop, or update them, or sync the ones you bought to your device except in an ad hoc way.

I admit I was taken by surprise. I heavily use these features: in principle, I do not download apps or app updates directly to my iPhone or iPad (there are exceptions): if I am at home with WiFi, I consider I might as well use my Mac, and elsewhere I’d rather not eat into my WAN bandwidth cap, battery life, etc. Plus, I indeed find iOS app browsing in the built-in (iOS) App Store app to be a substandard experience. Yes, some of us still don’t buy into the idea that the handheld device is necessarily self-sufficient; I mean I’d very much like to see you add freely distributed music (which as a result is not in the iTunes store) to your iPhone music library, or back up your iPhone to a non-Internet backup location, using solely the iPhone itself. As long as I can’t do that and have to sync, might as well use sync for everything (and honestly, I don’t mind sync per se).

And of course, speaking as a developer-adjacent person, I have to wonder what the impact is when potential customers who come across a link to an iOS app while browsing the web on their desktop… can no longer buy it there. There will be lost sales until Apple improves the situation (QR codes would be a start, for instance).


Now thinking about it more, I may be able to live without this feature. I don’t reorder apps on the iPhone screen from iTunes, app thinning means that even with two devices, my bandwidth use should even be lower (compared to downloading the “fat” app or app update as I do today), I haven’t tried to keep superseded version of apps just in case an update would ruin it, I will search and discover apps on the device if I have to (the main trouble in this scenario being, for me, the lack of free trials — that is not changing), I haven’t switched devices in a long time (though that might change in the next few months)…

So I’ll try it out. After one last sync, I will later today update iTunes and I will see if I miss anything. Maybe iOS 11 will help, maybe it won’t, maybe Apple will improve the experience (I wouldn’t hold my breath — iOS 11 features are already known, normally).


But as with yesterday’s post, my biggest worry is for historical preservation. What happens, in the long run, for apps that are no longer being sold in the iOS App Store? Will they only remain on devices where they were bought, with no chance of being able to transfer them on a different device? But I fear this is the last of Apple’s worries…

What benefits does iOS 11 get from being 64-bit only?

By now you have proooooooooobably heard that iOS 11 will consider apps that still have not been ported to 64-bit mode as obsolete. In practice, by refusing to run them.

Now this post is not on the how to port to 64-bit (I mean, if that is your concern Apple has been encouraging you to do so for years now…), rather on the why Apple did so. Why obsolete perfectly good 32-bit code and apps? I do not have all the answers, but I have a few. Let us first see why 64-bit is the better choice if we have to choose between the two, and why Apple chose not to maintain both.

Why 64-bit only is better than 32-bit only?

That one is an open and shut case: in this earlier post I already presented how the then-new iPhone 5S 64-bit environment was overall a benefit; and the benefits have only grown since then (as I wrote: “native 64-bit math is a plus for some specialized tasks and the future”), so there really is no question. If Apple had to drop one, it had to be the 32-bit environment.

Why not both?

hardware savings

In theory, Apple could save silicon surface on their post-iOS 11 hardware designs (iPhone 8, 8+, and X) by omitting parts that serve only for ARM/A32 mode execution in their processor design (which they have the power to do, remember they design their own ARM CPUs now). Indeed, while the cleanup from ARM/A32 to ARM64 was not nearly as dramatic as x86 to x86-64, some instructions and instruction semantics were dropped, though how far this could could save in terms of execution units is way beyond my expertise; more important probably are the savings for the instruction decode circuits: not only is there no need to support Thumb, but the instructions formats were completely overhauled between ARM/A32 and ARM64, with the former being quite convoluted (plenty of non-uniform formats, one-off cases, and split fields).

In practice, I wonder if this is worth the trouble. I think ARM processors are meant to start up in 32-bit mode before being raised to 64-bit, anyway, and there may be additional compatibility constraints (e.g. with drivers, hypervisors). Even if they did take advantage of this, this is not the main driver.

software savings

That is the part where the real savings are. Through the equivalent of app thinning, Apple could already eliminate the 32-bit parts from their kernel and built-in applications, but they would still have had to provide the 32-bit slice of the library stack (everything from libSystem to AppKit) so that 32-bit apps could keep running. And that does take up some space on your iPhone or iPad storage (which I have not measured, to be honest)… but more importantly, this slice would take up space in RAM, next to its 64-bit equivalent (always present since built-in apps use it), just as soon and any time a 32-bit app would be running.

This is the message that Apple has been not-so-subtly telling users already when they warned of 32-bit apps that running them would slow down the device: iOS devices have traditionally been quite RAM-constrained, and even if that eased a bit in recent years, any RAM savings are worth taking: they allow more tabs to remain active without having to be reloaded, more apps to remain frozen and only have to be (quickly) thawed instead of having to be relaunched, etc., improving the overall experience. And so to keep having the 32-bit library stack loaded in RAM in most iOS devices just next to the 64-bit library stack was starting to look like a waste of precious resources.

Was it worth it?

Heck if I know. I do not think I will be too much affected through the apps I own, but I am always worried about such obsolescence, especially from a digital preservation perspective. That being said, for the purposes of saving such history it is best to rely on a historical device (such as one that can’t be updated to iOS 11), because they are many other reasons why historical iOS software just stops running anyway. I keep my old iPhone 3GS for that purpose, and it is already loaded with a number of apps that simply don’t run any more on my iPhone 5S running iOS 10.

WWDC 2017 Keynote not-quite-live tweeting

(Times are GMT-7 and their timestamps correspond to a real-time, though not live, viewing of the 2017 WWDC Keynote and Platforms State of the Union)

Apple to phase out usage of Imagination Technologies GPU in iOS devices

Big news dropped recently: via Daring Fireball, we learn that Apple notified Imagination Technologies that they would no longer be using their products in new iPhone, iPad or iPod Touch designs within a 15 to 24 months timeframe.

For some time already, the GPU has been the biggest driver and bottleneck of iOS performance, if not since the beginning, at least starting with the iPad and Retina devices, compounded when iPads became Retina themselves: iOS SoCs have been characterized for some time as being bandwidth monsters (relatively to mobile devices), most of it connected to the GPU so that it can feed the screen pixels. It is the GPU which is mostly responsible for scrolling smoothness, for the amount of layers you can have on screen before performance takes a dive, for the performance of games, etc. The improvement of CPU performance, comparatively, improves the iOS experience much less (in the browser, mostly). If you’ve been curious enough to look at CPU teardowns of iPhones, for instance here for the iPhone 7, you know the GPU can take as much space as the multiple CPU cores, and for iPads a truly outrageous amount of silicon surface is taken by the GPU alone. And you are more than aware of Apple’s reliance on graphical effects (not just partial transparency, but also now translucency, blurs, etc.) in the iOS interface, all of which are generated by the GPU. So the GPU on iPhones and iPads has strategic importance.

If you need a refresher, Apple has been using PowerVR GPUs from Imagination ever since the original iPhone. More than that, though, it is the only outside technology (and a significant one, at that) that is and has always been an explicit dependency for iOS apps: readers of this blog don’t need to be reminded of Apple’s insistence to own every single aspect of the iOS platform (if you missed the previous episodes, most of it is in my iPhone shenanigans category) so as not to let anyone (Microsoft, Adobe, whoever) get leverage over them, but graphical technologies have been a notable exception, being more than mere software. For instance, while Apple uses OpenGL ES, and now Metal, to abstract away the GPU, a number of PowerVR-specific extensions have always been available and Apple encouraged their use. Even if Apple has recently tried to wean their developers away from these extensions, and stopped advertising to developers the GPUs as being PowerVR products (starting with the A7/iPhone 5S, if I recall correctly), iDevices are still using Imagination products, and PVRTC, as in PowerVR Texture Compression format, textures are still a common sight in the bundle of iOS games and other apps, for instance.

So the first challenge here is the dependencies on these extensions. I don’t see Apple getting developers to make such a transition so quickly, especially as the first devices without Imagination tech are going to be available 12 months before the deadline (the iOS product lines have become too complex to perform the hardware transition all at once), which would leave developers 3 to 12 months to transition… So most likely, Apple is going to have to support those, and this is going to expose them to intellectual properties issues (patents or otherwise). Besides the extensions developers explicitly use, there are all the performance aspects and tradeoffs specific to PowerVR that iOS games have unwittingly become dependent upon (e.g. whether to use complex geometry or compensate with shaders, how to best obtain some effects, etc.), which Apple would have to best reproduce, or at least not regress on, in a new GPU.

And even if they started from a blank slate when it came to third-party software, Apple has many technological challenges to overcome. Much like audio and video codecs, graphical processing technologies are patented to the hilt; but contrary to audio/video codecs, there is no FRAND licensing, no patent pool, or single licensing counter for GPU tech; instead, existing GPU companies live in an uneasy truce, given they are all exposed to each other’s patents. And mobile GPUs are a particular breed within this universe, with adapted techniques to live in such constraints, like Tile-Based Deferred Rendering (present in all PowerVR GPUs). Apple has managed to build its own CPU with great success, so I have little doubt that they will manage to develop their own GPU, especially given their expertise in SoC design as well. But I also see patent royalty payments in Apple’s future.

So what does this mean for iOS developers? For now, nothing. There is nothing to justify scrambling to remove any PowerVR dependency at this point, and it’s pointless to second-guess the performance characteristics of these future Apple GPUs. Best to wait for Apple to come forward. But there is some transition ahead, because at least some long-held assumptions about how iPhone graphics work are going to be challenged when the new Apple GPU will eventually appear. If anything, I’m surprised for such a glaring externality in the iOS platform to have managed to remain for so long, and it will be interesting to see how this will play out and how Apple will manage any necessary transition.

See also: Ryan Smith’s take at AnandTech, a reference.

APFS’s “Bag of Bytes” Filenames (Michael Tsai – Blog)

I have sooooooooo many questions. I mean, first I have the same ones as Michael, but on top of that:

  • “bag of bytes”, but I hope at least that the file name, even if not normalized, is guaranteed to be valid UTF-8, right? Right? Right?
  • In some circumstances, it is possible for the user to type the beginning of a file name to select or at least winnow the file selection; is there going to be guidance on how to perform this?
  • Sorting file names for display. Oh, the fun we shall have with sorting. Again, will guidance/a standard function be provided?
  • Normally this should result in less issues for software that wrote a file name with any valid UTF-8 string, then expects a file with that exact name to be in the directory listing, as it will be the case at least more often (I must admit I don’t fully understand the issue that led to the Apple response in the first place, though I understand even less the Apple response). However, when performing manipulations with NSString/NSURL/Swift String, do those preserve composition enough that developers can rely on them for that?

Now, granted, I know two people this will make happy (or, OK, less unhappy)…

EDIT: One additional data point about this, is that in a similar situation, even Apple doesn’t get it right (coincidentally, fixed in Safari 10.1 and iOS 10.3). Let me tell you, this issue was a bear to isolate.

I admit:

  • I have no idea where this was in Safari, though it is safe to say Apple has responsibility for that code,
  • Safari is already compensating for invalid data, the URL should be properly escaped in the first place, and
  • this is when using HTTP, not the filesystem.

Nevertheless, this shows Apple themselves sometimes get it wrong and normalize strings in a way that causes issues because the underlying namespace has a dumb byte string for key. So if they can get it wrong, then third-party developers will need all the help they can get to get it right.

EDIT: New info, in that there will be a case-insensitive variant for the Mac, which will also behave differently for normalization.

I think “normalization-preserving, but not normalization-sensitive” means that (like HFS+ on the Mac, unlike APFS on iOS) you cannot have multiple files whose names differ only in normalization. And you can look up a file using the “wrong” normalization and still find it. Additionally, beyond what HFS+ offers, if you create a file and then read the directory contents, you’ll see the filename listed using the same normalization that you used.

This is my interpretation as well.

Curtain update

I took advantage of the recent update to JPS to experiment a bit with Curtain. I significantly retooled it towards one goal: separate the generation of the deployment package from the deployment itself.

While the initial version of Curtain benefitted from many influences, one I completely forgot to take into account was Alex Papadimoulis’ teachings, more specifically those about release management and database changes. Especially the commandment that builds be immutable and to make sure that what gets deployed on production is the same thing that got deployed on the earlier environments.

When I recently re-read those two articles for inspiration at work, I thought: “Uh, oh.”

Indeed, with Curtain the deployment process is not only a function of the revision that we ultimately want there, but also of what was previously there, in order to support proper rollover of resources (itself necessary because of offline support). And as originally designed, Curtain would just adapt its deployment to what was previously there, which means that, if I wasn’t careful and did not double check that staging was properly rolled back to what is present in production (which let’s admit, we’ve all done at some point), then the Curtain deployment to staging would not be representative of the eventual deployment to production. Oops.

So Curtain has been updated to, rather than perform the deployment itself, instead generate a package containing the generated files; this package doubles as a Python script which, when invoked, will perform all the deployment steps to the target of choice. The script itself is dumb and takes no decisions, such that it can be invoked multiple times and perform always the same job, but it also checks prior to operating that the data previously present corresponds to the expectations it was generated with. That way, we can use the same script multiple times, once on staging and once on production, and be certain that the two deployments will be the same. And Alex will be happy.

One more thing. In my initial post, I also completely forgot to mention another influence: Deployinator. Many aspects of Curtain come from Deployinator: deployment as a single operation, deploying assets as a layer separate from code, and versioning these assets as part of the URL, etc. The lessons from Deployinator were so obvious to me that it did not even occur to me to mention where they came from. That omission has now been repaired.