Eric Roller's Development Blog


On the development for iPhone using iOS.

Framing iPhone 14 screenshots

- Posted in iOS by

When trying to place generated screenshots into device frames (automated by fastlane's frameit), I discovered that, since the iPhone 14 devices have small bezels, the corners of the screenshot can be seen spilling over to outside of the template frames for these devices:

trim_devices = [
    "iPhone 14 Plus",
    "iPhone 14 Pro",
    "iPhone 14 Pro Max",

I solved this by cutting the corners off of the screenshot images before running frameit. This can be done by three ImageMagick commands, creating a rounded-rect image with 50-pixel-radius curved corners:

`convert '#{file}' 
    -format 'roundrectangle 1,1 %[fx:w+4],%[fx:h+4] 50,50' 
    info: > mask.mvg`
`convert '#{file}' 
    -alpha transparent 
    -background none 
    -fill white 
    -stroke none 
    -strokewidth 0 
    -draw "@mask.mvg" mask.png`
`convert '#{file}' 
    -alpha set 
    -bordercolor none 
    -compose DstIn 
    -composite '#{file_name}'`

In my setup, I am operating in a temporary directory (with a language subdirectory, e.g. "en-US"):

Dir.mktmpdir do |tmp|
  path_temp = File.join(tmp, language)
  puts "Operating in #{path_temp}"

  num_found = 0
  # ...(see below)...

Then for each of the devices, depending on whether it is a large iPhone 14 device or not, I either create a corner-cropped image in the temporary directory or place a copy of - or rather a symbolic link to - the original image:

path_shot = File.absolute_path(File.join(
      "..", output_directory, language

run_devices.each do |device|
  Dir.glob(File.join(path_shot, device + "-*.png")).each do |file|
    file_name = File.basename(file)
    Dir.chdir(path_temp) do
      if trim_devices.include? device
        # Cut off rounded corners using ImageMagick:
        puts "Rounded frame for: #{file_name}"
        `convert '#{file}' -format 'roundrectangle 1,1 %[fx:w+4],%[fx:h+4] 50,50' info: > mask.mvg`
        `convert '#{file}' -alpha transparent -background none -fill white -stroke none -strokewidth 0 -draw "@mask.mvg" mask.png`
        `convert '#{file}' -alpha set -bordercolor none mask.png -compose DstIn -composite '#{file_name}'`
        Dir.glob("mask.*").each { |mask| File.delete(mask) }
        puts "Linking frame for: #{file_name}"
        File.symlink(file, file_name)
    num_found += 1

After adding Framefile.json as well as any other needed files into the temporary directory, frameit can be called to do its work there:

    use_platform: "IOS",
    path: tmp

Thereafter the "*_framed.png" images need to be copied before the temporary directory is deleted. Here is an example where the result frames are placed in a separate "en-US_framed" directory:

path_fram = File.absolute_path(File.join(
    "..", output_directory, language + "_framed"

Dir.glob(File.join(path_temp, "*_framed.png")).each do |file|
  file_name = File.basename(file).sub(/_framed/, "")
  puts "Frame generated: #{file_name}"
  FileUtils.cp(file, File.join(path_fram, file_name))

MC-Timer UI Design

- Posted in iOS by

For a new version of MC-Timer, I have been looking at reorganising the playback screen, especially the text pointed out here:

MC-Timer progress text below the rings. Song: Sia - The Greatest

Being near the bottom, I don't think people look at this much and combining step values with timing values is probably confusing.

So maybe it will look better when annotated directly over the progress rings? Thankfully SwiftUI makes it easy to experiment with different layouts.

MC-Timer progress text on the rings

Yes, this helps in the understanding of the values, but I had to drop "Music", "Pause", or "Countdown" which didn't look good when written as a curved text. However, the overall aesthetic suffers tremendously.

So it is probably better to keep the texts in the top corners:

MC-Timer progress text in the top corners

This looks much better and the grouping of the timing values on the left vs. the step values on the right helps also. There is no additional information what the "8 / 121" values mean but it does become apparent whenever the red progress bar increments.

But the wide "15s Pause" text spilling over into the top of the red is not ideal, so I will split that into two lines like this:

MC-Timer split progress text in the top corners


My app "MC-Timer" supports playing music with a mixture of Apple Music curated playlists, (Apple Music catalog playlists,) Apple Music songs, library playlists as well as individual songs from your media library.

Playback for songs from your media library and for those streamed from Apple Music works well with the iOS media player.

However, for items in your own "catalog playlist", the Apple Music API will return playParams in the JSON response that may look like this:

"playParams": {
    "id": "i.DVENxPRTdlJoV",
    "kind": "song",
    "isLibrary": true,
    "reporting": false,
    "purchasedId": "253867849"

By the way, parsing this into a dictionary of type [String: Any] is a huge pain and I wish the media player API could just accept the JSON as is. Apple, please add: MPMusicPlayerPlayParameters(json: String).

To play this song, one can pass on the dictionary with the parameters to the music player as follows:

if let mppp = MPMusicPlayerPlayParameters(dictionary: playParams) {
    // …

However, it only captures part of the dictionary:

(lldb) po mppp
<MPMusicPlayerPlayParameters:0x280d983f0 {
    id = "i.DVENxPRTdlJoV";
    isLibrary = 1;
    kind = song;

And when you try to play that:

player.setQueue(with: MPMusicPlayerPlayParametersQueueDescriptor(playParametersQueue: mppp))

you will see:

2021-03-12 14:02:37.105160+0100 app[1626:732477] [tcp] tcp_input [C13.1:3] flags=[R] seq=2895265255, ack=0, win=0 state=LAST_ACK rcv_nxt=2895265255, snd_una=3660699433
2021-03-12 14:02:39.764732+0100 app[1626:732039] [SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=6 "Failed to prepare to play" UserInfo={NSDebugDescription=Failed to prepare to play}

and the music player will play any random song from your library instead.

Neither does it work to play the song via its store identifier:

player.setQueue(with: [ "i.DVENxPRTdlJoV" ])

Apparently, this is a known problem for years, and it has not been fixed.

I hear that the purchaseId should be used instead, but this is undocumented. Also, if that is the case, the MPMusicPlayerPlayParameters should handle that under the hood.

On Apple TV with tvOS 14.4, once Apple Music access is enabled, the following commands will cause an app to crash:

let songStoreID = "900032829" 
let musicPlayer = MPMusicPlayerController.applicationQueuePlayer  // [1]
musicPlayer.setQueue(with: [ songStoreID ])
musicPlayer.prepareToPlay()    // <-- crashes here [2]
musicPlayer.play()   // if skipping [2], it would crash here instead

In the Xcode console log, I note these messages:

// [1]
[APSLogUtils] [AirPlayError] APSCopyPairingIdentity:627: got error 4099/0x1003 NSXPCConnectionInvalid
[MediaRemote] Error fetching local receiver pairing identity (OSStatus = 4099)

// [2]
[SDKPlayback] applicationQueuePlayer _establishConnectionIfNeeded timeout [ping did not pong]

Note: It does NOT crash when using MPMusicPlayerController.systemMusicPlayer.

Apple feedback id: FB8985422

New App: MC-Timer

- Posted in iOS by

I can finally reveal what I have been playing with recently:

MC-Timer App Icon MC-Timer

It is a workout timer which plays music, aimed at training sessions with repeated high-intensity and rest phases. The app will play music when you are working and will be quiet when you are not.

A typical usage example is circuit sessions where you work hard for instance for 45 seconds and rest for 15 seconds, and repeat. In a group, you may want to use the quiet periods to tell your friends what the next exercise is.

MC-Timer Playback screen

You can freely configure the timings and your music playlist, even add songs from Apple Music.

As an universal app it supports both iPhone and iPad. There is even a playback app for Apple TV.

There are neither subscriptions nor in-app purchases, no ads, no pestering review requests, and no usage tracking.

The app is written entirely in Swift, uses MusicKit for playback, and CloudKit with Core Data to synchronise sessions between devices via iCloud. I suppose it could have been finished earlier, but I decided to transition to SwiftUI, which requires iOS 14. Fastlane is used to automate the creation of screenshots.

SwiftUI would be impossible without the preview canvas where our view model can be inspected live. When using Core Data, however, I don't want to preview actual data; I want to be able to supply custom preview-specific data.

To do this, I am creating an in-memory persistent store with hard-coded data:

import SwiftUI
import ShopCore    // Private framework
import CoreData

// …


struct ContentView_Preview: PreviewProvider {
    static var previews: some View {
        return ContentView(store: previewStore)

    static let previewStore: ShopKeeper = {
        // Create a dedicated instance, without loading from Core Data
        let store = ShopKeeper()

        // Create the preview coordinator
        let context = PreviewCoordinator.shared.viewContext

        let previewShop =
            ShopModel.entityName(), into: context)
            as! ShopModel

        // Data to be shown in the preview canvas:
        previewShop.name = "Preview Shop"
        previewShop.comment = """
This is a dynamically created preview shop that
is stored in an in-memory \"persistent\" store.
        // …

        try? context.save()

        // Show just our preview store
        store.list = [ previewStore ]
        return store

And this is how the in-memory persistent store is set up for the preview:

private class PreviewCoordinator {
    static let shared = PreviewCoordinator()

    let objectModel: NSManagedObjectModel
    let storeCoordinator: NSPersistentStoreCoordinator
    let viewContext: NSManagedObjectContext

    init() {
        objectModel = NSManagedObjectModel()
        objectModel.entities = [

        storeCoordinator = 

        viewContext = NSManagedObjectContext(concurrencyType:

        do {
            try storeCoordinator.addPersistentStore(ofType:
                NSInMemoryStoreType, configurationName: nil, at: nil,
                options: nil)
            viewContext.persistentStoreCoordinator = storeCoordinator
        } catch {


To make this work, our managed object needs to be able supply its entity (class) name:

extension NSManagedObject {
    public class func entityName() -> String {
        // Typically: "MyApp.MyModel"
        let moduleAndClassName = NSStringFromClass(object_getClass(self)!)
        // Return just: "MyModel"
        return String(moduleAndClassName.split(separator: ".").last!)

I cannot load the Core Data model description, since it is located in an embedded framework. Therefore, the entity property cannot be used; instead the ShopModel needs to supply a preview-specific entity description:

import CoreData


extension ShopModel {
    // Returns the entity description, mirroring what is defined
    // in the .xcdatamodeld file for Core Data.
    // Beware: To be kept in sync with the .xcdatamodeld !!!
    public class func previewDescription() -> NSEntityDescription {
        let entity = NSEntityDescription()

        entity.name = entityName()
        entity.managedObjectClassName =
            "ShopCore." + (entity.name ?? "none")
        entity.renamingIdentifier = entity.name

        entity.properties = [
            NSAttributeDescription(name: "id",
                type: .UUIDAttributeType, optional: true),
            NSAttributeDescription(name: "name",
                defaultValue: "New Shop"),
            // …


The most critical part turned out to be the managedObjectClassName - it needs to be set correctly or else the as! ShopModel cast will fail. Normally, the class name would be something like "ShopApp.ShopModel", but in my case, where the ShopModel is declared in an embedded "ShopCore" framework, it would be "ShopCore.ShopModel". To eliminate these conflicts, it is a good idea to manually set the Module (to "ShopCore") for each object in the .xcdatamodeld editor.

To streamline the above code, I also added a convenience initialiser, with defaults that are specific to my use case:


extension NSAttributeDescription {
    // Used below to create properties for the entity description.
    convenience public init(name: String,
        type: NSAttributeType = .stringAttributeType,
        defaultValue: Any? = nil, optional: Bool = false) {


        self.name = name
        self.renamingIdentifier = name
        self.isOptional = optional
        self.defaultValue = defaultValue
        self.attributeType = type

        switch type {
        case .stringAttributeType:
            self.attributeValueClassName = "NSString"
        case .dateAttributeType:
            self.attributeValueClassName = "NSDate"
        case .UUIDAttributeType:
            self.attributeValueClassName = "NSUUID"
            self.attributeValueClassName = "NSNumber"


Update History

[2020-08-14] Now using a constant managedObjectClassName prefix. [2020-08-09] Renamed entityDescription() to previewDescription(), clarifying why entity cannot be used. Moved previewStore into the ContentView_Preview struct, avoiding the use of a global valiable. Moved convenience init() to be last.

For a long time, I have been trying to avoid scroll bars in the screenshots that I generate. The solution that I came up with was simply to wait for them to disappear:


// Wait for the scroll indicators to be hidden
Thread.sleep(forTimeInterval: 2.5)


I have also been using fastlane for several years to automate this. While I try very hard never to add any screenshot-specific exceptions to the code, there has always been the odd need to do it, for instance to handle situations that are not supported by the iOS simulator.

Preprocessor Macros

For code sections that are only used during development, you should mask them out using "preprocessor macros" like:

print("Data finished loading.")

This works using an Xcode project setting that defines SWIFT_ACTIVE_COMPILATION_CONDITIONS to use DEBUG, which results in the command-line option -D DEBUG to be used by CompileSwift. You can see this if you dive deep into your build logs.

Similarly, you can mask sections that are only (or never) to be used in the simulator like this:

#if targetEnvironment(simulator)
    // iOS simulator only
    // iOS device only

Simple inversions are also supported in Swift using an exclamation mark (read it as: "not"):

#if ! targetEnvironment(simulator)
    // iOS device only

Fastlane-Specific Code

The same can be done for fastlane-specific code, but it is necessary to define a macro, for instance in the fastlane/Snapfile. Here we define FASTLANE:

# Add a define for xcodebuild

which allow us to use in the code:

    // fastlane only

No Scroll Bars in Fastlane

Using the above macro, I have now simply disabled the scroll bar indicators in the code:

override func viewDidLoad() {

    navigationItem.rightBarButtonItem = editButtonItem

    #if FASTLANE
    tableView.showsVerticalScrollIndicator = false
    tableView.bounces = false

And this means that the test script can be made to run without additional delays (compare above):


[UPDATE 2021-01-21] In SwiftUI, you could use:

List {
.onAppear {
    #if FASTLANE
        .showsVerticalScrollIndicator = false

Bash Script to Scale AppIcons

- Posted in iOS by

The Old Automator Flow

In a previous post, I described an Automator workflow to rescale iOS app icons. While that has worked well, it had potential for improvement:

  1. The workflow document is not easily edited, at least not on a slow machine like mine;

  2. When saved, the workflow document contains a preview image which blows up the file to about 1 GB. The preview image can be removed, but it adds another manual hack that is required.

Because of these shortfalls, I became increasingly reluctant to use it and found myself resizing images manually again.

Welcome to sips

I don't know how long it has been there, but it turns out macOS contains a command-line utility to convert images: sips (scriptable image processing system). You can check if it is installed on your machine using:

$ which sips


$ sips -h
sips - scriptable image processing system.
This tool is used to query or modify raster image files and ColorSync ICC profiles.
Its functionality can also be used through the "Image Events" AppleScript suite.

    sips [image-functions] imagefile ... 
    sips [profile-functions] profile ... 


Using sips

With some tips from this post, I have now set up the following script, saved in a file like ~/bin/resizeAppIcons, (i.e. in a directory listed in your $path variable):


args=$(getopt h: *)     # do not use "$@" to work with 'set' below.
if [ $? != 0 ]; then ((help++)); fi

set -- $args
for i ; do
    case "$i" in
        shift; break;;

if [ $help -gt 0 ]; then
    echo "USAGE: $(basename $0) [-h] [pngFiles...]"
    exit 2

if [ ! -x $pngquant ]; then
    echo "Warning: ImageAlpha.app is not installed"

for img in $* ; do
    echo "$img"

    # Get the current image size: ( width height )
    size=( $(sips -g pixelWidth -g pixelHeight "$img" | grep -o '[0-9]*$') )

    # We expect 1024 x 1024.
    if [[ ${size[0]} -ne ${size[1]} || ${size[0]} -ne 1024 ]]; then
        echo "Error: Image size should be 1024 x 1024 (not ${size[0]} x ${size[1]})."

    # Split the file path:            path/to/filename.png
    dirn=$(dirname "$img")          # path/to
    base=$(basename "${img%.*}")    #         filename
    extn=${img##*.};                 #                  png

    if [[ $extn != "png" ]]; then
        echo "Error: Expected image format is PNG (not $extn)."

    if (( $errs )); then
        echo "Too many errors; giving up"
        exit 1

    # Using all the known image sizes:
    for width in 16 18 19 32 36 38 40 48 55 58 60 64 80 87 88 100 120 128 152 167 172 180 196 216 256 512 1024 ; do

        # Do not overwrite existing files
        if [[ -f "$outfile" ]]; then continue; fi

        echo "--> $outfile"

        # Copy or resize the image
        if [ $width -eq 1024 ]; then
            cp "$img" "$outfile"
            sips -Z $width "$img" --out "$outfile" $gt; /dev/null

        # Use exec in ImageAlpha to reduce colours and size.
        if [[ -x $pngquant ]]; then
            $pngquant -f $colours -o "$outfile" "$outfile"

exit 0

Feel free to adjust the for width in ... line to use the image sizes that you need.


The script requires a 1024 x 1024 pixel PNG image as input. It creates the scaled images in the same directory:

$ resizeAppIcons ./work/export/AppIcon.png
--> ./work/export/AppIcon-16.png
--> ./work/export/AppIcon-512.png
--> ./work/export/AppIcon-1024.png

Image Alpha

Finally, I can recommend the use of ImageAlpha, which will optimize your images and reduce the colours. Once installed in /Applications, the script will pick it up.

For my simple icons, I can get away with just 256 colours. If your icons are more complex, adjust the $colour variable as needed.

Here is a piece of code that looks quite reasonable, but it doesn't work. The plan is to assemble a playlist of multiple songs starting off with a list of their persistent IDs:

let persistentSongIDs : [MPMediaEntityPersistentID] = ...

let musicPlayer = MPMusicPlayerController.applicationQueuePlayer
var filterSet = Set<MPMediaPredicate>()

for songID in persistentSongIDs {
    let predicate = MPMediaPropertyPredicate(value: songID,
            forProperty: MPMediaItemPropertyPersistentID)

if filterSet.isEmpty == false {
    let query = MPMediaQuery(filterPredicates: filterSet)
    let descriptor = MPMusicPlayerMediaItemQueueDescriptor(query: query)
    mediaPlayer.setQueue(with: descriptor)

So, why does it not work? We are creating search predicates with each of the persistent song IDs, surely this should give us a list of songs that we can play, right?

Well, no.

As soon as you have multiple songs, the song list will always be empty.

The reason for this is simple, if you think about it. What we are searching for is:

"persistentID == 5819395988480015566"
"persistentID == 9054558313999882624"
"persistentID == 8511246475365999992"

But all of these search predicates are combined, i.e. they all need to be true to find matches.

Let's rephrase this: This will never be true:

"persistentID == A" && "persistentID == B" && ...

So How Do We Fix This?

We need to search for this:

"persistentID == A" || "persistentID == B" || ...

But that cannot be expressed in a list of search predicates. The solution is to search for each ID in turn and assemble a playlist manually:

mySongs = [MPMediaItem]()

for songID in persistentSongIDs {
    let predicate = MPMediaPropertyPredicate(value: songID,
            forProperty: MPMediaItemPropertyPersistentID)
    let query = MPMediaQuery(filterPredicates: [predicate])
    if let song = query.items?.first {

if mySongs.isEmpty == false {
    // Let's play !!!
    musicPlayer.setQueue(with: MPMediaItemCollection(items: mySongs))

When creating screenshots with fastlane, there is support for checking at runtime within the app whether fastlane is being used by looking at a user default:

if UserDefaults.standard.bool(forKey: "FASTLANE_SNAPSHOT") {
    // runtime check that we are in snapshot mode

Unfortunately, that does not work from within the test code since the tests are run in a separate helper app!

In the test code, I would like to use something like the following such that I can test and tweak the code within Xcode before running fastlane on it:

    snapshot("1_init", timeWaitingForIdle: 0)
    let attach = XCTAttachment(screenshot: XCUIScreen.main.screenshot())
    attach.lifetime = .keepAlways

One would think the FASTLANE_SNAPSHOT=YES build setting would be suitable for this task, but I have not found any way to detect it from within Swift, possibly because values like YES are not supported by the compiler.

This, however, seems to do the trick: Add this line to your Snapfile:

# Add a define for xcodebuild