A quest for automated iOS marketing shots
Sun, 29 Jul 2018 10:16:23 +0100

Motivation for automation

Legend has it that if a person happens to find your app or game in the App Store, they'll be put off if the video previews and screenshots are not in their language. And even if the app is indeed localized, it won't count as localized if you don't prove this with screenshots.

The other troublesome legend is that successful apps have responsive interfaces that adapt well to different screen sizes and aspect ratios.

So for Silabitas I spend time to adjust the font size, and the location of the different UI elements for all the different iPad and iPhone screens, and localized it to all the languages I know: English, Japanese, Spanish, and Catalan. The first version of Silabitas was also translated to German, Chinese, French, Portuguese, and Italian. But I dropped support for those because it was too much cost, and there were 0 users anyway.

But even with just 3 languages (Catalan is not supported in the App Store, so I don't need to take screenshots for that one), if I want to take 6 screenshots for my game, that means 6 screenshots Γ— 3 languages Γ— 7 screens (5.8'', 5.5'', 4.7'', 4'', 12.9'', 10.5'', 9.7''). That's 126 screenshots. I used to take those manually, but that's a lot. And if you want to make at least one video preview per language, that will be 21 videos.

For Silabitas latest update I thought it was time I automated this because it's a pain. I read about fastlane and I thought it would be all easy-peasy, but it wasn't that easy... I'll describe here the different step and/or approaches for automating the process as much as possible.

Screenshots with Fastlane

Just follow the documentation to set it up. It's all quite easy, but I encountered a couple of issues. The first one is that multiple languages are not supported for Swift, apparently. I found out that you can simply create a "Snapfile" inside the fastlane folder for it to work. So ignore that "Snapfile.swift" that gets created in that folder, and put your list of devices and languages in the "Snapfile",

devices([
   "iPhone 5s",
   "iPhone 8",
   "iPhone 8 Plus",
   "iPhone X",
   "iPad Pro (12.9-inch)",
   "iPad Pro (10.5-inch)",
   "iPad Pro (9.7-inch)"
])

languages([
        "en-US",
        "es-ES",
        "ja"
])

Then, you create some UITests that call the snapshot function in the right timings. To generate all the snapshots, you call fastlane snapshot from the command line. It will start the appropriate simulators in the different languages, so the process can take a couple of hours. Once it's done, it will create a summary page in HTML and a report in the console. See below,

Now you can try to setup Delivery to automatically upload the screenshots to App Store Connect. Again, the setup for Swift failed to create the appropriate files, it seems, so I had to manually create a DeliverFile inside the fastlane folder. Couldn't find much information about the format, but I set up these 3 things,

username("my developer account")
app_identifier("the Bundle Identifier of the app")
overwrite_screenshots(true)

When I call fastlane deliver it generates another html with a preview of what's gonna get uploaded, but the screenshots point to the wrong place. I have to manually move the Preview file to the right folder in order to see the summary properly. But if I accept the changes an continue the upload, I always get several errors in the console,

I searched in several forums, but I couldn't figure out how to resolve this. In the end I gave up, and I uploaded the screenshots through the App Store Connect website...

So I couldn't fully automate this, but at least I managed to automate the most tedious process of taking the screenshots. Uploading these manually through the Media Manager in App Store Connect means drag&dropping 7 sets of 6 images (multiple select the 6 screenshots) for all the languages, so 21 drag&drop actions. Not ideal, but I'll live with that.

Screenshots for SpriteKit

The challenge I had for Silabitas is that it uses SpriteKit and there's no immediate support for UI testing. But before addressing SpriteKit, make sure that if you have any UIButtons, they all have a unique accessibilityLabel set up. If they don't have a label, you can access a button from a UI test by its text (if it has text), or the name of the image it displays. But if you've localized your game, these texts and images will probably change with each locale, so the tests will fail. Use accessibility labels instead.

For SpriteKit you will have to manually create accessibility elements at the same locations as your sprites, and assign them unique labels that you can identify. I followed this nice post from StackOverflow: Step by step guide to create accessibility elements for SpriteKit. By the way, once you have this set up, I assume you can use VoiceOver to control your app, so it's a double-win, but I haven't actually tried.

When you create a UI test in Xcode, there's a record button that appears when you place the cursor inside a test function. If you click record, it will record all the interactions that you do, mainly button clicks. However, it fails to capture the accessibility elements for SpriteKit that we've just created. But if you gave them proper names, you can create the test by hand. In the case of Silabitas, you first select a "piece" with a number which is its position in the list of pieces for that screen, and then click on the board, that I indicate with the name "cell" plus 2 integers for X and Y. So one of my tests looks like this,

    func testSortija() {
        let app = XCUIApplication()
        app.buttons["startButton"].tap()
        app.buttons["Level2"].tap()
        app.buttons["stage.0"].tap()
        app.buttons["piece.8"].tap()
        app.buttons["cell.4.1"].tap()
        app.buttons["piece.3"].tap()
        app.buttons["cell.5.2"].tap()
        snapshot("4StageSortija")
    }

Attempt to automate video captures with simctl

Now the most challenging part. I used to create all videos manually, on a real device, or playing in the Simulator and recording them with simctl as I played. Then, I would use iMovie to create an App Preview. I manually edited the timing and the transitions there (an App Preview can't be longer than 30 seconds). But that meant I usually created the videos in just one language, English, and use the same one for all the other languages. I'm happy to give up on nice editing on favor of an automatic process for all languages.

So I created a UI test that would be my demo session. In Xcode, I created a separate Scheme, and for the testing of this scheme, I selected only this demo test. This is so when I run the tests from the command line, only this test is called. Check this post. So I created this simple script to start the test in a Simulator, and then wait for the simulator to boot to start a recording with simctl,

xcodebuild -project Silabitas.xcodeproj \
           -scheme "SilabitasTest" \
           -destination 'platform=iOS Simulator,name=iPhone 6,OS=11.4' \
           test&

# https://coderwall.com/p/fprm_g/chose-ios-simulator-via-command-line--2
# wait until there is a device booted
count=`xcrun simctl list | grep Booted | wc -l | sed -e 's/ //g'`
while [ $count -lt 1 ]
do
    sleep 1
    count=`xcrun simctl list | grep Booted | wc -l | sed -e 's/ //g'`
done
echo "Recording video... Press CTRL+C to stop..."
xcrun simctl io booted recordVideo silabitas-en-iPhone6.mov

Unfortunately, this produces blank videos. I'm not sure why. At first I thought it was because I was running it in background mode, but I also read there's a bug in Xcode 9.4.1: https://stackoverflow.com/a/51120591/1765629. So I installed Xcode 10 beta 4, and then I was able to capture, but the capture result was glitchy. I think it's out of sync or something,

I tried starting the UI test from Xcode, and then manually call simctl from the command line as soon as my app starts, but I also get glitchy videos. I could try starting a real device, but then I wasn't sure how I would capture. Even if I could script Quicktime to do the screen capture, as soon as Quicktime gets hold of the device, it disconnects it from Xcode, so the UI test would get interrupted πŸ˜…... So I decided I better give up on this route.

Automating video captures with ReplayKit

So the only alternative I think I had left was to take the captures from inside the app. This has the advantage that the capture can be scripted from the UI test. I've checked several sources, but this post was the most clear and straightforward.

In my UI tests I pass a special argument to my app like this,

app.launchArguments.append("--uitesting")

I use this in my AppDelegate to set up a demo account (so I don't log in into GameCenter) that has several stages unlocked. But also, I use it to add a special button to the main screen with an accessibility label called "Record" that I can easily call from my UI test. My UI test is wrapped with this,

   app.buttons["Record"].tap()
   sleep(3)
   if app.alerts.count == 1 {
       app.alerts["Allow screen recording in β€œSilabitas”?"].buttons["Record Screen"].tap()
   }
   // play...
   // [...]
   // stop recording and save the video
   app.buttons["Record"].tap()
   sleep(5)
   app.toolbars["Toolbar"].buttons["Save"].tap()
   sleep(4)
If you've given permissions once, you may not get the alert. Hence the "alerts.count" guard. The other problem here seems to be localization, since I'm not sure the system alert asking for recording permission has a generic accessibility label. However, I found out that for automating the localization of the video recordings it is better to have your device set to English, and then change the localization settings when launching the app programmatically, so you can have a unit test per language and record all the languages in one session. The system will be in English, so the system alert will still show in English, but your game or app will appear localized. Convenient ☺️. See the UI test code below,

private func launch(app: XCUIApplication, lang: SupportedLang) {
    var language = ""
    var locale = ""
    switch(lang) {
    case .en:
        language = "(en)"
        locale = "en_US"
    case .ja:
        language = "(ja)"
        locale = "ja_JP"
    case .es:
        language = "(es)"
        locale = "es_ES"
    }
    app.launchArguments += ["--uitesting"]
    app.launchArguments += ["-AppleLanguages", language]
    app.launchArguments += ["-AppleLocale", locale]
    app.launch()
}
func testGameSessionEnglish() {
    let app = XCUIApplication()
    launch(app: app, lang: .en)
    recordGameSession()
}
func testGameSessionSpanish() {
    let app = XCUIApplication()
    launch(app: app, lang: .es)
    recordGameSession()
}
func testGameSessionJapanese() {
    let app = XCUIApplication()
    launch(app: app, lang: .ja)
    recordGameSession()
}
private func recordGameSession() {
    let app = XCUIApplication()
    app.buttons["Record"].tap()
    // ... see preview code
}

So, all this sounds nice but.... There are some gotchas. First, ReplayKit doesn't work in older devices, like the iPhone5 (iPhone 5S should be fine). The isAvailable field will return false, so you'll know. I have an iPhone 5 I use for development, so this is a bit of a bummer. Specially since the second problem is that ReplayKit doesn't seem to work in the Simulator either. I couldn't find any documentation on this, but that's what I read in some comments here and there. It's strange, because "isAvailable" returns true, but I don't get the "recording permissions" alert, and when I press stop, the callback of the stop function is never called. This sounds like this issue, but that issue happens in real devices as well. I got that on my iPad, but I think that was because the UI test, for some reason, clicks the "Do not allow" button, and then the screen recording seems to stop working in the following sessions. There's no alert being displayed, and the startRecording function does not return any errors, so it's apparently working, but the stop callback never happens. I solved this by rebooting the iPad.

So... the remaining problem is borrowing real devices. I have an iPad Pro 9.7'', the iPhone 5 that I can't use for this, and an iPhoneX. It's a bit of a pain finding real devices, but I'll come up with something πŸ˜…. For iPad, if even the UI slightly changes (font sizes, etc.) from device to device, I've simply recorded the 3 videos of the 3 language on my iPad and rescaled the videos. It's the same aspect ratio, so there aren't any strange deformations. Read next section.

Final video touches

No way I'll edit the videos manually again with iMovie. It may be prettier, but it's too much work. I think what it's important is that the user haves a feel of the real app, including the visuals for their language. Because my UI test generates a video that is 38 seconds-long, I've sped it up a little with ffmpeg. Also, ReplayKit creates a 60fps 1920x1440 video, and both the frame rate and the resolution are wrong for App Store Connect. So here are the conversions that I do from the command line,

# The UITest produces a 38~39 sec video. Speed it up to be ~29 secs
ffmpeg -i Silabitas2.0-ja.MP4 -filter_complex "[0:v]setpts=0.75*PTS[v];[0:a]atempo=1.33[a]" -map "[v]" -map "[a]" Silabitas2.0-ja-faster.mp4

# App Store complains about too high frame rates! Change to 30fps
# https://stackoverflow.com/a/26730600/1765629
ffmpeg -i Silabitas2.0-ja-faster.mp4 -r 30 Silabitas2.0-ja-30fps.mp4

# The capture produces a 1920x1440 video, but App Store Connect
# wants them to be 1600x1200 for 12.9'' and 10.5'' iPads
ffmpeg -i Silabitas2.0-ja-30fps.mp4 -vf scale=1600:1200 -c:a copy Silabitas2.0-ja-1600x1200.mp4
# 9.7'' iPad
ffmpeg -i Silabitas2.0-ja-30fps.mp4 -vf scale=1200:900 -c:a copy Silabitas2.0-ja-1200x900.mp4
You could change the script to loop for all languages. And probably you could write the amazing final transformation with just one call to ffmpeg, but ffmpeg is too confusing for me... When I combine parameters I always end up having a video out of sync, or other weird problems...

Once I find all necessary iPhones, I'll end up with 21 videos that I'll have to drag&drop to upload manually again. I've already uploaded the 9 iPad videos and they look fine πŸ‘. Here's one of those videos that I've uploaded to youtube:

Summary

This was a long post, but it is because I've been struggling with this the whole week! So it felt indeed as a "quest"! I "wasted" many hours, and I couldn't get it all fully automated. There are also a few gotchas, but at least I can:
  • automatically capture all the screenshots for all devices and languages through UI tests and fastlane;
  • automatically capture a demo video for all languages in a given real device.
For the video capture, I still have to start the test manually in all the relevant devices, but that's max 7 times. Then I have to copy all the videos back from the device to my laptop, and process the files with ffmpeg.

I had already encountered the problem of not being able to use the Simulator in the past, with Palettist. That's because that app is Metal-only, and Metal is not supported in the simulators 😭. So I guess for Metal apps I'll have to plug one device one by one, and maybe set up fastlane to use one device at a time for the screenshots.

And for both the videos and the screenshots, I still have to use the App Store Connect web interface to drag&drop 21 sets of images, and 21 videos. But overall, this is much faster than what I did in the previous release of Silabitas. Oh, and please check the game!

silabitas.endavid.com

The latest update with 20 new stages should be out soon... As soon as I find iPhones to capture all the missing videos...


◀️ Older | Newer ▢️

βͺ Previous year | Next year ⏩