When your company is running marketing campaigns that constantly accentuate a privacy focus, it is incongruent to keep stuff like this in the shadows. Apple didn’t try to hide that this is happening but it didn’t do much to keep people informed either.
On its machine learning blog, Apple explained how they use customer voice samples to validate the accuracy of "Hey Siri" detection.
The iOS Security white paper says Siri voice recordings are saved for six months with an ‘anonymous’ identifier that can group recordings from the same person together, and then the recordings are kept for at least another two years without the associated identifier. It says the saved audio is “for use by Apple in improving and developing Siri” with “ongoing improvement and quality assurance”. What it does not say explicitly is that these recordings are reviewed and assessed by humans.
If you give it a second of thought, then of course these clips have to be peer-assessed. To improve software to be more human-like, you need humans to label the training data sets and check sample outputs. However, this indirection may not be immediately obvious to people who don’t understand have a firm grasp of how machine learning works. The onus should not be on the customer to know, or to guess. The paragraph should say it explicitly; something like ‘anonymised recordings are reviewed by Apple employees for ongoing improvement and quality assurance of the Siri service’.
It’s also important to point out that the content of the white paper is not what is shown to a user when they activate Siri on their iPhone or iPad. The “Ask Siri, Dictation & Privacy” screen focuses on how Apple handles your data to process Siri requests in the moment, and the copy quickly glazes over the retention policy details. It does not mention the six-month or two-year periods for example.
What is also weird is that Apple has no opt-out for Siri data retention. You can turn off Siri completely and Apple will delete any data on its servers that can be traced back to your account, but you cannot elect to use Siri and not have Apple keep data to improve the service. Plenty of other iOS features provide this granularity, with dedicated options to “Improve Maps”, “Share iCloud Analytics” and the like. It’s a strange omission that Siri offers no such controls.
The worst parts of the 3D Touch user interface was the home screen quick actions system. The quick actions are useful, but the gesture of applying force to an app icon clashed with the gesture of long-pressing on the icon to make the icons jiggle and enable editing of the home screen arrangement. They are technically wholly independent interactions; one based on time, one based on pressure exerted. In reality, people found it nigh-impossible to differentiate those actions. It’s hard to long-press on a screen without also pressing firmly.
This overlap resulted in a laundry list of issues in iOS where the same UI elements would respond differently to pressure and long-press gestures. The home screen was by far the worst offender, though. Jiggle mode has existed since the dawn of time. All of a sudden, starting with the iPhone 6S, ‘long-pressing’ would sometimes work correctly and sometimes trigger this strange menu of actions — with no escape hatch.
The new design in iOS 13 beta 4 is definitely an improvement over what came before. 3D Touch, Haptic Touch and long-press gestures are now standardised into this one context menu system. If you press down on an icon and wait, the quick actions menu always displays first. If you keep waiting with your finger still on the display, the menu dismisses and jiggle mode activates. If you apply force when you press down on the screen, i.e. using your pressure-sensitive iPhone display, then the same sequence of events happen. The benefit you get from 3D Touch now is that it reduces the delay before the quick actions menu is shown, thanks to the more affirmative initiation.
Moreover, if a user does release their finger early, the new system is still forgiving to the ‘average user’ who probably just wants to edit their home screen. This is because the quick actions menu now includes a “Rearrange Apps” item. Tapping this button enters jiggle mode. You can’t really get stuck in the wrong place anymore.
There are a couple of warts as of beta 4. It would be better if all of the new iOS 13 context menus featured taller rows. They are barely 44 pixels tall; this means small font text, tiny icons and not too much leeway in where your finger can be when you are quickly sliding around. In the old 3D Touch menu design, the buttons were like three times the size and the ergonomics were better for it.
The “Rearrange Apps” button is a great addition (and something I’ve been advocating for a while) but it is a bit lazy to just append it to the bottom of the other actions. It should be distinct from the app actions, probably as a pill button in the corner of the screen, like the Done button that appears when jiggle mode is active.
They also have this seamless transition from sliding to select an action in the menu to dragging the app icon if you keep going. This is a clever shortcut but it’s frankly a bit finicky right now. I find that the safe zone of the pan gesture is too small; it’s far too easy to start moving the app icon when you don’t mean to. And when that happens, the result is a messed up home screen. Not the nicest thing to have to deal with.
Hopefully, these niggles will get patched up by the time iOS 13 is finished. Overall, it’s a much better experience. And it’s an experience that is consistent across device type. It doesn’t matter what kind of screen technology your device has, all iPhones and iPads have access to the same features, and they all behave (mostly) the same.
If you take the Wall Street Journal’s timeline as truth, ignoring the silly sales statistics meant to paint a picture of a company in turmoil, then Ive’s exit should only be seen as a positive step. It doesn’t make sense to have a person in the leadership team that is slowing down the product development process out of fatigue. The recent Bloomberg report mostly corroborates the same account of events; Ive was intimately invested in making the Apple Watch, and gradually shirked his responsibilities in the following years after the Watch debuted.
Along these lines, the real criticism should be pointed at Tim Cook for mismanagement of leadership. Ive was gracefully transitioning away from the company with his ‘promotion’ to Chief Design Officer. It was perfectly setup for a clean departure with Richard Howarth and Alan Dye’s faces viewable on the Apple Leadership page since mid-2015. However, what the Wall Street Journal says is that it was Cook who convinced Ive not to leave in 2017, and put him back at the top of the pyramid. History now shows that Ive retook the management role somewhat unwillingly, reportedly frustrating the product development of things like the iPhone X user interface as subordinates looked to approval from a leader who had either lost focus or interest (or understandably otherwise preoccupied by the declining health of his parents).
Ive sold his soul to Apple for a long time, did excellent work, and he now wants to chill out. Who can blame him. I think a lot of people under-approximate Ive’s impact on the company. Up until 2015, he was integral and working hard. I can’t get this Jobs quote from the Isaacson biography out of my head:
He’s not just a designer. He has more operational power than anyone else at Apple except me. There’s no one who can tell him what to do, or to butt out. That’s the way I set it up.
Apple employees waited on his opinions and craved his approval. The Wall Street Journal post cites a good anecdote from Ive’s time as Chief Design Officer (the first run at retirement) where employees who were supposedly under Howarth and Dye’s purview were not interested in judgements that were not Ive’s. Even Cook himself seemingly sided with Ive to begin making watch wearables in the first place.
It will be a hard gap to fill. It was probably the right time for him to be gone, but that doesn’t make dealing with the repercussions any easier. The company renowned for its design expertise literally doesn’t have a leader of design anymore; neither Evans Hankey nor Alan Dye are becoming senior vice presidents. Moreover, the vice presidents fall under Jeff Williams and not Cook directly. Cook is unusual in having a high number of direct reports compared to most CEOs at other companies, yet design is apparently not worth his unqualified attention anymore. To stress this point, there are currently five VPs named on the Leadership page and all of them report to Cook directly. In the two year span of Dye and Howarth’s stewardship, both of them were also listed as reporting to Cook.
I think fears that Apple is transforming into a highly-efficient factory line of iPhones, iPads and $999 last-generation MacBook Airs are overblown. But clearly design is going to be less influential under the new structure than it ever has been. As someone who loves it when Apple releases opinionated products, not necessarily chasing new product categories every year but adventurous products fuelled by daring decision-making and effusive care, a demotion of design leadership does not sit right with me. It didn’t feel right on day one and I can’t shake the consternation a few days later.
Jeff Williams is great, but is he a a best-in-class design lead that is not laden with myriad other responsibilities? If Apple wants to preserve its culture of design, Apple should have a SVP of design.
There’s no one who can tell him what to do, that’s the way I set it up.
It’s been hard to compose a post-WWDC impressions post because there was just so much content to take in and parse. Apple delivered meaningful updates to every single operating system. The sheer quantity of customer features, and developer API, announcements is almost overwhelming, and unlikely to be repeated for a while. For the user-facing things, I think you can also get the sense that this stuff has been in the works for a while and not rushed out the door. The releases have that little bit extra polish and finesse. All told, there’s a lot to like.
The leading flagship feature for the iOS 12 release was performance improvements, particularly for older devices. Yet, just one year later, Apple is set to one-up itself. Due to innovation in how the App Store packages and serves apps, users are apparently going to see 50% smaller download sizes and up to twice-as-fast app launch times. iOS 13 also includes caching of app runtime dependencies and optimisations to other parts of the system, which provide additional benefits on top of the app packaging enhancements. iOS 12 did wonders for older iPhones, but iOS 13 is looking to provide noticeable speed boosts for all hardware models. Last year, I was concerned that the bug fix and performance focus might not be sustainable; you can’t go every year solely focusing on stability and sidelining new stuff. It’s hard to say for sure from beta 1, but I get the feeling that iOS 13 will continue the positive reputation of iOS 12 and bring a long list of feature additions and enhancements.
Apple’s take on Dark Mode is unsurprisingly to use true black as the base layer. This is okay, although it is definitely conducive to OLED smearing. You can scroll grouped table views with Dark Mode active, and the background of the row of the start of every section will wobble like a jelly. It’s an annoying effect but it’s hard to argue against the beauty of true black designs when looking at mostly-static scenes. Apple can sufficiently mitigate OLED’s burn-in issue with software but there isn’t really a way around smearing other than selecting a different colour palette. Presumably, newer OLED screens in future iPhones will be better at minimising the tradeoffs, and of course Apple is rumoured to be working on microLED panels that will not be afflicted with any of OLED’s problems.
A clever nuance of Dark Mode is that the base colour defaults to black, but the system has a concept of ‘elevation’. If a view is elevated, then the base colour changes to a dark grey. This applies to modal presentations inside the app (which now default to a card style by the way) and even the app itself. On the iPad, Slide Over apps are adapted to the elevated appearance for instance. This means they can visually contrast against the apps beneath them. It’s clever.
Adding editing controls for video — two thumbs up from me!
The new Photos tab in the Photos app is really cool. Digital photography and ever-increasing storage capacities enable people to shoot lots of pictures, many of which are taken just in case someone in the frame had their eyes closed or wasn’t looking at the lens. When taking photos, it’s empowering to have the flexibility to take these backup photos, but they are essentially redundant duplicates that hold little long-term value. It takes professional photographers hours to go through their bank of dailies and pick out what to keep and what to delete. Normal people don’t bother with any of that. The Days view does a good job at addressing how people can go back and see their best shots … without having to take the time to curate their libraries by hand. Switching from Days to Months to Years is enjoyable too. The app features incredibly beautiful transitions from state to state; the Days collage group up into months of photos which stack neatly when viewing Years. The dynamism and interactivity makes the other tabs — which have not been updated — almost pale in comparison. They just feel somehow further apart and disconnected.
I’m so happy to see street view come to Apple Maps; they are very behind Google on the breadth of their image collection but better to start now than never. The interface in the Maps app is really nice to boot. You can smoothly pan from place to place instead of the jerkiness of Street View. It’s not hard to see how the same imagery, with overlaid points-of-interest markers, could be applicable to an Apple Car HUD or augmented reality glasses experience. The Share ETA option is clever and useful. It’s not an original idea — there are apps like Glympse where this feature is their entire premise for existence — but it’s much better being built-in to the same app. You can just start a journey home, and automatically notify your significant other if you hit heavy traffic and are going to be delayed.
The mechanics of CarPlay, in which the car acts as a dumb client and the phone renders the entire interface, means that Apple can improve the in-car experience for users just by upgrading the OS of the phone. That will really come to roost in iOS 13 with the biggest update to CarPlay since its debut; a new multi-widget Dashboard home screen, a brand new Calendar app, and a new light UI theme.
Probably my favourite change is the addition of SF Symbols, an expansive icon set designed to match the San Francisco system typeface. It’s not perfect, but it’s close. No more do we have to suffer the boxy fishy hook as the system-wide share icon. The Symbols glyphs have rounded corners, non infinitesimal line weights, and are inviting to look at and touch. The iOS 7 iconography was defined by its clinical and adherence to geometry; SF Symbols are worlds apart. What’s more, Apple has undertaken a major effort to actually use these icons in its apps. Almost every iOS 13 system app has been updated to use the new iconography (and it’s only beta 1 so maybe the stragglers get addressed by September).
I think the rebranding of iOS for iPad to iPadOS is weird. It feels like fan-service to cash in some goodwill. The first home screen may have a widget sidebar now, but it’s still very recognisable as the same operating system as the iPhone. Perhaps it will diverge more in the years to come, but in that case, they could have saved the rename for a better time. In terms of features, it was a solid showing — the Files and Safari changes stick out most in my mind — but I remain unsatisfied with the iPad’s multitasking metaphors. The new multi-window modes enable more sophisticated workflows (like having multiple messages conversations open as separate windows, and flicking through them with the new Slide Over home indicator gesture) but also introduce more complexity to an already confused metaphor. I think it’s insane they still haven’t added a way to easily multitask with apps not in your Dock.
The bevy of new faces in watchOS 6 have the potential to become long-standing favourites. Modular Compact is a really good execution of an analogue face with space for richer Infograph-style complications. The new numerals faces are a more stylish interpretation of the utilitarian X-Large face. But the standout winner is California. It may have a niche name, and its default configuration touts the eponymous horological combination of Roman numerals and normal digits, but it can be customised to be arguably the best all-round face. It looks great in full-screen and circular modes; the circular version is like Infograph with a bit more restraint. You still get the bezel text complication, and four rounded complications in each corner, but the centre is a simple analogue face. It’s a modern Utility.
The addition of Audiobooks, Calculator, Cycles, and Voice Memos fill out obvious missing fundamental features of the Watch form factor. Having a dedicated app for measuring the ambient volume feels a bit overkill, but warning notifications about being immersed in loud environments for too long are potentially life-changing for some people. It’s interesting that Apple put a dedicated App Store on the Watch; maybe it will drive watch app downloads higher but I’m not sure people will really be spending their day searching through the store. It definitely gives exposure to the selection of developers Apple features on the App Store’s main screen, but I don’t expect Watch users to dive any deeper — at least on the watch itself. The big news for watchOS this year is Swift UI. I have long campaigned for something better than WatchKit and Swift UI definitely fits the bill. There is so much stuff that is possible now that simply wasn’t before, and much more stuff is now easy when previously it was so hard to achieve that basically no developer bothered to do it. I know I’m excited to develop for the watch again.
It was interesting that Apple spent relatively little stage time (either in the main keynotes or the sessions) discussing Catalyst, née Marzipan. Catalyst will have by far the biggest impact on the Mac ecosystem in the near term. Apple didn’t shy away from the fact that ported iPad apps will not be as high quality as apps made for Mac first. Catalyst was positioned as a way to get your existing iOS apps available on the Mac with almost no work. I think that’s fine. They may not be as good as real AppKit or SwiftUI apps, but they sure beat apps built on Electron. There is going to be an explosion of Mac apps this autumn. The Mojave Marzipan quartet seem to be largely unchanged in macOS Catalina, but the new Podcasts app makes a concerted effort to be a better platform citizen. It is not merely a port of the iPad interface, it probably doesn’t even share that much code under the hood. I think Apple is hoping that developer’s Catalyst apps will catch on, which will then allow them to justify dedicating resources to code for macOS — whether written in AppKit, UIKit or SwiftUI.
I haven’t updated my MacBook yet to Catalina, but I probably will soon in order to start getting to work on bringing my suite of iOS apps to macOS. This also means I haven’t tried Sidecar for myself yet, but from everything I’ve seen it looks fantastic. Rendering the Touch Bar on the iPad display is a stroke of genius. The bluetooth mesh Apple is building with the Find My network has a lot of potential and clearly will be bolstered by an Apple-branded tag device sometime soon. “Find My” as an app name is downright stupid though. Find a better name. macOS 10.15 says goodbye to the iTunes brand, but it seems like most of its code is preserved in the Music app. I think I would have preferred a larger departure but the strip-down-what-already-worked plan means they haven’t upset the loyal hardcore iTunes fanbase — most features of iTunes remain. The benefits of splitting up iTunes are really seen in the manifestation of the other two apps. The TV app will offer a good platform for accessing the TV+ service on the desktop, and it brings support for Apple’s 4K HDR movies to the Mac for the first time. The dedicated Podcasts app is a huge leap over what Apple had provided on the Mac before; the podcasts section of iTunes sucked.
HomeKit Secure Video is a great feature with the potential to disrupt the business models of a lot of incumbent smart security camera manufacturers who have been depending on subscription services to boost revenue. Analysis of the video to generate significant event notifications, like spotting a person moving, is handled locally by the user’s HomeKit hubs and then the video is stored on iCloud in an end-to-end encrypted fashion. The 200 GB iCloud tier enables 1 camera recording and the 2 TB plan ups that to five cameras total. There is seemingly no option to pay for additional cameras even if you wanted to, which is a weird omission.
I was really surprised and pleased when Apple announced multiple-user support for tvOS; I never thought that was seriously on the cards. The home screen changes for Apple TV also look fantastic and console game controller support for iOS, tvOS and macOS really boosts the chances of Apple Arcade being a success. I also can’t wait to try out the multi-user features on the HomePod — alas there is still no beta process for that device.
The Mac Pro also looks like a winner, for the market it addresses. Apple’s new cheesegrater highlights the bastardisation of the word ‘pro’ in consumer technology. It’s not pro as in iPad Pro or MacBook Pro, it’s pro as in pro-grade. It talks to a very small niche, but that niche appears to be being satisfied. The Pro Display is similarly aimed at that same audience; video and audio professionals. Apple is not targeting the developer market with the Mac Pro but that’s fine, the iMac Pro is a great choice for developers wanting a desktop computer.
And yes, Apple announced an overpriced display stand in probably the most tactless way possible.
For the first time ever, the tvOS portion of the WWDC keynote featured major new enhancements to tvOS. That sounds like a stupid sentence but it’s true. We have been all too accustomed to the TV announcements being about upcoming content partnerships and not much else. This year is different.
Announcing multiple user support for tvOS is a huge deal. I am really looking forward to Apple TV+ and the extent to which I will enjoy the service is inextricably linked to the TV app on Apple TV itself. The current status quo is one shared Up Next and Watch Now recommendations for everyone that uses the Apple TV in the living room. tvOS 13 adds a Control Centre-esque sidebar that lets you quickly switch between user profiles. Now, everyone will be able to have their own independent bucket of TV shows, music and movies. There is an API for third-party apps to respond to changes to the current user profile too so Netflix or Plex could match up the Apple TV’s system user profiles with the user accounts of their own services.
Karaoke-style synchronised lyrics is a fantastic addition to the Apple Music experience. The UI looks great on iOS and tvOS alike; it’s clearly a direct result of Apple’s acquisition of Shazam. tvOS 13 also transforms the home screen into a canvas for video previews based on the currently-highlighted app icon. You can see full-screen film trailers and music videos when hovering over the TV app and Music app respectively. Mirroring the UI vocabulary of the recently-updated TV app, users can swipe up to take the previews full-screen and reveal extra information. Developers can create these new rich Top Shelf Carousels as well. An app can show multiple items at once in a carousel; each item can feature descriptive labels and an action button to open the app straight to that particular piece of content. tvOS arguably received a more significant home screen redesign than the iPad did.
Finally, Apple announced that the Apple TV (as well as iPhone and iPad) will support PlayStation and Xbox controllers as input methods. This is clearly driven by the impending launch of Apple Arcade and I don’t think we would have seen Apple make such a partnership if it wasn’t motivated by Services revenue opportunities. It’s sort of the gaming equivalent to putting the TV app on Samsung smart TVs. Even if you don’t subscribe to Arcade, you can now pair standard console controllers to your iPhone, iPad or Apple TV and enjoy playing existing App Store games that much more.
The table view cell highlighting behaviour is an example of how iOS 7’s design principles resulted in a design that is functional … but boring. There wasn’t anything whizz-bang about how it worked before. Just a few subtle choices to add that extra level of professionalism and sophistication.
The iOS 1-6 highlight was not simply a wash of blue, but a subtle gradient of blues shifting from light to dark. The variation in colour was just enough to suggest that the previously-dormant cell had been active and was popping off of the page.
Simultaneously, the cell content would invert. The black text and accessories, like the disclosure triangles, lit up as a white silhouette on the blue backdrop. The cell reacted as a unit, and the flash of vibrancy rewarded the user for interacting with the screen.
Most third-party developers — and some Apple apps — have already broken ranks with the official HIG and are on the path to more playful and lush designs. No one knows yet what form UIKit will take in iOS 13.
The standard set by iOS 7-12 is much more drab, almost clinical. It’s a flat, nondescript, grey that seems like it was chosen specifically because it would not draw the eye. The grey is close enough to white that anything white would not have sufficient contrast, so the illumination effect is also no longer present. Cell content no longer reacts in tandem. The whole interaction is a lot more lifeless. Rather than the UI egging the user on to complete the tap action, today’s iOS drearily yawns and says “okay, if you must”.
This is just one of the laundry list of things that people railed against in 2013. Criticism died down as people acquiesced to what was given to them. Many critics, myself included, accepted the iOS 7 design as a rush job and thought that Apple would obviously catch their breath and ‘fix it’ over the next couple of OS versions. I don’t think anyone at the time expected us to still be stuck with these missteps six years later.
When you do Apple Pay at a brick-and-mortar store, the phone actually doesn’t do very much. The phone simply communicates a token which relates back to the card on file. It’s the merchant terminal that is connected to a server and sends off the transaction. This is why Apple Pay can be performed on a watch or phone with no network connectivity.
With a standalone NFC tag, it’s the sticker that is offline and inert and the phone does the heavy lifting of making the transaction and sending it to a server to be processed. I think what is happening here, essentially, is scanning the sticker with your device triggers an Apple Pay on-the-web transaction in the background. For the user, it’s convenient not to have needed to visit a website, download an app or otherwise login beforehand.
The same result could be achieved with a special-format QR code that encodes the payment data but that would add friction like needing to position the code in the camera frame — and would preclude the watch altogether.
What wasn’t clear is whether the NFC stickers are proprietary and will only work with Apple devices or if this is all based on some kind of industry standard. I assume it is the latter, though, as Jennifer Bailey showed the stickers bearing the generic contactless logo and not Apple Pay branding.
The share sheet has to be up there as one of the most used parts of the iOS system. As it is the only way for an app to expose all sharing services installed on the device to the user, pretty much every app shows a share sheet controller at some point in its flow. The component supports effectively no customisation points for its appearance or behaviour, so its presentation across every app looks the same. You see it a lot.
I’ve never been particularly thrilled by how the share sheet looks or works. Since iOS 7, the panel is split up into three sections; AirDrop, Share and Action. AirDrop is handy but doesn’t justify a row to itself. Really, AirDrop should be an option in the Share section. Nobody can adequately nail down whether an operation should be catalogued under the Share bucket or the Action bucket. The concepts are too closely intertwined. Sharing is an action and to share is an action. Customers and developers don’t know and don’t care in equal measures. It’s a jumble and people only learn where stuff is by rote, relying on muscle memory rather than any semblance of a sane ordering system.
The share sheet would be greatly improved if it was oriented around people first. Maybe have one section that is about sharing with others (Messages, Mail, Twitter) and then a separate section which is about sharing to yourself (Files, Notes). What if the share sheet showed a list of people rather than a list of apps? The Apple Clips app uses private APIs to do something custom along those lines, substituting the AirDrop row for a list of Messages contact to share your videos with a single tap.
What Gurman describes for iOS 13 sounds somewhat along these lines. It’d be cool if the recommendations would surface sharing shortcuts for people across all apps on the phone, but I wouldn’t be surprised if it was limited to Messages contacts only.
Some people following up on this story quickly picked up on the fact that the creator of PanelKit, Louis Dhauwe, now works at Apple as if it is somehow meaningful. From my perspective, it’s more of a neat coincidence. Dhauwe joined the Xcode team in late 2018, a wholly separate department from the OS groups and months after the ideation and initial implementation for the upcoming ‘pro’ iPad features had happened.
PanelKit, or draggable panes, are not new inventions. There’s plenty of prior art for Apple to draw upon; they don’t need to hire a specific person to do it. In fact, this is all very much a touch translation of an AppKit concept; detachable popovers are used liberally throughout macOS and have existed since Lion. (Popovers originate from the iOS domain since iPhone OS 3.2, and the AppKit equivalent came later with additional features.)
I love the whole idea and the addition of detachability adds a lot of power to an iPad UI without burning screen real estate or requiring vast diversions from the idiomatic touch/gesture vocabulary. I doubt you even need to add a single element of visible interface to support it. Just show a popover, drag on the titlebar, and boom. You’re in.
A common issue with getting stuff done on the iPad is that you are always digging into and out of menus, dismissing menus only to open them again a minute later. The desktop solves this with multiple floating windows that can overlap and occlude windows not currently being used. A lot of those same benefits can be derived from floating panels. It’s worth restating that no single change will turn the iPad into a competent productivity machine that can rival a Mac. It’s an iterative process, every positive addition eating away at more of the outstanding (and ever-dwindling) pain points.
Of course, any third party app can roll their own detachable popovers today — PanelKit is the proof — but the reality is nobody does because it’s too much effort. Making it a system feature will coerce all developers into using it and make the overall platform experience more intuitive for users with the same learned behaviours carrying over across apps from different developers.
One aspect that is not confirmed nor denied in Rambo’s reporting is whether a detached panel can escape the bounds of the application. This is something that no third-party developer can do. An app today cannot draw outside of its window rectangle, whether it’s full-screen, in Slide Over, or in a Split View. I would hope that Apple’s approach does allow for that capability, particularly for Split View contexts. It means that a user could drag a panel over on top of the sibling app for safekeeping, without covering the active app at all. In my imagination of how this would work, these panels would still be transient such that they go away when the app is backgrounded, but whilst an app is front-most, I think the detached panels should be able to placed anywhere on screen.
I made a rough mockup to demonstrate my vision. In the example, I am busy writing an article in Pages and using Safari in Split View to lookup things for references. I detach the Pages formatting popover and place it to the side, temporarily covering Safari whilst I work on my document. These formatting controls are now always available to me with a tap, unlike the current status quo where it would require repeated taps on the toolbar button to show and hide the modal popover. When I’m done with it, I just flick the popover off the side of the screen and it automatically dismisses.
The Netflix edge is that you pay one monthly fee and you can watch everything. There are no restrictions on the content library. If you see something in the app, either overtly recommended or found through search, you can just watch it.
The Apple TV proposition is not that. Apple is reselling ‘channels’ from networks like HBO and Showtime, and essentially offering its own channel of original content in the form of Apple TV+. As the TV app synthesises content from many sources, looking for something to watch will involve skipping past a lot of recommendations that are not accessible to you.
Let’s say you pay up the as-yet-undisclosed monthly bill for Apple TV+. You can’t freely browse inside the TV app. The TV app is going to show you content from other sources that you don’t yet pay for. Content that you can’t watch unless you cough up. Apple TV+ will only include new Apple exclusive shows — about two dozen at launch. There’s no back catalogue to fall back to. The TV app will have to recommend other sources otherwise the app will feel like a barren wasteland. Apple is also financially incentivised to advertise Apple TV Channels to its users.
The latest iOS and tvOS betas include the new TV app and demonstrate exactly this. Scroll around and you quickly run into banners for Showtime with one-click buttons to sign up and subscribe. It is contradictory to me that Apple designed the TV app in this way, a pseudo-advertising platform, at a time when many people are switching to streaming services because they want to get away from ads and commercial breaks. This factor alone will limit the enjoyment of the Apple TV+ service and impair its adoption.
I believe Apple TV+ will foster talent and debut many incredible shows, but I don’t like the idea of navigating past buy buttons when I just want to watch TV. As it stands, Apple will not provide that experience. I would like to be able to tell the TV app to only show me stuff I am subscribed to, but I am not convinced that Apple will ever include an option like that as it would hurt the sales of Apple TV Channels.
The AirPower mat is the perfect example of how hard it can be to make something simple. You can get very close easily, there are plenty of multi-device chargers on the market obviously, but doing it in the way Apple envisioned is clearly so difficult that Apple itself has given up. It is saddening because I had bought into the uncompromising wireless charging vision the moment that Schiller presented it alongside the iPhone X. The dream is now dead. Apple is one of very few companies that cares enough about the finer details of this experience and has the money to fund the development effort. Realistically, no one else is going to try.
The AirPower mat was not going to be financially significant for the company but it heralded real advantages of wireless charging and would have been a fantastic complementary product for Apple customers who constantly juggle the charging their iPhones, Watches and AirPods on their nightstand. This is the reason why Apple showed it at the September 2017 event. It was a true step into the future. The iPhone X was the ‘future’ phone. One backed up the messaging of the other.
This justification makes it no less embarrassing for them. AirPower was important enough to share stage time with the 2017 iPhone lineup, so it’s a big miss however you slice it. It’s also a first in the company’s modern history. I can’t think of another product that Apple announced and simply never, ever, shipped. The 3GHz Power Mac G5 is the closest parallel I can think of and that was merely a failure to deliver a single configuration of a product line. In 2003, Apple released Power Mac G5s and promised that they would debut 3 GHz CPU options within a year. The 3GHz Power Mac G5 never materialised, highlighting limitations in the PowerPC architecture, and Apple transitioned the Mac to run on Intel CPUs not long thereafter.
I’m sad that the product will not exist and I’m also not thrilled with how Apple handled the cancellation. When Apple finally decided to release the AirPods wireless charging case earlier this month, which carried a hefty premium over the normal second-generation AirPods, they clearly knew that they had given up on the mat. They decided to wait until after the rush of AirPods orders had gone through to announce AirPower’s fate. Therefore, plenty of people bought the wireless charging model with the AirPower mat use case in mind, none the wiser to Apple’s internal plans. I am one of those buyers. Apple made more money by making its announcements public in that order. Even if the total of those purchases is small, it is a bit sketchy. I know I regret paying the extra £40 for my new AirPods.
I don’t believe that Apple planned to release three product updates in three successive days in March. At the very least, the new AirPods seems like something that was meant to be out for Christmas. The iMac spec bump might have been intended to be a 2018 thing too. However, the company wanted to draw a line in the ground and make its Monday event solely about software and services.
So, Apple made the best of a bad situation and engendered excitement for relatively-minor announcements in a way that I haven’t seen them do in a long time. The Tim Cook sketch memes on Twitter were great and community was actively excited to see what was coming the day after. The hype train came to a halt when Apple ended the streak on Thursday but as they never promised to have something new for every day of the week, no one really came away feeling disappointed.
I mean, we all know how interesting credit cards are.
It also acts as a nice backstop for the services bonanza taking place on Monday. Apple has already paid some attention to its hardware user base so if you watch the spring event and are completely disinterested in news, games, TV or the credit card project, at least you have already been throne a bone the week before.
The shared motif of all three product category updates was the feeling that these teams were told to do whatever update you can without needing to change the chassis design. For the iPad Air and Mini, this is passable. The bezels are big but it is reasonable to keep the new-generation design as a ‘Pro’ differentiator for a year. The 2018 iPad Pro models only just came out after all. The Air represents good value for money and the Mini carves its own niche (a la Mac mini). The bifurcation of the Apple Pencil accessory is annoying and inelegant. It doesn’t feel like asking too much for Apple to flatten the edges and add a magnetic charger into these models, but alas they did not do this.
The iMac update is so incremental it’s almost insulting. The only change to the default configurations is the inclusion of eighth- and ninth-generation Intel CPUs. The Radeon Pro Vega GPU option is a $750 upgrade. I wasn’t expecting a chassis redesign (although one is very much warranted almost a decade deep with the same design) but the lack of SSD-only / T2 equipped iMac configuration was a disappointing shock.
The new AirPods are cool. I’ve ordered a pair, primarily so I have the wireless charging case at hand for whenever the AirPower mat arrives. My number one complaint with the current AirPods is that, sometimes, it can take like 20 seconds to switch from my iPhone to my iPad. When it looks like it is behaving normally, this action takes about three-to-five seconds. Apple markets the H1 chip as halving the average connection time, and I hope the reliability is improved too.
The new AirPods are described in documentation as ‘second-generation’ but these are more like AirPods 1.2 plus the wireless charging case. Nothing wrong with that but I’d expect more significant change next year. The pricing is a bit awkward. $159 for the new AirPods with the standard charging case, or $199 for the new AirPods with the wireless charging case. I saw plenty of people who were confused by this and thought it was an Apple TV-esque stunt where the old model was continuing to be sold at the same price point. You pay an extra $40 just for the wireless charging function. That is a pretty steep premium for what you currently get in exchange. I predict, maybe next year, the wireless charging case will become standard across the line and the $199 model will be more differentiated, perhaps using features like noise cancellation as the up-sell. The $199 price point isn’t going to disappear.
Shortcuts is an overloaded term in the Apple world. What I am referring to in this article is the Siri Shortcuts system, not the Shortcuts app that allows users to make their own workflows. You can make Siri Shortcuts from shortcuts made in the Shortcuts app.
What I am focusing on is the mechanism by which apps, first or third-party, can expose shortcut actions to iOS that can then be ‘added into Siri’ and activated using your voice. The shortcuts can be presented to the user by each individual app or found in a list of recommendations in the Siri pane of the Settings app.
This means Siri can now be smarter by drawing on the capabilities of many more apps. You can order coffee. Control third party audio apps like Overcast or Pandora. Plan travel itineraries with Kayak. All with your voice talking to your intelligent personal assistant.
Except that’s not really true. That is how Apple likes to market the feature but it’s a twisted form of reality. Shortcuts are not making Siri smarter, in fact they are dumber than pretty much anything Siri has done to date. Shortcuts put the burden on the user to do the legwork of synthesising data sources and integrating the apps into the voice service.
Shortcuts require registration and administration to do anything at all with Siri. The user has to pre-emptively search out every command available in a certain app and then add each in turn to Siri. Registration requires the user to think up the phrase they want to use to trigger the command on the spot. Siri can then trigger these actions when that same phrase is said back to it at a later date.
There is no intelligence here. Siri transcribes the user’s voice and looks for an exact text match of that phrase in the database of voice shortcut phrases that the user has generated off their own back. If a match is found, it proceeds. Otherwise, failure.
There is no leeway in what can be asked. There is no flexibility in how a command can be impromptu formulated. Any sequence of words other than what was inputted when the shortcut was first registered is not understood. That’s the point, there is no understanding. The Siri interpreter has no understanding of semantics or meaning when you are interacting with an app shortcut. It is a dictionary text lookup and nothing more.
It is disappointing that Apple is leaning so heavily on shortcuts as a mainstream way for customers to get more from Siri. It flies in the face of how you want a voice assistant to work and behaves differently from every other type of Siri interaction. When you ask Siri for the weather, you can say ‘What’s the weather?’ or ‘What’s the weather on Friday?’ or ‘tell me the forecast’ or ‘do I need to wear sunglasses?’ or just ‘weather’. The whole point is the user does not have to revise a set list of triggers. Apple has made entire ad campaigns to this effect, promoting the flexibility. Forget custom variables, the Shortcuts system cannot support multiple ways of saying the same thing. A truly good voice assistant does not require the user to remember something.
This philosophy is exactly what drove Apple to design the SiriKit API in the way they did. SiriKit abstracts aways the parsing or semantics of a snippet of speech. It is the responsibility of Apple engineering to enable the understanding, across English locales and foreign languages. Under SiriKit, the third-party apps only supply the data for the response. All of the work Apple puts in to improve Siri’s understanding of commands automatically benefits every SiriKit app, and every SiriKit app of the same domain should respond consistently to the same commands. The downside to SiriKit is that it can only work with a subset of applications, those which Apple has done the legwork to create a domain for. So, a user has to know which of the apps on their phone works with Siri, but they don’t have to register a corpus of commands and can interact with the app through Siri with an order of magnitude more freedom.
By relying on the Shortcuts system so heavily, Apple risks breeding even more hatred for Siri than what already exists in the community, as people inevitably forget the exact words they used when they made the shortcut four months ago, and blame it on Siri being dumb instead. If you want to call Siri dumb, then Shortcuts is primordial in comparison.
I am worried that Siri Shortcuts has usurped favour in Apple’s product marketing groups to the point where engineering resources on SiriKit have been sidelined. Siri Shortcuts certainly requires less effort on the part of Apple to maintain. It’s the lazy way out for them versus SiriKit which necessitates continuous advancement and development of new domains. I care about the end user experience and SiriKit provides superior results. I want them to invest in supporting more domains. At best, Apple should consider Siri Shortcuts as a companion feature to Siri and not a substitute for SiriKit.
The distinction between iPhone and iPad support sticks out. In the age of Split View, iPad app have to be ready to adapt to skinny columns, square, and full-screen layouts at any time. They are programmed to be continuously resizable. Even fixed size iPhone apps on macOS would be alright; they’d just be small windows amongst your other windows.
In an attempt to decipher the Chinese whispers here, I think what this actually means is that this year’s Marzipan system will not let you shrink windows into single-column designs. At a technical level, apps running under Marzipan on macOS 10.15 will not transition to compact width size classes. In fact, this is exactly how News, Stocks, Voice Memos and Home work on Mojave. Try it right now. Although the windows are resizable, they have minimum size constraints in both width and height. It is not possible to compress the interface enough to collapse the sidebars.
I don’t know why they chose to enforce this restriction as many actual Mac apps, like Finder, do happily morph into skinny column layouts when their windows are made small, and I can’t think up a technical reason that would prevent it from being implemented.
I love Kuo. He can go silent for three months and then, out of the blue, drop a wide-reaching report about seemingly every major new Apple hardware product coming this year. On a Sunday.
Kuo believes there will be a new MacBook Pro update with a 16 to 16.5-inch display. This is apparently an “all-new design” which, in Kuo parlance, means a substantial change to the chassis. This isn’t going to be the same MacBook Pro chassis we know today made to accommodate a 16-inch display. The obvious direction is to make the bezels smaller. Kuo provides no more details, but let’s hope the keyboard is “all-new” too.
Interestingly, a 16-inch panel implies that the laptop will almost certainly get larger. Even if you removed 100% of the black frame surrounding the current MacBook Pro screen, you would just reach a 16-inch diagonal. In reality, there is of course going to be some minimal expanse of bezel, and the report has enough wiggle room that screen could be up to 16.5-inches, so the dimensions simply have to be getting a little longer and wider. Maybe this will be a MacBook Pro that does not tout thinner, lighter and smaller as one of its flagship improvements. That’s significant in itself.
The other morsel in the Mac category is details about Apple’s upcoming external display, expected to ship alongside the Mac Pro. Kuo describes its as a 31.6-inch ‘6k3k’ high-resolution panel with a backlight similar to Mini LED. Unlike typical LCD monitors, Mini LED backlights enables portions of the display to be turned on and off independently. This results in a similar effect to OLED with high-contrast black levels. Mini LED lighting is also thinner and power efficient than standard LCD backlight components. All of these features firmly differentiate the Apple display from the LG UltraFine range. It certainly fits the bill of Apple’s promises that it would be a ‘high-end pro’ display. Also, expect a big bill when you buy it.
Kuo says that Apple will an Apple Watch ceramic body option after an Edition hiatus with the Series 4. He reiterates that AirPower is shipping in the first half of the year, and accompanying wireless charging AirPods case, so that’s good news. The 9.7-inch iPad is being bumped up to 10.2-inches but it’s not clear if the screen bezel is shrinking with that product as Apple still has to contend with the Touch ID home button. The iPad Pro, iPad mini and iPod touch are getting the short end of the product marketing stick, with mere processor upgrades anticipated.
Regarding iPhones, the report says that the iPhones will have a new ‘frosted glass’ casing finish. This sounds like a similar material to the Pixel 3’s matte glass back. I haven’t actually handled a Pixel 3 in person, but the matte glass is cool to look at in photos. Kuo says the triple camera setup will mean the addition of an ultra-wide lens. A fair few Android phones offer ultra-wide cameras currently, I don’t know how useful they are in the real world. I feel like it’s more niche than the zoom lens. Kuo calls out ‘larger batteries’ as a new feature. The XR embarrasses the iPhone XS (non-Max) so much when it comes to battery life it is actually insane. Hopefully, the 11 can bring those models closer together. The bilateral charging feature benefits from a bigger internal battery too, as you can use any excess juice to wirelessly charge up your AirPods in a pinch.
The other major thing Kuo highlights is support for ‘Ultra-Wide Band’ indoor positioning. UWB involves venues setting up little antennas in the corners of rooms, which the phone can talk to and locate itself with a very high degree of accuracy (often better than a 1 metre radius). Apple has been slowly expanding its indoor mapping features in Apple Maps, and I bet that this is the year they push it big time. Here comes augmented reality Maps navigation.
This is annoying but it is not unusual. App Review loves to changes its mind and never more so than with the renewing subscription requirements. On this occasion, I had ‘nothing’ to do whilst I was waiting in the metaphorical App Store Connect line. It was a weekend.
For some reason, I channeled my frustration from the repeated delays into making a Mac app. I booted up a new Xcode project and tried to learn AppKit on the spot. I had approximately zero prior knowledge about Mac development, and it seemed like a terrible time to start given the turbulent state of macOS apps framework, but I had nothing better to do.
AppKit is funky. I have now made a Mac app, but I admit I don’t know what I’m doing. When you get stuck with AppKit, you really do get stuck. Googling doesn’t really help you as there simply is a lack of online posts and forum threads for Mac-specific topics. I basically followed UIKit idioms whenever the most obvious path was not apparent.
Some stuff came very naturally. Other stuff seemed inane, or at least I don’t have the context or berth to understand the choices that have been made.
I thought it was really cool that behaviours like alternate menu items in the menubar are built-in automatic features of the AppKit framework. (This is the thing where you can hold down a modifier key, like Option, whilst in a menu and see how some of the items dynamically update to show an alternate-but-related action that depends on the modifier key being held.) As a developer, you just add two menu items with concomitant keyboard shortcuts, set a flag, and it shows and hides the secondary one automatically. In general, AppKit has a much larger library of controls and interface elements than UIKit such that the need for coding custom view subclasses is greatly reduced. Also, macOS standard elements look prettier than the stock iOS counterparts, so that helps a lot too.
In the ‘stupid stuff’ camp, it is ridiculous that you can’t zoom in or out of a Mac storyboard in Interface Builder. I was flummoxed when it came to making an action that could be executed with a keyboard shortcut without having an activatable control on the screen. I also struggled to work out how I was supposed to organise my Mac project. How do I funnel menu items from a storyboard into outlets for my window controller? The solution I found is to make outlets on the AppDelegate that I then expose as public properties. Yuck.
Despite the bumpy road, my familiarity and ten year knowledge of iOS app development meant that I could complete the work. I have made a single window utility Mac app, and doing it was really fun. I originally made it as a personal proof of concept but it turned out so good that I polished it up into a shipping app.
What is that app? Tabs to Links. It detects open windows and tabs in Safari and lets you export those tabs as a list of links. In one step, you can make a bulleted list in Notes or just send the links in an email. There’s also an integrated suffix algorithm, which allows the app to strip redundant text from the links. This comes up a lot as many websites include their name and/or slogan in the page title. Just check the ‘Trim titles’ box and the app finds such repetition and will not include it when sharing. Click the Share Links button to see all compatible sharing services. If Tabs to Links is frontmost, you can press Command+C to instantly copy to the clipboard, ready for pasting into any Mac app.
You can get it now from the Mac App Store. Watch this 15-second demo video I made using my incredible (read: terrible) iMovie and QuickTime skills. It works with Safari only; I could make it work for Chrome too if there is demand.
Here’s an amusing aside. Years ago, I made this feature as a tabs-to-links Safari extension. It would take the open tabs and spit out a HTML list. Me and Zac have used it for ages to create the show notes for the Happy Hour podcast.
Sods law is such, that upon completing the native Mac version, Safari Technology Preview added an API for app extensions to retrieve data for all windows and tabs. This means Tabs to Links could live as an app extension after all. In any case, the Mac app is way nicer to use. AppKit enables more features, a prettier UI, and access to system sharing services. Down the road, I could integrate a Safari extension into Tabs to Links as an alternative option.