You don’t have to take my word for it that AirPlay needs some love: just go searching for newly-launched AirPlay speaker systems. The only company we could find showing one at CES this year was Moshi (watch out for a review by Zac Hall shortly).
Whether it’s manufacturers deciding that AirPlay is too unreliable to invest, or consumers unwilling to pay the price premium for a protocol that doesn’t give them everything they want, clearly there’s an issue. Apple needs to fix the reliability issues that appear to stem from flaws in the protocol itself, license it to manufacturers at a reasonable price and then give it enough PR that mass-market consumers get to know about it.
The AirPlay audio streaming technology is based on old protocols reaching back into the AirTunes days … it needs a revamp. The official Apple answer is to buy your own speakers and hook up an AirPort Express which is both clunky and prohibitively expensive.
The success of portable Bluetooth speakers says to me that there is a consumer desire for wireless music streaming in the home. I think Apple should re-engineer AirPlay as a WiFi protocol that requires an internet connection. This means it could also sync up with HomeKit and be controlled remotely, outside of the home network. Essentially, each AirPlay receiver would stream music from an iCloud / Apple Music server. This means individual devices do not have to manage the streaming work locally. A friend could stream a playlist at your house and leave with his iPhone in his pocket without interrupting the music. It’s the Chromecast model.
Making things based around an internet model simplifies so much. Apple could do some really intelligent things, like automatically silence speakers when you leave home. Couple a reworked protocol with some pretty speakers and you’ve got something good.
With this Apple Store refresh next week, Apple will be pulling these aging iPad 2s from most of its stores. Instead of Smart Signs, Apple will begin pre-loading price information and product details onto display iPhones, iPads, and Macs themselves. Apple will use the new space to install more devices that potential customers can test on the show floor. The recently launched iPad mini displays for the Apple Watch will not be going away. Apple will also showcase iPhones with redesigned white display docks.
Some larger Apple Stores will retain a few Smart Signs as Apple conducts this transition. Sources say that, beyond the desire to feature more products on store tables and to replace the aging iPad 2s, Apple is removing Smart Signs because it found that the setup is confusing for some customers. Retail employees note that some customers were unaware that the Smart Signs were simply informational items and became confused when they could not fully use the device as a demo iPad.
The Smart Signs were a concept that sounded a lot cooler than how they were realised in stores. The information they provided wasn’t very smart, like a static microsite of some product specs. In contrast, the iPads that are attached to Apple Watch display units are truly smart with the second-display information keeping in sync with the current views on the Watch. These plaque displays are not going away.
The Smart Signs were also flawed by their physical dimensions, iPads are not that small. They are pretty big. It made the Apple Store look cluttered and tied up space that could be instead used for more real demo devices. Having a Smart Sign iPad accompanying an actual iPad was comical. Per Gurman’s report, Apple’s new solution is boring but in the end achieves the same result without the downsides, all things considered.
The biggest change to this new design is the decision to remove the Store link from the main navigation bar at the top of the screen. The bar had become crowded in recent months with the addition of the Watch, but it’s an interesting decision to remove that store rather than one of the other products.
Instead, the ability to shop for Apple gear has been integrated into each product page. As before, a “buy now” button appears on the page for each model. A shopping bag button has been added beside the search button to allow customers to quickly manage their bag, favorite items, Apple Store account, and more.
I had rather embarrassingly ignored the separation between Apple’s online website and its online store. I had always overlooked the amusing inelegance of a fragmented website for a company that prides itself on integration and simplicity.
Anyway, with no fanfare, Apple updated its site on Thursday to bring both parts together into one site. When you go to buy something now, you don’t get carted off into the ‘store’ subdomain to actually make a purchase. Everything happens from the same combined interface which means your basket is now visible from any page too, in the header. Neat and tidy.
On Twitter I quipped about what happens to the website on keynote day. How does the store go down when the whole site is the store? There’s a part of me that thinks this transition means the iconic ritual of ‘We’ll be back soon’ on new product day will never happen again. I can see how Ahrendts would think it was dumb and can it.
If you know you’re in my Skype contacts list feel free to park in front of my house with your Windows 10 PC. But you’ll have to bring your own Wi-Fi, because Wi-Fi Sense won’t let you connect to my network. That option is off by default for every network, as you can see by the Not Shared status message under each one.
And you have to very consciously enable sharing for a network. It’s not something you’ll do by accident.
For those not clued in, WiFi Sense is a feature in Windows 10 that shares WiFi network passwords with friends if you explicitly request to share the password of a particular network. WiFi Sense has been part of Windows Phone for a while but, naturally, nobody cared.
In his piece, Bott correctly addresses the fact that the sharing service does not happen without permission as some tech sites misreported when this blew up. However, it does have a security hole.
The person who gives permission does not have to be the WiFi network owner. Any person who is (or has) connected to the network can enable the sharing. This is the opening for abuse as the control of the setting is heavily diluted amongst clients. It might not be obvious why this is a problem.
A mostly harmless example is a coffee shop that gives the WiFi password when you buy something. With WiFi Sense, it is incredibly easy for someone to accidentally enable sharing and then all Windows users can free-ride on the internet without paying the cafe a cent. Similarly, in a residential setting, a hypothetical friend comes round to use my WiFi on his Windows 10 laptop. My entire block can now access my internet without ever talking to me without me even being told it was happening. It’s not out of the question that this then blows through my monthly data cap and I get foot with a costly bill. I’m sure you can extrapolate to find some more criminal examples.
There are limits to who receives the shared password, usually limited to the person’s Skype or Facebook friends, so it’s not like the whole world can suddenly join in. Still though, it’s the principle. It’s just weird that the network owner does not get ultimate control over this.
Jeremy Clarkson, Richard Hammond and James May are reuniting to create an all-new car show, exclusively for Amazon Prime. The show will be produced by the trio’s long time executive producer Andy Wilman. On working with Amazon, Jeremy Clarkson said “I feel like I’ve climbed out of a bi-plane and into a spaceship.” The first show will go into production shortly and arrive exclusively on Amazon Prime in 2016.
Not a surprise that Clarkson, Hammond and May made a deal with an online network for a new series … but signing with Amazon is an unexpected twist. I really thought Netflix would want Top Gear as a flagship new media programme. Really interested to see whether this turns out to be good or not, I would guess Amazon can give them an even bigger budget than what the BBC could. Hopefully, it’ll be the same show I know and love to watch, albeit dropping the Top Gear brand.
So in the coming months, a Google Account will be all you’ll need to share content, communicate with contacts, create a YouTube channel and more, all across Google. YouTube will be one of the first products to make this change, and you can learn more on their blog. As always, your underlying Google Account won’t be searchable or followable, unlike public Google+ profiles. And for people who already created Google+ profiles but don’t plan to use Google+ itself, we’ll offer better options for managing and removing those public profiles.
You’ll see these changes roll out in stages over several months. While they won’t happen overnight, they’re right for Google’s users—both the people who are on Google+ every single day, and the people who aren’t.
The only way I can read this is as an admission Google+ didn’t take off in the way it wanted it to, despite being mandatory to use other popular Google products, namely YouTube. Google now focusing Plus on the small niche discussion communities where it is doing well (which are good) and leaving things like YouTube as independent, separate, products.
Although the company has been discretely signalling this transition for a while, like Google Photos being positioned as a standalone offering, this blog post is confirmation that the dream of Plus as the persistent glue that connects your Google life together is indeed over.
Today’s redesign moves half of the video player – specifically the controls – from Flash to HTML5 and Javascript. The video itself is still in Flash underneath the controls. However, this is an important step to releasing the much-anticipated full HTML5 player.
You’ll begin to see the new player on channel pages first. As previously mentioned, this is a gradual roll out. If you are not part of our initial pool of users, please be patient as we release the redesigned player at a steady pace.
This piecemeal milestone has no real world advantage but at least signifies a start of a transition. What’s weird, though, is that Twitch will serve users HTML5 video today. Just visit a streamer’s page in Safari on iOS or Mac (with no Flash installed) and it shows a working <video> tag. Some of the nice Twitch-branded UI is missing but the actual crucial activity, the playback of the stream, works flawlessly. It’s been like this for as long as I can remember.
These are so cool on iOS 9 (and watchOS 2). Rather than being prerecorded videos, the trophies are rendered in realtime as 3D objects so you can flick and swivel them around with your finger. The material lighting is fantastic and it’s super responsive — even on the Watch. Many of the achievements are also engraved with your name to add that splash of personalisation.
It also makes me more motivated to unlock all of the other possible achievements just so I can see what they look like as fully 3D objects. I’ve seen some argue that these achievements are inconsistent with the flat design of the operating system. Sometimes though, you have to break the rules. This is a nice dash of skeuomorphism to add some real flair to the experience. Imagine that these achievements were merely flat 2D drawings. It wouldn’t be as fun nor as endearing.
Apple® today introduced the best iPod touch® yet and unveiled a new lineup of colors for all iPod® models, including space gray, silver, gold, pink and blue. The ultra-portable iPod touch features a new 8 megapixel iSight® camera for beautiful photos, an improved FaceTime® HD camera for even better selfies, the Apple-designed A8 chip with 10 times faster graphics performance for a more immersive gaming experience, and even better fitness tracking with the M8 motion coprocessor.
The iPods are the runt of the litter nowadays. That being said, today’s iPod touch update uses up-to-date internals like an A8 processor and an 8 megapixel camera. Whilst it still isn’t getting the design attention it probably should get by Apple, the body and case are unchanged, this iPod touch refresh is substantial enough that you don’t feel bad about buying one anymore. It is about as fast as an iPhone 6 and will be supported by the app ecosystem for many years.
I can’t praise Apple’s efforts on the Nano and Shuffle, though. Same internals, same software, different coloured cases. The Nano is a booby trap in Apple’s lineup primed for unwitting parents to buy as ‘nice’ Christmas gifts. The 16 GB Nano is overpriced for what it is, sold for $149 a pop. For $50 more, you can get a brand new 16 GB iPod touch. Seriously.
At least the Shuffle holds a distinct place in the range as a clip-on sports MP3 player. The same can’t be said for the iPod nano. The Nano is a mediocre imitation of the Touch in every way, but priced almost as much as the more powerful sibling.
In Fall of the Designer Part III, I noted how Twitter apps were becoming visually homogenized to the point that they were virtually indistinguishable. I could not have imagined it could go further. Following a recent update by Twitter for their native iOS client, it seems all three apps might as well have been designed by the same person.
Schiff doesn’t say this explicitly but the implication is that flat design is so constricting that the only possible outcome is homogeneity in application design. Certainly, the era of flat design makes it really easy to be lazy. You can make a ‘flat’ app with very little work that gels with modern appearance expectations and looks good, relying on copious whitespace and large typography.
I hate how this has transpired. It is true that the current ecosystem inlcudes a lot of apps that look similar. The important distinction is that it doesn’t have to be this way. The flat world is not constricting. You can make a wide variety of visual styles and colour schemes work. In the skeuomorphic world, it was incredibly easy to create a distinct visual style — just pick a different texture. It’s harder to do original refreshing aesthetics in a flat world (because the tendency to use white is so great) but it is possible. Developers need to work harder to achieve it.
If you want hard examples, the first place to look is the system apps themselves. I think Apple does a pretty good job about offering diverse user experiences in its own apps. Camera uses yellow and black reels, Messages has chat callouts with vivid gradients, Game Center has weird bubbly things and Weather has a vast array of rich backgrounds. Apple isn’t perfect by any means — apps like Videos and Phone are quite lacking in differentiation.
One of the biggest offenders for blandness post-iOS 7 is apps that are basically white scrolling timelines with navigation bars where the only real self-expression comes through a single tint color for interactive elements. In a past life, developers could employ an arsenal of ornamentation (shadowing, gradients) to make these kind of apps stand out.
The Apple Beta Software Program lets you try pre-release software and provide feedback to help us make it even better. In this guide you will find information on the latest beta releases and how to get started. Check back regularly for updates.
The public beta program is still strange to me. The seeds as they stand are still quite buggy but in obvious self-evident ways, ways that don’t need hundreds of thousands of ordinary users to test the OS and report back. If Public Beta members do report bugs, I am sceptical that they bring up new issues that Apple’s internal QA team or the developer community hasn’t already found at this ‘early’ stage in the cycle (seed 3).
Also, I’d love to see data on how many people running the public betas even bother to use the Feedback Assistant at all. I think most people that join do so just to be part of the ‘cool’ club running prerelease operating systems. They don’t have the motivation nor technical knowledge to post useful bug reports. Most developers don’t either.
Running a public beta program is a huge endeavour, especially when you consider the additional costs on Apple Support when something goes wrong, for what I see as little upside.
Today, I am really proud to announce the new Bingo Machine for iOS 7. Well, not exactly. It’s been a long time coming. I began work on this version of the app in June 2013, after WWDC. I hated the new look I had designed, I wasn’t sure what I liked in the new world of flat design, so it took time for me to be happy with something. I also had other stuff to do, other stuff got in the way.
Time eventually rolled into mid 2014. At this point, I decided I might as well skip iOS 7 entirely and wait for iOS 8. Crucially, Apple introduced live blurring for third party apps in iOS 8 … which then meant I could rework the design of Bingo Machine some more. I settled around September, but again delayed shipping as I prioritised some client projects and getting a Writing Aid widget out the door.
Naturally, Apple released WatchKit in November so I messed about with that, putting off the finishing touches to Bingo Machine again. I ended up dropping the Watch app again though. There was also some holdups with getting assets together. However, it’s finally done.
Bingo Machine 3 finally escapes the world of fake gloss and textured linen with a design that centres on black and white colours without feeling sterile. I have kept a level of realism to the bingo balls, although still drastically simplified over their previous designs. I use a physics engine (Box2D implementation, as UIKit Dynamics does not support circular bodies) as a substitute for the visual realism. It also feels more native to modern iOS. Balls fall freely into a full-width canister. I experimented with accelerometer-controlled movement but it was too distracting to keep.
The new appearance also drastically simplifies the mental model of the app. The application now has just three screens. The canister, the overview grid and the settings toggles. The latter two of these are presented as light modal panels, so users never lose context of the current game.
Thanks to live blur backdrops, activating the grid view to review called balls no longer feels like initiating a separate state. It feels like a transient display that can be quickly toggled on or off by tapping the bottom-left button. I love these toolbar buttons by the way: they have transparent inscriptions and inverted button states.
The bottom-right button represents the application settings. The icon is dynamically generated, though, so it can double up as a numerical status as well. Most of the time it shows number of balls left but will also act as a countdown when the calling timer is enabled.
I simplified the settings page itself down to a simple series of segmented controls. I love the symmetry. Most of the time the user will see three sections, customising game type, the timer and speech mode. If an external display is connected, a TV-Out option is dynamically appended below.
Whilst version 3 is mainly about a much-needed redesign, there are a few new features. I’ve re-recorded the spoken human catchphrases (and added a female speaker) as well as offering a synthesised voice which can service many more languages adequately. I am also experimenting with localisation of the UI into French and Spanish.
As Bingo Machine is often used as a learning tool, I have also added a way to change the language of the catchphrases independent of the iOS system language. It’s currently exposed with a long-press on the calling button and pretty hidden. I may have to change this in an update if I get a lot of support emails. I’m banking on it being a power-feature to justify not exposing it in the UI explicitly.
The canister design is better suited to the state of iOS devices, which now span many screen sizes and ratios. For taller devices, I can simply display more balls on screen. For iPad, I now use a split-view presentation with the canister view adjacent to the called balls board. I never liked how Bingo Machine was presented on iPad, it’s still not perfect, but it is miles better.
My goal with Bingo Machine 3 was to make something that was obvious, that required no explanation. I also wanted to portray a modern-but-consistent visual style across the application, straying from iOS’ visual appearance where I felt it was inadequate. I like to think I achieved these aims.
There were options to “Show Apple Music” and use the “iCloud Music Library,” both of which were unchecked, for some reason. I checked the options, and after only a few server errors when trying to log in to my iTunes account, my copy of iTunes finally showed the “For You,” and “New” tabs that I had seen screenshots of.
My playlist finally showed up in the “Playlists” section. The song that I had spent so long trying to delete was nowhere to be seen in the “My Music” tab, though it still persists on my iPad. I started thinking about how I’m supposed to add songs to a playlist. In Spotify, the list of playlists constantly stays on the left so that I can easily search for songs and simply drag them into the playlists. Wondering what would happen, I searched for “Animals as Leaders.” Turns out that the search pop-up has an option to search either through your library or through “Apple Music.” It defaults to your library every time you click the search box, so you have to take an extra step to click “Apple Music” in the actually pretty ugly search dialog every time you want to search. Sure, whatever, just get me to the songs.
Many, granted not all, of the UI problems associated with Apple Music are because Apple decided to integrate the new features into their existing music apps. On iOS, they extended the interface for the existing Music app. On the Mac, they added even more tabs and settings to the bloated iTunes app.
I think Apple would have been better served to develop its streaming services as an independent pieces of software. In the same way that enabling iCloud Photo Library is a clear line in the sand for how you manage photos on iOS, Apple Music should be of a similar separated experience. Right now, a lot of compromises are made to make it so both the new and old experiences can coexist.
To make this happen, they added a whole load more tabs to the tab bar to add ‘For You’, ‘New’ and ‘Connect’. This filled up the number of available tab slots so they had to find alternate ways of representing music filtering. For instance, they added a weird popup menu to switch between Albums, Artists and Songs. In the old app, these views were instantly available in the tab bar with one press. It doesn’t look like a hamburger navigation stack but that’s essentially what it is — a bundle of hidden actions hidden behind a button.
Similarly, every table cell has a ••• button now which brings up a contextual menu. Unfortunately, what’s contained in this menu is pretty random and illogical. The list vary from context to context so you can’t develop a muscle memory for tapping on specific things, often do not include seemingly obvious common actions and sometimes duplicate things already on the screen. There are also just too many actions shown at one time, lacking focus.
Distinguishing the streaming Apple Music from the current iTunes-backed music library with two different apps would help a lot. The Music app could still prompt people to upgrade to the subscription service but the apps themselves would be distinct. A ‘Add To Library’ option in Apple Music would treat the song as if it came from iTunes and show up in the local library as iTunes In The Cloud music does today. Naming would naturally need rethinking: having two apps simply named ‘Music’ and ‘Apple Music’ would be confusing.
Obviously, it is inelegant to have two apps but Apple’s current half-and-half attempt causes other problems. A lesser of two evils, if you will. Maybe there is a way to incorporate everything into one app nicely — necessarily dropping the tab bar for stack-based navigation — but two apps surely simplifies the mental model of ‘what is my music’ and ‘what is streaming’. Having two apps reeks of ugliness but observe how Apple separated out the Music app from the iTunes Store since iOS began. There are advantages to not stuffing everything into one.
Now, let’s consider the Mac. More accurately, how Apple messed up in adding streaming features to the iTunes app (Mac and PC). Imagine instead if Apple created a separate app that handled all the streaming and recommendation features of Apple Music. The need for cross-platform compatibility doesn’t really matter — Apple could make the app in the same way they create multi-platform iTunes. This clarifies a lot of the required UI and it could be tailored for the streaming music experience and intercommunication with the download-centric iTunes Store could be completely ignored. Everything syncs over iCloud anyway so there is no need for iTunes’ syncing infrastructure to be ported.
It sounds messy but this is exactly the same setup as other streaming services: you have both the Music app and the Spotify app. The same is true with Beats Music (before Apple acquired it obviously).
The transition period is what makes this especially hard. With the move from iPhoto to Photos, Apple basically says all or nothing. You either move forward to the new app or you stay in the past and iPhoto stops working as libraries become out of sync. The same attitude can’t really be held against this iTunes : Music initiative as Apple can’t really ask its customers to ‘upgrade’ to a new world where they have to pay monthly. There was no payment associated with the iPhoto to Photos migration.
Apple tried to put Apple Music into the existing iTunes interface and did an okay job. I am sure there are better unified designs possible than what Apple produced to integrate both the old and new worlds but clearly it is hard. Hard on creativity and hard on engineering. Accepting the disadvantages, building Apple Music as an independent app may have been a better strategic move.
It means that as Twitterrific displays media thumbnails in the timeline (pictures, videos, etc), the app tries to detect faces and frame the thumbnail so faces are always showing. In short, if Twitterrific sees a face in a tweet, it tries to make sure you see it too!
The effect when scanning through your list of tweets in the timeline can be dramatic. Previously Twitterrific always framed thumbnails on the center of images, but many times people’s faces aren’t in the middle, especially on portrait shots. Check out these before and after comparison screen shots to see the difference facial framing makes in the timeline:
Apple includes a load of APIs as standard in the SDK, ranging from language analysis, physics engines to image facial feature recognition. As a developer, you are always looking at ways to develop new apps and features by applying these — rather niche but rich — frameworks. Often, this means creating an entirely new app off the back of such a framework (Pedometer++ is a great example) but Twitterrific have cleverly thought up a way of using Core Image detectors to enhance the experience of their existing app.
They use Apple’s face detection APIs to better frame cropped images in the timeline. Is it life changing? No. Is it nice? Yes. Ideally, the feature is invisible. This is why these frameworks exist. The Iconfactory can piggyback off the work done by Apple (which originally developed the face detectors for its own Camera app) and deliver improvements like this to users with relatively little effort than if they had to develop the face detection engines in house.
We’re not in this business just to make money: all of us at the Iconfactory hope that our products will make people’s lives better. We’ve worked hard to make Twitterrific work well with the accessibility features in iOS. Hearing that these efforts make things easier for customers with disabilities is rewarding beyond words. (Listen to the podcast file in that last link to get a great idea of what life is like for a VoiceOver user.)
But now there’s another incentive for thinking about accessibility: helping others also helps your downloads.
This is a bit disingenuous. The sales boost from this Apple feature is almost certainly temporary and there are no guarantees that Apple will do such a promotion again. Also note that most of the apps in the store with VoiceOver support were not featured in this lis and received no benefit at all from their investment here.
On raw monetary grounds, investing in VoiceOver support is hard to justify on a raw cost-benefit analysis. I haven’t seen it materially affect sales under normal circumstances. Luckily, very little work gets you very far with VoiceOver conformance. It’s a nice thing to do that makes your apps accessible to an even larger base of people. As a developer, taking some time to do this is incredibly rewarding in ways other than bottom line profit. I love getting feedback from blind users of my apps. It makes me feel good that I’ve helped someone. That’s what makes adding VoiceOver worthwhile.
The original iPad mini has quietly disappeared from Apple’s web site, and is no longer available to purchase new from the Apple Store. Introduced in October 2012, the first iPad mini established the industrial design that was subsequently used in the iPad mini 2 and iPad mini 3, as well as the larger but otherwise nearly identical iPad Air and iPad Air 2. Apple notably continued to sell the 16GB iPad mini as an entry-level model alongside two of its sequels, dropping its price to $299 in October 2013, then $249 in October 2014.
It took five years but finally every device in the currently sold iOS lineup have Retina displays. It also signals the end of the A5 chip but developers can’t get too excited to end support as iOS 9 will run on the original iPad mini. The fifth-generation iPod touch also includes an A5 … I think that will disappear soon. My guess is that it will be replaced — in contrast to the line just being killed — by a new non-cellular iPhone variant with a 4.7 inch display.
The Touch is a nice gateway drug into the iOS world so I don’t think Apple can drop the product entirely. The only way it could get completely axed, I believe, is if Apple can sell an iPhone at iPod touch price points. We aren’t there yet.