Twitch Starts Transition From Flash To A HTML5 Video Player

Twitch:

Today’s redesign moves half of the video player – specifically the controls – from Flash to HTML5 and Javascript. The video itself is still in Flash underneath the controls. However, this is an important step to releasing the much-anticipated full HTML5 player.

You’ll begin to see the new player on channel pages first. As previously mentioned, this is a gradual roll out. If you are not part of our initial pool of users, please be patient as we release the redesigned player at a steady pace.

This piecemeal milestone has no real world advantage but at least signifies a start of a transition. What’s weird, though, is that Twitch will serve users HTML5 video today. Just visit a streamer’s page in Safari on iOS or Mac (with no Flash installed) and it shows a working <video> tag. Some of the nice Twitch-branded UI is missing but the actual crucial activity, the playback of the stream, works flawlessly. It’s been like this for as long as I can remember.

iOS 9 Apple Watch Interactive Animated Achievements 

These are so cool on iOS 9 (and watchOS 2). Rather than being prerecorded videos, the trophies are rendered in realtime as 3D objects so you can flick and swivel them around with your finger. The material lighting is fantastic and it’s super responsive — even on the Watch. Many of the achievements are also engraved with your name to add that splash of personalisation.

It also makes me more motivated to unlock all of the other possible achievements just so I can see what they look like as fully 3D objects. I’ve seen some argue that these achievements are inconsistent with the flat design of the operating system. Sometimes though, you have to break the rules. This is a nice dash of skeuomorphism to add some real flair to the experience. Imagine that these achievements were merely flat 2D drawings. It wouldn’t be as fun nor as endearing.

The New iPods 

Apple:

Apple® today introduced the best iPod touch® yet and unveiled a new lineup of colors for all iPod® models, including space gray, silver, gold, pink and blue. The ultra-portable iPod touch features a new 8 megapixel iSight® camera for beautiful photos, an improved FaceTime® HD camera for even better selfies, the Apple-designed A8 chip with 10 times faster graphics performance for a more immersive gaming experience, and even better fitness tracking with the M8 motion coprocessor.

The iPods are the runt of the litter nowadays. That being said, today’s iPod touch update uses up-to-date internals like an A8 processor and an 8 megapixel camera. Whilst it still isn’t getting the design attention it probably should get by Apple, the body and case are unchanged, this iPod touch refresh is substantial enough that you don’t feel bad about buying one anymore. It is about as fast as an iPhone 6 and will be supported by the app ecosystem for many years.

I can’t praise Apple’s efforts on the Nano and Shuffle, though. Same internals, same software, different coloured cases. The Nano is a booby trap in Apple’s lineup primed for unwitting parents to buy as ‘nice’ Christmas gifts. The 16 GB Nano is overpriced for what it is, sold for $149 a pop. For $50 more, you can get a brand new 16 GB iPod touch. Seriously.

At least the Shuffle holds a distinct place in the range as a clip-on sports MP3 player. The same can’t be said for the iPod nano. The Nano is a mediocre imitation of the Touch in every way, but priced almost as much as the more powerful sibling.

On How Flat Design All Looks The Same

Eli Schiff:

In Fall of the Designer Part III, I noted how Twitter apps were becoming visually homogenized to the point that they were virtually indistinguishable. I could not have imagined it could go further. Following a recent update by Twitter for their native iOS client, it seems all three apps might as well have been designed by the same person.

Schiff doesn’t say this explicitly but the implication is that flat design is so constricting that the only possible outcome is homogeneity in application design. Certainly, the era of flat design makes it really easy to be lazy. You can make a ‘flat’ app with very little work that gels with modern appearance expectations and looks good, relying on copious whitespace and large typography.

I hate how this has transpired. It is true that the current ecosystem inlcudes a lot of apps that look similar. The important distinction is that it doesn’t have to be this way. The flat world is not constricting. You can make a wide variety of visual styles and colour schemes work. In the skeuomorphic world, it was incredibly easy to create a distinct visual style — just pick a different texture. It’s harder to do original refreshing aesthetics in a flat world (because the tendency to use white is so great) but it is possible. Developers need to work harder to achieve it.

If you want hard examples, the first place to look is the system apps themselves. I think Apple does a pretty good job about offering diverse user experiences in its own apps. Camera uses yellow and black reels, Messages has chat callouts with vivid gradients, Game Center has weird bubbly things and Weather has a vast array of rich backgrounds. Apple isn’t perfect by any means — apps like Videos and Phone are quite lacking in differentiation.

One of the biggest offenders for blandness post-iOS 7 is apps that are basically white scrolling timelines with navigation bars where the only real self-expression comes through a single tint color for interactive elements. In a past life, developers could employ an arsenal of ornamentation (shadowing, gradients) to make these kind of apps stand out.

The iOS and OS X Public Beta Program

Apple:

The Apple Beta Software Program lets you try pre-release software and provide feedback to help us make it even better. In this guide you will find information on the latest beta releases and how to get started. Check back regularly for updates.

The public beta program is still strange to me. The seeds as they stand are still quite buggy but in obvious self-evident ways, ways that don’t need hundreds of thousands of ordinary users to test the OS and report back. If Public Beta members do report bugs, I am sceptical that they bring up new issues that Apple’s internal QA team or the developer community hasn’t already found at this ‘early’ stage in the cycle (seed 3).

Also, I’d love to see data on how many people running the public betas even bother to use the Feedback Assistant at all. I think most people that join do so just to be part of the ‘cool’ club running prerelease operating systems. They don’t have the motivation nor technical knowledge to post useful bug reports. Most developers don’t either.

Running a public beta program is a huge endeavour, especially when you consider the additional costs on Apple Support when something goes wrong, for what I see as little upside.

Bingo Machine 3

Today, I am really proud to announce the new Bingo Machine for iOS 7. Well, not exactly. It’s been a long time coming. I began work on this version of the app in June 2013, after WWDC. I hated the new look I had designed, I wasn’t sure what I liked in the new world of flat design, so it took time for me to be happy with something. I also had other stuff to do, other stuff got in the way.

Time eventually rolled into mid 2014. At this point, I decided I might as well skip iOS 7 entirely and wait for iOS 8. Crucially, Apple introduced live blurring for third party apps in iOS 8 … which then meant I could rework the design of Bingo Machine some more. I settled around September, but again delayed shipping as I prioritised some client projects and getting a Writing Aid widget out the door.

Naturally, Apple released WatchKit in November so I messed about with that, putting off the finishing touches to Bingo Machine again. I ended up dropping the Watch app again though. There was also some holdups with getting assets together. However, it’s finally done.

Bingo Machine 3 finally escapes the world of fake gloss and textured linen with a design that centres on black and white colours without feeling sterile. I have kept a level of realism to the bingo balls, although still drastically simplified over their previous designs. I use a physics engine (Box2D implementation, as UIKit Dynamics does not support circular bodies) as a substitute for the visual realism. It also feels more native to modern iOS. Balls fall freely into a full-width canister. I experimented with accelerometer-controlled movement but it was too distracting to keep.

The new appearance also drastically simplifies the mental model of the app. The application now has just three screens. The canister, the overview grid and the settings toggles. The latter two of these are presented as light modal panels, so users never lose context of the current game.

Thanks to live blur backdrops, activating the grid view to review called balls no longer feels like initiating a separate state. It feels like a transient display that can be quickly toggled on or off by tapping the bottom-left button. I love these toolbar buttons by the way: they have transparent inscriptions and inverted button states.

The bottom-right button represents the application settings. The icon is dynamically generated, though, so it can double up as a numerical status as well. Most of the time it shows number of balls left but will also act as a countdown when the calling timer is enabled.

I simplified the settings page itself down to a simple series of segmented controls. I love the symmetry. Most of the time the user will see three sections, customising game type, the timer and speech mode. If an external display is connected, a TV-Out option is dynamically appended below.

Whilst version 3 is mainly about a much-needed redesign, there are a few new features. I’ve re-recorded the spoken human catchphrases (and added a female speaker) as well as offering a synthesised voice which can service many more languages adequately. I am also experimenting with localisation of the UI into French and Spanish.

As Bingo Machine is often used as a learning tool, I have also added a way to change the language of the catchphrases independent of the iOS system language. It’s currently exposed with a long-press on the calling button and pretty hidden. I may have to change this in an update if I get a lot of support emails. I’m banking on it being a power-feature to justify not exposing it in the UI explicitly.

The canister design is better suited to the state of iOS devices, which now span many screen sizes and ratios. For taller devices, I can simply display more balls on screen. For iPad, I now use a split-view presentation with the canister view adjacent to the called balls board. I never liked how Bingo Machine was presented on iPad, it’s still not perfect, but it is miles better.

My goal with Bingo Machine 3 was to make something that was obvious, that required no explanation. I also wanted to portray a modern-but-consistent visual style across the application, straying from iOS’ visual appearance where I felt it was inadequate. I like to think I achieved these aims.

Bingo Machine 3 is still just $0.99 cents on the App Store. Tell me what you think and leave a review — it really helps.

Apple Music, Despite The New Icon, Is Not A New App

Cezary Wojcik:

There were options to “Show Apple Music” and use the “iCloud Music Library,” both of which were unchecked, for some reason. I checked the options, and after only a few server errors when trying to log in to my iTunes account, my copy of iTunes finally showed the “For You,” and “New” tabs that I had seen screenshots of.

My playlist finally showed up in the “Playlists” section. The song that I had spent so long trying to delete was nowhere to be seen in the “My Music” tab, though it still persists on my iPad. I started thinking about how I’m supposed to add songs to a playlist. In Spotify, the list of playlists constantly stays on the left so that I can easily search for songs and simply drag them into the playlists. Wondering what would happen, I searched for “Animals as Leaders.” Turns out that the search pop-up has an option to search either through your library or through “Apple Music.” It defaults to your library every time you click the search box, so you have to take an extra step to click “Apple Music” in the actually pretty ugly search dialog every time you want to search. Sure, whatever, just get me to the songs.

Many, granted not all, of the UI problems associated with Apple Music are because Apple decided to integrate the new features into their existing music apps. On iOS, they extended the interface for the existing Music app. On the Mac, they added even more tabs and settings to the bloated iTunes app.

I think Apple would have been better served to develop its streaming services as an independent pieces of software. In the same way that enabling iCloud Photo Library is a clear line in the sand for how you manage photos on iOS, Apple Music should be of a similar separated experience. Right now, a lot of compromises are made to make it so both the new and old experiences can coexist.

To make this happen, they added a whole load more tabs to the tab bar to add ‘For You’, ‘New’ and ‘Connect’. This filled up the number of available tab slots so they had to find alternate ways of representing music filtering. For instance, they added a weird popup menu to switch between Albums, Artists and Songs. In the old app, these views were instantly available in the tab bar with one press. It doesn’t look like a hamburger navigation stack but that’s essentially what it is — a bundle of hidden actions hidden behind a button.

Similarly, every table cell has a ••• button now which brings up a contextual menu. Unfortunately, what’s contained in this menu is pretty random and illogical. The list vary from context to context so you can’t develop a muscle memory for tapping on specific things, often do not include seemingly obvious common actions and sometimes duplicate things already on the screen. There are also just too many actions shown at one time, lacking focus.

Distinguishing the streaming Apple Music from the current iTunes-backed music library with two different apps would help a lot. The Music app could still prompt people to upgrade to the subscription service but the apps themselves would be distinct. A ‘Add To Library’ option in Apple Music would treat the song as if it came from iTunes and show up in the local library as iTunes In The Cloud music does today. Naming would naturally need rethinking: having two apps simply named ‘Music’ and ‘Apple Music’ would be confusing.

Obviously, it is inelegant to have two apps but Apple’s current half-and-half attempt causes other problems. A lesser of two evils, if you will. Maybe there is a way to incorporate everything into one app nicely — necessarily dropping the tab bar for stack-based navigation — but two apps surely simplifies the mental model of ‘what is my music’ and ‘what is streaming’. Having two apps reeks of ugliness but observe how Apple separated out the Music app from the iTunes Store since iOS began. There are advantages to not stuffing everything into one.

Now, let’s consider the Mac. More accurately, how Apple messed up in adding streaming features to the iTunes app (Mac and PC). Imagine instead if Apple created a separate app that handled all the streaming and recommendation features of Apple Music. The need for cross-platform compatibility doesn’t really matter — Apple could make the app in the same way they create multi-platform iTunes. This clarifies a lot of the required UI and it could be tailored for the streaming music experience and intercommunication with the download-centric iTunes Store could be completely ignored. Everything syncs over iCloud anyway so there is no need for iTunes’ syncing infrastructure to be ported.

It sounds messy but this is exactly the same setup as other streaming services: you have both the Music app and the Spotify app. The same is true with Beats Music (before Apple acquired it obviously).

The transition period is what makes this especially hard. With the move from iPhoto to Photos, Apple basically says all or nothing. You either move forward to the new app or you stay in the past and iPhoto stops working as libraries become out of sync. The same attitude can’t really be held against this iTunes : Music initiative as Apple can’t really ask its customers to ‘upgrade’ to a new world where they have to pay monthly. There was no payment associated with the iPhoto to Photos migration.

Apple tried to put Apple Music into the existing iTunes interface and did an okay job. I am sure there are better unified designs possible than what Apple produced to integrate both the old and new worlds but clearly it is hard. Hard on creativity and hard on engineering. Accepting the disadvantages, building Apple Music as an independent app may have been a better strategic move.

Twitterrific Adds Face Detection To Improve Image Crops 

The Iconfactory:

It means that as Twitterrific displays media thumbnails in the timeline (pictures, videos, etc), the app tries to detect faces and frame the thumbnail so faces are always showing. In short, if Twitterrific sees a face in a tweet, it tries to make sure you see it too!

The effect when scanning through your list of tweets in the timeline can be dramatic. Previously Twitterrific always framed thumbnails on the center of images, but many times people’s faces aren’t in the middle, especially on portrait shots. Check out these before and after comparison screen shots to see the difference facial framing makes in the timeline:

Apple includes a load of APIs as standard in the SDK, ranging from language analysis, physics engines to image facial feature recognition. As a developer, you are always looking at ways to develop new apps and features by applying these — rather niche but rich — frameworks. Often, this means creating an entirely new app off the back of such a framework (Pedometer++ is a great example) but Twitterrific have cleverly thought up a way of using Core Image detectors to enhance the experience of their existing app.

They use Apple’s face detection APIs to better frame cropped images in the timeline. Is it life changing? No. Is it nice? Yes. Ideally, the feature is invisible. This is why these frameworks exist. The Iconfactory can piggyback off the work done by Apple (which originally developed the face detectors for its own Camera app) and deliver improvements like this to users with relatively little effort than if they had to develop the face detection engines in house.

Adding VoiceOver To Apps

The Iconfactory:

We’re not in this business just to make money: all of us at the Iconfactory hope that our products will make people’s lives better. We’ve worked hard to make Twitterrific work well with the accessibility features in iOS. Hearing that these efforts make things easier for customers with disabilities is rewarding beyond words. (Listen to the podcast file in that last link to get a great idea of what life is like for a VoiceOver user.)

But now there’s another incentive for thinking about accessibility: helping others also helps your downloads.

This is a bit disingenuous. The sales boost from this Apple feature is almost certainly temporary and there are no guarantees that Apple will do such a promotion again. Also note that most of the apps in the store with VoiceOver support were not featured in this lis and received no benefit at all from their investment here.

On raw monetary grounds, investing in VoiceOver support is hard to justify on a raw cost-benefit analysis. I haven’t seen it materially affect sales under normal circumstances. Luckily, very little work gets you very far with VoiceOver conformance. It’s a nice thing to do that makes your apps accessible to an even larger base of people. As a developer, taking some time to do this is incredibly rewarding in ways other than bottom line profit. I love getting feedback from blind users of my apps. It makes me feel good that I’ve helped someone. That’s what makes adding VoiceOver worthwhile.

iPad Mini No Longer Sold On Apple.com

9to5Mac:

The original iPad mini has quietly disappeared from Apple’s web site, and is no longer available to purchase new from the Apple Store. Introduced in October 2012, the first iPad mini established the industrial design that was subsequently used in the iPad mini 2 and iPad mini 3, as well as the larger but otherwise nearly identical iPad Air and iPad Air 2. Apple notably continued to sell the 16GB iPad mini as an entry-level model alongside two of its sequels, dropping its price to $299 in October 2013, then $249 in October 2014.

It took five years but finally every device in the currently sold iOS lineup have Retina displays. It also signals the end of the A5 chip but developers can’t get too excited to end support as iOS 9 will run on the original iPad mini. The fifth-generation iPod touch also includes an A5 … I think that will disappear soon. My guess is that it will be replaced — in contrast to the line just being killed — by a new non-cellular iPhone variant with a 4.7 inch display.

The Touch is a nice gateway drug into the iOS world so I don’t think Apple can drop the product entirely. The only way it could get completely axed, I believe, is if Apple can sell an iPhone at iPod touch price points. We aren’t there yet.

Making The Original Crash Bandicoot

Quora:

By far the best part in retrospect—and the worst part at the time—was getting the core C/assembly code to fit. We were literally days away from the drop-dead date for the “gold master”—our last chance to make the holiday season before we lost the entire year—and we were randomly permuting C code into semantically identical but syntactically different manifestations to get the compiler to produce code that was 200, 125, 50, then 8 bytes smaller. Permuting as in, “for (i=0; i < x; i++)”—what happens if we rewrite that as a while loop using a variable we already used above for something else? This was after we’d already exhausted the usual tricks of, e.g., stuffing data into the lower two bits of pointers (which only works because all addresses on the R3000 were 4-byte aligned).

Ultimately Crash fit into the PS1’s memory with 4 bytes to spare. Yes, 4 bytes out of 2097152. Good times.

These anecdotes are so cool. 2097152 bytes is nothing but back then it was the entire memory space available for a game. It’s hard to conceive how restrictive this is given today’s abundance of available memory and storage. 2097152 bytes is equivalent to 2.1 megabytes. Can you think of any modern day media that fits in 2 megabytes?

iOS 9 Picture In Picture Takes Desktop Concepts To Mobile And Does It Better

iPad multitasking was sorely needed. What Apple did matches up really well with what I asked for way back in January 2014. In iOS 9, Apple implemented both of my feature requests, Panels (which Apple calls Split View) and Popovers (which Apple calls Slideover). The nomenclature is different but the feature descriptions are almost identical. The iPad Pro element of the puzzle will no doubt show itself later in the year.

iPad Multitasking is hard to verbalise and describe well as its really an umbrella term for a class of different modes and behaviours. I’ve tested it and everything is pretty straightforward and intuitive to actually use, way easier than trying to write up an explanation of every interaction. I want to focus on one element of iPad multitasking in particular: picture in picture.

The thing about the iPad picture-in-picture implementation is that its actually better than how one would handle such a task on a Mac. On a Mac, trying to play a video in the corner whilst getting on with your work is difficult. Let’s take a video on YouTube playing in Safari. To play this in a corner of the screen on a Mac, you have to pull the window out into its own tab. Then, you have to manually drag the corners of the window to resize it and do your best to clip out all the unnecessary surrounding UI by hand. No doubt the window has a toolbar so you’ll probably have to do some awkward keyboard shortcut or hidden menu command to hide that as well.

Then you have to actually manage the window as you go on with your work. What do I mean by this? Well, with every other task you open you also have to make sure it doesn’t occlude the video playback window by dragging it out the way. The video can’t stay foremost so it’s actually really easy to lose the video amongst your other windows.

If you ever want to move the video from one corner to another, not only do you have to position the video on the screen, you also have to move all your other windows back over to the other side.

What if you want to make the video view a bit bigger? Drag the corners right? Nope. Dragging the corners of Safari just makes the viewport bigger showing the content on the webpage that surrounds the video. To actually make the video bigger, you have to zoom into the page and then readjust the window region to again fit the new video bounds. It’s a mess and the iPad implementation should embarrass the Mac team.

On the iPad, you play a video. With the video still playing, you press the Home button and the video scales down into a floating always-on-top chromeless panel. There’s a subtle drop shadow to indicate layering but nothing overbearing. You can make it bigger or smaller with a quick two-finger pinch and use one finger to flick it to any corner where it snaps into place. It’s so much simpler. There’s nothing to manage.

Just compare how many words I needed to describe the required interactions on the Mac and the iOS 9 iPad to achieve the same result. The drastically-simplified iPad implementation puts the Mac to shame. iOS 9 picture-in-picture is really great.

To seal the deal further, this behaviour also works with FaceTime calls. On iOS 9, with the same single press of the Home Button, you can now multitask videoconferencing and any other app. This is a massive boon for business customer but also has benefits normal people: my Mum always switches out to other apps whilst talking to me on a FaceTime call and having a persistent video stream of my face is a perfect reassurance that I am still ‘there’.

Picture-in-picture really is a fantastic feature with fantastic design. Its an incredible translation of a desktop metaphor to a tablet. Its so much simpler than a typical window manager with almost no compromise. Its just better. Why doesn’t the Mac work this way?

ReplayKit Screen Recording In iOS 9

9to5Mac:

With ReplayKit, developers will be able to offer users the ability to screen record gameplay or other apps automatically or manually with a single tap. Users will then be able to share recorded content through an iOS share sheet directly to social networks and video sharing sites. Apple pauses all incoming notifications and anything that might ruin the gameplay video experience, and only users will have access to the recorded videos.

It’s cool technology. The lacklustre part of this jigsaw is the sharing element. Apple’s answer to this problem is to present the system share sheet and delegate all responsibility to third-parties. This feels like a cop out. Apple should integrate this feature with Game Center so users could upload their clips to Apple’s servers and friends could visit Game Center profiles and watch the highlights. The videos would ideally be tagged with the app they originate from, helping with marketing.

Mike Beasley Mocks Up iOS 9 App Store App With San Francisco Font

Mike Beasley, 9to5Mac:

A second factor that helped influence these design changes (to a lesser degree) is the rumor that iOS 9 will change the system font to San Francisco, the typeface created for the Apple Watch. All of these mockups use that font.

On the Happy Hour podcast, I tried to make an analogy between swapping fonts in UI and introducing a new piece of modern furniture into a room. I say tried because I’m pretty sure I failed to make my point.

Let me try again in words. You can’t just replace all instances of Helvetica with San Francisco if you want the design to look good. The rest of the iOS UI is dependent on the Helvetica sizing and aesthetics. I think Beasley’s mockups show this well. On toolbars, the angular square letter forms of San Francisco just don’t mesh with the nicely rounded bar button icons like the Search magnifying glass. These icons are rounded (and use the same line width) as Helvetica, not San Francisco.

To do this properly, you would have to rework the bar button icons and other details (like navigation bar height).

Font choices are an integral part of the UI and the whole design is interconnected with the choice of font. For the Watch, Apple made different variants of San Francisco for different parts of the interface. It’s designed with the mannerisms and characteristics of font in mind so it works well and looks good.

For iOS 9 and OS X 10.11, I hope Apple will apply the same level of care although I fear they haven’t. Redesigning UI and drawing new icons is a big project and the rumours haven’t pointed to this happening. One semi-solution might be to use the rounded San Francisco typefaces. They will — naturally — be a better fit as a substitute for the (also rounded) Helvetica.

Unfortunately, the historical precedents suggest Apple will not apply this level of care. With Yosemite, Apple willingly switched out Lucida Grande for Helvetica on the Mac and I think the result is not ugly, but not beautiful. With just two days until the reveal, I can only hope that Apple has done a better job this time around.

Google Photos 'Free' Unlimited Photo Storage

Google:

Google Photos gives you a single, private place to keep a lifetime of memories, and access them from any device. They’re automatically backed up and synced, so you can have peace of mind that your photos are safe, available across all your devices.

And when we say a lifetime of memories, we really mean it. With Google Photos, you can now backup and store unlimited, high-quality photos and videos, for free. We maintain the original resolution up to 16MP for photos, and 1080p high-definition for videos, and store compressed versions of the photos and videos in beautiful, print-quality resolution. For all the storage details, visit our help center.

When Google announced this and I tweeted how embarrassing this is for Apple’s iCloud pricing, I was barraged with feedback that it isn’t really free. If you are not paying for the product, you are the product. Yes, I know how Google works. I’m sure that somewhere in the terms and service Google will use data extracted from my photo library to serve ads.

It doesn’t matter. What matters is that for almost-free, Google is offering unlimited photo storage. Apple’s almost-free plan gives you a paltry 20 GB. When I say it’s embarrassing, I’m not saying Apple should store all your photos for free. I’m saying Apple should store all your photos for a lot cheaper than they do. With such high margins on hardware, maybe they could do it for truly free.

If not, they can certainly slash their iCloud prices to better compete. Google gives you 20 GB for free and offers a terabyte for $10/month. Apple offers 5 GB for free and a terabyte for $20 per month. The real kicker is when you realise Google offers that to people that have never bought anything else from Google. With Apple, they’ve already extracted hundreds of dollars of profit out of users in hardware sales and also charge comparatively exorbitant prices for cloud storage.

'Proactive' To Bring Google Now Features To iOS 9 

9to5Mac:

Below the search bar will sit a new user interface that automatically populates with content based around three key parts of iOS: Apps, Contacts, and Maps, effectively a considerably upgraded version of Siri’s existing “digital assistant” functionality. For example, if a user has a flight listed in her Calendar application and a boarding pass stored in Passbook, a bubble within the new Proactive screen will appear around flight time to provide quick access to the boarding pass. If a user has a calendar appointment coming up, a map view could appear with an estimated arrival time, directions, and a time to leave indicator based on traffic. Proactive will also be able to trigger push notifications to help the user avoid missing calendar events.

Beyond Calendar integration, the feature will be able to integrate with commonly used apps. For example, if an iPhone user typically opens the Facebook app when he wakes up around 9AM, a button to access Facebook will start to appear for the user around 9AM. If the user calls his mother every Tuesday at 5PM, a bubble to “Call Mom” could appear around that time every Tuesday. As this feature integrates deeply with a user’s contact list, it is likely that the Recent Contacts menu introduced to the top of the Multitasking pane in iOS 8 will be relocated to Proactive’s interface. Lastly, Proactive will be able to display restaurant suggestions and ratings around breakfast, lunch, and dinner times in Proactive, changing based on the user’s location.

First and foremost, Apple is moving the Spotlight search view to a new place in the UI (next to the first page of the home screen) which is a huge improvement for visibility compared its current hidden location, only revealed by a disconnected downward swipe at the screen. Having no visual indicators about the feature is rough but it was doubly-complicated by the discrepancy of swiping from the edge to open Notification Center and swiping from ‘not-the-edge’ to activate Spotlight. Explaining this to family was tough and they still forget about Spotlight a few weeks later. Returning Spotlight to its pre-iOS 7 position is the right move for user experience and its reassuring to see that Apple is not stubbornly against using some elements of the ‘old’ iOS era.

The wholly new Proactive features sound cool. From Gurman’s description, it sounds very similar to Google Now but focused on areas where Apple can collect good data. For example, the Proactive feature will notice user habits regarding when they open apps and show as appropriate. For instance, surfacing the Starbucks app in the mornings for users who take a daily commute.

The places where Android will continue to surpass Apple is in areas where it cannot collect the same level of data. Few users use iCloud Mail so crawling that for contextual information is out of the question unlike Google’s dominance with Gmail. In the future, I think Apple could potentially get around this problem by looking at your email through the Mail.app databases. It doesn’t necessarily have to host your email to get features like this but obviously this is a big undertaking that is out of scope for the initial release.

In fact, Gurman reports that Vorrath is hesitant to launch Proactive in any ‘full’ capacity — preferring an iterative sequence of releases. This seems like risk-averse damage limitation. Just like Siri and Maps, Proactive is one of those fuzzy-logic big-data features that is impossible to work flawlessly every time. By starting small, they contain the inevitably negative PR backlash.

When iOS 9 ships, it will be interesting to observe whether people actually latch on to these new Proactive features and use them. Old habits die hard and tapping on application icons is deeply ingrained into user behaviour. Android has had Google Now features for years now and — anecdotally — they are underused too.