How Machine Learning and Artificial Intelligence Used by Apple?

Yugal Choubisa
8 min readNov 12, 2020

Machine Learning and AI

Create intelligent features and enable new experiences for your apps by leveraging powerful on-device machine learning. Learn how to build, train, and deploy machine learning models into your iPhone, iPad, Apple Watch, and Mac apps.

The work is innovative.
The experience is magic.

The people working here in machine learning and AI are building amazing experiences into every Apple product, allowing millions to do what they never imagined. Because Apple fully integrates hardware and software across every device, these researchers and engineers collaborate more effectively to improve the user experience while protecting user data. Come make an impact with the products you create and the research you publish.

How does Apple use machine learning today?

Apple has made a habit of crediting machine learning with improving some features in the iPhone, Apple Watch, or iPad in its recent marketing presentations, but it rarely goes into much detail — and most people who buy an iPhone never watched those presentations, anyway.

There are numerous examples of machine learning being used in Apple’s software and devices, most of them new in just the past couple of years.

Machine learning is used to help the iPad’s software distinguish between a user accidentally pressing their palm against the screen while drawing with the Apple Pencil, and an intentional press meant to provide an input. It’s used to monitor users’ usage habits to optimize device battery life and charging, both to improve the time users can spend between charges and to protect the battery’s long term viability. It’s used to make app recommendations.

Then there’s Siri, which is perhaps the one thing any iPhone user would immediately perceive as artificial intelligence. Machine learning drives several aspects of Siri, from speech recognition to attempts by Siri to offer useful answers.

Siri(Apple Voice Assistant)

Savvy iPhone owners might also notice that machine learning is behind the Photos app’s ability to automatically sort pictures into pre-made galleries, or to accurately give you photos of a friend named Jane when her name is entered into the app’s search field.

In other cases, few users may realize that machine learning is at work. For example, your iPhone may take multiple pictures in rapid succession each time you tap the shutter button. An ML-trained algorithm then analyzes each image and can composite what it deems the best parts of each image into one result.

Phones have long included image signal processors (ISP) for improving the quality of photos digitally and in real time, but Apple accelerated the process in 2018 by making the ISP in the iPhone work closely with the Neural Engine, the company’s recently added machine learning-focused processor.

Apple performs machine learning tasks locally on the device, on hardware like the Apple Neural Engine (ANE) or on the company’s custom-designed GPUs (graphics processing units).

Apple dropped $200 million this week on a company that makes lightweight artificial intelligence. It’s all about keeping an edge in AI … by adding more AI to the edge.

If you’re an iPhone user, you’ve come across Apple’s AI, and not just in Siri’s improved acumen in figuring out what you ask of her. You see it when the phone identifies a caller who isn’t in your contact list (but did email you recently). Or when you swipe on your screen to get a shortlist of the apps that you are most likely to open next. Or when you get a reminder of an appointment that you never got around to putting into your calendar. Or when a map location pops up for the hotel you’ve reserved, before you type it in. Or when the phone points you to where you parked your car, even though you never asked it to. These are all techniques either made possible or greatly enhanced by Apple’s adoption of deep learning and neural nets.

Yes, there is an “Apple brain” — it’s already inside your iPhone.

Machine learning, my briefers say, is now found all over Apple’s products and services. Apple uses deep learning to detect fraud on the Apple store, to extend battery life between charges on all your devices, and to help it identify the most useful feedback from thousands of reports from its beta testers. Machine learning helps Apple choose news stories for you. It determines whether Apple Watch users are exercising or simply perambulating. It recognizes faces and locations in your photos. It figures out whether you would be better off leaving a weak Wi-Fi signal and switching to the cell network. It even knows what good filmmaking is, enabling Apple to quickly compile your snapshots and videos into a mini-movie at a touch of a button. Apple’s competitors do many similar things, but, say its executives, none of those AI powers can pull those things off while protecting privacy as closely as Apple does. And, of course, none of them make Apple products.

AI isn’t new to Apple: as early as the 1990s it was using some machine learning techniques in its handwriting recognition products. Remnants of those efforts are still to be found in today’s products that convert hand-scrawled Chinese characters into text or recognize the letter-by-letter input of an Apple Watch user finger-“scribbling”a custom message on the watch face. (Both of those features were produced by the same ML team of engineers.) Of course, in earlier days, machine learning was more primitive, and deep learning hadn’t even been buzzworded yet. Today, those AI techniques are all the rage, and Apple bristles at the implication that its learning is comparatively shallow. In recent weeks, CEO Tim Cook has made it a point to mention that the company is on it. And now, its top leaders are elaborating.

“Our devices are getting so much smarter at a quicker rate, especially with our Apple design A series chips. The back ends are getting so much smarter, faster, and everything we do finds some reason to be connected. This enables more and more machine learning techniques, because there is so much stuff to learn, and it’s available to [us].”

Though Apple wasn’t explaining everything about its AI efforts, I did manage to get resolution on how the company distributes ML expertise around its organization. The company’s machine learning talent is shared throughout the entire company, available to product teams who are encouraged to tap it to solve problems and invent features on individual products. “We don’t have a single centralized organization that’s the Temple of ML in Apple.

Apple’s AI infrastructure allows it to develop products and features that would not be possible by earlier means. It’s altering the company’s product road map. “Here at Apple there is no end to the list of really cool ideas,” says Schiller. “Machine learning is enabling us to say yes to some things that in past years we would have said no to. It’s becoming embedded in the process of deciding the products we’re going to do next.”

One example of this is the Apple Pencil that works with the iPad Pro. In order for Apple to include its version of a high-tech stylus, it had to deal with the fact that when people wrote on the device, the bottom of their hand would invariably brush the touch screen, causing all sorts of digital havoc. Using a machine learning model for “palm rejection” enabled the screen sensor to detect the difference between a swipe, a touch, and a pencil input with a very high degree of accuracy. “If this doesn’t work rock solid, this is not a good piece of paper for me to write on anymore — and Pencil is not a good product,” . If you love your Pencil, thank machine learning.

Sprinkled throughout Apple’s announcements about iOS, iPadOS, and macOS were a number of features and updates that have machine learning at their heart. Some weren’t announced onstage, and some features that almost certainly use AI weren’t identified as such, but here’s a quick recap of the more prominent mentions that we spotted:

  • Facial recognition for HomeKit. HomeKit-enabled smart cameras will use photos you’ve tagged on your phone to identify who’s at your door and even announce them by name.
  • Native sleep tracking for the Apple Watch. This uses machine learning to classify your movements and detect when you’re sleeping. The same mechanism also allows the Apple Watch to track new activities like dancing and…
  • Handwashing. The Apple Watch not only detects the motion but also the sound of handwashing, starting a countdown timer to make sure you’re washing for as long as needed.
  • App Library suggestions. A folder in the new App Library layout will use “on-device intelligence” to show apps you’re “likely to need next.” It’s small but potentially useful.
  • Translate app. This works completely offline, thanks to on-device machine learning. It detects the languages being spoken and can even do live translations of conversations.
  • Sound alerts in iOS 14. This accessibility feature wasn’t mentioned onstage, but it will let your iPhone listen for things like doorbells, sirens, dogs barking, or babies crying.
  • Handwriting recognition for iPad. This wasn’t specifically identified as an AI-powered feature, but we’d bet dollars to donuts it is. AI is fantastic at image recognition tasks, and identifying both Chinese and English characters is a fitting challenge.

There are absences in this list — most notably Siri, Apple’s perennially disappointing digital assistant. Although Siri is AI-heavy, it mostly got cosmetic updates this year (oh, and “20 times more facts,” whatever that means). A new interface is a welcome change for sure, but it’s small fry when you compare Siri’s overall performance with other AI assistants.

Machine Learning will transform every part of the Apple experience in the coming years.

— John Giannandrea

(Apple’s senior vice president for AI and Machine Learning)

Hope This may helps you guys.

Thank You and Stay Safe

--

--

Yugal Choubisa

Mr. Engineer, Technical Content Writer, Love to Share knowledge