Apple is getting dangerously close to making a great deal of science fiction a reality, with machine learning and computer vision at the center.
One of the things I learned very early on in my limited relationship with Steve Jobs was that he was a control freak. And while this got him fired from Apple in 1985, it served him well in one key area: manufacturing and the supply chain.
His desire to control the process drove him and his team to develop the iPhone processor, an area of expertise Apple has since expanded to other products, including the. Jobs’ philosophy was that if Apple bought components off the rack, it would never outdo its competitors.
I have been impressed with Apple’s semiconductor chops; its design work has created a library of IP cores it can build upon for years to come. Apple still relies on Intel for the Mac’s core processor, but I believe that will change in the next two years.
Last week, Apple added another upgrade to the iPhone’s A-Series processor with the A12 Bionic.
This chip is very different from previous iterations. In the A11 Bionic, the neural engine took a much smaller part of the overall SoC block and was integrated with some other components. It was capable of 600 billion operations per second and was a dual-core design.
The neural engine in thenow has a dedicated block in the SoC, has jumped from two to eight cores, and is now capable of 5 trillion operations per second. But it all comes together in the software, where Apple is letting developers use CoreML to make apps we have never experienced before.
Apple is getting dangerously close to making a great deal of science fiction a reality, withand computer vision at the center. Until recently, this technology has been relegated to highly controlled experiences. But it’s now front and center in the automotive industry as autonomous cars take to the streets. , which detects and recognizes objects through a smartphone camera, is another impressive example of computer vision.
Now with the A12 Bionic and rich APIs in developers’ hands, it’s exciting to think about what’s to come on the app front. If you have not seen it, I encourage you to watch the Homecourt demo from Apple’s Sept. 12 event (at the 59:45 mark in the video above). The app did real-time video analysis of a basketball player: how many shots he made or missed, where on the court he made and missed them as a percentage of his shots, and even his form down to the legs and wrist in order to look for patterns. It was an incredible demonstration with real-world value, yet it only scratches the surface of what developers can do with this new era of iPhone software.
Machine Learning and AI as the New Software Architecture
When it comes to this paradigm change in software, machine learning and artificial intelligence (will enable a new era of modern software.
I can’t overstate how vital semiconductor innovation is to this effort. We have seen it in cloud computing, as many Fortune 500 companies are now deploying cloud-based machine learning software thanks to innovations from AMD and Nvidia. But the client-side processing for machine learning has been well behind the capabilities of the cloud until now. Apple has a brought a real machine learning powerhouse to its customers’ pockets and opened it up to the largest and most creative developer community of any platform.
Even more interesting is that Apple’s vertical integration makes it hard for competitors to keep up. Samsung does a pretty good job competing at the semiconductor level, and its mobile division can take advantage of various divisions in Samsung Corporate. But even here, Apple has a pretty solid edge in the design process, since its teams are part of one larger team that creates any new product. Samsung must tap into individual divisions within Samsung, and as far as I can tell, it doesn’t integrate its semiconductor team into the overall product R&D.
I look for Apple to use even more homegrown IP in semiconductors and perhaps other components in the future; its role as a product and service powerhouse is integral to its future.
Read more: “”
Originally published at.
was originally published in on Medium, where people are continuing the conversation by highlighting and responding to this story.