No explanation was provided
These days, some of the most important new smartphone features and performance gains to be found have to do with AI-powered workloads. So it wasn’t any surprise when Google and Qualcomm announced last year that they were planning to make the Neural Networks APIs that power those features on Android updateable via Google Play Services — one small part of Android’s growing update modularity. Unfortunately, it looks like those plans have been abandoned.
According to changes recently pushed to the AOSP Gerrit, spotted by Esper.io’s Mishaal Rahman, Google is giving up on the updateable NNAPI, which would have worked through an Play Store-provided APEX module, similar to the mainline modules some of our readers might be familiar with, and separate updateable hardware drivers delivered through Play Services.
Google did not explain why it was giving up on the change, but an updateable API would have clear benefits. Updating components of the system separately from the system itself is always a boon to promptly patching security holes without waiting on manufacturers to deliver updates, and it serves to add extra standardization to how those bits are implemented. But, it also means that new features and improvements can be backported to older hardware more easily — teaching old smartphone dogs new tricks, as it were.
For a fast bit of context, APIs are how developers interact with the system, applications, and its hardware. The days of writing everything directly in machine-understood assembly code are long gone for most software developers. Now hardware has several different abstraction layers on top of that that make the process much simpler and more universal while also enhancing security and performance. Android’s Neural Networks API (NNAPI) allows developers to take advantage of various heterogeneous compute methods (a fancy way of saying different pieces of hardware that compute things in different ways) for AI-specific workloads.
The Neural Networks API means app-makers don’t have to figure out which task needs to go where.
See, AI and machine learning workloads usually don’t operate at their best in a typical CPU environment. Sometimes they need to run a ton of low accuracy calculations in parallel all at once — in circumstances like that, a GPU, ISP, or another optimized piece of hardware is better. Many recent chipsets even have dedicated components just for specialized machine learning workloads. The NNAPI means developers don’t have to worry about any of that, and when they need to accomplish a task, the API and the software beneath it ensure that it ends up at the right place to do it best without any fuss.
This change does not necessarily mean that Google won’t make machine learning APIs an updateable part of the system later via some other means, it just means that the previously planned approach isn’t going to happen for whatever reason.
Read Next
About The Author


