More and more applications are integrating machine learning on the device itself to offer lower latency in predictions, more efficient use of the battery and above all not to depend on a network connection, but this also entails some problems that now Google wants to solve.
Google has found that developers who implement machine learning in their applications run into application size limitations, performance differences, and limitations in the latest advances in machine learning. To solve these three problems Google will integrate Tensor Flow Lite machine learning in Android.
Tensor Flow Lite for Android
Currently, most applications that make use of machine learning integrate the TensorFlow Lite library in their APK, increasing the size of the application, so Google has decided to integrate Tensor Flow Lite directly into Google Play Services.
The integration of Tensor Flow Lite in all compatible mobiles prevents the applications from having to integrate that library, thereby slightly reducing the size of the applications, but it also entails two other important new features.
This integration will also allow Google to improve optimal performance on all the devices in its library. On some devices, machine learning will be able to activate hardware acceleration when it is available so that some artificial intelligence tasks run in less time.
By integrating Tensor Flow Lite API into Google Play Services, you will receive regular updates on all versions of Android 4.4 or higher, covering almost all active devices on the market. In addition, Google is also working with chip vendors to update hardware drivers through Google Play, with Qualcomm being the first partner.