Google Introduces Machine Learning to Mobile with TensorFlow Lite

by November 22, 2017 0 comments

Artificial intelligence (AI) today has embedded itself in a wide array of technologies without realization of most consumers. One important aspect of AI is Machine Learning, which handles pattern recognition. The adoption of machine learning models has grown rapidly over the last few years and so there was a need to organize them on mobile and embedded devices. Google has launched a lightweight version of its open-source TensorFlow machine learning library for mobile platforms. TensorFlow Lite will give app developers the ability to deploy AI on mobile devices.

 

More about TensorFlow and TensorFlow Lite 

Google’s TensorFlow has been a popular framework since its inception in 2015. Its wide adoption can be seen right from the enormous server racks to tiny IoT (Internet of Things) devices. It’s mainly a library of machine learning algorithms, handled by the Cloud Tensor Processing Units (CPUs) powering its servers.

TensorFlow made designing, training and building deep learning models easier for developers. It is used for machine learning application by researchers and developers. It also helps in deep learning that can enhance many technologies in future. With the growing use of machine learning models, there was an increasing need to redesign it for mobile devices.

TensorFlow Lite is an evolved version of Google’s TensorFlow open source library, designed specifically for mobile devices.

The existing TensorFlow Mobile API will be operational but will no longer be the preferred solution for mobile AI. Google advises that developers should start using TensorFlow Lite although the existing system will be operational.

 

Areas of focus

TensorFlow Lite was redesigned to focus mainly on three areas:

•  It was rebuilt to make as lightweight as possible. So, it enables inference of on-device machine learning models with a small binary code base.

•  It has a run time which is designed to run on cross platforms including Android and iOS.

•  Fast optimisation for mobile devices enabling improved loading time of models and support for hardware acceleration.

 

Mechanism

 TensorFlow Lite runs the AI models in a highly optimised form on the CPU. It initially supports a handful of pre-trained and tested models such as the follows:

•  MobileNet: The model is able to identify across 1000 different object classes which are specifically designed for efficient execution on mobile and embedded devices

•  Inception v3: It is an image recognition model.

•  Smart Reply: It provides one-touch replies to incoming conversational chat messages.

The company said that more models and features will be added and updated in future according to users’ requirements.

 

Advantages of TensorFlow Lite

 •  It is a lightweight and easy solution for mobile and embedded devices. TensorFlow Lite is also much smaller in size. It occupies less than 300KB when all deployed operators are linked and less than 200KB when only the operators required for supporting models InceptionV3 and Mobilenet are used.

•  It enables on-device machine learning inference with low latency and a small binary size. TensorFlow Lite also supports using custom operations in models. Developers can write their own custom operators and use them in models.

•  TensorFlow Lite also supports hardware acceleration with the Android Neural Networks API. It helps developers run neural networks on low-power devices.

•  It uses techniques such as optimization of the kernels for mobile apps that allow the app to be smaller and faster models. TensorFlow Lite has a new mobile-optimized interpreter, which has the key goals of keeping apps lean and fast. An on-device interpreter with kernels optimises faster execution on mobile.

•  A new FlatBuffers-based model file format.

•  There is a TensorFlow converter which can convert TensorFlow-trained models to the TensorFlow Lite format.

•  This platform will allow developers to reinforce AI on mobile devices. The version will work on both Android and iOS.

•  It makes developing machine learning apps on mobile devices much easier.

 

Conclusion

Google Inc., which started off as a search engine and expanded into the machine learning and AI fields envisions a future revolving around the use of machine learning and AI.  Google may seem to have diversified in recent years since they are exploring every field from self-driving cars to smartphones, but, machine learning is actually at the core of its every operation.

TensorFlow Lite is still under active development. Hopefully, with more development, it will simplify the developer experience of targeting a model for small devices. Once the complete library is available for the developers, it will help a developer in app development equipped with much smarter machine learning.

No Comments so far

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.

Your data will be safe!Your e-mail address will not be published. Also other data will not be shared with third person.