Press "Enter" to skip to content

Google Open Sources Its Machine Learning Platform, TensorFlow

Ryan Kasso 0

Google recently open-sourced its TensorFlow machine learning library, which aims to bring large-scale, distributed machine learning and deep learning to everyone.

Google’s existing machine-learning infrastructure, known as DistBelief, has hitherto been used internally at Google to do things like identify and automatically label items contained within YouTube videos and photos, as well as improve speech recognition in Google apps. But DistBelief was limited, insofar as it was “narrowly targeted to neural networks, it was difficult to configure, and it was tightly coupled to Google’s internal infrastructure — making it nearly impossible to share research code externally,” according to a blog post by Jeff Dean, senior Google fellow.

TensorFlow is basically a second-generation machine-learning system, one that Google claims is twice as fast and more flexible, and can be run on an individual smartphone or across data centers.

Nowadays, more and more Internet giants share the software sitting at the heart of their online operations. Open source accelerates the progress of technology. In open sourcing its TensorFlow AI engine, Google now has the upper hand on AI development over other tech companies. Right now, TensorFlow is the most powerful AI tool out in the market, and those interested in technology and AI will likely explore this software. Google can feed all sorts of machine-learning research outside the company, and in many ways, this research will feed back into Google.

Machine learning is an integral part of what powers our online existences. It’s the technology that helps suggest new Facebook friends and language translation. TensorFlow depends on machines equipped with GPUs, or graphic processing units, chips that were originally designed to render graphics for games and the like, but have also proven adept at other tasks, such as Nvidia Titan X cards, which are used to power AMAX’s Deep Learning Engines. AI is playing an increasingly important role in the world’s online services, and chip architectures are playing an increasingly important role in AI.

TensorFlow deals in a form of AI called deep learning. With deep learning, you teach systems to perform tasks such as identifying spoken words (like Apple’s Siri), recognizing images, and even understanding natural language by feeding data into vast neural networks connected machines that approximate the web of neurons within the human brain. If you feed photos of flowers into a neural net, you can teach it to recognize flowers. If feeding it conversational data, you can teach it to carry on conversations. In our other blog – “Deep Learning Smarts up Your Smart Phone”, you will read more about how machine learning works, the transformation to deep learning and the differences between the two.