InfoQ Homepage TensorFlow Content on InfoQ
-
Google Introduces New Metrics for AI-Generated Audio and Video Quality
Google AI researchers published two new metrics for measuring the quality of audio and video generated by deep-learning networks, the Fréchet Audio Distance (FAD) and Fréchet Video Distance (FVD). The metrics have been shown to have a high correlation with human evaluations of quality.
-
Machine Learning on Mobile and Edge Devices with TensorFlow Lite: Daniel Situnayake at QCon SF
At QCon SF, Daniel Situnayake presented "Machine learning on mobile and edge devices with TensorFlow Lite". TensorFlow Lite is a production-ready, cross-platform framework for deploying ML on mobile devices and embedded systems, and was the main topic of the presentation.
-
Google Introduces TensorFlow Enterprise in Beta
In a recent blog post, Google announced TensorFlow Enterprise, a cloud-based TensorFlow machine learning service that includes enterprise-grade support and managed services.
-
PyTorch and TensorFlow: Which ML Framework is More Popular in Academia and Industry
An article that was recently published on the gradient is examining the current state of Machine Learning frameworks in 2019. The article is utilizing some metrics to argue the point that PyTorch is quickly becoming the dominant framework for research, whereas TensorFlow is the dominant framework for applications in the industry. In this article we will dive into their differences.
-
Databricks' Unified Analytics Platform Supports AutoML Toolkit
Databricks recently announced the Unified Data Analytics Platform, including an automated machine learning tool called AutoML Toolkit. The toolkit can be used to automate various steps of the data science workflow.
-
Facebook Open-Sources RoBERTa: an Improved Natural Language Processing Model
Facebook AI open-sourced a new deep-learning natural-language processing (NLP) model, Robustly-optimized BERT approach (RoBERTa). Based on Google's BERT pre-training model, RoBERTa includes additional pre-training improvements that achieve state-of-the-art results on several benchmarks, using only unlabeled text from the world-wide web, with minimal fine-tuning and no data augmentation.
-
Denis Magda on Continuous Deep Learning with Apache Ignite
At the recent ApacheCon North America, Denis Magda spoke on continuous machine learning with Apache Ignite, an in-memory data grid. Ignite simplifies the machine-learning pipeline by performing training and hosting models in the same cluster that stores the data, and can perform "online" training to incrementally improve models when new data is available.
-
Waymo Shares Autonomous Vehicle Dataset for Machine Learning
Waymo, the self-driving technology company, released a dataset containing sensor data collected by their autonomous vehicles during more than five hours of driving. The set contains high-resolution data from lidar and camera sensors collected in several urban and suburban environments in a wide variety of driving conditions and includes labels for vehicles, pedestrians, cyclists, and signage.
-
New Technique Speeds up Deep-Learning Inference on TensorFlow by 2x
Researchers at North Carolina State University recently presented a paper at the International Conference on Supercomputing (ICS) on their new technique, "deep reuse" (DR), that can speed up inference time for deep-learning neural networks running on TensorFlow by up to 2x, with almost no loss of accuracy.
-
Google Releases Post-Training Integer Quantization for TensorFlow Lite
Google announced new tooling for their TensorFlow Lite deep-learning framework that reduces the size of models and latency of inference. The tool converts a trained model's weights from floating-point representation to 8-bit signed integers. This reduces the memory requirements of the model and allows it to run on hardware without floating-point accelerators and without sacrificing model quality.
-
Google Releases TensorFlow.Text Library for Natural Language Processing
Google released a TensorFlow.Text, a new text-processing library for their TensorFlow deep-learning platform. The library allows several common text pre-processing activities, such as tokenization, to be handled by the TensorFlow graph computation system, improving consistency and portability of deep-learning models for natural-language processing.
-
Google Announces TensorFlow Graphics Library for Unsupervised Deep Learning of Computer Vision Model
At a presentation during Google I/O 2019, Google announced TensorFlow Graphics, a library for building deep neural networks for unsupervised learning tasks in computer vision. The library contains 3D rendering functions written in TensorFlow, as well as tools for learning with non-rectangular mesh-based input data.
-
Google's Cloud TPU V2 and V3 Pods Are Now Publicly Available in Beta
Recently, Google announced that its second- and third-generation Cloud Tensor Processing Units (TPU) Pods — its scalable cloud-based supercomputers with up to 1,000 of its custom TPU — are now publicly available in beta. With these Pods, Machine Learning (ML) researchers, engineers, and data scientists can speed up the time needed to train and deploy machine learning models.
-
Google Scales Weak Supervision to Overcome Labeled Dataset Problem
Google recognizes that the need for labeled data in machine learning (ML) is a significant bottleneck and recently adapted the open-source Snorkel framework to overcome the problem at scale. Google enhanced Snorkel by integrating it with Tensorflow, using the file system for sharing data instead of a database, and creating separate executables for labeling functions.
-
Teaching the Computer to Play the Chrome Dinosaur Game with TensorFlow.js Machine Learning Library
A simple, yet entertaining and useful for educational purposes application of machine learning, was recently made available on Fritz's HeartBeat Medium publication. Google's machine learning TensorFlow.js library is leveraged in the browser to teach the computer to play the Chrome Dinosaur Game.