Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on August 3, 2021

All you need to know about Tensor, Google’s own processor

AI galore


All you need to know about Tensor, Google’s own processor

Google unveiled its Pixel 6 phones for 2021 last night. While the phone will arrive later this year, one of the highlights of the device was the company’s indigenous chip called Tensor.

The company’s CEO Sundar Pichai tweeted the picture of this chip and said it has been “4 years in the making,” and built off of two decades of the firm’s computing experience.

The chip’s announcement wasn’t totally surprising though. We’ve heard rumors of the company developing its own processor under the ‘Whitechapel’ codename back in April

While the company didn’t tell us much about the specifications of this new processor, it did mention it’s based on ARM architecture — just like Apple’s M1 chip.

Google has its own chips in Pixel phones earlier: there’s Pixel Visual Core for photography processing, Pixel Neural Core for AI stuff, and Titan-M for security. However, the Pixel 6 will be the first phone to use Tensor as the main processor, instead of borrowing Qualcomm’s chips.

With the name like Tensor — a nod to Google’s open-source AI language TensorFlow — you can expect some great AI chops on the phone. In a tweet thread about the phones, the company said that this new processor will get you a “transformed experience” in photography and speech recognition.

What will Tensor offer on the Pixel 6?

In an interview with Engadget, Rick Osterloh, the company’s SVP of devices and services, said that Tensor has a redesigned Image Signal Processor (ISP). He said there are a “few points in the ISP where we can actually insert machine learning, which is new.”

Marques Brownlee, who runs the YouTube tech video channel MKBHD, said that in a closed demo, Google showed off how the machine learning chops of the processor could improve the video capturing capabilities of the Pixel 6. Basically, the company has finally managed to apply its much-touted HDRnet model for images on videos — which means you could expect capabilities like noise reduction and HDR support in your footage.

Pixel 6 Pro
The Pixel 6 and the Pixel 6 Pro

Engadget also noted that the new phones have capabilities of unblurring shaky photos of toddlers or pets. I definitely want to try this on my cats.

During its Android 12 presentation at Google IO in May, the Big G had mentioned that it’s also working on improving selfies across different skin tones. I wonder if the AI models running on Tensor chip could give us the first taste of that.

Google has noted that it’ll use a new Speech On Device API to power language-based features, including composing a message through voice and transcription and translation of audio clips.

I’ve used Google’s voice recorder for Pixel phones that had an amazing transcription feature. So I am really stoked about trying out language features on the Pixel 6.

For years, there has been an argument that Apple’s iPhones are better at certain things like RAM management and multi-tasking because the company makes it own CPUs, and can have its hardware and software play well together. The Tensor chip gives Google the chance to show off the capabilities of its AI prowess and what Android can do in a purpose-built chip. This should be exciting.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with