cua cà mau cua tươi sống cua cà mau bao nhiêu 1kg giá cua hôm nay giá cua cà mau hôm nay cua thịt cà mau cua biển cua biển cà mau cách luộc cua cà mau cua gạch cua gạch cà mau vựa cua cà mau lẩu cua cà mau giá cua thịt cà mau hôm nay giá cua gạch cà mau giá cua gạch cách hấp cua cà mau cua cốm cà mau cua hấp mua cua cà mau cua ca mau ban cua ca mau cua cà mau giá rẻ cua biển tươi cuaganic cua cua thịt cà mau cua gạch cà mau cua cà mau gần đây hải sản cà mau cua gạch son cua đầy gạch giá rẻ các loại cua ở việt nam các loại cua biển ở việt nam cua ngon cua giá rẻ cua gia re crab farming crab farming cua cà mau cua cà mau cua tươi sống cua tươi sống cua cà mau bao nhiêu 1kg giá cua hôm nay giá cua cà mau hôm nay cua thịt cà mau cua biển cua biển cà mau cách luộc cua cà mau cua gạch cua gạch cà mau vựa cua cà mau lẩu cua cà mau giá cua thịt cà mau hôm nay giá cua gạch cà mau giá cua gạch cách hấp cua cà mau cua cốm cà mau cua hấp mua cua cà mau cua ca mau ban cua ca mau cua cà mau giá rẻ cua biển tươi cuaganic cua cua thịt cà mau cua gạch cà mau cua cà mau gần đây hải sản cà mau cua gạch son cua đầy gạch giá rẻ các loại cua ở việt nam các loại cua biển ở việt nam cua ngon cua giá rẻ cua gia re crab farming crab farming cua cà mau
Skip to main content

Google Assistant 2.0 isn’t just a minor evolution. It’s a game-changing upgrade

Image used with permission by copyright holder

Folding devices like the Galaxy Fold and Huawei Mate X represent the next major alteration in phone design, but what about the next, next? What will change the way we interact with our phones, once we’re done folding them in half?

Google gave us a teaser during the Google I/O 2019 keynote presentation, as it demonstrated the prowess of Google Assistant when the masses of data it requires to operate is shifted from the cloud to the device. Voice control has been part of our smartphone experience for a while, but the speed, versatility, and accuracy of this advanced system could be a game-changer.

Recommended Videos

Meet Google Assistant 2.0

What did Google announce? A next generation version of the Google Assistant we currently know and love from our Android phones, Google Nest products, or even Android Auto. Google Assistant uses three complex algorithms to understand, predict, and act upon what we’re saying, which requires 100GB of data storage and a network connection to operate. Google announced it has used deep learning to combine and shrink those algorithmic models down to 500MB — and that means it’ll fit happily on our phones, and stops network latency slowing responses and actions down.

google i/o assistant
Julian Chokkattu/Digital Trends

Google CEO Sundar Pichai said using the next-generation Assistant is so fast it’ll make tapping the screen seem slow.

“I think this going to transform the future of the Assistant,” Pichai said.

Hyperbole? No. The demo was mind-blowing. The verbal commands were back-to-back, and included setting timers, opening apps, performing searches, doing basic phone operations, and even taking a selfie. A second demo showed how Assistant could quickly and easily generate and create message and email replies using Google Photos, and search. It used continuous conversation, without saying the Hey Google wake word, along with natural commands, and often across multiple apps.

Next Generation Google Assistant: Demo 2 at Google I/O 2019

Scott Huffman, Google’s vice president of engineering for Google Assistant, summed up what the new Assistant could do, saying: “This next generation Assistant will let you instantly operate your phone with your voice, multi-task across apps, and complete complex actions, all with nearly zero latency.”

Simply put, Google is giving you the tools to confidently speak to your phone, and have it work faster than when you touch it. This has the potential to transform the way we use our devices, and even the overall design of the software and hardware, in the future.

Transformative

Integrating a reliable, fast version of Google Assistant into our phones without the need for a network connection is the final hurdle for creating a truly voice-operated device. Voice-controlled programs like this need to be helpful for us to use, and until they can do everything with little or no alterations in the way we speak from us, they won’t become indispensable. The on-device Assistant is a massive step toward this.

Speed is everything, because with it comes convenience.

Recently, Google has pushed for changes in how we summon Assistant on our phones, with many new devices using a short press of the sleep/wake key to open Assistant, rather than an on-screen action. Many phones now also come with a dedicated Google Assistant button as well. This walkie-talkie action makes it easier to call the Assistant without looking at the phone, ready for verbal control through a pair of headphones, and is crucial for speeding up and simplifying the launch process.

Removing the need for a wake word, such as Hey Google, and introducing continuous conversation is also key. Continued conversation is already part of Google Home, but not Assistant on our phones, but without it the speed required for true seamless voice control wouldn’t be possible. All of this combined gives you a look at Google’s plan to help us make Assistant part of our regular phone routine.

Speed is everything, because with it comes convenience. Without it, there’s only frustration. You can reply to messages now using dictation, but you have to go through a series of steps first, and Assistant can’t always help. Using voice is faster, provided the software is accurate and responsive enough. Google Assistant 2.0 looks like it will achieve this goal, and using our phones for something more than only basic, often-repeated tasks may be about to become a quicker, less screen-intensive process.

Scenarios

Less screen intensive? Definitely. If we trust the software to do what we ask it, even in the most basic situation, we will look at our phone less. We can carry out simple basic tasks now, using Assistant and our voice; but not with the same level of accuracy, versatility, and speed shown at Google I/O.

Google

It’s the versatility that shouldn’t be overlooked. Performing multiple tasks, all in succession, without manually flicking through apps or making multiple gesture-based selections, will make our phones more natural to use. It’s the way we perform tasks in the real world, and how we tell others what we want them to do, or communicate what we’re about to do. It’s all very natural.

Retraining our brains not to resort to using a finger or gesture on our phones will take some time.

However, the concept of a voice-controlled phone isn’t without problems. First, to do all this will take some practice. Understanding how to use voice — from which commands it can accept, to ending a conversation — requires patience, and retraining our brains not to resort to using a finger or gesture on our phones, will take some time.

Not only that, it will require us to become more comfortable with using voice for control, mostly outside the home. It will also need an acceptance that Google will know more about us, and that careless talk could potentially open up privacy problems when talking to a phone in public. We’ll all have to be more vigilant with what we share with Google, and what actions we carry out in public, when we start to use voice more often.

Google’s not the first

The on-stage Assistant demo was easily the most comprehensive and relatable example of how voice can transform our phone use that we’ve seen so far; but Google isn’t the first to try and harness the power of speech for device control, or explore the speed of on-device A.I. processing.

Huawei made excellent use of on-device A.I. for image recognition and other camera-related features when it introduced the Kirin 970 processor, which had a Neural Processing Unit (NPU) onboard, ready to take the A.I. strain rather than leave the processing in the hands of a cloud-based system. The speed benefits were enormous, and unique at the time. It has since gone on to demonstrate the ability of the NPU in interesting ways, and outline how it sees A.I. shaping the future, while some other manufacturers have struggled on with cloud-driven A.I. with poor results.

Huawei AI Kirin 970 chip
Huawei

When Samsung launched its own virtual assistant, Bixby, in 2017, the goal was to create an assistant that could cover everything we’d normally do with a touch command. Samsung’s Injong Rhee told Digital Trends at the time, “What we’re looking at is revolutionizing the interface.” Bixby isn’t the best example of a capable voice assistant, but Samsung’s prediction of a revolution should it work correctly is accurate.

When will it happen?

What we’re on the cusp of here, now that Google has found a way to squeeze 100GB of competent and complex data modeling into 500MB, is the development of phone interfaces, apps, and potentially even hardware designs that rely on us looking and touching less, and speaking more. Pichai wasn’t exaggerating when he called this breakthrough a “significant milestone.”

We’re not even going to have to wait long before it’ll be possible to try it out. Huffman promised that the next generation assistant will first come to the new Pixel phones — meaning the Pixel 4 — later in 2019. Assistant is available on the vast majority of Android smartphones, and although it’ll debut on the new Pixel and Android Q software, more phones will almost certainly get the feature in the future.

The question is, are you ready to use voice as often as you use touch to control your phone?

Andy Boxall
Andy is a Senior Writer at Digital Trends, where he concentrates on mobile technology, a subject he has written about for…
Yes, Reddit is down. Here’s everything you need to know
The Reddit app icon on an iOS Home screen.

Bad news, fellow Redditors. If you're trying to browse your favorite subreddit right now, you're probably unable to. Why? Because Reddit appears to be down due to technical difficulties.

What's going on with the outage? Do we know when it'll be back up? Here's a recap of everything we know.
Why is Reddit down?
On the Reddit status website, the company indicates an "unresolved incident" taking place on November 20. The company confirms "degraded performance for reddit.com," which appears to be accurate.

Read more
I traveled 8,000 miles to get an Android phone unlike any I’ve used before
Someone holding the Lava Agni 3 smartphone.

The U.S. smartphone market is a well-known entity in 2024. Apple dominates the flagship space with the iPhone, Samsung's Galaxy S handsets are a reliable force every year, and Google's Pixel phones continue improving. But what about budget phones? There are some decent choices from Google, Motorola, and OnePlus, but your options are limited.

It's a trusty, if somewhat unexciting, swath of smartphones, especially when you get a glimpse at what's happening in other parts of the world. On a recent trip to India with MediaTek to see the company's presence in the Indian tech market firsthand, I was given the Lava Agni 3 — a new smartphone release from the India-based company Lava.

Read more
The Nubia Z70 Ultra just gave the Galaxy S25 Ultra some tough competition
A render of the Nubia Z70 Ultra.

With all eyes on Samsung and the upcoming Galaxy S25 release, it's easy to forget about other players on the field. And the Nubia Z70 Ultra could be the underdog competitor no one considered. This powerful phone was just announced in China and will see a global launch on November 26 for roughly $635.

This phone has a lot going for it, but the absolute first thing that catches the eye is the breathtakingly gorgeous design. I'm a fan of Post-Impressionist art, and I have to say that the Starry Night pattern looks better than any case could dream of, but this handset is no slouch in the specs department either. It comes with the same display as the RedMagic 10 Pro and has a screen-to-body ratio of more than 95%. The Nubia Z70 Ultra uses a 16MP underdisplay camera rather than a pinhole camera, giving the screen a more complete look.

Read more