Virtualization Technology News and Information
Article
RSS
What Does Security Look Like in the Always-On Environment?

ben-kolde-unsplash 

Voice-activated virtual helpers like Alexa, Siri and the Google Assistant are supposed to be useful, but not overbearing. Usually, that means they stay silent until their owners say the "wake word," but they're ready to get the next command.

However, that kind of functionality means these gadgets are always on and listening. There was also a software problem that made the Google Home Mini record things constantly without input.

As a result, some concerned consumers wonder what could happen if their trusty gadgets pick up on things they shouldn't hear, and what becomes of the vocal input the humans give.

Is it stored somewhere and transmitted to an unknown party? These questions understandably make people realize they need to remain aware of possible security risks and take steps to minimize them.

Smart Speakers Can Facilitate Unauthorized Purchases

Amazon's Alexa-powered speakers allow people to add items to virtual shopping lists, then purchase them - all by only using their voices. Recently, Amazon's smart speakers began offering the option to train them to recognize various voices, presumably to keep up with Google home, (which already did).

The improved voice recognition ability is promising, but it still may not do its job in all cases. One of the potential security vulnerabilities associated with smart speakers is that there have been instances of kids fooling the gadgets to buy things without their parents' knowledge. In another case, even a talking parrot bought something. People who are concerned about those possibilities should consider completely disabling voice-activated purchases with Alexa. Alternatively, there's a way only to authorize purchases for people who correctly provide a four-digit password.

The Possible Threat to Privacy

Companies like Google and Amazon that are responsible for offering voice-activated technologies aren't specific about what they do with the collected speech snippets. Therefore, some people wonder if compromised privacy is the cost of convenience.

Fortunately, both the companies mentioned above offer ways to see the captured voice data and delete it. Amazon says it stores voice data and uses it to improve the quality of its services. However, it explains how to get rid of it after going through several steps. Google offers a similar option, and people can even see the data saved by date.

People who feel anxious about what companies may hear when they use voice-activated technologies may find it easier to relax when they periodically go to a respective service's website and delete the data or even look at what's there.

Smart Speakers Aren't Locked Down to Hackers

Statistics published near the end of 2017 found that sales of smart speakers tripled in 2017 and totaled nearly 25 million units. As their popularity grows, smart speakers become more alluring to cybercriminals who want to break into them and take control. Unfortunately, that scenario isn't so far-fetched.

A British security researcher revealed a shortcoming in Amazon Echo speakers sold before 2017 allows a person to install malware and turn the gadgets into listening devices.

If a hacker successfully did that, then heard an owner say a string of commands to order an Uber and turn off the lights in a house, they could assume the residence was about to become empty - and choose a time shortly after that to burglarize it.

The Apple HomePod Has an Interesting Security Feature

With the recent release of Apple's HomePod speaker, some reviewers noted that its sound quality was excellent but wondered if the market was already too dominated by Amazon's successful products to make the HomePod a serious competitor. However, the product has another notable perk besides superb sound.

It features end-to-end encryption, which means once a user talks to Siri and the person's voice data gets transmitted to Apple, the company can't connect it back it to that individual or even view it. In contrast, Amazon's representatives have only said its data is "securely stored," but they did not elaborate.

In 2014, Google began working on an end-to-end encryption for Gmail, but that project apparently stalled. Unlike Apple, it has not come forward to assert that its speakers have end-to-end encryption.

A New, Low-Power Chip Could Make Future IoT Devices More Secure

Besides the step taken by Apple with its HomePod, there's also evidence that upcoming gadgets that connect to the Internet of Things (IoT) could become much less prone to potential security flaws. That's because researchers at MIT are working on a new kind of chip with elliptic-curve encryption, using mathematics to lock devices down.

It executes commands about 500 times as fast as a software-based security solution. The scientists working on this project say the quick computational speed combined with the reduced energy usage make the chip promising to potentially fill future data encryption needs.

Always-on devices do pose security risks that make some users understandably wary. However, the fact that people are becoming more aware that they don't know what happens to their data after they say things to a smart speaker could make device manufacturers more willing to use end-to-end encryption - especially if their dominance in the marketplace depends on it.

##

About the Author

Kayla Matthews is a tech-loving blogger who writes and edits ProductivityBytes.com. Follow her on Twitter to read all of her latest posts! 
Published Thursday, March 08, 2018 7:34 AM by David Marshall
Filed under:
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<March 2018>
SuMoTuWeThFrSa
25262728123
45678910
11121314151617
18192021222324
25262728293031
1234567