Tech & The Web

Why I Don’t Want a Virtual Assistant in My Home

I’m always amazed at how quickly people will buy and completely trust a virtual assistant even when privacy threats seem obvious. 

The virtual assistant, devices with names like Alexa and Echo, are appealing for a lot of people. I wouldn’t try to deny that. 

I’ve seen Alexa in person: we have one at the office that we use to monitor a feed we provide. From time to time, a co-worker will ask it to play some pop song, much to the chagrin of his or her colleagues. (The colleagues, sooner or later, will ask for a song of their own for revenge.)

But what bothers me about devices like these is that they’re always listening. Always. Even when we assume they aren’t. 

And it sure is easy to forget they’re in the room when you’re doing other things! 

Yes, they’re listening to you.

A recent article from Bloomberg confirms what many suspected already: people are paid to listen to your conversations with your virtual assistant. 

On the one hand, it’s perfectly understandable. They’re looking for ways to improve their service by listening to what’s being asked for and comparing it to what’s actually being provided. In some cases, they’re looking for proper names like musicians so that they can make sure they’re matching requests to the right body of work.

But on the other hand, it should give people pause. Even when the intent is innocent, it should scare the hell out of everyone that someone is potentially hearing what you’re saying when you might not be aware of it. Many of these devices have a specific word combination that activates them. “Hey Siri” is what Apple users say to command its virtual assistant. From what I’ve seen, for Alexa, you can just call its name and wait for the hockey puck-shaped device to light up. 

But the article points out that some of the content employees paid to listen has been “disturbing.”

At least two workers have reported hearing what they believe is a “sexual assault.” If that’s actually what they heard, does anyone believe the assaulter called out for Alexa to listen in? The article doesn’t mention that the possible victim called Alexa’s name to call 911. I would like to believe that this would have been made clear, but maybe it wouldn’t have been.
Without knowing specifically the circumstances surrounding how a possible sexual assault came to be recorded, we have to wonder whether recordings are happening even if we aren’t calling the device’s name.

And if that’s a genuine possibility, imagine this scenario: You call your bank to question a charge on your credit card. You recite to the bank’s automated answering machine your credit card number. Then, when you eventually get an actual human, you have to provide security answers and potentially other information hackers could use to gain access to your account.
Maybe I’m paranoid. Or maybe I just sound paranoid.

But if you don’t know who’s listening, how can you be sure they won’t compromise any sensitive conversations they might overhear when you didn’t even know they were being recorded to begin with?

Do you use a virtual assistant device? Do you ever worry about who might be listening?

Leave a Response

We'd love to hear from you, but remember all comments must be respectful. We reserve the right to remove comments that do not follow our comment guidelines. Click here to review our comment policy.

Your name, as provided, will display on the website with any comment you leave. Your email address and your browser’s IP address does not display publicly and we do not share or sell your email address or IP address to anyone.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Patrick is a Christian with more than 27 years experience in professional writing, producing and marketing. His professional background also includes social media, reporting for broadcast television and the web, directing, videography and photography. He enjoys getting to know people over coffee and spending time with his dog.