You are viewing a single comment's thread.

view the rest of the comments →

3
39

[–] vandilx 3 points 39 points (+42|-3) ago 

If you purposely buy an Internet-connected speaker whose advertised job is to listen out for you to call it's name, it's sending everything it's hearing to a remote server for analysis to determine if its name was being called. Where that data goes/is_stored is unknown.

Same with leaving your iPhone set to respond to "Hey Siri."

If you do any of this, you have fucked yourself, even if your life is uninteresting and you commit no crimes, you have bugged your own home and self.

2
14

[–] BlueBerryPie [S] 2 points 14 points (+16|-2) ago  (edited ago)

During the past 5 years, every SAMSUNG "Smart TV" has a mike and sending home to MOM, everything it hear's 24/7 as long as your plugged in the wall. I unplug all my TV made after 2007 that I'm not watching at that moment in time.

Even when the TV is "OFF" all modern TV's automatically connect to WIFI and stream all audio home, the argument is for AI training, but in reality, like this article the purpose is life-long law enforcement, or more so like the movie "V for Vendetta", where the GOV all over the world wants to listen and gauge the mental state of its captives.


The shit will always been used against you insurance company's will demand the audio to deny claims, any good lawyer can fabricate anything from silence to convince a jury that your were beating the meat when driving the car, and the accident was your FAULT.

0
9

[–] Norm85 0 points 9 points (+9|-0) ago 

My samsung "smart tv" is blocked on my router and is not allowed access to the home LAN.

1
3

[–] tribblepuncher 1 points 3 points (+4|-1) ago  (edited ago)

The shit will always been used against you insurance company's will demand the audio to deny claims

This is probably the point where the idiots who say "you're paranoid, lol" to so many privacy concerns get a nice, big slap in the face.

It seems like one of the few real psychological lines is the pocketbook, and this is where it goes squarely into that. Unfortunately by the time it happens, it may be too late to do much about it, and even if it isn't, it's going to be a very, very tough battle.

Even if you somehow are a perfect angel at home, your kids are going to do something to raise your premiums while your back is turned.

0
1

[–] Drunkenmoba 0 points 1 points (+1|-0) ago 

If they don't have wifi it's not an issue. But different tvs

0
0

[–] Longsword 0 points 0 points (+0|-0) ago 

I let LG Oled connect to wifi for streaming Netflix etc, but voice/microphone had a separate user agreement that I never accepted, so I can't use it, and they cant use it.

1
11

[–] advan 1 points 11 points (+12|-1) ago 

It doesn't send everything it hears to a server to determine the activation keyword. The keyword activation is done at the hardware level, and once activated it sends anything following that to the server.

It wouldn't work to stream audio 24/7 to a server for analysis, it's entirely too costly.

0
5

[–] weezkitty 0 points 5 points (+5|-0) ago 

The bandwidth consumption would also become obvious

1
3

[–] VOTINGUP 1 points 3 points (+4|-1) ago 

Point taken, though if one becomes an interest to fake authorities, it gives them the technology to target certain individuals to stream 24/7 audio?!$.

0
0

[–] TheCuckFather 0 points 0 points (+0|-0) ago  (edited ago)

Nothing is done at hardware, its all software, perhaps your not aware that the wifi chips are largely micro-code which is software.

They don't have to stream audio 24/7, they just have to stream live, compress to 4khz bandwidth and every 5 min send 5 min history home to mom, it takes the server only a moment to search for 'keywords' ( like bomb, ... ) then the stream get's stored according to your 'fingerprint' ( unique browser ID from your HW ).

0
8

[–] dontreadit 0 points 8 points (+8|-0) ago 

That's not completely true, the "Hey Siri" detection works on the device itself. After that however, the recordings are sent to Apple.

Creepy sidenote though: For cases in which detection was reported as wrong they have people listening to recordings and typing what the correct meaning would have been to train the AI.

0
0

[–] tossbow 0 points 0 points (+0|-0) ago 

It's outsourced to random people, too, it's not even done super internally. However, the recordings are reviewed anonymously, at least.

0
2

[–] trazzz 0 points 2 points (+2|-0) ago 

Why someone shuold buy an internet-connected speaker? I don't even trust the microphone of my phone....

0
4

[–] advan 0 points 4 points (+4|-0) ago  (edited ago)

For the same reason you own a smart phone.