technology

154 readers
1 users here now

founded 2 years ago
MODERATORS
1
 
 

In the future, your home will feel as safe from intruders as a state-of-the-art military base.

Cameras and sensors surveil the perimeter, scanning bystanders’ faces for potential threats. Drones from a “deterrence pod” scare off trespassers by projecting a searchlight over any suspicious movements. A virtual view of the home is rendered in 3D and updated in real time, just like a Tesla’s digital display. And private security agents monitor alerts from a central hub.

This is the vision of home security pitched by Sauron, a Silicon Valley start-up boasting a waiting list of tech CEOs and venture capitalists.

Couldn't find a non-paywall version. Edit: found one thanks to @ireticent : https://archive.is/roxCM

2
3
4
5
6
7
8
9
10
11
12
 
 

This is strange to me. Did the students create the deepfake nudes or did software create those nudes? A normal image editor program won't just create explicit material on its own, the user has to do it with a mouse/tablet/whatever. But AI algorithms will. Even if the kids were giving the instructions to AI, why isn't the software and/or the company that runs it at least somewhat liable for creating child porn?

Suppose the students drew the nudes themselves, but they were really bad stick figure drawings with a name beside them to say who it's supposed to be? Is that illegal? What if they were really good artists and the drawing looked like a photo? At what point is the art considered good enough to be illegal?

13
14
15
16
17
18
19
20
21
22
 
 

"...in a demonstration of the risks of connected, autonomous AI ecosystems, a group of researchers has created one of what they claim are the first generative AI worms—which can spread from one system to another, potentially stealing data or deploying malware in the process."

23
24
25
view more: next ›