User Tools

Site Tools


ai_ip_cameras_and_privacy_legislation

AI, IP Cameras and Privacy Legislation

Concern

I have today finished reading a ebook called 2062,The World that AI made, by Tobi Walsh. He is a Australian who specialises in Artificial Intelligence (AI).

You can borrow the ebook online from the Logan library.

The last chapter basically says that our current laws, society, and economy need to change to take into account what artificial intelligence is currently doing and will do in the future.

IP Cameras

https://en.m.wikipedia.org/wiki/Closed-circuit_television

“IP cameras are considered part of the Internet of Things (IoT) and have many of the same benefits and security risks as other IP-enabled devices”

Source: https://en.m.wikipedia.org/wiki/Closed-circuit_television

Privacy legislation

Per se, privacy legislation does not provide privacy. All it does is regulate what is regarded as private, how its collected, used, stored and disseminated. Importantly, also what is exempt from the legislation. Like matters of “national security”

Non personal

Because of AI and recognition software, it is probable that a piece of data, that is not a digital photo, and so not personal, would not be regarded as personal information under current legislation. However without any identifying information that same data can be combined with other non identifying data to identify a person. Even without any identification information recognition data can be used to determine factors like a person sexual orientation, and the possibility of them being criminal. However to be able to do this the machine learning software need something to learn from. We humans will need to provide that data. We can, have and most likely will make mistakes about that data, at the expense of peoples lives.

AI can create images of people that dont exist

This is an example of what can be done. Like all new technology, it can be used for good and also for evil. In the wrong hands that could be very powerful.

AI can create fake videos of existing people

Deepfake can be used to make a video with voice sound of a person doing whatever you may want them to be doing.

Here is an example for Tom Cruse:

https://youtu.be/iTR-_zWOuA4

But as the report says it's not Tom Cruse but the use of Deepfake to make it look like Tom Cruse.

Moving through to the next logical step it could be possible for a criminal to produce a Deepfake of police making a Deepfake video of the criminal. Where does that leave the use of video as evidence?

ai_ip_cameras_and_privacy_legislation.txt · Last modified: 2021/09/04 19:03 by geoff