summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorNicholas Johnson <nick@nicholasjohnson.ch>2023-02-16 00:00:00 +0000
committerNicholas Johnson <nick@nicholasjohnson.ch>2023-02-16 00:00:00 +0000
commit35ba46d61d52632444a380eb7ed7c1429e0281e6b6e8c67948b9ce008fa45fe1 (patch)
tree881fde9ef9523f699987e826cf4dc1f40bb50b30b6b04d074dae1fc907a84803
parent29500930e4a6a2dcd3eac97435e94ba7e5c644ac2a12dc1e1acc8ae095504b68 (diff)
downloadjournal-35ba46d61d52632444a380eb7ed7c1429e0281e6b6e8c67948b9ce008fa45fe1.tar.gz
journal-35ba46d61d52632444a380eb7ed7c1429e0281e6b6e8c67948b9ce008fa45fe1.zip
Convert refs: implications-of-synthetic-media
-rw-r--r--content/entry/implications-of-synthetic-media.md33
1 files changed, 9 insertions, 24 deletions
diff --git a/content/entry/implications-of-synthetic-media.md b/content/entry/implications-of-synthetic-media.md
index 1488ccd..43922f9 100644
--- a/content/entry/implications-of-synthetic-media.md
+++ b/content/entry/implications-of-synthetic-media.md
@@ -2,13 +2,12 @@
title: "Implications of Synthetic Media"
date: 2022-04-24T00:00:00
draft: false
-makerefs: false
---
-A few months ago, I wrote an entry titled "The Privacy Implications of Weak AI".[1] This entry is a continuation of my thoughts about AI, specifically synthetic media.
+A few months ago, I wrote an entry titled "[The Privacy Implications of Weak AI](/2021/11/10/the-privacy-implications-of-weak-ai/)". This entry is a continuation of my thoughts about AI, specifically synthetic media.
A.I. and automation are subjects people avoid thinking about because they're scary. I can't fault anybody for that because they're right. The way weak AI is already being used is extremely worrying. It doesn't bode well for the future, but we can't find solutions without discussing the problem. So today, I thought I'd explore another way weak A.I. might disrupt society.
-In case you're not familiar with the term "deepfake", it refers to AI-generated media[2] (synthetic media) where a person in a picture or video is digitally replaced with somebody else. The goal is for the replacement to be so seamless that it's impossible to tell the difference. Right now, deepfakes[3] are pretty good and they're getting better all the time. This has huge implications.
+In case you're not familiar with the term "deepfake", it refers to [AI-generated media](https://www.wikipedia.org/wiki/Synthetic_media) (synthetic media) where a person in a picture or video is digitally replaced with somebody else. The goal is for the replacement to be so seamless that it's impossible to tell the difference. Right now, [deepfakes](https://www.wikipedia.org/wiki/Deepfake) are pretty good and they're getting better all the time. This has huge implications.
# Plausible Deniability
## Blackmail
@@ -26,19 +25,19 @@ Deepfakes change the game by reducing the cost of creating fakes. In the future,
## Nudes
This one's just a hunch, but I predict sending nudes will become more common given that the nudes will be deniable if they end up in the wrong hands. The original recipient may know that the nudes are real, but will anybody else believe them? So I think the deniability will increase people's willingness to send intimate media.
-The software for faking nudes already exists.[4]
+[The software for faking nudes already exists.](https://www.wikipedia.org/wiki/Deepfake_pornography#DeepNude)
# Social Engineering
-But there's more than just increased plausible deniability. Deepfakes will change the social engineering[5] game.
+But there's more than just increased plausible deniability. Deepfakes will change the [social engineering](https://www.wikipedia.org/wiki/Social_engineering_%28security%29) game.
I imagine it like that scene in the first Terminator movie where terminators can fake people's voices after hearing them once. You can just record someone's voice, then train an A.I. to replicate it. Unless there's a law against it, police might use this to trick suspects and obtain information from them.
-On the other side of the law, black hat hackers will certainly use deepfakes to social engineer corporations and institutions. In fact, it already happened when a voice deepfake was used to scam a CEO out of $243,000.[6]
+On the other side of the law, black hat hackers will certainly use deepfakes to social engineer corporations and institutions. In fact, it already happened when [a voice deepfake was used to scam a CEO out of $243,000](https://www.forbes.com/sites/jessedamiani/2019/09/03/a-voice-deepfake-was-used-to-scam-a-ceo-out-of-243000/).
# The Infopocalypse
The central subject which we seem to be orbiting is the infopocalypse. That is, when sock puppets and deepfakes become absolutely pervasive everywhere on the internet. And I have to mention sock puppets because they go hand in hand with deepfakes in an important way.
-Right now, what prevents bots from overtaking the internet is mainly CAPTCHA[7], phone registration, and bot detection systems. CAPTCHA is a technique to tell humans and computers apart. As A.I. improves, bots will eventually be able to do all the things that humans can do, including passing CAPTCHA. They'll also be able to bypass bot detection and, with some money, buy phone numbers.
+Right now, what prevents bots from overtaking the internet is mainly [CAPTCHA](https://www.wikipedia.org/wiki/CAPTCHA), phone registration, and bot detection systems. CAPTCHA is a technique to tell humans and computers apart. As A.I. improves, bots will eventually be able to do all the things that humans can do, including passing CAPTCHA. They'll also be able to bypass bot detection and, with some money, buy phone numbers.
We have to assume that as time passes, it will take less and less resources for anyone to create their own personal army of convincing bots. Combining this with deepfakes will make it nearly impossible to tell human from machine. Unless new techniques for bot prevention are developed, online platforms may run rampant with spam, disinformation, and sock puppets.
@@ -51,28 +50,14 @@ Now, broadening the subject even more to synthetic media as a whole, not just de
Maintaining relationships with real people takes effort. With synthetic media and convincing chat bots, a lot of people will probably opt for relationships with synthetic, digital A.I. systems instead of other human beings. This could be really destructive to the social fabric. The word "loner" will take on a whole new meaning.
-What worries me the most is how addictive these A.I. chat bots could potentially be. We've already seen how bad social media and smartphone addiction is. Maybe it's too early to worry about this, but if A.I. chat bots pass the Turing test[8] and become capable of real-time audio and video calls, there will probably be less human connection in society.
+What worries me the most is how addictive these A.I. chat bots could potentially be. We've already seen how bad social media and smartphone addiction is. Maybe it's too early to worry about this, but if A.I. chat bots pass the [Turing test](https://www.wikipedia.org/wiki/Turing_test) and become capable of real-time audio and video calls, there will probably be less human connection in society.
-If you're looking for some inspiration, two good films depicting human-bot relationships are Her[9] and Ex Machina[10]. Those films both depict A.I. taking human form, which goes a bit outside the scope of synthetic media, but synthetic media by itself probably wouldn't make good film.
+If you're looking for some inspiration, two good films depicting human-bot relationships are [Her](https://www.wikipedia.org/wiki/Her_%28film%29) and [Ex Machina](https://www.wikipedia.org/wiki/Ex_Machina_%28film%29). Those films both depict A.I. taking human form, which goes a bit outside the scope of synthetic media, but synthetic media by itself probably wouldn't make good film.
# Art and Self-Expression
Synthetic media will also revolutionize art and self-expression. Imagine online gaming where your face, body, and mannerisms are superimposed onto your avatar. Imagine going to see a movie with you and your friends as stars of the show. Imagine more interactive art.
-I don't think synthetic media used for self-expression is necessarily a net good though. Giving people new ways to express themselves is good, but not if they use it as a means of escaping the world like in the movie Ready Player One[11]. We don't want to give people yet another way to be bought off by extreme capitalists and distracted from the problems happening in the real world.
+I don't think synthetic media used for self-expression is necessarily a net good though. Giving people new ways to express themselves is good, but not if they use it as a means of escaping the world like in the movie [Ready Player One](https://www.wikipedia.org/wiki/Ready_Player_One_%28film%29). We don't want to give people yet another way to be bought off by extreme capitalists and distracted from the problems happening in the real world.
# Conclusion
Predicting the future is somewhat of a fool's errand. We'll only know for sure how synthetic media is going to transform society as time passes. But, I believe I've made some good predictions, and I hope I at least get more people thinking about it. Thanks again for reading.
-
-
-Link(s):
-[1: The Privacy Implications of Weak AI](/2021/11/10/the-privacy-implications-of-weak-ai/)
-[2: Synthetic Media](https://www.wikipedia.org/wiki/Synthetic_media)
-[3: Deepfake](https://www.wikipedia.org/wiki/Deepfake)
-[4: Deepnude](https://www.wikipedia.org/wiki/Deepfake_pornography#DeepNude)
-[5: Social Engineering](https://www.wikipedia.org/wiki/Social_engineering_(security))
-[6: Voice Deepfake Scam](https://www.forbes.com/sites/jessedamiani/2019/09/03/a-voice-deepfake-was-used-to-scam-a-ceo-out-of-243000/)
-[7: CAPTCHA](https://www.wikipedia.org/wiki/CAPTCHA)
-[8: Turing Test](https://www.wikipedia.org/wiki/Turing_test)
-[9: Her (Film)](https://www.wikipedia.org/wiki/Her_(film))
-[10: Ex Machina](https://www.wikipedia.org/wiki/Ex_Machina_(film))
-[11: Ready Player One](https://www.wikipedia.org/wiki/Ready_Player_One_(film))