{"id":37561,"date":"2022-06-01T08:30:00","date_gmt":"2022-06-01T05:30:00","guid":{"rendered":"https:\/\/forklog.com\/en\/?p=37561"},"modified":"2025-08-29T18:29:36","modified_gmt":"2025-08-29T15:29:36","slug":"what-is-a-deepfake","status":"publish","type":"post","link":"https:\/\/forklog.com\/en\/what-is-a-deepfake\/","title":{"rendered":"What is a deepfake?"},"content":{"rendered":"<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\">Key points<\/h2>\n<ul class=\"wp-block-list\">\n<li>A deepfake is a method of creating photos and videos with deep-learning algorithms. The technology allows a person\u2019s face in existing media to be replaced with someone else\u2019s.<\/li>\n<li>At least 85,000 AI-made forgeries have been found online. Experts say the tally doubles every six months.<\/li>\n<li>Beyond disinformation, intimidation and harassment, deepfakes are used in entertainment, synthetic-data generation and voice restoration.<\/li>\n<\/ul>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\">What is a deepfake?<\/h2>\n<p>A deepfake is a technology for synthesising media in which a face in an existing photo or video is replaced with another person\u2019s face. The forgeries are created using artificial intelligence, machine learning and neural networks.<\/p>\n<p>The term combines \u201cdeep learning\u201d and \u201cfake\u201d.<\/p>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\">Why are deepfakes made?<\/h2>\n<p>Many deepfakes are pornographic. By late 2020, Sensity had found 85,000 forgeries online created with AI methods. Some 93% were pornographic, the vast majority featuring the faces of famous women.<\/p>\n<p>New methods let unskilled users make deepfakes from just a few photos. Experts say this content doubles every six months. Fake videos are likely to spread beyond celebrities to fuel <a href=\"https:\/\/ru.wikipedia.org\/wiki\/%D0%9F%D0%BE%D1%80%D0%BD%D0%BE%D0%BC%D0%B5%D1%81%D1%82%D1%8C\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">revenge porn<\/a>.<\/p>\n<p>Deepfakes are also used for information attacks, parody and satire.<\/p>\n<p>In 2018 American director Jordan Peele and BuzzFeed <a href=\"https:\/\/www.buzzfeed.com\/craigsilverman\/obama-jordan-peele-deepfake-video-debunk-buzzfeed\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">published<\/a> a purported address by former US president Barack Obama in which he called Donald Trump \u201can asshole\u201d. The clip was made with FaceApp and Adobe After Effects. The director and journalists wanted to show how fake news might look.<\/p>\n<p><iframe loading=\"lazy\" width=\"1077\" height=\"606\" src=\"https:\/\/www.youtube.com\/embed\/cQ54GDm1eL0\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen=\"\"><\/iframe><\/p>\n<p>In 2022, after Russia\u2019s full-scale invasion of Ukraine, a fake video of president Volodymyr Zelenskyy circulated on social media in which he \u201ccalls on the people to surrender\u201d. Users quickly spotted the forgery, and Zelenskyy recorded a rebuttal.<\/p>\n<p>In May of the same year, scammers spread a deepfake of Elon Musk in which he \u201curges people to invest\u201d in an obvious <a href=\"https:\/\/forklog.com\/en\/news\/what-is-a-scam\">scam<\/a>. The YouTube channel that posted it had over 100,000 subscribers, and before the account was deleted the video had more than 90,000 views. How many people fell for the scam is unknown.<\/p>\n<p>For fun, there are many deepfake apps. In early 2020, <a href=\"https:\/\/thenextweb.com\/news\/new-deepfake-app-pastes-your-face-onto-gifs-in-seconds\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Reface<\/a> became widely known for using deepfake technology to create short videos that overlay virtually any face onto a wide range of videos and GIFs.<\/p>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\">What can be faked?<\/h2>\n<p>Deepfake technology can produce not only convincing videos but entirely fabricated photos from scratch. In 2019 a certain Maisy Kinsley <a href=\"https:\/\/www.fastcompany.com\/90332538\/how-to-spot-the-creepy-fake-faces-who-may-be-lurking-in-your-timelines-deepfaces\">set up<\/a> LinkedIn and Twitter profiles claiming to be a Bloomberg journalist. The \u201cjournalist\u201d contacted Tesla employees to fish for information.<\/p>\n<p>It later emerged this was a deepfake. The social profiles contained no convincing evidence linking her to the publication, and the profile photo was clearly generated by AI.<\/p>\n<p>In 2021 ForkLog HUB <a href=\"https:\/\/hub.forklog.com\/kak-nejroset-stala-odnim-iz-avtorov-platformy-hub\/\" target=\"_blank\" rel=\"noreferrer noopener\">ran<\/a> an experiment, creating a virtual character, N. G. Adamchuk, who \u201cwrote\u201d musings about the cryptocurrency market for the platform. The texts were generated with the GPT-2 large language model. The avatar was created with <a href=\"https:\/\/thispersondoesnotexist.com\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">This Person Does not Exist<\/a>.<\/p>\n<p>Audio can also be forged to create \u201cvoice clones\u201d of public figures. In January 2020, fraudsters in the UAE faked the voice of a company executive and convinced a bank employee to transfer $35m to their accounts.<\/p>\n<p>A similar case occurred in 2019 with a British energy company. Scammers <a href=\"https:\/\/gizmodo.com\/scammer-successfully-deepfaked-ceos-voice-to-fool-under-1837835066\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">stole<\/a> about $243,000 by impersonating the firm\u2019s chief executive with a fake voice.<\/p>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\">How are deepfakes made?<\/h2>\n<p>Creating deepfakes requires a large dataset of two people\u2019s faces. An AI encoder analyses the dataset, finds similarities and compresses the images.<\/p>\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"517\" src=\"https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p3-1024x517.png\" alt=\"\u0427\u0442\u043e \u0442\u0430\u043a\u043e\u0435 \u0434\u0438\u043f\u0444\u0435\u0439\u043a?\" class=\"wp-image-174628\" srcset=\"https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p3-1024x517.png 1024w, https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p3-300x152.png 300w, https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p3-768x388.png 768w, https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p3.png 1200w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption>Creating datasets of two faces and then encoding them. Source: ForkLog.<\/figcaption><\/figure>\n<p>The decoder is then trained to reconstruct faces from the compressed frames. A separate algorithm is used for each person. To swap faces, the compressed data are fed to the \u201cwrong\u201d decoder.<\/p>\n<p>For example, images of person A are fed into a decoder trained on person B. The algorithm reconstructs B\u2019s face with A\u2019s facial expression.<\/p>\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"517\" src=\"https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p1-1024x517.png\" alt=\"\u0427\u0442\u043e \u0442\u0430\u043a\u043e\u0435 \u0434\u0438\u043f\u0444\u0435\u0439\u043a?\" class=\"wp-image-174629\" srcset=\"https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p1-1024x517.png 1024w, https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p1-300x152.png 300w, https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p1-768x388.png 768w, https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p1.png 1200w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"517\" src=\"https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p2-1024x517.png\" alt=\"\u0427\u0442\u043e \u0442\u0430\u043a\u043e\u0435 \u0434\u0438\u043f\u0444\u0435\u0439\u043a?\" class=\"wp-image-174630\" srcset=\"https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p2-1024x517.png 1024w, https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p2-300x152.png 300w, https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p2-768x388.png 768w, https:\/\/forklog.com\/wp-content\/uploads\/deepfake_p2.png 1200w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption>Feeding compressed face images into the \u201cwrong\u201d decoder. Source: ForkLog.<\/figcaption><\/figure>\n<p>For high-quality results in video, the algorithm must process every frame this way.<\/p>\n<p>Another way to create deepfakes is with generative adversarial networks (GANs). This approach is used by services like This Person Does not Exist.<\/p>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\">Who makes deepfakes?<\/h2>\n<p>Deepfakes are created by academic and commercial researchers, machine-learning engineers, hobbyists, visual-effects studios and directors. Thanks to popular apps like Reface or FaceApp, anyone with a smartphone can make a fake photo or video.<\/p>\n<p>Governments can use the technology too, for instance as part of online strategies to discredit and undermine extremist groups or to contact target individuals.<\/p>\n<p>In 2019 journalists <a href=\"https:\/\/apnews.com\/article\/ap-top-news-artificial-intelligence-social-platforms-think-tanks-politics-bc2f19097a4c4fffaa00de6770b8a60d\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">found<\/a> a LinkedIn profile for a certain Katie Jones that turned out to be a deepfake. America\u2019s National Counterintelligence and Security Center said foreign spies regularly use fake social-media profiles to monitor US targets. The agency accused China of conducting \u201cmass\u201d espionage via LinkedIn.<\/p>\n<p>In 2022 South Korean engineers created a deepfake of presidential candidate Yoon Suk-yeol to attract young voters ahead of the 9 March 2022 election.<\/p>\n<p><script async=\"\" src=\"https:\/\/telegram.org\/js\/telegram-widget.js?19\" data-telegram-post=\"forklogAI\/1686\" data-width=\"100%\"><\/script>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\"><strong>What technologies are used to make deepfakes?<\/strong><\/h2>\n<p>Most forgeries are created on high-performance desktop computers with powerful graphics cards or in the cloud. Experience is also required, not least to polish finished videos and remove visual defects.<\/p>\n<p>However, many tools now help people make deepfakes both in the cloud and on smartphones. In addition to Reface and FaceApp, there is Zao, which overlays users\u2019 faces onto film and TV characters.<\/p>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\">How to spot a deepfake?<\/h2>\n<p>Most fake photos and videos are low quality. Telltales include unblinking eyes, poor lip\u2013speech synchronisation and blotchy skin tones. Along the edges of transplanted features there may be flicker and pixelation. Fine details such as hair are especially hard to render well.<\/p>\n<p>Poorly transferred jewellery and teeth can also betray a fake. Watch for inconsistent lighting and reflections on the iris.<\/p>\n<p>Big tech firms are fighting deepfakes. In April 2022 Adobe, Microsoft, Intel, Twitter, Sony, Nikon, the BBC and ARM formed the C2PA alliance to detect fake photos and videos online.<\/p>\n<p>In 2020, ahead of the US elections, Facebook <a href=\"https:\/\/www.bbc.com\/news\/technology-51018758\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">banned<\/a> deepfake videos that could mislead users.<\/p>\n<p>In May 2022 Google <a href=\"https:\/\/www.unite.ai\/google-has-banned-the-training-of-deepfakes-in-colab\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">restricted<\/a> the training of deepfake models in its Colab cloud environment.<\/p>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\"><strong>What are the risks of deepfakes?<\/strong><\/h2>\n<p>Beyond disinformation, harassment, intimidation and humiliation, deepfakes can undermine public trust in specific events.<\/p>\n<p>According to Lillian Edwards, a professor and internet-law expert at Newcastle University, the problem is less the fakes themselves than the denial of real facts.<\/p>\n<p>As the technology spreads, deepfakes may threaten justice, where falsifiable events can be passed off as real.<\/p>\n<p>They also pose risks to personal security. Deepfakes can already imitate biometric data and fool face-, voice- or gait-recognition systems.<\/p>\n<p><script async=\"\" src=\"https:\/\/telegram.org\/js\/telegram-widget.js?19\" data-telegram-post=\"forklogAI\/2159\" data-width=\"100%\"><\/script>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\">How can deepfakes be useful?<\/h2>\n<p>Despite the risks, deepfakes can be useful. The technology is widely used for entertainment. For example, London startup Flawless <a href=\"https:\/\/www.hollywoodreporter.com\/behind-screen\/new-ai-tool-offers-subtitling-and-dubbing-alternative\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">developed<\/a> AI to synchronise actors\u2019 lips with audio tracks when dubbing films into other languages.<\/p>\n<p>In July 2021 the makers of a documentary about Anthony Bourdain, who died in 2018, used a deepfake to voice his quotes.<\/p>\n<p>The technology can also help people <a href=\"https:\/\/www.projectrevoice.org\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">regain their voice<\/a> lost to illness.<\/p>\n<p>Deepfakes are also used to create synthetic datasets, sparing engineers from collecting photos of real people and obtaining permissions for their use.<\/p>\n<div class=\"wp-block-text-wrappers-keypoints article_keypoints\">\n<p>Subscribe to ForkLog on Telegram: <a href=\"https:\/\/t.me\/forklogAI\" target=\"_blank\" rel=\"noreferrer noopener\">ForkLog AI<\/a> \u2014 all the news from the world of AI!<\/p>\n<\/div>\n<\/div>\n<div class=\"wp-block-text-wrappers-cards single_card\">\n<h2 class=\"card_label\"><strong>Further reading<\/strong><\/h2>\n<p><a href=\"https:\/\/forklog.com\/en\/news\/artificial-intelligence-what-it-is-and-how-it-works\">What is artificial intelligence?<\/a><\/p>\n<p><a href=\"https:\/\/forklog.com\/en\/news\/what-is-machine-learning\">What is machine learning?<\/a><\/p>\n<p><a href=\"https:\/\/forklog.com\/en\/news\/what-is-a-neural-network\">What is a neural network?<\/a><\/p>\n<p><a href=\"https:\/\/forklog.com\/en\/news\/what-is-computer-vision-machine-learning\">What is computer vision? (machine learning)<\/a><\/p>\n<p><a href=\"https:\/\/forklog.com\/en\/news\/what-are-transformers-machine-learning\">What are transformers? (machine learning)<\/a><\/p>\n<p><a href=\"https:\/\/forklog.com\/en\/news\/what-is-natural-language-processing\">What is natural language processing?<\/a><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>What is a deepfake? How are they made and where are they used? How do you spot an AI fake? Explained in cards.<\/p>\n","protected":false},"author":1,"featured_media":37562,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"1","news_style_id":"1","cryptorium_level":"2","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[2113],"tags":[2130,438],"class_list":["post-37561","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-cryptorium","tag-101-artificial-intelligence","tag-artificial-intelligence"],"aioseo_notices":[],"amp_enabled":true,"views":"96","promo_type":"1","layout_type":"1","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/37561","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/comments?post=37561"}],"version-history":[{"count":1,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/37561\/revisions"}],"predecessor-version":[{"id":37563,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/37561\/revisions\/37563"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media\/37562"}],"wp:attachment":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media?parent=37561"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/categories?post=37561"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/tags?post=37561"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}