3rdPartyFeeds News

Deepfakes could further complicate social media’s job during election season

A fledgling, fast-evolving technology threatens the best efforts of social media’s biggest companies to tamp down disinformation during political season. Need proof? Seeing and hearing is disbelieving. Read More...

A fledgling, fast-evolving technology threatens the best efforts of social media’s biggest companies to tamp down disinformation during political season. Need proof? Seeing and hearing is disbelieving.

Video and audio deepfakes are an intriguing layer of subterfuge and digital hocus-pocus that could complicate things for Facebook Inc. FB, -0.55%  , Twitter Inc. TWTR, -0.05%  , Snap Inc., SNAP, +3.77%  and others who are working overtime to rid their vast social-media platforms of misleading content and outright lies before the Nov. 3 elections.

See more: Facebook and Twitter are concerned about what is going to happen after Election Day

The artificial intelligence-powered technology that helps generate so-called deepfake images and speech — one doctored video famously, and misleadingly, showed Speaker of the House Nancy Pelosi slurring her speech — could magnify online disinformation.

“Repercussions for political candidates, business leaders, it is real,” says Raymond Lee, who founded Fakenet AI, a startup that is building a social media scan option to detect fraudsters and low-credibility websites. Lee told MarketWatch that his company has talked to TikTok and a third-party fact checker service working with Facebook about using Fakenet AI’s technology.

Additionally, Fakenet AI has a pilot program with a publicly traded, multibillion-dollar financial institution using its product.

Major businesses and political parties are taking a closer look at digital options as deepfakes grow in sophistication, according to Lee and others.

Until recently, most deepfakes have been of low quality and easy to spot. But as productions of them get slicker through AI tools, security researchers like Lee are worried ersatz videos and audio will deceive more people. Members of both parties in Congress have voiced concerns that deepfakes could lead to widespread political interference.

It’s already happened in the business community. Fraudsters used AI-based software to impersonate the voice of a chief executive from an energy firm in Germany to demand and receive a $243,000 transfer in March, according to reports in The Wall Street Journal and others. The “caller” said the request was urgent, directing a lower-level executive to pay within an hour, according to the company’s insurance firm Euler Hermes Group SA.

Euler Hermes declined to name the energy company.

Read More

Add Comment

Click here to post a comment