top of page
  • Writer's pictureESET Expert

RSA – Spot the real fake


How erring on the side of privacy might ultimately save you from chasing down a virtual rendition of you doing the bidding of a scammer


At the RSA Conference 2022, the techno-geekery center of the security universe, the halls once more pulse with herds of real aching-feet attendees slurping up whatever promises to be the Next Big Thing. In case anyone feels the economy softening, you couldn’t tell it here. Basically no one is planning on spending less on security. But the landscape is morphing to creepy new levels.


Following a session on deepfakes (which we’ve written about a while back), it’s easy to wonder how long until deepfakes as a service (DFaaS, pronounced “deface” I guess?) hits the pseudo-legit market in the form of very-difficult-to-detect account hacks you can rent.


Let’s say you want to get into a jilted partner’s insurance policy and file a fake claim. Just assemble a combination of voice and video of “them” to convincingly trick a company into issuing a hefty payout for a car that never wrecked. They have that here.


In case insurance companies get better at spotting fakes, there are a host of open source projects to help your rent-a-fake get better. Right now it’s not obvious this kind of thing would get flagged, especially when paired with reasonable social engineering chops.


Not that you have the kind of time it takes to make a good fake, because while the good ones are getting very good indeed, the bad ones won’t convince anyone. But if someone sits in a warehouse packed with GPUs and cheap electricity, like, you know, the ones no longer mining certain digital currencies in a soft market, they could bang out digital identities by slogging through the huge number crunching it takes to make a “better you” quite quickly.


How would your insurance company dispute that it was you if they have a voice and video chat session? That kind of thing would be very tricky to disprove.


I recently heard of an upstart social platform that all but required the use of facial recognition to avoid bots signing up. But whose face is it really? With deep fakes, you can train your fake against real facial recognition apps until you get it right.


On the underground forums, vendors offer sample credit cards for you to test the quality of their wares without burning them on real transactions. This method of building reputation could be ported to DFaaS in case you wanted to test the product before purchase.


One Achilles heel: These models need a lot of data to work. Luckily, some people have been digitally vomiting selfies onto the publicly facing platforms for years now, so that will do fine. If, on the other hand, you keep your images and voice a little more private, the models will have difficulty.


I guess this is one more reason to err toward privacy. Not just for all the other excellent reasons, but now it might save you from chasing down a really good virtual robot of you doing the bidding of a scammer.


Also, if the “CEO” calls and asks you to urgently wire money to a far-flung location, you should still be extremely skeptical. Don’t get (deep-)faked out!

Tags:

bottom of page