While browsing Reddit, I came across subreddits like r/rateme, r/amiugly. These are subreddits where people post pictures of themselves for others to give feedback on their looks and how to improve them. But there’s a catch ….

To combat fake profiles posting on these subreddits, moderators of these subreddits require you to hold a piece of paper with your username, date, and sometimes subreddit name written on paper in at least one of your pictures.

There are dozens of subreddits that require uploading pictures for verifying genuine users from catfish or bot accounts.

Recently I signed up for an online banking service that required me to upload a picture of myself and identity documents for signing up. And in the end, I was asked to upload a selfie of myself while holding a government-issued id card for verification.

As someone who has spent months exploring exposed S3 buckets, Azure blobs, and exposed databases. I’ve come across a few KYC (Know Your Customer) image databases of small fintech, crypto companies. Most of them were using similar format images for KYC verifications.

Verification pictures from these subreddits can easily be manipulated to look like verification pictures required by online services during KYC verification.

Most of the country’s government-issued id cards, driving licenses, and passports are easily editable PSD files that have been available in underground markets since forever.

Various countries’ editable id card PSD files on sale

All it takes is an average skilled person to photoshop those edited PSD files in place of username paper. Add proper lighting and glare on the picture and it should pass as legit.

We have already seen North Korean actors photoshopping faces and ID cards on the same images to get their KYC done in crypto exchanges.

They got caught in this instance because they used the same body wearing the same t-shirt. But if they’d have used pictures of random individuals scraped from these subreddits or other similar sources, they’d have got away with it.

Felixo Token detected similar incidents from KYC fraudsters in the past. The attackers were using photoshopped images to create thousands of accounts to earn a referral bonus.

These subreddits provide an endless daily supply of images for KYC fraudsters to work with.

Use cases for manipulated images

Such manipulated images are used by fraudsters to generate synthetic identities, which they use to create accounts on different crypto exchanges and other platforms.

These accounts are then either sold on underground markets & forums to other people who might use them to launder money or receive funds from illicit activities. Or these accounts are involved in large-scale coordinated frauds like the case of FELIXO token.

A Telegram post advertising verified crypto exchange accounts for sale

Marketplaces for such images

The image format of a person holding a piece of paper or a card in their hand is the base template of KYC verification images. Some fake id marketplaces started listing selfies with an editable ID card in hand and swappable background for sale.

A marketplace seller showcasing editable selfie with ID card
A seller offering face swap and selfie making services for sale

What can be done to stop potential abuse of these images

On Reddit’s side — Reddit should make such verifications private, where users have to verify their identity by sending such pictures in private messages to moderators and not in public posts. Still, malicious moderators may be able to take advantage of the process, since they’re all volunteers and not bound by any agreement.

On the company’s side — Uploading only static images of yourself holding an ID card for KYC identity verifications should not be the only way of verifying KYC documents. Major crypto exchanges have their custom live video solutions and strict AI/ML checks in place, but some smaller companies still use old-fashioned static images for KYC verification. It puts smaller exchanges, like on fintech apps, at higher risk of these KYC frauds. If small exchanges can’t develop in-house capabilities for performing such checks, they should be using a third-party solution that performs liveness checks and other AI analyses in images to detect photoshop and manipulation of images.