Ads that offered the opportunity to “Make $$$ working from home” once were stapled to telephone poles. Today, they’re posted in internet chats, online job boards and social media. They promise big payouts from a few hours of work. The payouts are real, but the work isn’t. The ads seek money launderers, known in the world of cybercrime as “money mules.” The era of the amateur criminal has begun.
Big theft is no longer the sole territory of art forgers, safecrackers or even seasoned hackers. In the 2018 film “Ocean’s 8,” a crew of sophisticated criminals plan an elaborate scheme to steal a $150 million diamond necklace from a fundraising gala at the Metropolitan Museum of Art. In real life, last July the U.S. charged
Ramon Abbas,
a Nigerian Instagrammer known as “Hushpuppi,” with participating in a plot to steal $124 million.
Mr. Abbas didn’t have an obvious background in computer technology. He’s not alleged to have hacked computers or stolen data or money. He had none of the “skills” we typically associate with criminal masterminds like those in “Ocean’s 8.” His alleged contribution to the criminal team was to serve as the “cash out” leader. According to the Federal Bureau of Investigation, he recruited and maintained a global network of people with bank accounts to launder money at his direction. He is currently awaiting trial in federal court in California.
With rising unemployment during the pandemic, criminal gangs are recruiting vulnerable and desperate workers to open bank accounts, or use their pre-existing accounts, and move cash via electronic transfer. The number of digital money mules is increasing; in early December, Europol and authorities from 26 countries identified more than 4,000 money mules working around the world.
Some participate in these schemes knowingly, but many are unwitting—making it hard to distinguish between victim and criminal. Acting as a money mule is illegal, and perpetrators can be prosecuted even if they aren’t aware they are committing a crime, although typically the FBI only issues warnings to those it concludes are unwitting conspirators.
Many unknowing money mules are tricked into “lending” money through online dating sites. Last year, Oregon resident
Rodney Gregory
pleaded guilty to opening multiple business bank accounts that received money from people who thought they were sending it to an online romantic interest. In other cases, fake companies recruit “finance officers” or “money processing agents,” who often believe they are representing legitimate companies. In one case tracked by the FBI, a retired advertising professional found a work-at-home assignment to create a business facilitating imports and exports. He opened a bank account for the business, where he received a series of deposits and then transferred the money to other accounts. He did not realize he was actually fronting a criminal enterprise until the FBI contacted him.
Online human-resources schemes in which criminals pose as potential employers have proliferated during the pandemic. And while elder fraud has typically been linked to money-mule schemes, cybercriminals are increasingly targeting younger mules. “The number of cases of 14 to 18-year-olds who have allowed their bank accounts to be used to divert funds has grown by 73% in two years,” the BBC reported in 2019. Young people are targeted by social media posts featuring hashtags like #PayPalFlip, #EasyMoney, and #legitmoneyflips. In September, the Secret Service seized $140,000 from a 19-year-old for acting as a money mule for a criminal firm.
Money-mule schemes are also becoming less detectable, because of the diversification of labor in how stolen money is moved through payment systems and banks. Data-privacy rules make it difficult to trace money once it has left one bank for another, and no single financial institution can see an entire end-to-end payment as it flows through the banking network.
What can be done? Financial institutions must use artificial-intelligence and machine-learning technology to analyze publicly available information that their typical screening tools don’t search. This will enable better identity verification and should help banks spot account holders with histories of fraudulent activities.
Technology will also identify unwitting money mules. Many people realize they are doing something wrong but don’t understand the extent of it. As they grow suspicious or uncomfortable, they often don’t turn to the police but report their suspicions on social media. When my company was doing work to identify sex-trafficking victims, we used natural-language processing to search “johns’ boards” for messages from customers who suspected their “date” was coerced or otherwise trafficked. Similarly, money mules might post in online forums something about a job that makes them feel “uncomfortable.”
Artificial-intelligence and machine-learning technology can lead investigators to the masterminds behind money-mule schemes while preserving privacy. Such collaboration among banks has only recently become possible thanks to technology allowing institutions to share algorithms but keep data private. As algorithms learn from these collaborations, risk and compliance professionals can more efficiently track and deter the masterminds, and deny the facilitators access to legions of unsuspecting digital money mules.
Mr. Shiffman, a former chief of staff of U.S. Customs and Border Protection, is an adjunct professor at Georgetown’s Center for Security Studies. He is the founder of Giant Oak and Consilient and author of “The Economics of Violence: How Behavioral Science Can Transform Our View of Crime, Insurgency and Terrorism.”
Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8