
In 2023, ransomware attacks, child sexual exploitation (CSE) and online fraud remained the most threatening manifestations of cybercrime in the European Union (EU). This new ‘internet Organised Threat Assessment’ report from Europol also provides interesting insights about AI and Cybercrime. A summary of parts of the report is provided below.
Ransomware
Ransomware groups increasingly target small and medium-sized businesses, because they have lower cyber defences. Most ransomware operators choose their targets based on the size, likelihood of a pay-out and the effort required to compromise the target’s systems. This means that attackers seek out publicly accessible systems and services within the infrastructure (reconnaissance) and assess which of them can be compromised most easily. Gaining initial access can be done through stolen credentials or by exploiting vulnerabilities in the public facing technologies. Bitcoin is still the cryptocurrency that is most abused by criminals, but the use of alternative coins (altcoins, such as Monero) seems to be growing.
Similar to previous years, ransomware operators are continuing to deploy multi-layered extortion tactics. Although attackers still tend to encrypt the compromised systems, the risk of publishing or auctioning the stolen data has become the most relevant pressure point against victims, since many organisations have started to back up their systems on a regular basis.
The report provides an excellent overview of ransomware actors and Europol operations (p. 16-17). The authors explain that recent law enforcement operations and the leak of ransomware source codes (e.g. Conti, LockBit and HelloKitty) have led to a fragmentation of active ransomware groups and available variants.
Child sexual exploitation
Self-generated sexual material constitutes a significant share of the child sexual abuse material (CSAM) detected online. The volume of self-generated sexual material now constitutes a significant and growing part of the CSAM detected online. This content is created by and depicts children, especially teenagers. In many cases, it is the result of voluntary exchanges among peers but it can be classified as CSAM once disseminated to a third party without the consent of the person who first sent it. Self-generated sexual material is also often the result of online sexual grooming and extortion. In this setting, the perpetrator identifies the victim online, often on gaming platforms or social media, and after gaining their trust through grooming, perpetrators obtain sexually explicit material and use it as leverage for extortion. A feeling of shame and the hope that the threats might stop often lead victims to produce more self-generated sexual material. In addition to extortion for new CSAM, some offenders also extort money from their victims.
Live-distant child abuse (LDCA) is a persistent threat, where offenders watch child sexual abuse on demand with the support of one or more facilitators who perpetrate the abuse on the victim(s) in exchange for payment.
Forums and chatrooms are still essential networking environments for CSE offenders who exchange CSAM and discuss abuses perpetrated and fantasies, how to acquire original CSAM, techniques to groom children and OpSec tips. More proficient offenders usually network in dark web forums that appear to be more and more specialised and tailored to sexual preferences. These offenders have increasingly high levels of technical knowledge, and measures to conceal their traces. These forums have specialised sections for technical and OpSec related matters with tips and training options. As these digital environments are often subject to LE takedowns, technical vulnerabilities and Distributed Denial of Service (DDoS) attacks, they usually do not have a lifespan longer than two years. To overcome such issues, the administrators in charge of these forums create mirror sites, holding a copy of its content and, whenever their site is taken down, they quickly recreate it at a new address. End-to-end encrypted (E2EE) communication platforms are increasingly being used by offenders to exchange CSAM and for communication purposes.
Darknet markets
The main business in dark web markets remains illicit drugs, although there has been a noticeable rise in the volume of prescription drug sales in 2023. Fraudulent shops and services are also increasingly common, offering both fake drug sales and bogus hitman services.
The past year has seen a continued emergence of smaller and much more specialised single-vendor shops. Single-vendor shops allow vendors to avoid paying the fees imposed on traditional marketplaces for each transaction, while still maintaining a presence on several markets at the same time.
The dark web continues to be a key enabler for cybercrime, allowing offenders to share knowledge, tools and services in a more concealed way. It is nevertheless unstable as the fragmentation of marketplaces continues, hand in hand with a surge in exit scams. As a result, the lifecycle of criminal sites has become shorter and mirror sites are springing up rapidly to counter takedowns. The Tor network remains the most popular platform for cybercriminals to access the dark web.
In the aftermath of the German LE’s takedown of the Monopoly Market’s criminal infrastructure in December 2021, last year saw a coordinated operation by Europol and nine countries lead to the arrest of 288 persons across nine countries suspected of involvement in buying or selling drugs on the Monopoly market. Close to EUR 51 million in cash and virtual currencies, 850 kg of drugs, and 117 firearms were seized. The vendors arrested were also active on other marketplaces.
Europol states Exploit, XSS and BreachForums were among the most active cybercrime forums on the dark web in 2023. Cybercriminals were seen sharing hacking knowledge and trading in stolen data, hacking tools and cybercrime services on Exploit and XSS, with the services also serving as a platform for initial access brokers (IABs).
Exploit is primarily Russian-speaking and accessible via both the clear and dark web with an entry fee or a vetted reputation. XSS offers security features for user anonymity and has both free and premium membership options. BreachForums is an English-language forum that functions both as a forum and a marketplace for cybercriminals globally. It facilitates the trade in leaked databases, stolen banking cards and corporate data. In May 2023, one of the forum administrators was arrested5 and the forum was shut down. Three months later, the hacker group ShinyHunters resurrected the forum. In May 2024, it was taken down again in an international LEA operation.
CryptBB and Dread are other known forums with increased activity in 2023. CryptBB is a closed forum for cybercriminals, including hackers, carders, and programmers, from beginners to experts (the admins of CryptBB promote it as the most suitable forum for cybercrime beginners). It offers a range of cybercrime services, remote desktop protocol (RDP) access sales, ‘hackers for hire’, penetration testing and bug reporting services for marketplaces. Dread is a forum launched in 2018 that hosts a wide variety of content from hacking to drug trafficking, Personal Identifiable Information (PII), et cetear. With a user base of over 400 000 users, it is considered one of the most popular forums on the dark web. The forum was shut down by a DDoS attack in November 2022 but resurrected in February 2023. It then introduced a rotating onion address service called Daunt to protect hidden services from DDoS attacks.
As for marketplaces, RAMP, Russian market and the WWH-Club were the most prolific in 2023 beside Genesis, which, although taken down in April 2023, remained one of most active markets of the year. RAMP was a prominent drug marketplace for Russian speakers between 2012, when it began, and 2017 when the Russian Ministry of Internal Affairs seized the site. In 2021, a new RAMP appeared with a focus on ransomware. The new RAMP was no longer Russian speaking only and was opened to Mandarin and English speakers. It has a closed forum with strict access criteria. Russian Market is an English-language marketplace known for trading in PII and other illicit digital goods like RDP access and stolen credit card data.
AI and cybercrime
Artificial Intelligence (AI) based technologies are making social engineering even more effective. Malicious large language models (LLMs) are becoming prominent tools in the cybercrime market. There are already services offered in the dark web that can help online fraudsters to develop scripts and create phishing emails. LLMs are also being used in sexual extortion cases, where these tools can help offenders to refine their grooming techniques. Europol explains this trend might also be perpetuated by the wider availability and increased quality of AI-tools that lack prompt filtering, which cybercriminals can use to quickly assemble and debug their code.
The use of deepfakes is another area of concern as this is a powerful addition to the cybercriminal toolbox. In online fraud, deepfakes are used to mimic voices, for instance for Chief Executive Officer (CEO) fraud attempts and for shock calls, and their popularity is set to increase. The existence of a dark web service called ‘Only Fake’ has already been reported. The service sells AI-generated fake IDs that can be used to open accounts online on financial services, bypassing Know Your Customer (KYC) procedures. Europol expect more AI-generated advertisements luring in potential victims to online fraud.
In the area of child sexual exploitation (CSE), cases of AI-assisted and AI-altered child sexual abuse material (CSAM) as well as fully AI-generated CSAM were already being reported in 2023 and are expected to become more prominent in the near future. Abuse of LLMs might allow criminals to overcome language barriers so that sex offenders are able to groom victims virtually in any language, impersonating peers and interacting in a way that the victim perceives as natural and believable.
Europol explains that, even in the cases when the content is fully artificial and there is no real victim depicted, AI-generated CSAM still contributes to the objectification and sexualisation of children. The generation of these types of artificial images increases the amount of CSAM material in circulation and makes it harder to identify both victims and perpetrators. This production process is also widely available and does not require high levels of technical expertise, potentially broadening the number and spectrum of perpetrators. These files can easily be used for cyberbullying or for sexual extortion. The greater the volume of artificial CSAM in circulation, the more difficult it will become to identify offenders or victims through image recognition. In order to counter such emerging challenges, specialised CSE investigators will have to find new investigative pathways and tools.