The Korea Communications Standards Commission announced on the 12th that the number of corrective actions requested regarding deepfake sexual crime videos last year was 23,107, an increase of about 3.2 times compared to the previous year (7,178).
By year, the figures are 473 cases in 2020, 1,913 in 2021, 3,574 in 2022, 7,187 in 2023, and 23,107 in 2024. In response, the commission noted that it actively addressed this issue, including establishing a hotline with Telegram, to block the distribution of deepfake sexual crime videos that misuse generative artificial intelligence (AI) technology at an early stage.
Additionally, it was analyzed that the number of corrective requests for all digital sexual crimes last year was 94,185, which is an increase of about 41% compared to the previous year (66,929).
The commission explained that there is a continuing concern about the occurrence of cases involving the distribution of deepfake sexual crime videos and illegal filming that exploit children and adolescents, stating that it will implement various measures in the future, such as strengthening monitoring, inducing self-regulation through cooperation with business operators, requesting investigations into malicious information, and ongoing consultations, as well as collaborating with overseas operators and related organizations.