• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

HSE Researchers Assess Creative Industry Losses from Use of GenAI

HSE Researchers Assess Creative Industry Losses from Use of GenAI

© IPQuorum.Music

Speaking at the IPQuorum.Music forum on October 15, Leonid Gokhberg, HSE First Vice Rector, and Daniil Kudrin, an expert at the Centre for Industry and Corporate Projects of HSE ISSEK, presented the findings of the first study in Russia on the economic impact of GenAI on creative professions. The analysis shows that creators’ potential losses could reach one trillion roubles by 2030.

The pace of technological innovation is accelerating every year: while it took the telephone 75 years to 'take over the world,' DeepSeek reached 100 million users in just two weeks. Today, GenAI models are rapidly gaining popularity, enabling users to generate text, video, images, and audio. There are now at least 1,000 AI models on the market designed for creative tasks and capable of competing with human creators. 

Experts at HSE ISSEK have developed a unique methodology for quantifying the losses faced by authors and copyright holders as a result of the adoption of AI. The study focuses on 12 creative professions that are most vulnerable to GenAI. For the first time, researchers have estimated the share of generative content in different types of creative work and assessed the income decline experienced by 'traditional' creators who do not use neural networks. Notably, the study deliberately excludes the potential positive effects of AI, such as increased productivity and reduced barriers to entry.

The findings indicate that total income losses for members of creative professions could reach 1 trillion roubles by 2030. Programmers and developers, as well as advertising and marketing specialists, are expected to be hit the hardest. Generative models have brought new players into the creative industries, including both institutional actors—such as IT companies, developers, and digital platforms—and individuals, such as amateur creators. The production and distribution of AI-generated content by these new players pose a serious threat to the revenues of writers, translators, journalists, and composers: according to experts, AI penetration in these professions will exceed 25%.

Leonid Gokhberg
© IPQuorum.Music

'We live in an era of exponential digital development,' says Leonid Gokhberg, First Vice Rector of HSE University. 'Innovation cannot be stopped, but it is crucial to foster a balanced dialogue between creators, technology companies, and the government. GenAI should not become a threat to creative professions. Rather, it should be viewed as a tool that complements human imagination, not replaces it.'

The researchers emphasise that current AI solutions are still far from fully replacing humans, as generative models are prone to 'hallucinations' and errors, and can pose reputational risks. In practice, AI often serves as an assistant, helping creators with editing, reference selection, translation, and technical problem-solving.

The HSE ISSEK study represents the first step toward a systematic analysis of GenAI’s impact on the intellectual property market and toward developing solutions that balance innovation with the protection of creators’ interests.

A full recording of the plenary session 'Authors in the Service of AI' is available here.

See also:

HSE AI Research Centre Simplifies Particle Physics Experiments

Scientists at the HSE AI Research Centre have developed a novel approach to determining robustness in deep learning models. Their method works eight times faster than an exhaustive model search and significantly reduces the need for manual verification. It can be applied to particle physics problems using neural networks of various architectures. The study has been published in IEEE Access.

Educational Programmes on Robotics and Neural Network Technologies Launch at HSE University’s Faculty of Computer Science

Every year, in response to IT industry demands, the Higher School of Economics Faculty of Computer Science launches new educational programmes while updating existing ones. In 2026, the faculty introduced Bachelor’s and Master’s degree programmes in robotics for the first time.

Scientists Show That Peer Influence Can Be as Effective as Expert Advice

Eating habits can be shaped not only by the authority of medical experts but also through ordinary conversations among friends. Researchers at HSE University have shown that advice from peers to reduce sugar consumption is just as effective as advice from experts. The study's findings have been published in Frontiers in Nutrition.

‘Policymakers Should Prioritise Investing in AI for Climate Adaptation’

Michael Appiah, from Ghana, is a Postdoctoral Fellow at the International Laboratory of Intangible-Driven Economy (IDLab) at HSE University–Perm. He recently spoke at the seminar ‘Artificial Intelligence, Digitalization, and Climate Vulnerability: Evidence from Heterogeneous Panel Models’ about his research on ‘the interplay between artificial intelligence, digitalisation, and climate vulnerability.’ Michael told the HSE News Service about the academic journey that led him to HSE University, his early impressions of Perm, and how AI can be utilised to combat climate change.

HSE University to Host Second ‘Genetics and the Heart’ Congress

HSE University, the National Research League of Cardiac Genetics, and the Central State Medical Academy of the Administrative Directorate of the President will hold the Second ‘Genetics and the Heart’ Congress with international participation. The event will take place on February 7–8, 2026, at the HSE University Cultural Centre.

HSE University Develops Tool for Assessing Text Complexity in Low-Resource Languages

Researchers at the HSE Centre for Language and Brain have developed a tool for assessing text complexity in low-resource languages. The first version supports several of Russia’s minority languages, including Adyghe, Bashkir, Buryat, Tatar, Ossetian, and Udmurt. This is the first tool of its kind designed specifically for these languages, taking into account their unique morphological and lexical features.

HSE Scientists Uncover How Authoritativeness Shapes Trust

Researchers at the HSE Institute for Cognitive Neuroscience have studied how the brain responds to audio deepfakes—realistic fake speech recordings created using AI. The study shows that people tend to trust the current opinion of an authoritative speaker even when new statements contradict the speaker’s previous position. This effect also occurs when the statement conflicts with the listener’s internal attitudes. The research has been published in the journal NeuroImage.

Language Mapping in the Operating Room: HSE Neurolinguists Assist Surgeons in Complex Brain Surgery

Researchers from the HSE Center for Language and Brain took part in brain surgery on a patient who had been seriously wounded in the SMO. A shell fragment approximately five centimetres long entered through the eye socket, penetrated the cranial cavity, and became lodged in the brain, piercing the temporal lobe responsible for language. Surgeons at the Burdenko Main Military Clinical Hospital removed the foreign object while the patient remained conscious. During the operation, neurolinguists conducted language tests to ensure that language function was preserved.

HSE MIEM and AlphaCHIP Innovation Centre Sign Cooperation Agreement

The key objectives of the partnership include joint projects in microelectronics and the involvement of company specialists in supervising the research activities of undergraduate and postgraduate students. Plans also focus on the preparation of joint academic publications, the organisation of industrial placements and student internships, and professional development programmes for the company’s specialists.

AI Overestimates How Smart People Are, According to HSE Economists

Scientists at HSE University have found that current AI models, including ChatGPT and Claude, tend to overestimate the rationality of their human opponents—whether first-year undergraduate students or experienced scientists—in strategic thinking games, such as the Keynesian beauty contest. While these models attempt to predict human behaviour, they often end up playing 'too smart' and losing because they assume a higher level of logic in people than is actually present. The study has been published in the Journal of Economic Behavior & Organization.