Sign up for our free daily newsletter
YOUR PRIVACY - PLEASE READ CAREFULLY DATA PROTECTION STATEMENT
Below we explain how we will communicate with you. We set out how we use your data in our Privacy Policy.
Global City Media, and its associated brands will use the lawful basis of legitimate interests to use
the
contact details you have supplied to contact you regarding our publications, events, training,
reader
research, and other relevant information. We will always give you the option to opt out of our
marketing.
By clicking submit, you confirm that you understand and accept the Terms & Conditions and Privacy Policy
Gaming industry’s new boss battle: data protection authorities level up
In the high-stakes world of online gaming, a new boss battle has emerged – and it’s not part of the gameplay. Data Protection Authorities (DPAs) are flexing their muscles, imposing hefty fines and reshaping the industry’s approach to user data.
Data has become an important resource for the gaming industry. Games collect data on users through hardware and gameplay. This data is used for different purposes, ranging from improving games or personalising the gaming experience, to marketing and monetising unique datasets collected through sensors embedded in the devices.
The gaming industry’s data-driven success has caught the attention of regulators worldwide. The 2022 case against Voodoo, resulting in a €3m fine for GDPR violations, served as a stark warning that compliance is non-negotiable for all companies serving EU players. In Spain, the DPA imposed a €90,000 fine on a Chinese company for data minimisation and security breaches in game forum moderation, suggesting that the European DPAs have their eye on the gaming industry.
The crackdown is global. In the US, Microsoft faced a $20m penalty for illegally collecting children’s data on Xbox, while Epic Games settled for a staggering $520m over children’s privacy violations. Meanwhile, Nordic DPAs are joining forces, adopting joint principles on children and online gaming to ensure GDPR compliance across the region.
This regulatory surge demands a strategic shift. Gaming companies must prioritise data protection compliance while still harnessing the value of user information. Those who successfully navigate this landscape will find themselves with a competitive edge, building trust with players and avoiding costly penalties. In this new era, responsible data practices aren’t just a legal requirement – they’re a business imperative for sustainable growth in the dynamic global gaming market.
AI in gaming: copyright minefield or creative goldmine?
Generative AI is revolutionising the gaming landscape, promising dynamic storylines, lifelike non-player characters (NPCs) and personalised content. But as this technology reshapes the industry, it’s also raising complex legal questions about copyright and ownership.
The integration of large language models (LLMs) and generative AI systems in NPCs is transforming game development. Companies like Convai are fine-tuning existing LLMs to create believable human-like NPCs with unique personalities, facial expressions and behaviours. This technology offers unprecedented opportunities for immersive gameplay and dynamic storytelling.
The legal world is scrambling to keep pace. Recent court rulings paint a confusing picture: Beijing’s Internet Court granted copyright to an AI-generated image, while Czech courts balked at such protection. The European Commission proposes a four-step test for AI-assisted output, emphasising human creativity. At the heart of the debate: can we copyright AI-generated content? If it is not protected, what does it mean for licensing, copying? And what about potential infringement in machine learning datasets?
Furthermore, new regulation requires transparency about the use of AI-generated content. Does this also apply to AI-generated content used in a game? Video Games Europe advocates for a risk-based approach, arguing that synthetic media concerns are less relevant in fictional game worlds.
AI agents may be artificial, but the relationships we build with them are real, and so is the potential for ethical disaster. The integration of AI into gameplay raises legitimate concerns about the potential for AI to exploit human emotions and vulnerabilities. For instance, AI-driven NPCs could be designed to manipulate player behaviour, encouraging excessive gameplay or in-game purchases. This raises questions about the ethical boundaries of game design and the responsibility of developers.
Moreover, as AI-driven relationships in games become more sophisticated, we must consider the psychological impact on players. How do we address the blurring lines between real and virtual relationships? What are the implications for player mental health, especially for vulnerable groups?
There’s also the issue of bias in AI systems. If not carefully designed and monitored, AI in games could perpetuate harmful stereotypes or discriminatory behaviour. This calls for diverse development teams and rigorous testing protocols.
Game developers navigating this complex legal terrain while harnessing AI’s potential, stand to gain significant competitive advantages. By responsibly integrating AI, companies can create more engaging, personalised gaming experiences that drive player retention and open new revenue streams. Those who successfully balance innovation with legal compliance will be best positioned to capitalise on the creative and commercial opportunities ahead, shaping the future of interactive entertainment.
Yet, the question remains: will AI prove to be a creative goldmine or a legal and ethical minefield for gaming? Only time will tell, as more landmark cases challenge the boundaries of ownership and creativity. One thing is certain: the game is changing, and the rules are still being written.
Child’s play no more: gaming industry to step up protections for minors
Data protection authorities are not the only ones looking out for children. The gaming industry is facing a slew of stricter requirements for minors. As authorities worldwide level up their enforcement, gaming companies face their toughest challenge yet – protecting their youngest users or risking game over.
Recent penalties highlight the stakes: €1.125m on Epic Games for pressuring children into purchases, and in the US, Tilting Point Media accepted a $500,000 settlement for inadequate age verification, inappropriate advertising to children and selling children’s data collected without parental consent through the company’s “SpongeBob: Krusty Cook-Off” mobile app.
The UK’s Online Safety Act targets platforms with user-to-user functionality, potentially affecting age ratings for games with features like in-game chat. Meanwhile, the European Commission is investigating various potential Digital Services Act breaches regarding minor protection.
Gaming companies must revamp age verification methods to meet new legal obligations. Simple one-click verification no longer suffices. In this new regulatory era, proactive compliance and robust child protection measures are essential. Forward-thinking companies are turning this challenge into an opportunity, building trust with players and parents alike.
Dark patterns in gaming: the hidden boss players never knew they were fighting
Free-to-play games have revolutionised the industry, but they’ve brought a sinister companion: dark patterns. These manipulative design tactics, lurking in game mechanics, push players towards excessive spending and potentially harmful addiction.
One such example is the “pay-to-win” model, which – when taken to extremes – creates an uneven playing field where high-spending players gain significant advantages. For instance, the iPhone game Game of War, Fire Age received criticism for being impossible to beat unless you spend money – despite being marketed as “free to play”. In games like FIFA Ultimate Team, players can buy packs with real money that contain random player cards, giving paying players a better chance of building stronger teams. Similarly, in Genshin Impact, the most powerful characters and items are often only obtainable through paid “gacha” systems, creating a significant power gap between paying and non-paying players. This approach not only raises ethical concerns but also risks violating consumer protection laws, blurring the line between gaming and gambling.
Recent regulatory actions highlight the issue’s importance. As briefly mentioned above, in May 2024, the Netherlands Authority for Consumers and Markets (ACM) fined Epic Games €1.125m for pressuring young Fortnite players through aggressive ads and misleading countdown timers. The use of various design features such as these are characterised by the ACM as ‘dark patterns’.
The case is reflective of current regulatory concerns. In June 2022, the EU’s Consumer Protection Cooperation Network endorsed five key principles for fair advertising to children, emphasising consideration of their vulnerabilities and avoiding deceptive practices. In August 2023, the UK’s Information Commissioner’s Office and the Competition and Markets Authority issued a joint position paper on harmful design in digital markets outlining four expectations businesses are expected to comply with in supporting good Online Choice Architecture (OCA) practices. While not specifically targeting games, these principles could apply to practices seen in some free-to-play games like Candy Crush Saga where time-limited offers and complex in-game economies might pressure players into impulsive purchases.
Game developers face a challenging balancing act. While some design choices can boost short-term revenues, they risk long-term player trust and brand reputation if the designs turn to ‘dark patterns’. Companies that venture on the wrong side of this fine line may find themselves facing hefty fines and legal battles.
The gaming industry faces complex challenges in balancing monetisation with player experience and autonomy. While dark patterns exist, many developers strive for ethical design. Moving forward, the focus should be on creating games where the main challenges lie in gameplay, not in resisting manipulative tactics. By embracing transparency and fair play, developers can level up player trust while maintaining commercial success. As games often challenge players to make meaningful choices, so too can the industry choose paths that prioritise both player enjoyment and ethical business practices.
From pixels to podiums: IOC’s Olympic Esports Games spark legal revolution
The Olympics are about to go digital as the International Olympic Committee (IOC) gives the green light to the inaugural Olympic Esports Games in Saudi Arabia in 2025. This groundbreaking move isn’t just a game-changer – it’s a whole new playing field.
Esports, which covers competitive video gaming and virtual sports, has undergone a remarkable transformation, evolving from a game for hobbyists into a professional electronic sport that is reshaping the competitive landscape of the gaming industry. Live tournaments now command audiences of millions, with top esports athletes achieving celebrity status, complete with lucrative sponsorships and devoted fan bases. The industry has grown to include professional leagues, dedicated arenas and significant prize pools, often rivalling traditional sports in terms of viewership and revenue generation.
The IOC’s endorsement not only legitimises esports but aims to engage younger generations, potentially redefining the very concept of competitive sports. However, this digital gold rush brings a host of legal challenges.
Intellectual property issues present significant hurdles, with complex rights negotiations between game developers, tournament organisers and players. The professionalisation of esports demands robust contract law frameworks governing player agreements, sponsorships and broadcasting rights. Moreover, esports will need to adopt new regulating bodies or be sanctioned by international sports federations and/or the IOC.
The IOC’s endorsement marks a pivotal moment for esports, but blurred lines between virtual and physical competition necessitate new legal frameworks to support this digital sporting revolution. The game has changed – now it’s time for the rules to catch up.
Reign Lee is the head of strategy at Van Bael & Bellis in London. She has significant experience working with brands, funds, online marketplaces as well as creative industries (particularly music), and specialises in UK/EU digital regime counselling and compliance.
Thibaut D’hulst is the head of the data protection and intellectual property practices at Van Bael & Bellis in Brussels. He regularly advises clients on all aspects of intellectual property law, including strategies to protect trademarks, databases and other intellectual property, as well as litigation and technology projects related to compliance with intellectual property, data protection and/or pharmaceutical laws.
Ossama M’Rini is an associate in the Van Bael & Bellis commercial team, based in Brussels. He advises domestic and international clients on a wide range of commercial law issues, with a focus on IT/IP and data protection. He is a member of the firm’s AI taskforce, where he examines issues at the intersection of law and technology.
Email your news and story ideas to: [email protected]