Baroness Beeban Kidron has spent her career exploring how film has the potential to improve children’s lives. As a director she has divided her time between London and Hollywood with films including Bridget Jones: The Edge of Reason (2004) and Oranges Are Not the Only Fruit (1989); these days, as a cross-party peer, much of her time is spent travelling between London and Washington, meeting with legislators and tech companies to encourage a more thoughtful, responsible approach to how children are treated in the digital space.
The FilmClub charity she co-founded, later renamed Into Film, uses film as a way of educating young people; now, with the 5Rights Foundation, she is campaigning for the introduction of new standards for age-appropriate digital services. Created in collaboration with the Institute of Electrical and Electronics Engineers (IEEE), the IEEE 2089-2021 Standard for Age Appropriate Digital Service introduces practical steps that companies can follow when designing digital products. It is based on five key principles: presenting information in an age-appropriate way; upholding children’s rights; offering fair terms for children; recognising childhood; and putting children ahead of commercial interests.
I meet Kidron in the cafeteria of the House of Lords, where she begins by explaining how her interest in children’s rights stems from her film work. “I always went between making features and documentaries,” she says. “This came to a crossroads in 2012, when a smartphone became cheap enough for a parent to give it to their child. I noticed that all the kids around me suddenly changed and I thought, ‘that’s interesting’.”
Kidron’s documentary InRealLife (2013) examined how the lives of teenagers in the UK have been affected by the internet and, specifically, social media. “In the course of making the film, I realised that something enormous was happening that the world wasn’t watching,” she says. “I felt it was a generational injustice.
“I was struck by the fact that the kids went quiet. I always associate groups of young people as being both here, and somewhere else. And I wondered what that split might feel like in the long run. In the course of making the film, I looked at things about data, surveillance and profiling, and many of them are unfair commercial practices, in my view.
“It was absolutely extraordinary – the internet seemed to have some sort of exemption from the rules, including children’s rights, which are very well established and widely adhered to elsewhere. If you don’t have rights online, it follows that you don’t have rights at all.
“I didn’t realise at the time that I would start walking in a direction of legislation and policy, but that was the trigger.”
The pressure placed on children by social media – and the urgency of dealing with the problem – was brought into sharp focus recently by the death of British teenager Molly Russell. In a landmark verdict at the inquest in September, coroner Andrew Walker concluded that Molly “died from an act of self-harm while suffering from depression and the negative effects of online content”. Her father, Ian Russell, accused Facebook and Instagram owner Meta of “monetising misery”.
In 2019, Kidron chaired a UN committee that applied children’s digital rights to the Convention on the Rights of the Child. She was also behind a UK data protection law that “places responsibility on companies to provide well-designed services, rather than responsibility on parents and children to behave well, under circumstances in which they were being pushed to behave badly”.
Kidron is keen to stress that the fight for improved digital rights for children is not just about the content available to them; it is about a whole range of harmful and disruptive commercial practices.
“I am talking about direct messaging to kids; I’m talking about nudging kids to be awake through the night so they can’t wake up in the morning; I’m talking about keeping them in loops of anxiety about being popular, endlessly texting them, the social pressure to look like the most beautiful people in the world,” she says.
What has been the response from the tech industry itself to Kidron’s activities? “In the beginning, they were very dismissive,” she says. “I’ve certainly been the subject of ad hominem attacks and intensive lobbying by industry groups.
“However, I have always wanted to work with industry. My point has always been that government should not set the technical standard; they should set the principles by which they’re going to judge industry, and industry has the responsibility, skills and imaginations to innovate. In the places where we’ve passed the law, and they have innovated
to meet that law, they have been remarkable and brilliant. We’ve seen really creative solutions.
“But there’s still too much drag, too much resistance. I’d like to see them hurtling forward, rather than resisting at all points. “I’ve also learned a great deal from them, in trying to understand what is difficult for them. Sometimes we can think of it in a more creative or pragmatic way without actually lessening the purpose. That’s a very important conversation to have as a policymaker.”
There is perhaps a common perception that ‘Silicon Valley culture’ is antipathetic towards regulation, favouring a more libertarian approach. Does that hold true, I wonder? “It’s definitely true historically, and it’s particularly visible in some of the household names and early adopters,” Kidron believes.
“Now though you have to be a bit more nuanced; there are hundreds of thousands of people working in this sector, many of them are parents, and many of them are worried or embarrassed, and don’t actually want to be part of an industry that is greedy and toxic. And many of them are talking to me, trying to build better products, trying to persuade people within their own companies that this is the right thing to do.
“Then there are corporate naysayers and a lot of people who are paid handsomely to make sure nothing good happens,” she adds. “It’s foolish to tar everyone with the same brush though – ultimately, we must judge the sector by what it actually does. If you can’t see the different shades of grey, maybe we miss an opportunity.”
Finding the floor
To ensure that children’s rights are built into the digital space from the ground up, Kidron believes it is necessary to establish a ‘floor’ – a benchmark for practices which becomes the industry standard. “We want to see that floor raised when it comes to behaviour,” she says. “That goes across all of the technology and all of the companies.
“Children should be treated according to their rights and their development needs and, therefore, according to their age. What that really means is what we call the ‘four Cs’: content, contact, conduct, and commercial arrangements.
“When I started this, 75% of the social media companies enabled a stranger – an adult – to direct-message a child. That’s a contact issue. That’s a feature, it’s not a bug. It’s what they were trying to make happen, rather than preventing. If you ask any minister, any parent or indeed anyone in this room when is it a good idea to automate strange adults meeting underage children, you come up with only one answer.
“A floor never means you can’t do better. There is no ceiling to creating supportive, exciting, successful products that children use. We are very keen on default child-centred design, data protection and privacy, and taking a rights-based approach to create a broadly held understanding of what children deserve in the real world, and push that into the minds and practice of the sector.”
There are certain digital platforms or products, I suggest, that exploit children by design. How do they fit into the responsible culture the 5Rights Foundation is trying to promote?
“I do think that there are some companies whose sole purpose is either gathering data to sell on, or whose sole purpose is to sell children unnecessary assets that do not really add value to their lives,” Kidron says.
“However, some things are not necessarily bad for you in small quantities which might be bad for you in larger amounts. If I have an ice cream once a week, for example, that’s not really a problem. But some of these products are very intense, and they push kids into behaviours that are sort of endless.
“Whenever you’re dealing with a kid, you shouldn’t be selling them certain things; if you’re dealing with a kid who seems to have a credit card, you’d better check with the credit card holder; if you’re dealing with a kid and you’re just scraping their data, it’s not in their best interest.
“You start very, very quickly to say, actually, that particular service needs to redesign itself for the kid. It doesn’t mean it has to be totally different for all users. I want to emphasise that this should be proportionate and risk-based. I’m very careful about not demonising one product over another, but actually demanding a better experience for the user.”
In the same way that Kidron adopts a collaborative approach to working with industry, she is keen to avoid politicising her efforts. “This is a bipartisan issue,” she says. “I have never on my journey found that people divide on political, ideological grounds on this. The more people know, the more worried they are.
“My message to my colleagues and to other jurisdictions is, we have got to get it right. It mustn’t be allowed to become a battlefield. It is a responsibility in a world that is so dominated by digital that kids get a fair shake. Don’t turn them into rats in the experiment.”
Its emphasis on fairness, protecting the vulnerable and encouraging good behaviour on the part of business places Kidron’s work squarely within Trading Standards’ general ethos. So what implications does it have for those working in consumer protection?
“If you think about Trading Standards as the ‘real-world’ version of what we’re seeking to do in the digital world, it’s an inspiration,” Kidron says.
“In the real world, Trading Standards sets a floor of behaviour for the community. If we have decided that children should not smoke cigarettes, Trading Standards then sets the bar by which you are judged to have achieved or not achieved that goal. The same applies to whether something is safe and fit for purpose, whether that is food or any other product.
“Essentially, we’re talking about product safety here. I think the biggest obstacle to having in the digital world what most people would like to see, has been the industry’s claim of exceptionality. But actually, if you just look at it as a product, then the relevance of Trading Standards becomes obvious.
“I find the Trading Standards approach pragmatic, fair and successful. Proportionate regulation is what I believe in; I don’t think we have to make everybody on the planet a good person. I think we just have to take away bad actors.”
Finally, I ask what the role of parents is in all this. “You have to be careful, because you should never give people responsibility for things that they can’t really impact,” cautions Kidron. “If a parent goes to work, they come home and cook the dinner, they go to sleep; can they also be 24/7 on a kid’s phone? Secondarily, why should we make parents responsible for bad business practices?
“I think that the best way of looking at it, is that we have a vehicle at the top of the hill with no brake, no rearview mirror, and a kid at the wheel – and then we tell the parent to stand in front of that vehicle and make sure the kid’s all right. That’s not okay. And that’s not where we should put parents.”