[00:00:00] Speaker 1: The UK is looking to regulate artificial intelligence to protect children using internet services. The government is pledging to respond more quickly with legislation to close loopholes in existing laws. The Prime Minister says no online platform will get a free pass over the issue of children's online safety. Campaigners and bereaved parents have welcomed the news. Our technology editor Zoe Kleinman reports.
[00:00:26] Speaker 2: Ellen Room's son Jules was 14 when he died in 2022. She believes that he was trying an online challenge which went wrong, but she's never been able to access his data to prove it. Under current rules, a child's data must be requested from tech companies within 12 months of their death by either a coroner or the police. But bereaved parents say often by the time that happens, this record of what their child has been doing online has already been deleted. Under new rules, coroners will have to tell Ofcom as soon as possible about the death of every child aged 5 to 17, so that the regulator can order tech companies to preserve their data if it might be relevant to how they died.
[00:01:10] Speaker 3: I always said that if I could try and make something positive out of the loss of Jules's life, then I would, and this going forward will help other bereaved families. What we now need to do is stop the harm happening in the first place. This is really relevant when a child dies, but we need to stop children dying in the first place.
[00:01:26] Speaker 2: The government's already announced that it's launching a public consultation in March. It's seeking opinions about options like banning social media for under 16s, restricting their access to AI chatbots and limiting infinite scrolling for children, otherwise known as doomscrolling. It says it will act quickly on the results, but critics are concerned this might mean the government giving itself powers to change laws without due process. A newly enforced rule banning the creation of AI deepfakes was rushed into action 10 days ago, following condemnation of Elon Musk's chatbot Grok for creating and sharing explicit images of real women on the social network X. AI chatbots like ChatGPT didn't exist when the Online Safety Act was being written. The government says it intends to close loopholes so that the tech is now included. Today's announcement is a clear sign from the government that children's online safety continues to be a top priority. And while it may be based on wheels that were already in motion, it's been welcomed by campaigners and bereaved parents, some of whom have spent years tirelessly fighting for change. Zoe Kleinman, BBC News.
[00:02:37] Speaker 1: Well, the Australian government social media ban for under-16s has been in place since December last year. Let's speak to Professor of Communications at Edith Cowan University in Perth, Leila Green, to get more on this. Hi, Professor, good to have you with us. And so how effective is this ban on social media for under-16s proving so far?
[00:02:57] Speaker 4: So far, the anecdotal evidence is that most teenagers have found workarounds. Certainly, it is mainly anecdotal at this point. So the social media platforms are telling us how many hundreds of thousands of accounts have been closed. But the young people that I've spoken to aged between 13 and 16 haven't noticed a difference. They had already prepared to move on to other platforms that weren't regulated. But they found that that's unnecessary because they managed to evade most of the age verification software so far.
[00:03:33] Speaker 1: Yeah, because the onus is on the tech companies, isn't it, to take reasonable steps to keep kids off their platforms. But clearly, would you suggest that that's not working because they're not being strict enough?
[00:03:47] Speaker 4: I think that there's a lot of people trying to do a lot of things very quickly. But certainly, there's no sign that the tech companies are particularly motivated to go out of their way in this respect. It's good that it's a tech company responsibility. We want parents and children to continue to be able to talk to each other. Because actually, a lot of the things that are regulated have less to do with outside influences and more to do with social circles that young people move in, which is why the social media has its strengths. Not because it's a technical thing, but because it means you can't get away from your local social circle. Or something that's horrible about you could be posted to a whole social circle immediately. And you don't know how to respond to that because everyone you've ever spoken to suddenly gets told either erroneous or information that you never intended to be made public. So some of those things are really difficult. Yeah.
[00:04:47] Speaker 1: Now, I was going to ask what given the experience and just it's just, you know, six weeks or so experience, it's not very long, is it? But the lessons that Australia is learning as the UK and other countries look to move forward with their policies on social media, and there's going to be a consultation coming into effect next month, and the government wants to move quickly off the back of that. What do you think other countries can learn about perhaps how to be more effective?
[00:05:15] Speaker 4: Well, it seems to me that it would be better if the countries got together and sorted out how they want the social media companies to regulate themselves better. Not so much locking people out of the platforms, but to make the platforms safer for young people to be on them. And we don't want to close down those channels of communication because the parents are children's best response or best place of safety if something goes wrong. And they need to be able to keep talking to their parents about things that happen to them online and not risk having things done because they shouldn't have been online in the first place. So, you know, otherwise people tend to react to you shouldn't have been online, rather than how can I support you through this? What do you think we should be doing about this particular challenge? What's your ideas? Yeah, what do you think safer? What we're doing is closing down conversations.
[00:06:09] Speaker 1: What do you think safer looks like given that, you know, kids are taught about the dangers of posting inappropriate things on social media, and that that stays with them forever?
[00:06:24] Speaker 4: Yes, they're taught about that. But young people are realising that actually a lot of people, well, as we now know, with AI, what's real and what's pretend, what's fabricated and what's real is very difficult to know. And very difficult to defend yourself against very difficult to be authentic about. So we are in a new area here, the sorts of things but taking people, young people off these platforms is just going to delay things and possibly make things worse, because then they don't have channels of safety to talk to adults and trusted others about things that are going wrong. And that is still a child's best response is to get support from people and be able to talk through the options and develop their own critical reasoning and response skills, with support from those who care for them.
[00:07:18] Speaker 1: Professor Edith Cowan, thanks very much for joining us and for telling us more about what's happening in Australia. Well, let's speak to our political correspondent, Joe Pike now who is in Westminster. Hi, Joe. So tell us more about what's on the table for the government when it comes to their social media strategy.
[00:07:31] Speaker 5: Well, Samantha, the government have been accused by opposition parties of moving sluggishly. Of course, the challenge for politicians across the world is the technology, especially AI technology is moving far faster than legislative bodies are. So what the government are now doing is announcing a consultation that will start next month on a social media ban and other sort of related issues. And then before the summer, so maybe June or July, they will set out their plans. They also want to close a couple of loopholes, one on AI chat bots, where young people can have, in some cases, some pretty concerning conversations with these chat bots in cases that have led prominently to suicide. The government want to ensure that tech companies are far more responsible there. And also another change around maintaining the right data when a child has died. So a coroner in England and Wales will now need to contact one of these social media companies to notify them of a child's death, to preserve data if a family potentially would like to look into that and work out whether it has contributed to their death.
[00:08:45] Speaker 1: And how quickly could all of this happen? The government want to create new powers to try and push it through, but that's controversial as well.
[00:08:52] Speaker 5: That's controversial because they want the powers to allow them to use little bits of secondary legislation to bring in rules. So it's effectively like having the power to, without the input of MPs, then being able to tweak things. The argument for the government is, look, this is moving so fast, this is changing so fast, we need the power to close loopholes and not wait for long legislative processes. However, of course, that does change the sort of democratic role of MPs and peers. The argument from Australia seems to be, some argue, pretty strong, but it's also worth pointing out there are other prominent campaigners who argue against a ban. Some in particular worried about a possible cliff edge at 16, when suddenly 16-year-olds would have access to so many different social media platforms. Criticism too from the Lib Dems and the Conservatives here in the UK about the pace of change, the Tories saying the government has chosen inaction. Meanwhile, Samantha, the Liberal Democrats have accused the government of continuing to kick the can down the road.
[00:09:55] Speaker 1: Yeah, and we're just hearing there from one expert in Australia saying their social media ban anecdotally doesn't necessarily seem to be working. Kids are finding their ways around it and calling more for a coalition of countries to come together to work out how to talk to the tech companies. Does that kind of sit on the table there? Is that something that the government is looking at?
[00:10:17] Speaker 5: I think they're certainly communicating with other ministers and trying to learn lessons from other countries. I think part of what Keir Starmer realises is that people really, really care about this. So he needs to be seen to be taking it seriously and empathising. That's why he'll be speaking in the next few hours alongside the Mayor of London, in particular, highlighting that he understands, or arguing that he understands, because he has two teenagers himself. But one of the challenges facing countries across the world, of course, as you nod to, Samantha, is that arguably some of these big tech companies are more powerful than many nations and encouraging them to change their practice and put in safeguards can be really challenging, especially when those apps and programmes and websites are shifting themselves pretty fast.
[00:11:07] Speaker 1: Okay, Joe, thank you very much for now.
We’re Ready to Help
Call or Book a Meeting Now