Service pricing & terms
API pricing & terms
Estimate by minutes & options
Earn lifetime discounts
Savings for students & educators
Savings for charities & NGOs
Savings for climate orgs
Speed up research · 10% education discount
Compliant and confidential
Court‑ready transcripts
HIPAA‑compliant accuracy
Expand capacity and revenue
Evidence‑ready transcripts
Streamline team communications
Turn sessions into insights
Ready‑to‑publish transcripts
Education
Education
Our story & mission.
Meet the people behind GoTranscript
Services across 140+ languages
How‑to guides & industry insights.
Open roles & culture.
Security & compliance overview.
Customer success stories.
Integrations, resellers & affiliates.
Find answers and get support, 24/7.
Schedule a call, confirmation within 24 hours.
Speak with a specialist about pricing and solutions.
High volume, API, labeling for AI.
Help with order status, changes, or billing.
Ask anything about GoTranscript.
Explore open roles and apply.
PO setup, Net‑30 terms, and .edu discounts.
30,000+ Professional Language Experts Ready to Help. Expertise in a variety of Niches.
Unmatched expertise at affordable rates tailored for your needs. Our services empower you to boost your productivity.
Service pricing & terms
API pricing & terms
Estimate by minutes & options
Earn lifetime discounts
Savings for students & educators
Savings for charities & NGOs
Savings for climate orgs
GoTranscript is the chosen service for top media organizations, universities, and Fortune 50 companies.
Speed up research · 10% education discount
Compliant and confidential
Court‑ready transcripts
HIPAA‑compliant accuracy
Expand capacity and revenue
Streamline team communications
One of the Largest Online Transcription and Translation Agencies
in the World.
Founded in 2005.
Our story & mission.
Meet the people behind GoTranscript
Services across 140+ languages
How‑to guides & industry insights
Open roles & culture.
Security & compliance overview.
Customer success stories
Integrations, resellers & affiliates.
We're with you from start to finish, whether you're a first-time user or a long-time client.
Give Support a Call
+1 (831) 222-8398
Find answers and get support, 24/7.
Schedule a call, confirmation within 24 hours.
Speak with a specialist about pricing and solutions.
High volume, API, labeling for AI.
Help with order status, changes, or billing.
Ask anything about GoTranscript.
Explore open roles and apply.
PO setup, Net‑30 terms, and .edu discounts.
[00:00:00] Speaker 1: Tonight, the mother of one of Elon Musk's children is speaking out and sounding the alarm about his AI tool named Grok. Ashley St. Clair says Grok was used to create deep fake sexual images of her, including photos of her as a child. And St. Clair is not the only one who is reporting these types of disturbing images. It's impacting hundreds of thousands of women worldwide after people started flooding Grok with requests to take photos of mostly everyday women and put them in bikinis or remove their clothing. Musk is now facing growing international pressure to respond. Grok was just blocked in Indonesia and Malaysia. The United Kingdom is threatening to ban the social media platform. California's attorney general just announcing an investigation. So far, though, Musk is taking no responsibility, posting today that he's, quote, not aware of any naked underage images generated by Grok, literally zero. Ashley St. Clair is out front now. And Ashley, I really appreciate your taking the time, to come on and talk about this, because this is crucial for women, for children, for families around the world. So Musk says Grok generated zero naked underage images. But you say it undressed you in several photos, including one of when you were 14 years old. That's correct.
[00:01:14] Speaker 2: And what he said is deceptive at best, because while maybe there weren't actual nude images, it was pretty close to it. And the images that I saw, not only of myself, but of I don't know, I don't know, I don't know, I don't know, I don't know, I don't know, I don't know, who's children who were undressed and covered in various fluids. The abuse was so widespread and so horrific. And it's still allowed to happen. They just released restrictions that are based on where it's illegal.
[00:01:39] Speaker 1: OK, so Musk has promoted and touted what he's called Grok's spicy mode. OK, one point he shared a post that said Grok can put it on everything. And it was a toaster. So that's we can see that the bikini on the toaster there. And his statement today, as I said, he obviously, he said nothing about naked underage. He hadn't seen that. And then he said, obviously, Grok does not spontaneously generate images. It does so only according to user requests. When asked to generate images, it will refuse to produce anything illegal. What do you see between those lines, Ashley?
[00:02:14] Speaker 2: That's not what I saw at all. Images I saw do seem to be illegal. And even them coming out and now trying to place safeguards afterwards seems like an admission that they know that there has been an issue. That it has been creating nonconsensual sexually explicit images of women and children. He is saying that people are making this up. And meanwhile, Ireland is probing over 200 cases of child sexual abuse material produced by Grok. 200? 200. And that's just in Ireland? And that's just Ireland. And it's not only that, it's that he is placing the blame on the victims so that you if this happens to you, you have to go to your local law enforcement and take their resources and see if they can find this anonymous account. Instead of just turning the faucet off. This is what's wrong because they're handing a loaded gun to these people, watching them shoot everyone, and then blaming them for pulling the trigger.
[00:03:08] Speaker 1: And I mean what you're describing, I mean I obviously haven't seen the images, but you're talking about images of children covered in fluids. I don't need to say anymore, but that's deeply, deeply disturbing what you're describing to even hear that. Now look, you are choosing to speak out, okay, because you have a voice on this. Because people can hear you. People listen to you. But you faced backlash since you chose to do that. You were banned from Twitter premium, right? So you say there's been some things that have happened to you in terms of that. You've had harassment. You've had attacks on x, right? The platform where Grok lives, or whatever the right word may be. Why is it so important to you, Ashley, to take this extra step to put your face out here like you are now?
[00:03:48] Speaker 2: Because it's not just about me. It's about building systems and AI systems which can produce at scale. and abuse women and children without repercussions. And there's really no consequences for what's happening right now. They are not taking any measures to stop this behavior at scale. They are saying, we're going to make it illegal where it's illegal. That is absent all morality. And guess what? If you have to add safety after harm, that is not safety at all. That is simply damage control. And that's what they're doing right now.
[00:04:18] Speaker 1: And also, I guess in this context, it is disturbing when you think about, you know, the post that Elon Musk had originally put up about Grok can put a bikini on everything. Now, he was talking about a toaster. But sort of making light of all of it, given where it's gone, you would think it would be something you'd want to seize responsibility and say, I'm outraged, we're going to stop this. That's not been the response.
[00:04:41] Speaker 2: They're not. And people need to start asking questions not only to X and Elon, but also to the investors. Because amidst the scandal, they raised $20 billion. All these women were in my DMs telling me they feel hopeless, they are distressed, and they don't know how to get certain images removed. Because you can't even get them removed.
[00:05:00] Speaker 1: Once it lives, it lives. All right. So Elon Musk has posted on X this week about you. Obviously, you're out talking about this and trying to raise awareness for this. He posted about you, I'll be filing for full custody today, given her statements implying that she might transition a one-year-old boy. Now she's, I believe, talking about your child. Yeah. So obviously, you did not imply that in the post that you posted that he's responding to, which I want to read to everybody so they can hear it for themselves. You are apologizing for past comments that you'd made about the transgender community. You wrote, I feel immense guilt for my role and even more guilt that things I have said in the past may have caused my son's sister more pain. Your son's sister is another one of Musk's children who has transitioned. What's your response when that's the post this week?
[00:05:49] Speaker 2: I'm not at liberty to discuss it further. But I think anyone with more than a third grade reading comprehension level knows what I was saying there.
[00:05:57] Speaker 1: Yeah. I mean, you, I guess, so many questions to ask you. And I know that you're not at liberty to talk about things like custody and that. But I guess I'm curious, in all of this, how hard it must be for you. I mean, do you want your child to have a relationship with his father? I cannot discuss that right now. Right. You can't even talk about that. Thank you. Yeah. And I guess in the context of this, what do you do now?
[00:06:25] Speaker 2: I'd like to keep the focus on this issue at hand. Because I think there's a lot of people, especially with very tabloidy tweets, that are trying to distract from this issue. Because it is a big issue. They are facing immense backlash from multiple countries. And they are trying to move something to the second page of Google because they want it to go away. And I don't think we should make it go away. I think this needs to stay center focus until there is regulations and protections and safety for people being abused by these very new platforms, these new technologies. And you can't brag and take credit for being a disruptor in an industry and then absolve yourself of all liability once this damage and harm is done.
[00:07:04] Speaker 1: All right. Well, Ashley, thank you very much. I appreciate your time. Thank you. And your speaking out.
We’re Ready to Help
Call or Book a Meeting Now