Safeguarding 7 Minute Briefing Character AI

Page 1


7. Useful Links:

BBC News

Is Character AI Safe for Kids? What Parents Need to Know

Character.ai UK

Common Sense Media Risk Assessment

6. Final Thoughts

St Chris 7-Minute Safeguarding Briefing

1. Definition

“Character AI” broadly refers to platforms where users interact with conversational agents or “characters” driven by artificial intelligence (AI) in many cases large language models (LLMs) that mimic human-style chat or role-play. One of the best-known platforms is Character.AI.

2. Scope in the UK

The app Character.AI is accessible in the UK. Its terms of service state minimum ages: 13+ (and 16+ in some EU jurisdictions) for user registration. However, enforcement of age verification is weak or easily bypassed.

AI-character/chat platforms such as Character.AI are emerging quickly and can seem harmless or fun. But because they blur lines between human chat, role-play and AI simulation, they bring new safeguarding challenges. For children and young people, the risks of emotional attachment, exposure to inappropriate content, over-use, and lack of clear boundary between real and artificial are significant.

Our role as educators, parents and carers is to stay ahead of the curve: know what these tools are, talk openly about them, set clear boundaries, and monitor use. The aim isn’t necessarily to block all access (though for younger students that may be wise) but to ensure safe, supervised and informed use.

Crucially: If a student appears to be relying more on an AI chat than on human connections, or if their behaviour or mood changes linked to digital chat use, that is a safeguarding red-flag. Trust your instincts and act.

5. What can staff and parents do?

• Familiarise yourself with what the app is and how it works: try it yourself, view the terms of service, age restrictions and limitations.

• Have open conversations with your child: ask “What chat apps do you use? Who are you talking to? Is it a real person? Do you feel safe in the chat?”

• Set clear boundaries around device use: time limits, device usage in common areas (rather than unsupervised bedrooms at night), regular check-ins.

• Use device-level parental controls or monitoring software if appropriate: e.g., screen-time limits, app-install restrictions, usage summaries.

• Encourage offline relationships and support: ensure your child has trusted adults or peers they talk to, not just digital chats.

• If your child talks about unusual content, or shows emotional dependency on an app, take it seriously: don’t dismiss it as “just chatting”.

• Keep up to date with platform changes: note that Character.AI is changing its policies on under-18 chat access.

• If necessary, restrict access to the platform for younger children until they are older or until stronger safeguards are in place.

These systems allow users to create or select an AI “character” (for example a fictional persona, historical figure or user -designed avatar) and then hold a conversation, role-play, ask questions, or engage in creative/story mode.

Because these characters are not human, but are designed to feel humanlike (responding empathically, offering advice, role-playing relationships), there are special safeguarding considerations when children or young people use them.

4. Warning signs to be aware of:

• The student becomes secretive about device use: hides chats, clears history, uses device late at night or when unsupervised.

• A sudden shift in mood, behaviour or social withdrawal: the student may skip offline friendships in favour of chat with the AI character.

• Quotes or references emerging from the app: e.g., the child is quoting lines that look like character-role-play or linking to the platform.

• Over-identification with the AI character: the student treats the chat as a real person, seeks emotional support from it more than from human relationships.

• Excessive screen time, impact on sleep, homework, concentration.

• Discovery of unexpected or mature content the student is referencing or exposed to.

• Sharing of personal/personalised information via the chat (name, location, emotional issues, mental-health complaints) that should not be shared in that way.

• Signs of anxiety, dependency or upset if access is removed.

In October 2025 the company announced that under-18s will in future be banned from “open-ended chats” with its characters, following safety concerns and investigations.

UK online-safety and children’s-media organisations have flagged the platform as high risk for teens. For example, one review by Common Sense Media rated it as having “unacceptable” risk for under-18s.

Common Sense Media

In practice, many children and teenagers in the UK may already have access via personal devices, often without parental supervision. Awareness of the platform among staff and parents is still relatively low.

3. How Students Are Targeted and Affected

• Emotional/psychological risks: Because the AI character appears responsive and supportive, young users may form emotional attachments or rely on the character for support. This can blur the lines between real human contact and AI simulation.

• Content risk: Some AI conversations or user-created characters may contain inappropriate themes (adult content, self-harm prompts, explicit role-play).

• Overuse / screen-time risks: The interactive nature means students may spend long periods chatting, which can affect sleep, attention in class, offline socialising.

• Mis-information or poor advice: AI characters are not trained as professionals; they may give inaccurate or harmful advice (emotional, medical, behavioural) and children may not have the maturity to evaluate that.

• Privacy and data: Chats may be logged, analysed, the child may share personal data inadvertently (location, identity, mental-health details) with the platform or via the character.

• Peer / role-play risk: In some cases the character may facilitate role-play which is sexualised or manipulative, or the child may be directed to external links or behaviours.

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.
Safeguarding 7 Minute Briefing Character AI by St Christopher School - Issuu