Google is testing out a new “Gemini for Kids” AI assistant aimed specifically at children under 13, according to a new report. The news comes as the Children’s Commissioner for England, Dame Rachel de Souza warns against children turning to online chatbots for answers instead of their parents.
April 5 Update below: Child safety concerns intensify as U.S. senators push for information from AI chatbot makers
Children are already using chatbots for advice: According to the Commissioner, “If we want children to experience the vivid technicolour of life, the joy of childhood, the innocence of youth, we have to prove that we will respond more quickly to them than Chat GPT.”
Google’s plans to make a child-friendly version of Gemini could therefore be seen as a responsible alternative to preventing kids from using AI-powered assistants altogether by putting parents in control.
The Switch To Google Gemini Is Unavoidable
Google is in the process of replacing its original (non-AI) Google Assistant with Gemini and already has policies in place that allow parents to restrict their children’s use of the original assistant. However, when the original assistant is gone, there will be no alternative but to use its AI-powered replacement. This means younger users will need similar protections in place as they switch over to Gemini.
Gemini is, however, radically different to Google Assistant and considerably more powerful. Gemini functions more like chatting with a real person than a pre-programed robot, so the potential for misinformation and inappropriate content is considerably higher. This means Google must put extra safeguards in place.
As detailed below, proposed changes to the Google app on Android include an explicit warning for children, stating “Gemini isn’t human and can make mistakes, including about people, so double-check it.” The crucial question here is whether children are equipped with the necessary critical thinking skills to check Gemini’s responses. This was less of a concern with Google Assistant.
Google Prepares Gemini AI Chatbot Specifically For Children
Google’s plans for “Gemini for Kid Users” were recently discovered by Android app specialist AssembleDebug working with Android Authority. The report found inactive code within the latest version of the Google app for Android containing the following text (reformatted for clarity):
- Assistant_scrappy_welcome_screen_title_for_kid_users
- — Switch to Gemini from Google Assistant
- Assistant_welcome_screen_description_for_kid_users
- — Create stories, ask questions, get homework help, and more.
- Assistant_welcome_screen_footer_for_kid_users
- — Google Terms apply. Google will process your data as described in the Google Privacy Policy and the Gemini Apps Privacy Notice. Gemini isn’t human and can make mistakes, including about people, so double-check it.
Gemini For Kids: A Vital First Step
As “Gemini for Kids” isn’t yet released to the public, it’s too early to say how effective, or otherwise, Google will be in implementing these important safeguards. However, giving parents control is an essential first step and integrating these safeguards with Google’s established system of parental controls will give Gemini an advantage over competing chatbots, such as ChatGPT, which currently lacks this capability.
The increasing prevalence of AI means it likely won’t be possible to keep kids away from it entirely, but Google’s efforts should be seen as a positive step in avoiding some of the scenarios highlighted by Dame Rachel de Souza.
You can find the full Gemini apps privacy notice in Google’s Gemini Apps Privacy Hub.
Google Gemini For Kids: Growing Concerns
April 5 Update: Google’s planned “Gemini for Kids” update proves both timely and essential in the light of recent developments.
The need for Google’s kid-friendly Gemini update intensifies as United States senators Alex Padilla and Peter Welch formally request that AI chatbot sites Character.ai and Replika.com provide information on their safety measures, according to CNN, writing:
“We write to express our concerns regarding the mental health and safety risks posed to young users of character- and persona-based AI chatbot and companion apps.”
Their demands come after Character.aI came under scrutiny from parents who believe their children have come to serious harm following use of the chatbot, says CNN.
As a general-purpose AI assistant, Google’s Gemini is in a different category from products like Character.ai and Replica, which are designed to imitate personalized, humanlike characters and companions. However, safeguards for children remain vital as adults and children alike are exposed to increasing amounts of AI-generated information that places responsibility on the user to perform their own diligent fact checking — something children are ill-equipped to do.
Note that Character.ai expressly prohibits use by minors under 13 years of age (under 16 years of age for EU citizens) and has since removed offending chatbots from the platform. The Replika app has an age restriction of 18+ years. However, both apps currently show only a Parental Guidance rating in the Android Play Store, with over 10 million downloads each.
Follow @paul_monckton on Instagram.