Publisher's note: This informational nugget was sent to me by Ben Shapiro, who represents the Daily Wire, and since this is one of the most topical news events, it should be published on BCN.
The author of this post is Amanda Prestigiacomo.
A "fake employee" for the popular kids app Gacha Life reportedly instructed a 10-year-old girl to send a nude photo of her "bare chest"
to verify her sex and age, threatening that she'd be "permanently banned"
from the app if she didn't comply.
According to screenshots provided
to U.K.'s The Sun, Cahla McGarry, 10, was sent private messages making the request.
"Welcome to this amino! My name is Mandy and i [sic] work with amino!"
the first message from the app says. "So if you didn't know. This is a safe space for young girls. We require users to be 15 and younger. If you fit these requirements you can be here."
"We also have to make sure that all members here are girls. To verify this I will need from you a photo of your bare chest (with a bra on if you feel uncomfortable) and your age. This is just an extra security feature but all members must do this,"
the message continued.
"Users that refuse to do this will be permanently banned,"
a "fake employee"
from the app threatened.
The 10-year-old alerted her mother, 41-year-old Nicola McGarry, to the messages.
"I was scared. It felt as if they had intruded into our home,"
said Nicola. "Although it's probably someone thousands of miles away, it really felt like an intrusion into our privacy."
"The schoolgirl loves the game so much, she wanted to chat to other players and share her character creations so she joined a Gacha Life forum on the Amino app,"
noted The Sun. "Amino is an app dedicated to communities, chat, forums, and fan groups so people with shared interests can connect."
Amino Apps condemned the incident and said in a statement that they have "zero-tolerance for any type of inappropriate contact with minors."
"Keeping Amino safe is our top priority. We have zero-tolerance for any type of inappropriate contact with minors,"
a spokesperson for Amino Apps said. "Our team works 24/7 across seven supported languages to remove content and users in violation of our policies."
"We automatically remove nude imagery in order to prevent inappropriate content from being sent or received here, and we deploy the most advanced technologies available to assist with the enforcement of these rules,"
the statement continued. "While we are unable to provide any information about this particular case, we report all incidents of child sexual exploitation to the National Center for Missing and Exploited Children (NCMEC), and we fully cooperate with law enforcement agencies if we receive requests for information."
"We are absolutely dedicated to maintaining the trust and safety of the Amino platform,"
the spokesperson added.
Technological risks to children were highlighted in February by the media after a mother found a disturbing video on YouTube Kids teaching children how to kill themselves.
"The suicide instructions are sandwiched between clips from the popular Nintendo game Splatoon and delivered by a man speaking in front of what appears to be a green screen - an apparent effort to have him blend in with the rest of the animated video," reported
CBS News. "'Remember kids, sideways for attention, longways for results,' the man says, miming cutting motions on his forearm. 'End it.'"