Smartphone apps have the power, now more than ever. From setting up dates to playing games with people around the world, users are more connected.
But when it comes to kids, how far is too far for these apps?
According to a report published by research firm Influence Central, the average kid receives their first smart phone at 10 years old. The report also found the average child gets their first social media account at 11 years old.
Two new platforms are testing those limits.
One of those apps is Roblox. It’s a gaming app with user created content, so players can sign up and immerse themselves into countless evolving worlds.
The concern comes from the chat feature. As NBC10’s Keith Jones discovered, it only took a couple minutes to make a username, set a password, and put a random birthday in. Just like that, he had the capability to message the 50 million plus people that use Roblox every month.
The issue becomes more complex with the rise of Yellow. Designed to help users find and make new friends on Snapchat, the app has been criticized as being “Tinder for kids.”
Like the popular dating app, kids swipe right or left to decide if they want to “match” with someone and start a new friendship. Yellow is rated 17+ in the app store, but accepts any age above 13.
"It leaves them out to predators. Someone could say that they're their age, I'm 11, I'm 12, they could be 35-years-old," Anthony Carter Sr. said.
Carter works in cybersecurity, and his 11-year-old son Anthony is debating whether to use the app. He says it’s scary to know that everyone’s ages and locations are out in the open.
"You could like talk to them and it also shows their age and where they live at, their area,” Anthony said.
NBC10 talked to marine intelligence officer Kevin Hyde, who used to work for the NSA. He says when it comes to kids using these apps, parents need to be involved.
"Your child should have the expectation that you're going to supervise their activity, that they're going to ask you,” Hyde said. “They shouldn't have an expectation of privacy.”
NBC10 reached out to Yellow with questions about the safety of their app. They stressed that safety is their “top priority” and are exploring ways to prevent people from lying about their age.
Their safety efforts also include using software to detect fake pictures and profiles, and they are using a team of human moderators to review reported content. Their full statement can be found at the bottom of the page.
Cell phone carriers are also playing a role in online safety. Every major carrier has a plan, at an extra cost, which helps parents monitor or limit their child’s activity.
How can parents step in? Experts say that open communication is vital, but turning off the device’s real time geo-location is one way to preserve privacy. For parents still feeling uneasy, experts also suggest that they delete the app.
Yellow's Full Statement
I'm Marc-Antoine Durand, COO at Yellow, in charge of safety.
As Yellow is a social media to make online friends with more than 10 million users, safety is a topic that really matters to us.
We take our users' safety very seriously and are constantly developing R&D to always enhance our detection of wrongdoings. So this is definitely a topic that matters to us at Yellow !
Yellow is a member of the ICT Coalition for Children Online which helps us shape our approach to these important issues.
Here are some points we would like to explain, following your questions about fake identity. We are definitely willing to answer other questions you could have about this topic!
Users are required to register for the service and their mobile number is recorded and verified as part of the registration process. When users register for the service they receive a clear statement of the Community Rules that inappropriate pictures or videos will not be tolerated and users are encouraged to report any suspicious or abusive behavior or concerns.
Regular Alerts to users about ‘faking who they are'
Users receive an alert at regular intervals reminding them if they are sharing content that is inappropriate or if they are faking who they are, such as being younger or older their account will be removed from Yellow. Yellow also created a software technology to detect fake pictures.
Age of Users
Like all other social media services, Yellow relies users giving their real age, this is critical for the tools and processes social media companies like Yellow puts in place to be effective. Yellow is following industry standards already set in this area.
Minors cannot see adults on Yellow and adult cannot see minors. There are two separate apps.
Verifying Users changing profile to ‘over or younger than 18 years
Any user wishing to change his or her age on their profile after registering with Yellow have to send official proof of their ID for verification to the company, if they are under 18 years and want to change it to being18 and over, and similarly for anyone registered as 18 years wanting to change it to being under 18 years. This supports keeping fake profiles off Yellow and users pretending to be someone they are not.
Reporting and Blocking
Yellow like all social media sites such as Facebook and Twitter relies on users reporting concerns or difficulties they are experiencing.
A simple and effective reporting abuse feature is embedded in every profile for users to report abuse or concerns directly to Yellow and are responded to. Users can also ‘unfriend’ other users that are bothering them in any way like other social media services.
When registering for the service receive a clear statement of the community rules that inappropriate pictures or videos will not be tolerated and are encouraged to report any suspicious or abusive behavior to Yellow.
Responding to parents' concerns
If a parent contacts Yellow with concerns through the email address in the settings feature of the app firstname.lastname@example.org or website www.yellw.co they will receive a dedicated contact form, so they can describe any concerns they have and these reports are prioritized by the Yellow team.
We answer every parents in less than 24 hours.
Yellow uses technology to detect fake images. A special form has been created to allow users to inform Yellow that someone impersonated someone else on the app. Users can also report fake profiles when using the app.
Every reported content is analyzed by the Yellow moderation team and by their software.
Moderation - technical
Yellow uses software to automatically to block inappropriate contents. Yellow has also created a database of fake pictures and created a software to detect these pictures in profiles and block them.
Moderation - Human
Yellow has a team of human moderators who review content reported but also user profiles for suspicious behavior 24/7, and particularly those that maybe fake where users are 'not who they say they are’. And users who do create fake profiles or share inappropriate content are blocked and removed from Yellow.
Yellow Safety Centre - available in the settings section on the App.
Yellow like other social media services and in line with good practice in this area is developing a safety centre with clear and simply information for users, parents and educators. Also in development is a law enforcement guide to help with their data requests when investigating crime
Please find attached a number of screen shots to illustrate the App and accessing the safety tools and safety centre which is done directly through the settings feature on the app, similar to other App providers such as Instagram.