A little over a year ago, social media companies were warned about how they were either protecting or not protecting their youngest users.
In a series of congressional hearings, executives from Facebook, TikTok, Snapchat, and Instagram explained how their platforms were directing young users to harmful content, harming their mental health and body image, especially among teenage girls. , faced tough questions from lawmakers about not having enough guardians. Control and protection to protect teens.
These hearings, after whistleblower Francis Haugen disclosed what became known as the “Facebook Papers” about Instagram’s impact on teens, prompted both companies to make changes. urged to swear. Since then, the four social networks have introduced more tools and parental control options with the aim of better protecting young users. Others have made changes to their algorithms, such as forcing teens to view less sensitive content or stepping up moderation efforts. But some lawmakers, social media experts and psychologists say the new solutions are still limited and more needs to be done.
Senator Richard Blumenthal, chairman of the Senate Consumer Protection Subcommittee, told CNN Business: “Trust in Big Tech is long gone and we need real rules to keep kids safe online.”
Michela Menting, director of digital security at market research firm ABI Research, agrees that social media platforms “provide little substance to address the problems they suffer.” According to her, their solution enables a range of parental controls aimed at filtering, blocking, and restricting access, as well as more passive options such as monitoring and monitoring tools that run in the background. Parents are held responsible for
New York City-based clinical psychologist Alexandra Hamlet recalls being invited to a roundtable about 18 months ago to discuss ways to improve Instagram, especially for younger users. “I don’t think many of our ideas have been implemented,” she said. Her social media platforms must continue to work on “improving parental her controls, protecting young people from targeted advertising, and removing objectively harmful content,” she added. rice field.
The social media company featured in this article either declined to comment or did not respond to requests for comment in response to criticism that more needs to be done to protect young users.
For now, parents need to learn how to use parental controls. Here’s a closer look at what parents can do to keep their kids safe online.
In the wake of the leaked document, Meta-owned Instagram suspended its release plans for the much-criticized version. instagram for kids We are under the age of 13 and are focused on making our core services safer for younger users.
Since then, educational base Publish resources, tips and articles from user safety experts for parents, tool This allows parents to see how much time their kids spend on Instagram and set time limits.parents can receive updates See which accounts your teen follows and which ones they follow, see and get notified if your child has updated their privacy and account settings. Parents can also see accounts blocked by their teens. The company also offers video tutorials on how to use the new monitoring tools.
Another feature allows users to leave the appFor example, after a certain amount of time has passed, it may suggest taking a deep breath, writing something down, checking your to-do list, or listening to a song. Instagram also said it takes a “more rigorous approach” to the content it recommends to teens, saying that if teens are engrossed in content for long periods of time, they may want to consider architecture or travel destinations. He said that he would actively guide people to various topics such as
Facebook’s Safety Center offers monitoring tools and resources, including articles and advice from leading experts. “Our vision for the Family Center is to ultimately allow parents and guardians to manage their teen’s entire metatechnology experience all from one place,” she said of Meta. spokeswoman Liza Crenshaw told CNN Business.
The hub also offers a guide to Meta’s VR parental monitoring tool from ConnectSafely, a non-profit dedicated to helping kids stay safe online, helping parents keep their teens safe. and discuss virtual reality. Parents can review accounts blocked by minors, access monitoring tools, authorize minors to download or purchase apps that are blocked by default based on their ratings, and allow minors to You can also block certain apps that may be inappropriate for your
In August, Snapchat was introduced. parent’s guide Give more insight into how teens use the app, including who parents have talked to in the past week (without revealing the content of those conversations) It is a hub for the purpose of Parents must create their own Snapchat account for her and teens must opt-in and give permission to use this feature.
This was Snapchat’s first formal foray into parental controls, which previously required teens to become mutual friends before they started communicating with each other, and banned them from having public profiles. There were some existing safety measures for younger users, such as the You can also use it to reveal your real-time location to friends and family. Meanwhile, the Friend Check Up tool encourages Snapchatters to go through their friends list and see if they’d like to stay in touch with a particular person.
Snap has previously worked on more features, such as allowing parents to see which new friends their teens have added and confidentially report accounts they may be interacting with. We are also working on tools that will give parents the option to notify young users when they report accounts or content.
The company told CNN Business it will continue to build its safety features and consider feedback from communities, policy makers, safety and mental health advocates, and other experts to improve its tools over time. rice field.
In July, TikTok announced a new way to filter out adult or “potentially problematic” videos. The new safeguards assigned a “maturity score” to videos detected as possibly containing mature or complex themes. It also published a tool that helps users determine how much time they spend on his TikTok. The tool allows users to set regular screen time breaks and provides a dashboard that details details such as how many times the app was opened and a breakdown of usage during the day and night.
The popular short-form video app now offers a family pairing hub that allows parents and teens to customize safety settings. Parents can also link their TikTok account to her teen’s app and set parental controls for her, such as how long the app can be used each day. Limit exposure to specific content. Determines whether teens can search for videos, hashtags, or live content. Whether the account is private or public. We also offer TikTok Guardian’s guide This highlights how parents can best protect their children on the platform.
In addition to parental controls, the app restricts access to some features to younger users, such as live and direct messages. A popup will also appear when her teen under 16 is ready to publish her first video, asking her to choose who can see the video. Push notifications are suppressed after 9pm for account users aged 13-15 and after 10pm for users aged 16-17.
The company plans to do more in the coming days and months to raise awareness of its parental control features.
Discord didn’t appear before the Senate last year, but the popular messaging platform faced criticism The difficulty of reporting problematic content and the ability of strangers to contact young users.
In response, the company recently safety centerwhere parents can find guidance on how to enable safety settings. Frequently Asked Questions About how Discord works chips On how to talk to teens about online safety. Some existing parental control tools include options to prevent minors from receiving friend requests and Direct Her messages from people they don’t know.
A minor may still connect with a stranger on a public server or private chat, but if that person is invited by someone else in the room, or a channel link is dropped in a public group the user has visited. is the case. By default, all users, including those between the ages of 13 and her 17, can receive friend invites from anyone on the same server. This will allow you to send her messages privately.
™ & © 2022 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.
https://www.mercurynews.com/2022/11/13/a-guide-to-parental-controls-on-social-media/ A Guide to Parental Controls on Social Media