Science and Tech

A Guide to Parental Controls on Social Media

( Business) — A little over a year ago, social media companies were called out for how they protect, or fail to protect, their youngest users.

In a series of congressional hearings, executives from Facebook, TikTok, Snapchat and Instagram faced harsh questions from lawmakers about how their platforms can steer younger users to harmful content, harm mental health and body image. (especially among adolescent girls), and lacking sufficient parental controls and safeguards to protect adolescents.

Those hearings, which followed whistleblower Frances Haugen’s revelations about Instagram’s impact on teens in what’s known as the “Facebook Papers,” prompted companies to commit to change. Since then, all four social networks have introduced more parental control tools and options to better protect younger users. Some have also introduced changes to their algorithms, such as defaulting to less sensitive content for teens and increasing their moderation efforts. But some lawmakers, social media experts and psychologists say new solutions remain limited and more needs to be done.

“More than a year after the Facebook Papers dramatically exposed abuses by Big Tech, social media companies have taken only small, slow steps to clean up their act,” Senator Richard Blumenthal, who chairs the committee, told Business. Senate consumer protection subcommittee. “Trust in big tech is long gone and we need real rules to keep kids safe online.”

Michela Menting, Director of Digital Security at market research firm ABI Research, agreed that social media platforms “offer very little in the way of substance to counteract the ills incurred by their platforms.” Her solutions, she said, put the responsibility on guardians to activate various parental controls, such as those aimed at filtering, blocking and restricting access, and more passive options, such as monitoring and surveillance tools that run in the background. flat.

The impact of social networks on the youngest 1:52

Alexandra Hamlet, a New York-based clinical psychologist, recalls being invited to a panel discussion about 18 months ago to discuss ways to improve Instagram, particularly for younger users. “I don’t see many of our ideas being put into practice,” she said. Social media platforms, she added, must work to “continue to improve parental controls, protect young people from targeted advertising and remove objectively harmful content.”

The social media companies mentioned in this article declined to comment or did not respond to a request for comment on criticism that more needs to be done to protect young users.

For now, guardians need to learn how to use parental controls, while being aware that teens can often circumvent those tools. Here’s what parents can do to keep their kids safe online.

instagram

Following the fallout from the leaked documents, Meta-owned Instagram put on hold its much-criticized plan to launch a version of Instagram for kids under 13 and focused on making its core service safer for young users.

Since then, he has introduced a school for parents with resources, tips and articles from user safety experts, and has launched a tool which allows guardians to see how much time their children spend on Instagram and set time limits. Parents can also receive updates about the accounts your teens follow and the accounts that follow them, and see and be notified if your teen makes an update to their privacy and account settings. Parents can also see which accounts their children have blocked. In addition, the company offers video tutorials on how to use the new monitoring tools.

Another feature encourages users to take a break from the app, for example by suggesting they take a deep breath, write something, go through a to-do list or listen to a song, after a predetermined amount of time. Instagram has also said it is taking a “tougher approach” to the content it recommends to teens and will actively push them toward other topics, such as architecture and travel destinations, if they’ve lingered on any type of content for a while. too much time.

Facebook

The Facebook Safety Center offers monitoring tools and resources, including articles and advice from leading experts. “Our vision for the Family Center is to enable parents and guardians to help their teens manage their experiences on Meta technologies, all from one place,” Liza Crenshaw, a spokesperson for Meta, told Business.

The center also offers a guide to Meta’s VR Parental Monitoring Tools, produced by ConnectSafely, a non-profit organization whose goal is to help kids stay safe online, to help parents talk about virtual reality with your kids. Guardians can see which accounts their teen has blocked and access monitoring tools, as well as approve their teen’s download or purchase of an app that is locked by default based on its rating, or block apps that may be inappropriate for your teens.

Snapchat

In August, Snapchat introduced a guide for parents and a clearinghouse for guardians to learn how their teens use the app, including who they’ve talked to in the past week (without divulging the content of those conversations). To use this feature, parents need to create their own Snapchat account, and teens need to accept it and give their permission.

Although this is Snapchat’s first formal foray into parental controls, it already had some safety measures in place for young users, such as requiring teens to be mutual friends before they can start communicating with each other and banning profiles. public. Teen users have Snap Map’s location sharing tool turned off by default, but they can also use it to reveal their real-time location to a friend or family member even while the app is closed as a security measure. On the other hand, the Friend Check Up tool encourages Snapchat users to check their friend lists and make sure they still want to stay in touch with certain people.

Snap previously said it is working on more features, such as the ability for parents to see what new friends their teens have added and allowing them to confidentially report accounts that may be interacting with their kids. He is also working on a tool to give younger users the option to notify their parents when they report an account or content.

The company told Business that it will continue to develop its security features and will take into account feedback from the community, policy makers, security and mental health advocates, and other experts to improve the tools over time.

TikTok

In July, TikTok ad new ways to filter adult or “potentially problematic” videos. The new security measures assign a “maturity score” to videos that are detected as potentially having mature or complex themes. A tool has also been launched to help users decide how much time they want to spend on TikTok. The tool allows users to set regular screen breaks, and offers a dashboard detailing the number of times the app was opened, a breakdown of daytime and nighttime usage, and more.

The popular short video app currently offers a family matchmaking center, allowing parents and teens to customize their security settings. Parents can also link their TikTok account to their teens’ app and set parental controls, such as how long they can spend in the app each day; restrict exposure to certain content; decide whether teens can search for videos, hashtags, or live content; and if your account is private or public. TikTok also offers its Tutor Guide which highlights how parents can better protect their children on the platform.

In addition to parental controls, the app restricts access to some features to younger users, such as Live and direct messages. A popup also appears when teens under 16 are ready to post their first video, asking them to choose who can see it. Push notifications are restricted after 9pm for account users ages 13-15, and after 10pm for users ages 16-17.

The company said it will make more efforts to publicize its parental control features in the coming days and months.

discord

Discord did not appear before the Senate last year, but the popular messaging platform has faced criticism due to the difficulty of reporting problematic content and the possibility of strangers contacting young users.

In response, the company has recently renewed its Security Centerwhere parents can find guidance on how to activate security settings, frequent questions about the operation of Discord and tips on how to talk about online safety with teens. Some of the existing parental control tools include an option to prohibit a minor from receiving a friend request or direct message from someone they don’t know.

Even so, it is possible for minors to connect with strangers on public servers or in private chats if the person was invited by someone else in the room or if the channel link is dropped in a public group that the user signed in to. By default, all users – including those between the ages of 13 and 17 – can receive friend invites from anyone on the same server, allowing them to send private messages.



Source link