Skip to main content

TikTok became the latest tech company to implement greater protections for minors on its platform in the wake of increased regulatory scrutiny. The company says it will introduce a series of product changes for teen users ages 13 to 17, with the goal of making their TikTok experience more private, safer and less addictive. TikTok’s news follows similar moves announced recently by other tech companies that cater to teens, including Google, YouTube and Instagram.

The changes TikTok plans to implement in the coming months will address in-app messaging, the public nature of users’ videos, default download settings for videos and the use of push notifications.

This expands on changes to privacy settings and defaults for users under the age of 18 that TikTok incorporated in January. At that time, TikTok introduced stricter rules for 13- to 15-year-olds and slightly more permissive settings for 16- to 17-year-old users focused on default account types, comments and use of TikTok’s interactive features, such as Stitch and Duet.

Now, TikTok says new 16 to 17 users will have their Direct Message settings set to “No one” by default and existing users will be asked to review and confirm their settings the next time they use the messaging feature.

However, the company will not prevent teens from using Direct Messages, but they will have to make a more explicit decision to do so.

The app will now also display a pop-up message when a teen under 16 posts their first video that asks them to choose who can see their content, either just followers or just themselves. (The “Everyone” option is disabled). Previously, TikTok had limited who would come across the accounts of teens under 16, which would reduce the visibility of their content to only followers they approved, when using the default settings. In this case, it more directly pushes teens to choose how public they want their content to be and they have to decide for the video to be posted.

TikTok also said it will disable Duet and Stitch for users under 16, but this is not new, it was part of the privacy changes implemented in January.

Separately, teens 16 to 17 will now be asked to make a decision about whether or not others can download their videos. While TikTok will not prevent teens from creating their downloadable content, a box will appear asking them to reconfirm their choice, while reminding them that this means videos can be shared to other platforms. Meanwhile, downloads remain disabled for users 13 to 15.

The final change is perhaps the most interesting because it’s something neither YouTube nor Instagram introduced: TikTok will limit push notifications.

Younger teens 13 to 15 will not receive push notifications after 9 PM, while 16 to 17-year-olds will not receive any notifications after 10 PM. Notifications resume the following morning from 8 am.

This part of the update reflects TikTok’s global mindset and the Chinese roots of its parent company. Today, China is in the midst of a technology offensive, encompassing anti-trust regulations, data security practices, technology business models and even social mores, such as the addictive nature of video games, which state media equated to a drug like “opium.”

TikTok was also labeled as one of the most addictive social apps on the market, thanks to its advanced personalization technology, interactive design and simple interface, and psychological tricks that activate the pleasure center of users’ brains. The company already inserts “take a break” videos within its main feed, because users were losing hours to scroll through the app. Its decision to limit notifications is another recognition of the app’s ability to lead users, and particularly younger users, to create negative digital media habits. By preventing notifications during certain hours, TikTok can signal to potential regulators a feature that demonstrates it is doing something to address that problem.

The changes come at a time when there is a broader shift in the industry in terms of how technology companies cater to their younger users, concerns about screen time, addiction, online abuse, data collection, privacy and more coming to light.

In the U.S., Congress pressured companies to do more to protect younger users from the most harmful and negative impacts of technology.

A key piece of legislation in the works is an update to the decades-old children’s privacy law, COPPA (Children’s Online Privacy Protection Act). The new bill would expand COPPA to include teens under 18 and would prevent technology companies from using targeted advertising, among other things.

As a result, tech companies revamped their products to make the teen experience more private and increased protections, including how advertisers collect and use teen data.

TikTok was slow to take action on protections for teens as a result of the U.S. Federal Trade Commission’s multi-million dollar fine. For violations of children’s privacy laws in a government agency crackdown that later spread to YouTube. Beyond privacy changes earlier this year and reminders to take a break, TikTok also led the market in bundling parental controls within its app with the Family Pairing feature. The company also offers other resources for parents, including safety education videos and parenting guides. And it brought in outside experts to advise on policy creation, with the introduction of the TikTok Content Advisory Board.

“Our priority is to ensure that teens on TikTok have a safe and age-appropriate experience while creating and sharing on our platform,” Tracy Elizabeth, TikTok’s Global Underage Safety Policy Lead, said in a statement about the changes. “This announcement builds on our industry-leading efforts to make all accounts under the age of 16 private by default, age-restricted features like direct messaging and empowering parents with Family Pairing,” she said.

TikTok said the features will be rolled out globally, but could not share a specific time period.

Source

Leave a Reply