Popular social media network, TikTok is set to launch a new set of parental controls to users globally. Referred to as “Family Pairing”, the new features will allow parents to set controls on Restricted Mode, Screen Time Management and Direct Messages for their teenage children. Additionally, TikTok will disable direct messaging for users under 16 years of age in all markets.
The worldwide rollout of “Family Pairing” will take place in the coming months.
In February, a similar set of features was launched in the U.K keeping in mind the European laws and regulations. The features were known as “Family Safety Mode”.
To use the new controls, parents of a teenage user aged 13 and older will have to set up their own TikTok account and then link their account with their child’s. This will allow the parent to turn on or off who the child can direct message with, how long the child is able to use the TikTok app. The parents can even turn on TikTok’s “restricted mode” for the child’s account to limit inappropriate content.
Though the “restricted mode” is not a well-explained feature, TikTok’s basic intention is to flag inappropriate videos children come across. However, parents must note that this feature is not similar to setting parental controls on Netflix or restricting what a child can download on their phone from the App Store. In short, some adult material or inappropriate content could slip through.
As of now, TikTok users can set for themselves both Screen Time Management and Restricted Mode via the app’s Digital Wellbeing section. However, with Family Pairing, parents will be able to set these controls for their child.
TikTok has offered several controls on Direct Messaging previously that allowed users to restrict the audience, restrict messages to only approved followers or disable direct messages completely. Additionally, TikTok blocks images and videos in messages to avoid any other issues as well.
On the other hand, Family Pairing may allow parents to choose to what extent teens can even message privately on the platform.
TikTok has decided to disable Direct Messages for all registered account under the age of 16 automatically on April 30.
The new changes are likely to give parents far more control over their child’s use of TikTok compared to any other social media app( except those designed exclusively with children and family in mind). However, the parental controls are only a subset of the controls people can choose to set for themselves. For instance, users can choose to turn off comments, control who can duet with them, make their accounts private, etc.
These options may help relieve some parents’ stress about how addictive the TikTok app has become. The addiction of teen users is so much that TikTok itself launched its own in-app PSA to encourage users to “take a break” from their phone.
In addition, TikTok offers a host of other resources for parents such as educational safety videos and parental guides.
TikTok’s decision to launch screen time-limiting features and other restrictions amid the COVID pandemic may be surprising to many especially when teens are stuck at home with nothing much to do. However, with families at home together, it may be the best time to have a conversation about how much is too much for social media.
According to TikTok director of Trust & Safety, Jeff Collins, more families are turning to TikTok to stay informed, entertained and connected during the social distancing phase. He adds that TikTok is providing families with joint tools to express their creativity, show support for their communities and express their creativity. TikTok is helping families to navigate digital space together while ensuring a safe experience.
The changes come after increased scrutiny by government regulators of TikTok owned ByteDance and a fine of $5.7 million (in 2019) leveraged against Musical.ly by the FTC for violating COPPA( U.S children’s privacy law).
In response, TikTok came up with the introduction of the TikTok Content Advisory Council, the publication of its first Transparency Report, the release of new Community Guidelines, the hiring of global General Counsel, expansion of its Trust & Safety hubs in Ireland, Singapore and US and launch of its Transparency Center open to experts who may want to review its moderation practices.