TikTok diversifies its ‘For You’ feed by allowing users to select which topics they wish to avoid


TikTok announced this morning that it is changing the way its “For You” feed works. The “For You” feed is the main feed that is powered by the algorithmic recommendations of the short-form video app. Users’ engagement patterns within the app are taken into consideration by the algorithm, which the company has previously described in detail. However, the company acknowledges that having too much of one type of content can be “problematic.” The company is currently working on new technology that will interrupt “repetitive patterns” on its app, as well as a tool that will allow users to have a say in the matter by allowing them to choose which topics to avoid discussing on the platform.


According to the company, “too much of anything, whether it’s animals, fitness tips, or personal well-being journeys, gets in the way of the diverse discovery experience we’re aiming for.” Although users have expressed dissatisfaction with seeing too many cute puppy videos on TikTok, the company is diversifying its algorithm in response to regulators cracking down on technology and raising concerns about the harmful effects of unchecked recommendation algorithms, particularly on adolescent mental health.


Executives from Facebook and Instagram, as well as executives from other social media platforms, have been hauled before Congress and interrogated about how their apps have been directing users to potentially harmful content, such as pro-anorexia and eating disorder content, on their platforms.


It is specifically mentioned in the company’s announcement that there are certain types of videos that should not be viewed in excess, including those about “extreme dieting or fitness,” “sadness,” and “breakups.” A user expressing an interest in these types of videos may find them interesting, but the algorithm is not yet intelligent enough to recognize that repeatedly directing the user to more of what they are interested in is harmful. Of course, this is not a problem that only applies to TikTok. It is becoming increasingly clear that systems designed solely to increase user engagement through automated means will do so at the expense of the mental health of those who use them. In addition to the impact of these systems on children and adolescents, some studies have suggested that unchecked recommendation algorithms may also contribute to the radicalization of users who are drawn to extreme viewpoints.


User engagement with potentially harmful types of videos will also be investigated by TikTok, which plans to test new approaches to preventing users from being presented with a series of similar videos after they have watched and interacted with them. It did not, however, provide a comprehensive list of the types of videos that would be prohibited, only examples of what would be prohibited.


The company also announced that it is developing technology that will allow it to detect when a user’s “For You” page is lacking in diversity, according to the statement. TikTok stated that while the user may not be watching videos that violate the company’s terms of service, the company’s statement states that viewing “extremely limited types of content…could have a negative effect if it constitutes the majority of what someone watches, such as content about loneliness or weight loss.”


The implementation of a new feature that will allow users to direct the algorithm is another strategy that TikTok plans to implement. In order to exclude specific words or hashtags from their “For You” feed, they would need to enable this feature. This would be in addition to the existing tools on TikTok, such as the “Not Interested” button, which allows you to flag videos that you don’t like.


Please understand that the announcement made today is merely a roadmap for future changes and features, and not the actual launch of those features. More than anything, it is an attempt to deter regulators from conducting additional investigations into the company’s app and the potential harm it may cause. Most likely, the types of questions it was asked during both its own Congressional hearing and the hearings of its competitors had an impact on the company’s overall strategy.


TikTok warns that the actual implementation may take some time and iteration to perfect, so be patient.


Specifically, the company stated that it would “continue to examine ways to ensure that our system generates a diverse set of recommendations.”


By admin

Leave a Reply

Your email address will not be published.