Content media

New Study Highlights Failure of Instagram Policies to Restrict Drug-Related Content from Young Teens / Digital Information World

Drugs and teens don’t mix. The introduction of social media into their lives empowers them to access drug-related content. The scrutiny from parents of young teens has led to some pressure on these social media platforms. However, despite the security feature, it seems that teens can easily access content, including drugs, on Instagram.

Tech Transparency Project has conducted independent research on Instagram that reveals how certain accounts dedicated to producing drug-related content are available on the platform. Their main motivation is to sell drugs, and some have even sold MDMA, a party drug commonly known as ecstasy. The platform introduced security features to limit access to drug-related content; However, according to research, finding any possible drug-related content is just a simple hashtag search! Instagram’s algorithm relies on and uses hashtags as key elements. If you were to search for drugs on sale and put the hashtag symbol in front of it, the platform would generate a number of accounts selling the drug. You can also replace the drugs with a particular name, and it would give similar results!

Even though the platform does not allow the sale of drugs, young teens can still get in touch with Instagram and buy on Instagram. The platform tried to counter this issue by introducing a warning before a user searches for drug-related hashtags. Suppose a user needs to interact with the warning. In this case, they will be promoted on a website specializing in addiction. The group of researchers said these efforts were not enough. The group also criticizes Instagram for not taking any active division implemented to solve the problem. They argue that this can lead to a reduction in the average time a user spends on the platform, and Instagram doesn’t want that outcome.

The Instagram spokesperson responded by posting a statement. He said the platform successfully removes any drug-related activity or content before it is reported by someone. 96% of content that violates their drug-related terms and conditions is automatically removed according to the platform. However, according to the group’s research, the platform may have succeeded in cleaning its feeds of drug content. However, young users can still access illegal content. The platform has recently come under heavy criticism due to leaked Facebook logs. He said the effects of the platform on the mental health of young people are disastrous.

To carry out the research project, the team decided to register fake accounts on the platform, usually called dummy accounts. These were registered as teen users so that the protection features could be tested. The research project produced fascinating notes on the platform’s regulations regarding the restriction of drug-related content. Suppose a user searches for fentanyl, the platform will not produce any results; however, if the user put the name of a city in front of the drug, it would yield many results, and some of the accounts might actually be selling the opioid! Also, the app’s own policies seemed to go against its terms and conditions. If the dummy accounts pointed to an account that sold drugs, they would recommend similar accounts. Suggested accounts often sold the same or even multiple varieties of drugs

Instagram’s policies regarding drug content on its platform remain questionable. While spokespersons maintain the position that the platform is able to automatically get rid of any drug-related content, the research done says otherwise!

Read next: Instagram mimics TikTok’s success formula for recommending videos