Instagram has stood up for new features targeting attempts of sextortion of teenagers on the platform despite the criticism claiming that it does not protect users enough.
Meta revealed on Thursday that its new features, which include preventing screenshot or screen recording of disappearing images and videos were part of continued efforts to prevent ‘sextortionists’ whereby criminals lure teens into sharing intimate images with them.
The NSPCC welcomed the measures as being positive steps in the right direction. However, Arturo Béjar, a former Meta employee who later became a whistleblower, said there were less invasive methods through which Instagram could shield youths from such experiences.
“The greatest thing they could do is just enable a teen to mark when they feel the account that is asking them to follow them is not a teen,” Mr Béjar said. “The structure of the product means that by the time they need to report for sextortion the harm has been done.”
Meta revealed that all its reporting instruments, which were created with the help of the users, provide teenagers with understandable and direct means to react to every form of sexual misconduct or harassment.
It said it also provides specific procedures for reporting undesired nudity and claims that such reports are given high importance It stated that the people cannot report fake underage accounts because there are options for reporting fraud or scams.
Richard Collard, the NSPCC’s associate head of child safety online policy, said: It is, however, how come Meta are not deploying similar safeguards across its platforms, including where grooming and sextortion also occur in large numbers such as on WhatsApp?
Ofcom, the communications regulator of UK said that Social media companies will be penalized if they will not protect children.
What is sextortion?
Sextortion, where victims are coerced into providing the perpetrators with sexually provocative images before having them threatened with publication is now a common type of intimate image abuse.
Police departments worldwide have confirmed an increase in sextortion scams happening on popular social media sites, which more often than not involves teenage boys.
The UK’s Internet Watch Foundation revealed in March that of all the sextortion reports it received in 2023, 91% involved boys.
The suicides have been committed by the sextortion crimes victims who are harassed with threats their pictures will be posted online if they do not pay their blackmailers.
People whose teenagers have been killed by targeted individuals through Facebook and other social media outlets have urged the firms to do more to prevent it.
The mother of the 16-year-old Murray Dowey, who died by suicide in 2023 after being targeted by a sextortion gang on Instagram, said Meta was not doing “nearly enough to safeguard and protect our children when they use their platforms”.
‘Built-in protections’
Meta explained that it came up with new safety features and the campaign to support the existing existing features teenagers and parents use on the app.
In a statement, Meta’s head of global safety, Antigone Davis, said the new Instagram campaign is to educate children and parents about sextortion scams should the latter avoid the company’s detection tools.
‘However, this is an adversarial crime and the moment we have put in measures which state that thus shall not be done, these extortion scammers will find ways to circumvent them’.
It will blind follower and following lists to possible sextortion accounts, and inform teenagers if they are talking to somebody from another country.
Sextortion specialist Paul Raffile revealed in May that sextorters search for teen accounts in the following and followers lists after having searched for high schools and youth sports teams on the platforms.
Instagram will also stop screenshots of images and videos that are sent in direct messages with its “view once” or an “allow replay” option – these options are options that can be chosen by a user when sharing an image or video with another person.
It will be impossible for users to open these forms of media at all on Instagram web.
However, Mr Béjar said it could provide users with a ‘’perceived security’’ because attackers could take a picture of the image on the screen with the aid of another gadget.
Meta said that, unlike other social media, the feature provides not only a heads-up when someone has taken a screengrab of your photos or videos but also blocks the process.
Mr Béjar – who has called on the platform to create a button that lets teens straightforwardly report inappropriate behaviour or contact – also said nude images sent to younger teens should be blocked, not just blurred.
He added that younger users should have clearer, stronger warnings about sending such images than those currently offered.
Meta says its nudity protections were designed in liaison with child protection experts to educate people about the risks of seeing and sharing such images in a way that does not shame or scare teens by disrupting conversations.
The company is currently moving under-18s into Teen Account experiences on Instagram with stricter settings turned on by default – with parental supervision required for younger teens to turn them off.
But some parents and experts have said safety controls for teen accounts shift the responsibility of spotting and reporting potential threats onto them.
Dame Melanie Dawes, the chief executive of the regulator Ofcom, told it was the responsibility of the firms – not parents or children – to make sure people were safe online ahead of the implementation of the Online Safety Act next year.
Related News :
South Korea Summit to Outline a Strategic ‘Blueprint’ for Military AI Utilization