The Online Safety Act May Not Be Too Little, But It’s Surely Too Late Tech News

Reading the headlines about the Online Safety Act may give you the impression that it has been significantly weakened.

Five years after the bill was first introduced, Culture Secretary Michelle Donelan has produced a new version of the bill – version three, as far as I understand – and removed a key element, which was aimed at The so-called “legitimate but harmful” content regulations.

Campaigners and charities have accused Ms Donelan of watering down the bill, and on its face, the criticism seems fair.

Without rules for “legitimate but harmful” content, those who are abused online in the most egregious circumstances – for example, after they fall victim to a terrorist attack – will have no protection.

Admittedly, the bill is weaker than before.

However, as Ms. Donelan tries to explain, the weaknesses must be compared with the strengths of the original product. Yes, the bill has been diluted, but judging by the mouth-watering potency. That doesn’t mean it’s now worthless and watery.

WeChat

To see how powerful the Online Safety Act really is, consider the utility of tracking “legitimate but harmful” content.

More information about the Online Safety Act

It’s a new category created specifically for the bill, and it means it’s no longer legal to say something in front of someone online as long as it hurts that person.

That sounded like a good idea, until civil servants tried to define what exactly harm meant. Does it mean to hurt feelings? Physical adverse reaction? One person or many people? What about the joke? Or journalism?

Even after years of research, no one is quite sure. The attempt to avoid unintended consequences reassures few.

“A Recipe for Trouble”

If that sounds like a recipe for trouble, wait until you hear how “legal but harmful” will be enforced on the ground.

Not the police, not civil servants. Lacking the technical and human resources needed to tease out social media abuses, the government has decided to leave the onus for uncovering legal but harmful content to the tech giants themselves.

Companies like Facebook and Twitter suddenly find themselves policing this nebulous concept, with hefty fines for failing to comply.

Many people think that these companies over-enforce the law, shutting down any conversation that might even look harmful, but no one really knows for sure. Even by the standards of the new law, it is dangerously uncertain.

Read more: Why the bill is so controversial

Removing legal but harmful provisions would reduce the risk of the bill. But there is a catch. These changes remove these provisions for adults only. The bill still requires protection of children from harmful material.

This means that all initial difficulties remain. How will companies test children? Presumably, they will require extensive age verification systems, perhaps using artificial intelligence technology to identify children.

How would that work? How will they distinguish between 18 year olds who need protection and 19 year olds who clearly don’t? What is the penalty if they do something wrong?

How is “harm” defined?

Then there’s the question of how to define harm. As it stands, members of Congress will create a list of what they deem harmful, and platforms will have to explain it. It won’t be easy.

Even the simple measures in the bill are fraught with difficulties. It was recently announced that the updated legislation will prohibit the encouragement of self-harm. What the hell does this mean? Does this include algorithmic encouragement, or just the posting of certain types of content?

There are also genuine concerns among privacy activists that the bill could force companies to dig into people’s messages on apps like WhatsApp, undermining the end-to-end encryption that protects privacy.

There are many areas left untouched by the new bill. But if it passes – and given the strength of sentiment in the House of Lords is far from certain – it will be a sweeping, complex law of enormous significance.

Activists and charities will not be pleased. Neither will advocates of free speech, who see a charter of censorship in the bill.

But the bill would extend the rule of law to many areas that are currently appallingly unregulated, especially those involving children.

More importantly, it will give this administration and future administrations the opportunity to learn what works and what doesn’t. It’s not a popular way of looking at legislation, but learning by doing is crucial in this new field.

The government wasted five years when it could have collected data and learned how to regulate the online space. During that time, the lives of many children were irreparably damaged or even lost.

It is sometimes said that justice delayed is justice denied. The same goes for legislation. The bill may not be too little, but it is certainly too late.

Source link