Video Extras

Microsoft Has Sacked Its Journalists And Replaced Them With AI

A sign of things to come?

Microsoft has sacked around 27 people employed by PA Media, who were told they were getting the boot in a months’ time as they’re being replaced by artificial intelligence to curate the news on its homepages. With their news feeds automated, the AI will now select, edit and curate news articles. So… can we expect an uptick of computer stories and good deals on RAM?

One ex-employee isn’t sure the AI is up to the human task of determining and sticking to “very strict editorial guidelines”. After all, can a computer perceive and make the very human call of determining if its watching something which is subtly violent or morally inappropriate? News rooms constantly wade through grey areas that aren’t obvious and often judgments have to be made on a case-by-case basis. There is a very important and real difference between considered reporting and straight out showing terrible events.

Plus, I don’t know about you, but I’ve never once seen a laptop wear one of those reporter hats or break a scoop. All I ever get is 40 minute automatic updates right as I’m trying to start work for the day.

Microsoft’s move will no doubt shake and rattle other news sites across the globe. But what are the consequences? Human biases are often coded into computer programs. In 1988 the UK Commission for Racial Equality found a British medical school guilty of discrimination, after the program it was using to determine applications was found to be biased against women and those without non-European names.

And then there’s this hand sanitizer dispenser:


When awareness of who controls the news has never been more important, should we be handing the keys to a program that’s never had to live through or witness the multiple complexities of the events its showing? Maybe let’s try it with a cute animals curations list first and see how many times it gets it wrong by showing reptiles. And that, my friends, is what we call a human editorial bias.