Yesterday I wrote about the human labour that goes into training today’s machine learning algorithms. It’s often boring and disturbing, but whatever the nature of the work it underlines the fact that even the most efficient and ostensibly powerful technologies are built on human labour.
Something I was thinking about while writing yesterday was the story that Facebook is turning to journalists to curate its new ‘news feed’ tab (which I wrote about earlier this month).
This is, I think, really just the other side of the machine learning coin. This story, just like the stories we saw yesterday about the human data labelling workforce, are evidence of what algorithms lack - context.
This might seem obvious but it isn’t uncommon to hear people talking about algorithms as some kind of magic and objective tool that is free from the messy realities of human life, with error, bias, and context all causing problems.
That’s all nonsense. Algorithms can’t escape this messy reality - they were built from it and they shape it too.
From the data on which they’re trained (the stuff people have to label and sort), to the output at the other end - a set of popular content, perhaps - algorithms are fundamentally useless without human support.
They need human intervention to provide the contextual understanding that even the most sophisticated artificial intelligence system does not possess.
True, that intervention can sometimes be minimal, but with artificial intelligence finding its way into a huge range of different domains - many of which are deeply human and highly contextual - if they’re worth using at all humans will be required.
Retreading old ground
It’s worth pointing out that Facebook’s plan to hire journalists and editors isn’t new. It seems that however much they try to rely on algorithmic power, the company can’t help but return to humans.
They did it before back in 2016 with their ‘trending topics’ tab, which was closed down in June last year. A number of different reasons have been cited for its closure, including a suggestion that editors were supressing conservative content, but it seems the reason was really users just didn’t care for it (apparently it only accounted for 1.5% of clicks to news publishers).
With Facebook also making algorithmic and branding changes to focus on personal relationships rather than content from publishers, it was really just another step in that narrative arc.
I don’t know whether the move will work, and I’m cynical about Facebook’s ability to do news properly.
However, I do think accusations of bias are unfair. On the one hand they ignore the fact that the algorithm will necessarily be biased one way or another; on the other maybe bringing in the transparency of human decision making can only be a good thing.
The debate about whether Facebook is a platform has gone on long enough - taking editorial responsibilities seriously is a step towards appreciating its reality as a content publisher.