At one point in time, I made the ridiculous statement to a large gathering of my peers that I was an “algorithm engineer.”
This was a long time ago when I was younger and more naïve than I am now. The statement itself is one of absurdity. An algorithm engineer? Isn’t that what all engineers do in form or another? Why would my job be special?
In reality, my job wasn’t special. This was one of those silly statements that a young engineer makes when trying to justify their importance to the senior staff. Unfortunately, I lacked one very important realization. To a group of experienced engineers, many of whom joined the company before I was even born, I really wasn’t very important.
Fast forward 20 years or so, and the term “algorithm engineer” starts to take on a genuine meaning. In addition to web pages, online commerce and social content, the internet brings us an almost countless number of algorithms. From Google Search to Facebook’s Likes to Amazon’s “People Who Viewed This,” algorithms guide our everyday online experience.
And by extrapolation, a legion of anonymous “algorithm engineers” exert tremendous influence over what we see and, more importantly, over what we perceive to be important.
The recent news regarding Facebook’s trending items highlights the impact of news automation. Ex-Facebook employees revealed that Facebook routinely suppressed news stories regarding conservative topics. While denying “systematic political bias,” Facebook acknowledge the substance of the charges. To their credit, the organization is retraining their staff to be sensitive to conservative issues.
Many want to characterize this as a liberal bias issue. While this might be a case of liberal bias, I don’t believe that’s the real problem. Everyone knows the mainstream news has a liberal bias. They always have and they always will. Those people that care learn to adapt.
The Facebook issue illustrates a more insidious problem. Social media users are lead to believe that the various “trending topics” lists are targeted based on our viewing history. Social media touts this benefit as one of its greatest advantages. A simple, deterministic algorithm will steer us toward content that we should like.
If implemented correctly, the algorithms hidden in the fabric of social media enable exposure to a broad spectrum of media that users would not find on their own.
Unfortunately, Facebook substituted editorial content.
Should we be surprised? Yes. Should we be angry? Yes. The social media experience, at least in part, is about connecting people with similar interests. Instead of receiving data from an impartial algorithm designed to highlight interesting items, Facebook provided content based on the opinion and likes of its editorial staff.
We trade a huge amount of personal information in exchange to access these “free” services. If groups like Facebook want continued access to our data, they need to hold up their end of the bargain — ensure that content selection comes from an apolitical, impartial, and unbiased algorithm.