Algorithms, social media, and HRC’s defeat.

In her concession speech, Hillary Clinton indirectly thanked “secret, private” Facebook support groups like Pantsuit Nation. (see for example Endgadget).

This was a rather backhanded compliment: she added ‘I want everybody to come out from behind that and make sure your voices are heard’.

An article in Vox subsequently explained about Pantsuit Nation: “the group is set to “secret,” meaning members must be invited to join by another member. The idea is to make admission tougher for Clinton critics who might harass other members or start debates. Those who make it in are encouraged to focus on “positive, personal” posts, and moderators won’t approve rule-breakers. They’re also quick to delete negative comments about either candidate……The enthusiasm, despite these restrictions, is telling. It’s a highly curated oasis of positivity that stands in stark contrast to the mood of political season where half of Americans say the election is a “very or somewhat important source of stress” in their lives.”

It’s an interesting example of confirmation bias, where people seek arguments that support their own views and avoid contraries.

Some argue that this is built in to the media systems themselves: for example, Time claimed last year that “Facebook says the average user has access to about 1,500 posts per day but only looks at 300. ….To ensure that those 300 posts are more interesting than all the rest, Facebook says it uses thousands of factors to determine what shows up in any individual user’s feed.” these include
– How close you are to a person
– The post-type
– How much engagement the content has attracted
Much of this seems likely to feed confirmation bias: you hear from your friends and from people who are popular with your friends.

Time adds that “Engineers are also continually running multiple experiments with about 1% of Facebook users in an attempt to boost engagement.” Such an experiment in 2014 led to some controversy when the NY Times, for example, said “last week, Facebook revealed that it had manipulated the news feeds of over half a million randomly selected users to change the number of positive and negative posts they saw. ….The company says users consent to this kind of manipulation when they agree to its terms of service.”

The NY Times add: “The goal of all of this, Facebook says, is to give you more of what you want so that you spend more time using the service — thus seeing more of the ads that provide most of the company’s revenue.”

The published paper claims the experiment showed that “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. … In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion…”

Pantsuit Nation supporters see safety and comfort rather than bias: “the group is set to ‘secret,’ meaning members must be invited to join by another member. The idea is to make admission tougher for Clinton critics who might harass other members or start debates. Those who make it in are encouraged to focus on “positive, personal” posts, and moderators won’t approve rule-breakers. They’re also quick to delete negative comments about either candidate. The enthusiasm, despite these restrictions, is telling. It’s a highly curated oasis of positivity that stands in stark contrast to the mood of political season where half of Americans say the election is a ‘very or somewhat important source of stress’ in their lives.” But it still sounds like confirmation bias to me, plus a deliberate attempt (similar to the Facebook experiment), to ‘talk up’ morale. Alas for HRC, it didn’t result in enough votes for her, however happy it made her supporters in the run-up to the election: you could argue that in the medium term it was self-defeating.

A report in the Washington Post says that the Clinton campaign relied heavily on an algorithm named Ada (of course!) which “operated on a separate computer server than the rest of the Clinton operation as a security precaution, and only a few senior aides were able to access it.” The report continues: “According to aides, a raft of polling numbers, public and private, were fed into the algorithm, as well as ground-level voter data meticulously collected by the campaign. Once early voting began, those numbers were factored in, too….What Ada did, based on all that data, aides said, was run 400,000 simulations a day of what the race against Trump might look like. A report that was spit out would give campaign manager Robby Mook and others a detailed picture of which battleground states were most likely to tip the race in one direction or another — and guide decisions about where to spend time and deploy resources.”

A recent analysis of the Presidential Primaries points out the many statistical anomalies in the US election process. Given how the polls got it wrong, it is clear that the maths are not as simple as they might seem. According to a tweet by Kate Crawford, the cause is just: “Mo’ data, mo’ problems”.

In an NY Times article, Prof Crawford presents examples of algorithms which seem to show that “Sexism, racism and other forms of discrimination are being built into the machine-learning algorithms”. These benefit whites or males because “This is fundamentally a data problem. Algorithms learn by being fed certain images, often chosen by engineers, and the system builds a model of the world based on those images. If a system is trained on photos of people who are overwhelmingly white, it will have a harder time recognizing nonwhite faces.” She makes the same points, with other examples, in a Nature article.

May be so, but in this case the algorithms seem to have under-counted the white males who apparently voted for Trump, and over-counted the others who apparently voted for Clinton.

Leave a Reply

Your email address will not be published. Required fields are marked *