Cambridge Analytica and the Next Australian Election

Last week’s revelations about Cambridge Analytica and Facebook are continuing to reverberate around the world. With an Australian federal election due within a year, the impact could hit very close to home.

Cambridge  Analytica’s services have been linked to the Brexit vote and the  American election. On one level this is understandable, elections are  about deploying messaging to psychologically influence voters.  Micro-targeting slices of the electorate on niche issues has long been  part of most political parties’ campaign strategies. Why should using  the services of Cambridge Analytica be any different? And yet it is, by a  very wide margin.

Even  if we side-step the harvesting of 50 million Facebook profiles, and the  associated implications of this colossal ethical and privacy breach,  there are far deeper and much more insidious issues at play. The past  few years have seen democratic elections become geopolitical  battlegrounds. Foreign powers, and Russia in particular, have conducted  information operations designed to undermine democratic processes in  pursuit of their national interests.

The  work of Cambridge Analytica not only monetises these kind of  operations, it shows how willing and able entities like Facebook are in  influencing elections. This is just the beginning, an indication of the  scale and scope of this emergent capability. The structural disparities  between information flows and the landlocked ability to vote have  created an unprecedented opportunity for international political  arbitrage. This is an attractive value proposition for those states that  have the capability to exploit it, but also for political parties that  can afford the services of an entity like Cambridge Analytica.

Without  doubt, the biggest loser in this is the individual voter. Where  elections were once a contest fought by local personalities with  persuasive rhetoric, now these contests are underpinned by  micro-targeted political advertising through social media channels. They  are contests that marshal the resources of multinational corporations  to collect, analyse and appeal to the deep-seated cognitive biases of  the voting-age electorate; biases that the electorate may not even be  aware of, and that may have no actual link to the policy platforms of  the competing parties.

This  is not what democracies were designed for, it’s not what the print  media were designed for, it’s not even a domain the broadcast media cope  with very well. They are too slow and their audience’s location, credit  card purchases and digital history can’t be number crunched by  purpose-built artificial intelligence. They are, by any normal measure,  out of date and out of touch with their audience’s key data points.

This  is why, over the coming weeks, these 20th century channels of  communication will go for the throat of Cambridge Analytica, and it’s  why they will aim for the achilles of Facebook. But none of this will  matter if the underpinning legislative framework of democratic societies  like Australia isn’t able to defang the emerging capability of  micro-targeted psychological influence. It represents a genuine  political risk for democracies and Australia will soon be in the firing  line.

Yet  the Australian political class stands to gain from deploying these  capabilities in the next election. Social media micro-targeting solves  the problem of reaching the young demographic that lives in an apartment  with no landline, a demographic that can’t be door-knocked or cold  called. Even though allowing this kind of advertising is known to open  the door to foreign actors, odds are our political parties will use it  while simultaneously affirming their national security credentials.

This  calls into question where the real threat of foreign influence is  coming from. Australians from every demographic have uploaded their  tastes and preferences into the cloud, but the 18–35 bracket has done  this more than most. With each new election a host of digital natives  reach voting age. Why should their childhood and adolescent data points  be used to influence them? Shouldn’t these be off limits? They have used  these social media tools in good faith, yet their data is now being  used against them and the countries they vote in.

This  is not just an ethical question, it’s a question of national security.  For, if unchecked these tools will allow foreign actors to undermine  Australian democratic processes in the next election and the election  after that. Entities such as Cambridge Analytica and Facebook are  enabling this threat, profiting from it and then being obstructive  during the investigation of it. Australia has advance warning of what is  coming, yet it remains an open question what the country will do about  it. The ways of the world have changed; it’s time to acknowledge that  and provide for the adequate defence of the 2019 election.

What  does defending the underpinning processes of a democratic election even  look like? At its core, it means treating all communication channels  consistently and that means regulating all channels consistently. The  idea of Facebook being regulated as a media channel has long been  anathema to the organisation, yet by any measure they are the largest  media organisation on the planet. Arguments that “the channel is a  platform” should be dismissed as the tall-tale sophistries they really are.

Australia  also needs to interrogate the rules (or lack thereof) governing  micro-targeted psychologically influential advertising on social media.  That means opening up the ethical Pandora’s Box of whether we want or  need artificially intelligent agents that can decode and influence our  cognitive biases. This technology is still in its infancy and, more  importantly, the pool of data it feeds on to grow and know is also in  its infancy.

Every  year that this problem goes unresolved is another year of data these  algorithms can crunch, can learn from, can imitate. That’s the thing  about artificial intelligence, it needs to feed, and every election that  comes and goes is but another meal. How long before a corporation uses  this capability because a political party’s policies represent a  significant risk to their interests? How long before these algorithms  cross an inflexion point of precision and persuasion that gives their  wielders an undemocratic amount of influence?

Meanwhile,  we already know much about how Russia engages in this domain. We know  how it cannibalises the democratic capabilities of social media to  promote its interests. It’s not the only state that stands to gain from  exploiting the political arbitrage inherent in free-flowing information.  Capabilities can take decades to build but intentions can change  overnight. We need to act now to ensure it is as difficult as possible  for other states to digitally intrude on and influence the outcomes of  Australian elections.

Instead  of a race to the bottom enabled by the short-sighted quest for a quick  win at the next election, Australia needs some considered and measured  bipartisan reform in the way our elections are run. We need reforms that  protect an individual’s data profile and ensure their subconscious  biases aren’t preyed on by agents of foreign influence. We need to  ensure that the revelations of Cambridge Analytica aren’t domestically  repeated post the 2019 election.

Stay tuned on the Remi AI blog as we build out the complete supply chain offering!

Or, if you're ready to start seeing the benefits of A.I-powered inventory management, start the journey here.
Who are we?

Remi AI is an Artificial Intelligence Research Firm with offices in Sydney and San Francisco. We have delivered inventory and supply chain projects across FMCG, automotive, industrial and corporate supply and more.
Want to know more? Sign up to our newsletter for the latest information from us and other knowledgeable folk in the market.
Thankyou
We'll be in touch soon.
Oops! Something went wrong while submitting the form.
Recent Posts
See All

Find opportunities in your data