We’ve all heard the term ‘fake news’. It has become a buzzword in recent years, especially since the 2016 US presidential election. Fake news is information created in bad faith to deceive the public, for a myriad of reasons. What is important to understand with fake news is who the perpetrators are, or who stands to gain. Oftentimes this is simply online trolls looking to go as viral as they can with their deceit. The more malicious intent, however, is for political gain, and this is what the term fake news is readily associated with.

The future of fake news is clouded. It is mired in uncertainty about rapidly developing technology and steps that can be taken to counteract it. The current state of fake news is worse than ever. The spread of fake news has become increasingly rampant as media has diversified and grown, and there are now more places to get your news than ever. More and more people are getting their news from Facebook or Twitter or other social media sites. This kind of information is hard to control. There’s too much out there to fact check everything, and people are inclined to believe what they read. It’s also much easier to hit a few keys and share something, and fake news can become widespread rapidly. Most people who read fake news will never find out it wasn’t true.

Media ethicist Stephen Ward says that the increasing polarization of politics and growth of global media in journalism has led to the fake news explosion. He also points to extreme populism, with its deliberate and divisive appeal to the masses, and specifically its utilization by the far right. “This is demagoguery going back to ancient Greece,” he said of right-wing groups abusing populism. “Only now they have the power of global media and social media to infiltrate and spread conspiracy theories and fake news.”

Ward has proposed solutions to what he says is a much more complex problem than it seems. The fake news crisis will not just be solved by mainstream journalists taking more responsibility and becoming more ethical. “Journalists have to start joining and participating in projects with other civic groups to make people aware of how to analyze media, fight back, and detox the public sphere,” Ward said.

Teaching media ethics literacy at a younger age, such as in middle and high schools, is another solution Ward suggests. “[Such education would involve] how the media operates, how we as citizens use our media, and how messages get circulated,” said Ward. “This would be an institutional, structural change.” He added that journalists need to go to schools and hold seminars if teachers are not available.

Ward also points out that more globally-minded teaching and a stronger code of ethics needs to be put in place by journalism schools. “We can’t just keep giving students practical toolkits,” he said. “We need to give them histories of religion, histories of terrorism in different cultures, that sort of knowledge of the world. Also, knowledge of populism, and how to identify these different groups.” He also said there is “very little written in ethics about how we are to report on a guy like Donald Trump, or how we’re supposed to report on populist groups.”

Our technology is very important to the current and future state of fake news in how it can help us counteract it. Weichang Du, a computer science professor at UNB, explained what is currently available to counteract fake news. There is software out there to detect fake news, however, Du said it has a problem.

“These programs need to start with a training site designed by a human to learn what they’re looking for, and the problem is that humans are biased,” Du said. “This means that often times, the programs are only as good as the human designing the training site.” He recommended programs that can pull data from an organization’s site and tell you how reliable they are based on their track record. This can be used to guide your reading and can tell you if you should check another source.

The future of technology relating to fake news is murky, Du said, but the general direction is to improve machine learning so it doesn’t have to be reliant on human biases. This would involve using big data: extremely large data sets that machines can analyze at a fast rate. “Data science can be used to identify percentages on how likely it is that something is fake. The problem with big data is that it’s hard to draw specific conclusions from it, which is where the percentage comes in. Research is being done now to continually improve the percentage you get.”

The Brunswickan strives to provide its readers accurate and ethical reporting, but as Ward said, this is only the start to fixing the fake news crisis. Data science and technology may be able to inform our news consumption, but it isn’t perfect and it won’t be for the foreseeable future. It is a complicated problem that requires a multi-faceted solution. “Our democracy as I understand it is at risk,” Ward said. “We have to defend ourselves as individuals from being manipulated.”