SYRACUSE, N.Y. (WSYR-TV) — You’ve probably heard the saying, “you can’t believe everything you see on the internet,” and now two professors from Syracuse University are collaborating with professors across the country to try and end that myth.
Syracuse University Professor and Director of New Media Management Stephen Masiclat said, “The lie will go halfway around the world before the truth gets its boots on.”
Stephen Masiclat and Regina Luttrell, two professors at Syracuse University’s Newhouse School of Communications, are working with an entire team of people from across the country to develop an artificial intelligence system that detects fake news and misinformation, which is becoming more and more of a problem online.
“So, the technology is going to be important simply because the volume of information is increasing all the time. And as the total volume of information increases, the volume of false information also increases,” Masiclat said. “So, people need tools to help separate the wheat from the chaff.”
The team at Syracuse is working with engineers and mathematicians to develop an algorithm on how to identify when something is false. The algorithm will examine things like the style of writing, how fast the information is shared and the other elements involved in the news article, like pictures or videos.
“You can’t look for the one signal that something is fake. You have to look up and down at multiple signals,” Masiclat said. “It would not be fair of me to talk about this without talking about an engineering professor at SU, Reza Zafarani. He has published papers about looking at multiple levels to detect fake news, so he’s done work in early detection. So the issue is, how can we take his work and possibly narrow it down to the news industry?”
It’s one thing to identify misinformation, but how will the authors of that information be punished to stop it from happening again?
“Short of legislative penalties, it’s going to be really tough,” Masiclat said. “We have constitutional protections for speech, and that includes misinformed speech. Then you add to that, the complexity of the fact that the primary way people get misinformation is using privately-owned platforms, like Facebook and Twitter.”
Recently, Twitter temporarily banned President Trump’s campaign on their platform, after a video was tweeted where Trump falsely claimed children are “almost immune” to COVID-19.
The government has very little control over Twitter and other social media platforms where a lot of misinformation is spread.
However, the project Masiclat and his team are working on goes far beyond social media.
“This project is funded by the Department of Defense, under the office of DARPA, the Defense Advanced Research Projects Administration. So, they’re interested in broad strategies for detecting all different kinds of levels of misinformation,” Masiclat said. “So, it might include things like people inadvertently making a mistake, to very sophisticated state-sponsored actors, who are trying to inject misinformation.”
Although this multi-million dollar project is working to combat fake news head-on, the everyday person has a responsibility as well, with social media becoming more powerful every day.
Masiclat said, “A former dean of the Newhouse School and my friend, David Rubin, used to say ‘the antidote to false speech is more true speech,’ I’m paraphrasing him a little bit. In America, that’s one of the best tools we have.”
The $11.9 million project is expected to be completed in about three to four years. So, until then, and beyond, be mindful of what you post and share on social media.