Election Analytics: Predicting Election Outcomes
It is that time again when politics becomes an avid spectator sport. News channels are filled with images of candidates giving speeches, opponents giving opposing speeches, and pundits trying to tell us what it all means.
More and more, an integral part of this entire undertaking is polling. Across the country, people at home hear their phones ring, answer them, and then get asked a series of polling questions. Well, what do those polls actually tell us? Individually, they can point one way, while the final outcome of the election is actually something else.
CS Professor Sheldon Jacobson has been working with polling data for over a decade, trying to find ways to uncover pertinent and telling information.
Since 2008, Jacobson and his students have gathered polling data from a number of different sources and developed a website to analyze that data and make predictions.
Called Election Analytics, the site presents forecasts for election outcomes—if they were held today—of Senate races across the United States.
“We’re trying to weed out insights that other people cannot necessarily see,” said Jacobson. “You can find some of the trends, but when you’re trying to combine so much information and package it in a manner that you can wrap your mind around and really get a feel for—it’s really hard to do. So on the website, we try to make it user friendly.”
According to the website, the algorithm “employs Bayesian estimators that use available state poll results . . . to determine the probability that . . . each political party will win the Senate race in each state.”
The site weights polling data based on a variety of factors. One is when the poll took place—more recent polls have more weight than earlier polls. Also, polls are weighted more heavily if they have a larger sample. That is, polls with 1,000 respondents would be weighted more than those with 500 respondents.
The analytics and algorithms that undergird the site are what Jacobson sees as setting this election projection site apart from other sites that try to do projections. “Our stance is that with our methodology, in close races we will be able to provide better insights than others. This is the year to test that, because it’s going to be close,” he said.
So why does the site focus solely on this year’s Senate races? Data quality. “The strength of what we do depends on the strength of the polls and the quality of the polls,” Jacobson said. “There’s not enough quality data with the House, so we eliminated those races this year.”
This site would not exist without the work of a dedicated group of students. This year, Jacobson’s work is assisted by CS grad student Jason Sauppe, who is the senior project advisor, and CS undergraduates Taylor Fairbank, undergraduate team leader, and Dimitriy Zavelevich, web developer.
“I got involved in the project three years ago in the fall of 2011, when Dr. Jacobson was beginning to put together a team for the 2012 elections,” said Sauppe. “The biggest challenge in working on this project has been data management. We need to keep track of a great deal of information across many different races, and every day we have to check for new polling data in order to update our forecasts.”
“I geek out about the code,” said Fairbank. “Election Analytics has been a great way to improve my full stack web development skills on a real project. I learn best by doing: the Election Analytics project allowed me to apply knowledge from classes, such as database design and testing.”
Jacobson is as excited about these students as he is with the project on which they are working. “I have a superb group of students,” he said. “They are very attentive to detail, and smart.”
Already, Jacobson and his team are working on improvements for 2016, when the next presidential election comes around. “One thing we want to add that we didn’t have ready this year—and it will be ready for 2016—is that people will be able to choose which polls to use,” Jacobson said. “This is a capability that no other site has.”