tumblr hit tracking tool
Aniko Hannak

About me

I recently started as a PostDoc in the Center for Network Science at the Central European University. Before CEU I was a PhD student in the College of Computer & Information Science at Northeastern University advised by Alan Mislove and David Lazer.

Broadly, my work investigates a variety of content serving websites such as Search Engines, Online Stores, Job Search Sites or Freelance Marketplaces. In this quickly changing online ecosystem companies track users' every move and feed the collected data into big data algorithms in order to match them with the most interesting, most relevant content. Since these algorithms learn on human data they are likely to pick up on social biases and unintentionally reinforce them. In my PhD work I created a methodology called Algorithmic Auditing which tries to uncover the potential negative impacts of large online systems. Examples of such audits include examining the "Filter Bubble effect" on Google Search, online price discrimination or detecting inequalities in online labor markets.

For my detailed resume please see my cv.
And here is my PhD thesis (I promise it is easy to read!:)

Email: ancsaaa at ccs dot neu dot edu


11/01/16  Our paper about Discrimination in Online Freelance Marketplaces is out.
29/06/16  Defense done!
22/07/15  Got some attention in the Hungarian media as well:) Interview about tracking, algorithms and personalization.
22/07/15  Going to Japan to present our paper on Location based Personalization in Google Search!
02/05/15  Our work on uncovering personalization algorithms is part of this weeks cover story in New Scientist!
02/05/15  Our price discrimination study received press coverage in the Wall Street Journal, Good Morning America, and the CBS Evening News!
02/05/15  Our new site dedicated to our research on personalization algorithms, the Filter Bubble, and algorithmic society is now online. Compliments to Gary for the swanky design!



Posters and presentations

Current Projects

Measuring Bias in Online Labor Markets

Labor economy has been through a lot of structural changes in the past years. People use various online services to find employment, advertise freelance services, collaborate on projects, outsource work, etc. These online sites offer innovative mechanisms for organizing employment or hiring processes and may alter many of the social forces known to cause social inequality in traditional labor markets. While policies in the traditional labor economy protecting people in the labor market have been developed over hundreds of years, we are at the early stages of this process in the online context. Paradoxically, while meaningful policy making requires a good understanding of the mechanisms that create or reinforce inequalities, without regulations reinforcing audits or some form of transparency, it is very difficult to learn about these systems.

In my work I investigate the mechanism that emerge in this new ecosystem and their potential for creating or reinforcing gender and racial biases. I am especially interested in the impact tools that differentiate new online services from traditional labor markets, e.g. public social feedback or the use of big data algorithms in search and recommendation. Quantifying bias is a challenge in itself; obtaining data, defining the right baselines to compare against, and developing tools for detecting inequalities. Beyond finding inequalities, my work places emphasis on determining the real world effect of these differences and exploring possible mitigation strategies. These questions however lead to even more methodological challenges; how do we disentangle effects of algorithms, self-presentation, network-processes in the underlying social network, and review systems?
To answer the above question I combine a variety of methods, including online data collection and empirical analysis, online and field experiments, and survey base data collection.

Fact-Cheking interventions on Online Social Networks

The prevalence of misinformation within social media and online communities can undermine public security and distract attention from important issues. Fact-checking interventions, in which users cite fact-checking websites such as Snopes.com and Factcheck.org, are a strategy users can employ to refute false claims made by their peers. We use data from Online Social Networks such as Twitter to find these conversations and to examine the contexts and consequences of fact-checking interventions. Our preliminary results suggest that though fact-checking interventions are most commonly issued by strangers, they are more likely to draw user attention and responses when they come from friends.


Discrimination in Online Freelance Markets

Price Discrimination

Filter Bubble

Hungarian Press