Alfredo Kalaitzis
Alfredo is a Research Engineer in the AI for Good lab in London. He is one of the primary co-authors of the first technical report made in partnership with Amnesty International, on the large-scale study of online abuse against women on Twitter from crowd-sourced data.
His research interests lie in Machine Learning, Computational Statistics, and its applications for social good.
Prior to joining Element AI, he was a Senior Data Scientist in Digital Shadows, specialising in cyber-security and digital risk management, and a consulting Data Scientist in Microsoft's Xbox EMEA team, where he also collaborated with Microsoft Research Cambridge.
He has been a research scientist with the Department of Statistical Science in University College London, working on probability models to better understand ordinal data coming from surveys, and later on the interface of Machine Learning and Signal Processing to detect faults in the low-voltage power-line grid. During his time with UCL, his team won the first data challenge competition organised by the Royal Statistical Society, for which he designed and developed his team's algorithm for the analysis of resting f-MRI time-series data.
He earned his MSc in Artificial Intelligence from the University of Edinburgh, and his PhD in Machine Learning from the University of Sheffield under the supervision of Professor Neil Lawrence. His PhD research led to contributions in probability methods for the dimensionality reduction of data and developed methods for gene-expression time-series to discovery genetic factors of disease.
Can you give us a brief overview of the project you have been working on?
First thing I did in my current role in Element AI was to help Amnesty International quantify the amount and type of online abuse that a particular cohort of women politicians and journalists in the US and UK endured on Twitter during 2017. This involved a combination of crowdsourcing, statistics, and machine learning. Through Amnesty's Decoders platform, online volunteers helped to identify cases of online abuse in a sample of tweets that we showed them. Through statistical methods, we extrapolated the amount of abuse against the same group of women for all of 2017. Our findings were not surprising, but the one that stood out the most was the disproportionate targeting of women of colour. The full report can be read here https://decoders.amnesty.org/projects/troll-patrol/findings.
What was your favourite aspect of studying in the Department of Computer Science?
I was free to audit lectures outside from my immediate field of interest. I had the privilege of working towards my PhD in the computational biology and machine learning group located in SITraN (Sheffield Institute for Translational Neuroscience). At SITraN I was exposed to a very different kind of science, with immediate impact to the quality of life of patients.
What did you take from your time in Sheffield that has helped you in your career?
The quality of research, my former colleagues, and social culture developed inside my lab helped me appreciate and prioritise for those elements throughout my career. Also, after working in computational biology and machine learning, the immediacy of societal impact still resonates with me.
What is your fondest memory of studying and living in Sheffield?
As a Sheffield resident, I loved the music scene and in the summertime having picnics with friends in the park. As a PhD candidate, I learned to teach classes and tutor students of wide cultural diversity.
What has been your career highlight?
Collaborating with Amnesty International's human rights experts demonstrated to me the potential for impact when we can marry technical and domain expertise, working together on a social issue. Since its publication, our study has been covered by many outlets (New York Times, BBC, Le Monde, FT, Reuters, Tech Crunch) in various parts of the world (Business Today Kenya, Jakarta Post, Daily Times (Pakistan), El Nuevo Dia (Puerto Rico), Youm7 (Egypt), Sina (Taiwan)). The media coverage led to Twitter and Amnesty International sitting together to discuss solutions on greater transparency around online abuse, something that Amnesty has been pursuing for years.
Anything else you want to say?
To students - explore new topics. Attend seminars and chat to students outside of your department. Their perspectives on problems might seem fractional but they add up in novelty. Reach out to your colleagues and supervisor when you feel stuck. Research is not a solitary endeavour.
To recent graduates - look for signs of a healthy culture in a prospective workplace.