We are a research team led by Dr Luc Rocher, based at the Oxford Internet Institute within the University of Oxford. We conduct human-centred computing research to understand how data, digital infrastructure, and algorithms impact society. We are interested in making digital power visible to the public and guiding the development of accountable, sustainable, and safe algorithms that better serve the public interest.

Synthetic Society refers to the increasingly algorithm-mediated world we inhabit, where AI systems and digital technologies shape our social interactions, economic systems, and political processes. As a group, we collaborate on producing rigorous independent scientific evidence on how real-world technologies function. Our members come from diverse backgrounds and complementary disciplines, including mathematics, public policy, political science, complex systems, and human-computer interaction.

News

2025.02 Congratulations to Juliette for her work on data access and algorithm audits accepted at CHI.

2025.01 Welcome to Francisco who is joining us for his MSc thesis.

2025.01 New research article proposing a scaling law to predict the success of identification techniques in Nature Communications.

2025.01 We are hiring PhD students and postdoctoral researchers.

2024.12 Sofia presented our work on gendered biases in language models at NeurIPS.

2024.09 We are hiring one Research Assistant to join the team.

2024.09 Luc Rocher is awarded a £2M UKRI Future Leaders Fellowship to fund the team for the next four years.

Primary research areas

Improving researcher data access with privacy-enhancing technologies.

Facilitating data access while protecting sensitive data is a significant challenge for public interest research. Some privacy-enhancing techniques—such as injecting noise into data or creating ‘synthetic’ datasets—can fundamentally distort data in unknown and potentially harmful ways. For example, rare diseases may be suppressed in synthetic data, or vulnerable communities may be further marginalised.

We study the impact of modern secure data sharing with strong privacy guarantees on reproducibility and research integrity, and build open source tools for the public to understand how their data is being used.

Meaningful data access for algorithm audits [CHI, 2025]

A scaling law to model the effectiveness of identification techniques [Nature Communications, 2025]

Anonymization: The imperfect science of using data while preserving privacy [Science Advances, 2024]

Evaluating interactions between humans and machines.

Studying humans or algorithms in isolation reveals little about the real-world implications of data-driven technology. These systems cannot be studied by abstracting out interfaces, interactions, or affordances of real-world deployments. Societal harms often arise at these interaction layers, and thus cannot be identified without accounting for how humans use and interact with a system in practice.

Through user studies and simulations of user-AI interactions, we aim to propose better ways to conduct sociotechnical evaluations.

Investigating opaque platform algorithms.

Automated algorithms set prices for consumers in online markets, match humans in dating apps, and recommend posts on social media feeds. These algorithms affect the prices we pay, the people we date, and the information feeds we consume. However, many of these algorithms are proprietary and opaque, not just to the public but to researchers and regulators as well.

We work to uncover the effects of such algorithms, focusing on how they serve or undermine the public interest. We also develop tools and methods to help users better control how algorithms affect them.

Adversarial competition and collusion in algorithmic markets [Nature Machine Intelligence, 2023]

Get in touch

If you are interested in collaborating with us, please contact Luc Rocher. We have regular opportunities for PhD students, postdoctoral researchers, and visiting scholars. We also welcome inquiries from motivated students based at the University of Oxford interested in conducting research with us as part of their studies.