OxTalks will soon move to the new Halo platform and will become 'Oxford Events.' There will be a need for an OxTalks freeze. This was previously planned for Friday 14th November – a new date will be shared as soon as it is available (full details will be available on the Staff Gateway).
In the meantime, the OxTalks site will remain active and events will continue to be published.
If staff have any questions about the Oxford Events launch, please contact halo@digital.ox.ac.uk
Machine-learning algorithms are increasingly used to assist humans in high-stakes decision-making. For example, loan officers apply algorithmic credit scores to inform lending decisions, HR managers use data-driven predictions in selecting applicants, and judges turn to recidivism risk tools when setting bail. Despite their pervasiveness, there are growing concerns that such predictive tools may discriminate against certain groups, which has led to numerous efforts to exclude information about protected group membership (e.g., race) from input data. While, technically, such interventions can increase overall fairness levels, there is little evidence on how human decision-makers, who take these predictions as input, ultimately react to them. Do they consider the elimination of protected characteristics in algorithmic predictions when making decisions about others?
To address this question, I conduct a lab experiment in which subjects predict the other-regarding behaviour of other participants in an economic game. Subjects receive (i) information about the other participants’ social identities and (ii) an algorithmic prediction about the other participants’ behavior based on previous experimental data. I vary the algorithm’s fairness properties, i.e. whether the prediction includes protected social identity variables or not, which is communicated to the subjects. Moreover, I explore how potential reactions to fairness properties might be influenced by subjects’ biased beliefs about differences in other-regarding behavior across protected groups.