On 28th November OxTalks will move to the new Halo platform and will become 'Oxford Events' (full details are available on the Staff Gateway).
There will be an OxTalks freeze beginning on Friday 14th November. This means you will need to publish any of your known events to OxTalks by then as there will be no facility to publish or edit events in that fortnight. During the freeze, all events will be migrated to the new Oxford Events site. It will still be possible to view events on OxTalks during this time.
If you have any questions, please contact halo@digital.ox.ac.uk
In the criminal justice system, risk assessments of an individual’s propensity to offend or reoffend are made to determine what is ‘acceptable risk’; where individual deprivation of liberty prevents future societal harm. Traditionally, these decisions have been made by humans, assisted by an in-depth knowledge and clinical assessment of the individual, true to the principle of individualised justice. However, in recognition of prejudice in human decision-making and a desire for accuracy and expediency in risk assessments, algorithms now appear at every stage of the criminal justice system. But can these tools ever offer us a judgement that is transparent and free from prejudice, or are we simply automating bias against marginalised groups in our society?
In this talk, I’ll discuss the ethical implications of risk assessment tools currently being used to make decisions in the criminal justice system and consider the ‘ethical debt’ to society created by the rapid development of tools which are disadvantaging minoritised groups.
Joining link: medsci.zoom.us/j/91251667656