OxTalks will soon move to the new Halo platform and will become 'Oxford Events.' There will be a need for an OxTalks freeze. This was previously planned for Friday 14th November – a new date will be shared as soon as it is available (full details will be available on the Staff Gateway).
In the meantime, the OxTalks site will remain active and events will continue to be published.
If staff have any questions about the Oxford Events launch, please contact halo@digital.ox.ac.uk
Human judgment is flawed and limited. Our reasoning degrades when we are tired or hungry; we are capable of thinking only in low dimensions; we process many forms of information slowly; and so on. Algorithm judgment promises to correct our flaws and exceed our limits. Algorithms do not get hungry or tired; they can “think” in high dimensions: they process many forms of information at break-neck speed; and so on. This paper is concerned with a particular flaw in human judgment—noise, understood in Kahneman et. al’s (2021) sense, as unwanted variability in judgment. A judge is noisy, for example, if she sometimes hands down harsh sentences and sometimes lenient sentences—with no particular rhyme or reason—to defendants who ought to receive the same sentences. Her judgment exhibits unwanted variability. We ask: are algorithmic systems susceptible to noise? At first glance, the answer is no—and indeed, Kahnemen et. al argue that it is no—since many algorithmic systems compute the same function every time, and so by definition are free from a certain kind of variability. This first glance, we argue, is misleading. The kind of variability that algorithms are free from can, and often does, come apart from the kind of variability that is unwanted in cases of noise. Algorithms are susceptible to noise, just like we are.