On 28th November OxTalks will move to the new Halo platform and will become 'Oxford Events' (full details are available on the Staff Gateway).
There will be an OxTalks freeze beginning on Friday 14th November. This means you will need to publish any of your known events to OxTalks by then as there will be no facility to publish or edit events in that fortnight. During the freeze, all events will be migrated to the new Oxford Events site. It will still be possible to view events on OxTalks during this time.
If you have any questions, please contact halo@digital.ox.ac.uk
This seminar will be held on Zoom, please register here: medsci.zoom.us/meeting/register/tJEod-mqrjkuGdY2kFuVSi9-gwBFYWUqOdDc
Technological developments have undoubtedly changed our world and, as a result, questions concerning moral responsibility are increasingly perplexing. How should we think of responsibility when humans collaborate with technology, for example, where our devices remind us of important appointments, or where clinics employ artificially intelligent decision support systems? Who (or what) can be held responsible when harms or benefits appear to result from technology alone? Some authors suggest a problematic “responsibility gap”, while others promote various “bridging” strategies, from pinning responsibility onto proxy individuals or corporations, to locating novel sorts of group agency in human-machine composites. Few, however, have seriously considered more direct strategies, namely how we might hold machines responsible. Although it may sound farfetched, seeing machines as responsible can depend less on seeking sophisticated capacities, like consciousness, and more on rethinking the nature of responsibility itself. Focusing on the latter approach, I put forward a pragmatic account of moral responsibility for the technological world. In short, we stand only to gain by rethinking responsibility in ways that accommodate sophisticated technologies and that help us adjust to one another amidst our increasing reliance upon technology.