On 28th November OxTalks will move to the new Halo platform and will become 'Oxford Events' (full details are available on the Staff Gateway).
There will be an OxTalks freeze beginning on Friday 14th November. This means you will need to publish any of your known events to OxTalks by then as there will be no facility to publish or edit events in that fortnight. During the freeze, all events will be migrated to the new Oxford Events site. It will still be possible to view events on OxTalks during this time.
If you have any questions, please contact halo@digital.ox.ac.uk
The late Professor Stephen Hawking once said:
“The development of full artificial intelligence could spell the end of the human race.”
He also went on to state that he advocated research into precautionary measures to ensure future super-intelligent machines remain under human control. However, AI apocalypse is not necessarily robots marching down the street, there are a number of examples subtler than this. So, what is the risk of AI apocalypse and can we calculate this probability? Furthermore, could we come up with a strategy to minimise this probability. In this talk, we will consider the scenario of AI taking over the world economy and how we can use mathematical modelling to investigate this.
Nira will take you through a mathematical model of the complexities of human behaviour that caused the world economic crash. He’ll go on to show how the same model can be used to investigate how to minimise the probability of an artificial intelligence takeover.