Oxford Events, the new replacement for OxTalks, will launch on 16th March. From now until the launch of Oxford Events, new events cannot be published or edited on OxTalks while all existing records are migrated to the new platform. The existing OxTalks site will remain available to view during this period.
From 16th, Oxford Events will launch on a new website: events.ox.ac.uk, and event submissions will resume. You will need a Halo login to submit events. Full details are available on the Staff Gateway.
We propose an algorithm for machine learning of conditional distribution functions for a dependent variable (Y ) with continuous support. The algorithm produces a complete description of the conditional distribution function at all observed points in the covariate (X) space, and provides a similar estimate for other possible covariate values. The descriptions it provides are quite general and are globally valid conditional densities.The algorithm is multilayered and feed-forward. Each layer has the same statistical interpretation: Layer k takes a vector e(k-1) that is nearly perfectly marginally Gaussian and makes it more marginally Gaussian and more independent of X. It does this by applying a continuous monotonic transformation that varies depending on an observation’s X value. Each layer is estimated by an elastic net regularization of maximum likelihood. We demonstrate Wilks’ phenomenon for the composite algorithm and show how to calculate the algorithm’s effective dimension.
Please sign up for meetings here: docs.google.com/spreadsheets/d/1qPQqXivNYBDNJY_0OdHcZfjslHLu5UtSVQMd0LpETqc/edit#gid=0