OxTalks will soon move to the new Halo platform and will become 'Oxford Events.' There will be a need for an OxTalks freeze. This was previously planned for Friday 14th November – a new date will be shared as soon as it is available (full details will be available on the Staff Gateway).
In the meantime, the OxTalks site will remain active and events will continue to be published.
If staff have any questions about the Oxford Events launch, please contact halo@digital.ox.ac.uk
We propose an algorithm for machine learning of conditional distribution functions for a dependent variable (Y ) with continuous support. The algorithm produces a complete description of the conditional distribution function at all observed points in the covariate (X) space, and provides a similar estimate for other possible covariate values. The descriptions it provides are quite general and are globally valid conditional densities.The algorithm is multilayered and feed-forward. Each layer has the same statistical interpretation: Layer k takes a vector e(k-1) that is nearly perfectly marginally Gaussian and makes it more marginally Gaussian and more independent of X. It does this by applying a continuous monotonic transformation that varies depending on an observation’s X value. Each layer is estimated by an elastic net regularization of maximum likelihood. We demonstrate Wilks’ phenomenon for the composite algorithm and show how to calculate the algorithm’s effective dimension.
Please sign up for meetings here: docs.google.com/spreadsheets/d/1qPQqXivNYBDNJY_0OdHcZfjslHLu5UtSVQMd0LpETqc/edit#gid=0