The aim of the workshop is to bring together Australian researchers and practitioners as well as key international academics in areas related to data science, e.g., mathematics, statistics and computer science, to discuss recent advancements, share ideas and foster new local and international collaborations. The workshop consists of two integral parts:
We intend to organise ongoing yearly workshops with specific underlying theme for each year's boot camp. In our inaugural year in 2019, the theme of the boot camp is 'Randomised Numerical Linear Algebra' (RandNLA), which is a subject of ongoing intense study. For this, invited speakers who are the pioneers and leading scholars in the field will bring their expertise in the area of RandNLA and expose early career researchers and graduate students to related background topics. The aim of the second half of the workshop is to explore not only the interactions between the theme of the boot camp, i.e., RandNLA for 2019, and machine learning, but also a diverse range of topics in data science. For this, speakers from various areas related to the mathematics of data science will present and discuss their cutting-edge research results.
To become acquainted with the topic RandNLA, please see the introductory monograph "Lectures on Randomized Numerical Linear Algebra". The suggested prerequisite for the Boot Camp is an introductory knowledge on probability and linear algebra.
Other dates will be added as they are decided.
The venue for the workshop is Fort Scratchley Function Centre, Nobbys Road, Newcastle.
The workshop consists of two integral parts:
The workshop will involve a dinner, to be held on Tuesday 10th December.
|Sunday, 8 December 2019|
|12:00-12:45||Kenneth L. Clarkson|
|14:15-15:00||Kenneth L. Clarkson|
|15:00-15:45||Kenneth L. Clarkson|
|16:15-17:00||David P. Woodruff|
|Monday, 9 December 2019|
|9:00-9:45||David P. Woodruff||Boot Camp|
|9:45-10:30||David P. Woodruff|
|11:00-11:50||Michael Mahoney, Kenneth L. Clarkson, and David P. Woodruff||Boot Camp Wrap-Up|
|11:50-12:40||Michael Mahoney||Minimax Experimental Design, Exact Expressions for Double Descent, and Implicit Regularization in RandNLA Algorithms|
|14:00-14:50||David P. Woodruff||Towards a Zero-One Law for Column Subset Selection|
|14:50-15:40||Kenneth L. Clarkson||Quantum-inspired-inspired Algorithms for Data Analysis|
|16:10-17:00||Michael E. Houle||Local Intrinsic Dimensionality: A Practical Foundation for Dimensionally-aware Data Analysis|
|Tuesday, 10 December 2019|
|9:00-9:50||Deanna Needell||Simple Approaches to Complicated Data Analysis|
|10:20-11:10||Matt Wand||Streamlined Variational Inference for Random Effects Models|
|11:10-12:00||Peter Taylor||Some Thoughts About a Distributed Solution of the PageRank Equation|
|13:00-23:00||Excursion and Dinner|
|Wednesday, 11 December 2019|
|9:00-9:50||Kate Smith-Miles||Party Tricks with Numerical Linear Algebra and the Quest for Trust|
|9:50-10:40||Kerri Mengersen||Bayesian Statistical Analysis of Large Images|
|11:10-11:40||Vivak Patel||On the Practice of Solving Randomly Sketched Linear Systems|
|11:40-12:10||Daniel Ahfock||On Randomised Sketching Algorithms and the Tracy-Widom Law|
|12:10-12:40||Lindon Roberts||Improving the Scalability of Model-based Derivative-free Optimization|
|14:00-14:30||Peter Eades||Large Graph Visualisation Using Spectral Sparsification|
|14:30-15:00||Mikhail Kamalov||Semi-supervised VAE with PageRank|
|15:00-15:30||Zdravko Botev||How Bayes Can Help Frequentist Model Selection|
|16:00-16:30||Scott Lindstrom||Splitting Methods for Signal Recovery|
|16:30-17:00||Sevvandi Kandanaarachchi||Dimension Reduction for Outlier Detection|
|Thursday, 12 December 2019|
|9:00-9:30||Fred Roosta||Reproducing Stein Kernel Approach for Correcting Approximate Sampling Algorithms|
|9:30-10:00||Ali Eshragh||LSAR: Efficient Leverage Score Sampling Algorithm for the Analysis of Big Time Series Data|
|10:00-10:30||Glen Livingston Jr||ARMA Models and Big Data|
|11:00-11:30||Samudra Herath||Name-like Numbers for Simulating Names in Entity Resolution|
|11:30-12:00||Yang Liu||Stability Analysis of Newton-MR Under Hessian Perturbations|
|12:00-12:30||Russell Tsuchida||Richer Parameter Priors for Infinitely Wide MLPs|
|14:00-14:30||James Juniper||‘Unreasonable Effectiveness’ of Machine Learning in Both the Natural Sciences and the Social Sciences?|
|14:30-15:00||Vektor Dewanto||A Review on Average-reward Reinforcement Learning|
|15:00-15:30||Robert King||Sampling Behaviour of L-moment Estimators of the GPD Type Generalised Lambda Distribution|
|• Kenneth Clarkson||IBM Research, USA|
|• Michael Houle||National Institute of Informatics, Japan|
|• Michael Mahoney||University of California, Berkeley, USA|
|• Kerrie Mengersen||Queensland University of Technology, Australia|
|• Deanna Needell||UCLA, USA (Teleconference talk)|
|• Joshua Ross||University of Adelaide, Australia|
|• Kate Smith-Miles||University of Melbourne, Australia|
|• Peter Taylor||University of Melbourne, Australia|
|• Matt Wand||University of Technology Sydney, Australia|
|• David Woodruff||Carnegie Mellon University, USA|
Before 31st October (early bird):
After 31st October:
|Registration fees include attending all sessions, morning/afternoon teas, Hunter Valley tour and the conference dinner. Prices are GST inclusive.|
Higher-degree research students in mathematical sciences and Early Career Researchers (ECRs) at AMSI Member Institutions attending the workshop may apply to receive full or partial travel support through the AMSI Travel Fund. Applicants must, at the time of the event, satisfy one of the following:
For more information on the process and to apply, please visit the AMSI Travel Funding form.
Additional travel support may be available to workshop attendees through one of these schemes:
For those arriving at the Newcastle Airport we recommend taking a taxi to Newcastle. The taxi rank is adjacent to the arrivals area of the terminal. Newcastle Taxis can be contacted directly, free-of-charge, on the dedicated taxi phone located in the arrivals end of the terminal. Alternatively, you can catch the 130 or 131 bus from the Newcastle Airport to the Newcastle Station. From Newcastle Station it is an easy walk to the recommended hotels. For more information and/or to plan your exact trip times see the Sydney Trains or the Port Stephens Coaches timetables.
For those arriving at Sydney Airport we recommend taking the train. For information and trip planning, please visit www.transportnsw.info. The trip typically takes 3 hours, though the route is scenic. You should take the T2 Airport Line from Domestic or International Airport Stations to Central Station and then the Central Coast & Newcastle Line.
We have recently been advised that a company contacted a couple of invited speakers in regards to booking their hotel accommodation on behalf of the Data Science Down Under workshop. We would like to confirm that the company has no relation to our event or our university. Please do not book with them as we do not know if they are legitimate. If you have any inquiries about your booking, please contact Ms Juliane Turner directly.
There are two main areas with hotels and backpacker accommodation — Newcastle East and Honeysuckle, within easy walking distance of each other and the conference venue. Honeysuckle is a popular area with lots of restaurants with great views across the harbour. Additionally, the website wotif.com sometimes has good deals.
|• Ali Eshragh (Chair)||University of Newcastle, Australia|
|• Fred Roosta (Co-chair)||University of Queensland, Australia|
|• Ricardo Campello||University of Newcastle, Australia|
|• Elizabeth Stojanovski||University of Newcastle, Australia|
|• Natalie Thamwattana||University of Newcastle, Australia|
If you want to present a talk or a poster, please submit your abstract through the following link:
Note that abstract submission is a separate process from registering to attend the conference.
The deadline for abstract submission is 31st October 2019 (extended from the end of August).