My research is focused on the application of advanced computational and statistical techniques to optimally extract knowledge to astronomical data in an effort to better understand the Universe in which we live. Specific application areas currently include:
1) Development and application of new statistical and machine learning techniques to large astrophysical data. Specific algorithms currently include random forests, random atlas, hierarchical Bayesian estimation, and deep neural networks, while specific application areas include source classification, image classification, and distance estimation.
2) Development and application of new cosmological measurement codes, in particular n-point clustering measurements and the quantifying the constraints these new measurements place on cosmological parameters and our understanding of the growth of large scale structure.
3) Acceleration of learning and cosmological measurement codes by using new hardware technologies. Currently these efforts focus on the use of multi- and many-core systems, GPUs, and cloud-based systems.
4) Identification and characterization of transient and variable phenomena in large photometric and spectroscopic data.
These projects primarily use photometric and spectroscopic data from the Sloan Digital Sky Survey, the Baryon Acoustic Oscillation Spectroscopic Survey, and the Dark Energy Survey, as well as several space-based surveys. Many of the techniques developed are also applicable to the design and operation of the forthcoming large synoptic survey telescope.