Science Overview
Within JSP, we work on defining, creating, and implementing tools to combine data at space (~0.1-0.2") and ground-based (>0.7") resolution at the pixel level and at different wavelengths (near-infrared and optical). This results in the highest precision multi-wavelength images and photometric catalogs whose accuracy across projects will be similar to that within a project. Such precision at the sensitivities achieved by the projects enables a range of new science in the areas of Cosmology, Reionization and Galaxy Evolution, Microlensing and the Search for Planets and Stellar Mass Black Holes, and Stellar Populations and their Proper Motions.
JSP focuses on the following science cases:
- Precision dark energy studies by minimizing weak lensing systematics and much improved photometric redshifts
- Precision Hubble constant measurements through strong lensing time delays
- The search for the most distant galaxies and a characterization of their physical properties through high-fidelity multi-wavelength catalogs
- Stellar streams and dynamics of nearby galaxies through milli-to-microarcsec stellar motions at magnitudes up to 6 magnitudes deeper than Gaia
- Scene modeling and deconfusion to enable the detection of exoplanets through microlensing and faint, high-z supernovae
- Orbit derivation and composition of Solar System objects
In particular, given the timelines of the projects, Euclid-Rubin joint processing provides an excellent opportunity to refine techniques and identify systematics in preparation for Euclid-Rubin-Roman joint processing. Some of the early prototyping work has been on Subaru/HSC data and Hubble/ACS data (e.g. Faisst et al. 2021), since they are well matched to the resolution and depth of the forthcoming datasets.
On the technical side, with 100 Petabytes of reduced data to be handled from three different projects, and with a billion CPU hours of computing needed to obtain high accuracy solutions, JSP is developing scalable software within science platforms, and leveraging high-performance networking, in order to enable distributed co-processing of petabyte-sized datasets.