We are happy to share with you the latest developments at Astro Data Lab in this June 2024 newsletter!
In this newsletter
- New and retired datasets at Data Lab
- Improved downloads for async queries
- New Jupyter notebook server
- New Jupyter notebooks
- Updated disclaimers and acknowledgments
New and retired datasets at Data Lab
SkyMapper DR4
Data Lab has recently incorporated a new dataset for use by the community: SkyMapper DR4. The SkyMapper Southern Survey is a 6-band optical survey conducted with the Australian National University’s 1.3m SkyMapper Telescope at Siding Spring Observatory in Australia. This forth SkyMapper Data Release covers over 22,000 square degrees from the South Celestial Pole to Declinations of +16°, with some fields observed up to +28°. About 724 million unique sources have been observed from 15 billion photometric data points measured from over 400,000 images acquired between March 2014 and September 2021. For more information about SkyMapper and DR4, please visit the SkyMapper landing page or consult the SkyMapper DR4 paper (Onken et al. 2024).
As always, the Data Lab team has already crossmatched the catalog above with our reference datasets: Gaia DR3 (for astrometry), AllWISE, NSC DR2, unWISE DR1 (for photometry), and SDSS DR17 (for spectroscopy), and vice versa. We have also added a few other useful columns such as nest4096, ring256, and htm9 for sky tessellation use cases. The pre-crossmatched tables are accessible in the schema browser, and through standard TAP/SQL/ADQL queries, like all other catalogs at Data Lab.
The Data Lab team evaluates periodically which external survey datasets we should source, ingest, and serve. We appreciate requests and suggestions from our users. Please contact us at datalab@noirlab.edu to send your request and, if possible, mention an example scientific use case.
Retired datasets
SkyMapper DR1 and DR2 have been retired from Astro Data Lab. SkyMapper DR4 supersedes both of these releases.
Improved downloads for async queries
The Data Lab command-line package has been recently updated with a pycurl-based download mechanism for async queries. After updating their installed version of the ‘astro-datalab’ package users should see much improved download stability and speed in their workflows (multiple Mbyte/s of sustained speed, if the network allows it). The same benefits are now also deployed on the Data Lab Jupyter notebook server. Users can update their local version of Data Lab following the installation instructions.
New Jupyter notebook server
Data Lab has upgraded to new notebook server hardware. Users should see faster performance. The notebook server address is the same: https://datalab.noirlab.edu/devbooks The operating system and the Python stack have also been significantly updated. The default Python 3 kernel now runs v3.10. Other custom kernels were updated accordingly. Please note that it is possible that some previously available packages are not installed on the new server. If you are missing something important, please contact us at the Helpdesk. Also note that we are aware that a few notebooks in the Data Lab notebook collection currently don’t execute correctly on the new server - we are working on fixing it.
New Jupyter notebooks
Several new notebooks were recently added to Data Lab’s extensive collection of notebooks for our user community:
Accessing gravitational wave events using ANTARES
Authors: ANTARES Team
This notebook is an example of how to access gravitational wave notices from GCN using the ANTARES devkit and client. ANTARES receives alerts from surveys in real-time and sends them through a processing pipeline. The pipeline contains the following stages:
- 1. Associate the alert with the nearest point of known past measurements within a 1” radius. We call this a Locus.
- 2. Discard alerts with a high probability of being false detections.
- 3. Discard alerts with poor image quality.
- 4. Associate gravitational wave events from GCN to Locus.
- 5. Look up associated objects in our catalogs.
- 6. Update watch lists.
- 7. Execute filters on Locus.
- 8. Send Locus to user Kafka topics.
Filters are Python functions that take a Locus object as a single parameter. Functions on the Locus provide access to the alerts properties, the data from past alerts on the Locus, gravitational wave events on the Locus, and the associated catalog objects. This information can be used to characterize or classify the Locus. The Locus also provides functions to set new properties on the Alert and Locus objects, and to send the Locus to a specific Kafka stream.
SPARCL + Jdaviz
Authors: Camilla Pacifici, Brett Morris, Benjamin Weaver, Alice Jacques
This notebook demonstrates how to find and retrieve spectroscopic data for certain objects from the DESI EDR dataset using SPARCL (SPectra Analysis and Retrievable Catalog Lab) and display an interactive plot of an object's spectrum using Jdaviz.
Data reduction with DRAGONS
Authors: Brian Merino, Vinicius Placco
A set of seven Jupyter Notebook examples of data reduction for the Gemini Observatory instruments has been added. Usually, you need DRAGONS installed on your computer to run these notebooks, but the Astro Data Lab has a custom kernel called DRAGONS (Py3.7) that will allow you to run these data reduction notebooks directly on our server . These notebooks were written using the DRAGONS Application Program Interface (API) for Python, based on the examples provided in the DRAGONS Documentation.
Updated disclaimers & acknowledgments
We would like to bring your attention to updated Disclaimers and required Acknowledgements for Data Lab users. They can be found on our website, and all notebooks in the Data Lab notebook collection are being updated accordingly.
Contact Us
You can visit our website, use the Helpdesk, reach us via email, and follow us on Twitter/X.
Currently registered and active users have been subscribed to this
Newsletter mailing list. To unsubscribe, send an empty
email to datalab-newsletter-unsubscribe@mailman.tuc.noirlab.edu
and follow the instructions in the verification email to
confirm.