0 votes
922 views

I'm trying to perform a cross-match between an already existing survey and this one. For the time being, I just want to be able to download fits images of certain objects. Looking through the Jupyter notebook tutorials, I was able to create a short script using a dataframe from the cross-match service on ls.dr7.tractor which are accurate to 0.01 degree for most objects.

Issues start to arise when I try to use an svc.search:

DEF_ACCESS_URL = "http://datalab.noao.edu/sia/ls_dr7" 
svc = sia.SIAService(DEF_ACCESS_URL)

imgTable = svc.search((ra,dec), verbosity=2).votable.to_table()

results in a table where the ra/decs are almost .1 off of where I indicated, which confuses me because I originally cross-matched objects in that said survey. Following that, I copied the function download_deepest_image from the notebook examples, and I get an array in which the counts are very low and the exposure time in the tables for stacked images are said to be 0, for example, one row in my data looks like this. 

[-3.5462887e-03 -1.8389812e-02 -1.1744213e-02 ... -1.9076301e-02
  -6.1717429e-03 -2.6141177e-03]

This is the command I used to download the images 

image = download_deepest_image(ra,dec,fov=0.1,band='z DECam SDSS c0004 9260.0 1520.0')

This causes my imshow plot command to come out with low numbers (~1e-4). So my questions are:

1) Does my svc access url properly query the right schema? I tried doing ls_dr7.tractor since that is what I originally cross-matched on, but it claims the URL doesn't exist.

2) Is there another way I could extract fits data for the cross-matched objects? For reference, I want to be able to in the end download grz-band fits for over 2000 objects from this service, so the image cut-out service isn't viable because I would like to be able to both name these files AND not have to press on "Download FITS" for every object. 

by derrcarr (220 points)
edited by 584 | 922 views

2 Answers

0 votes
Hi,

could you please send to datalab@noao.edu a bit more detailed
information? Let's try to go step by step, from the cross-matching, to
calling the SIA service for LS7, to using the download_deepest_image()
function (that value you provide to the 'band' argument looks very suspicious...).

Thank you,
Robert
by mrniceguy (180 points)
0 votes

Hi Derrick,

Thanks for the extra info.  So there isn't actually a bug here, I think, but there are some confusing points. 

First, on the subject of the RA and Dec in the table that is returned, these are of the center of the tile that contains the postage stamp that you want.  The default behavior of the SIA service is to query the image metadata table for any image that overlaps with the position and field of view that you gave, and then return the row for that table.  That row will report the parent image coordinates.  It also contains an entry in the access_url column that contains the URL needed to generate the cutout.  So this notebook cell:

fov = 0.1
ra = 7.139569
dec = -1.180077
imgTable = svc.search((ra,dec),(fov/np.cos(dec*np.pi/180), fov),verbosity=2).votable.to_table()

generates a table that contains e.g. this access_url:

http://datalab.noao.edu/svc/cutout?col=ls_dr7&siaRef=legacysurvey-0071m012-image-z.fits.fz&extn=1&POS=7.139569,-1.180077&SIZE=0.1000212139822063,0.1

where your RA, Dec, and FOV parameters are in the arguments of the cutout URL.

To download the z-band image from this table, I did:

sel = (imgTable['prodtype'] == 'image'.encode()) & (np.char.startswith(imgTable['obs_bandpass'].data.data.astype('str'),'z'))

to select rows with 'image' as the prodtype and 'z' at the start of the filter name, and then:

Table = imgTable[sel] 
row = Table[np.argmax(Table['exptime'].data.data.astype('float'))] 
url = row['access_url'] 
img = io.fits.getdata(utils.data.download_file(url.decode(),cache=True,show_progress=False,timeout=120)) 
hdr = io.fits.getheader(utils.data.download_file(url.decode(),cache=True,show_progress=False,timeout=120)) 

to get the image and header for that image.

If you look at the header, you'll notice that there is no exptime keyword, which is because for these image stacks, that quantity is not well-defined (and so the table schema records it as zero).  Instead, the units of the image are recorded in the BUNIT keyword as nanomaggys, hence what appear to be low counts.

Anyway, to your specific questions:

1.  Yes, you have the correct service URL.

2.  As you can tell from the table, the cutout is made available as a URL with parameters for the ra, dec, and FOV that you provide.  You could e.g. access a list of these with wget or curl, or download them through your notebook and store in your VOSpace.

Hope this helps...

Knut Olsen

by kolsen (2.7k points)

416 questions

433 answers

437 comments

636 users

Welcome to Data Lab Help Desk, where you can ask questions and receive answers from other members of the community.

Categories