0 votes
1 view

I am trying to upload a table into myDB using the data explorer tab. I get the following error -

"Unable to process upload. Verify that the csv headers are present and that they match the table schema (for existing tables)"

I followed this answer thread to troubleshoot the problem. The table is the LOFAR Value Added Catalog (link to Data release and README file). I made sure the file name, table names and column headers are all lower case and allowed characters only as listed on the CSV file formatting requirements help page here. The total file size is ~ 4 GB but this is under the total size limits as described here. Another interesting thing to note is the file is ~ 4GB on my disk, but during the upload process the total size column actually displays the size as 3.7 GB. To check whether data formatting was the problem, I uploaded a truncated subset of the file using only the first two rows and the same header, this file is 2.4 kB. This gives the same error. 

The table was converted from the original fits file into ascii.csv using astropy.table.

EDIT: The header and first row is -

```

source_name,ra,dec,e_ra,e_dec,total_flux,e_total_flux,peak_flux,e_peak_flux,s_code,mosaic_id,maj,min,pa,e_maj,e_min,e_pa,dc_maj,dc_min,dc_pa,isl_rms,flag_workflow,id_flag,prefilter,postfilter,lr_fin,optra,optdec,composite_size,composite_width,composite_pa,assoc,id_qual,assoc_qual,blend_prob,other_prob,created,position_from,renamed_from,id_ra,id_dec,uid_l,unwise_objid,id_name,separation,legacy_id,hpx,release,brickid,objid,maskbits,fracflux_g,fracflux_r,fracflux_z,type,ra,dec,pstar,star,anymask_opt,gmmcomp,zphot,zphot_err,var.density,var.tr.noise,var.in.noise,flag_qual,zspec_sdss,zwarning_sdss,plate_sdss,mjd_sdss,fiberid_sdss,z_hetdex,z_hetdex_conf,hetdex_sourceid,z_desi,z_desi_err,desi_sourceid,2rxs_id,xmmsl2_id,resolved,las,z_best,z_source,size,l_144,field,legacy_coverage,lm_flux,lm_size,bad_lm_flux,bad_lm_image,las_from,mag_g,magerr_g,mag_r,magerr_r,mag_z,magerr_z,mag_w1,magerr_w1,mag_w2,magerr_w2,mag_w3,magerr_w3,mag_w4,magerr_w4,wise_src,mass_median,mass_l68,mass_u68,g_rest,r_rest,z_rest,u_rest,v_rest,j_rest,k_rest,w1_rest,w2_rest,r_50,r_50_err,flag_mass
ILTJ000000.03+195152.1,0.0001397891057877132,19.864494342561894,1.3216566180151654,0.916539605380532,1.2329283979607426,0.43208893097044526,0.4658497899333264,0.12294834612392941,S,P359+21,11.100479930452728,8.584910238521983,101.29414881778887,3.1449812567973736,2.110634486533086,49.723586831447754,9.339157273177563,6.139031440183402,101.29414881778908,0.1155058853328228,8,1,-99,0,0.17094795081591677,,,,,,0,,,,,Create initial sources,LR,--,,,--,--,--,,-1,208,,,,,,,,--,,,,--,--,--,,,,,,,,,,,,,,,,,,,,False,18.678314546355125,,,,,Fall,True,,,False,False,Gaussian,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,False

```

by ssabhlok (120 points)
edited by ssabhlok | 1 view

1 Answer

0 votes
Hi, it looks like you have duplicate column names "ra" and "dec". Once those are removed or renamed, the upload to MyDB should work. I tried your example with renamed columns and it uploaded successfully. Let me know if it works for you or if you continue to run into issues!
ago by ajacques (1.6k points)

527 questions

532 answers

513 comments

705 users

Welcome to Data Lab Help Desk, where you can ask questions and receive answers from other members of the community.

Categories