-
Notifications
You must be signed in to change notification settings - Fork 350
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatically make files for comprehensive offline gracedb uploads #4438
Automatically make files for comprehensive offline gracedb uploads #4438
Conversation
802e738
to
9e67d82
Compare
1702923
to
679252d
Compare
I'm struggling to get the xmlall to be transferred into the minifollowup at the moment, but have added skymap creation and plotting. For using bayestar in a workflow, I've suggested allowing a set filename for output https://git.ligo.org/lscsoft/ligo.skymap/-/merge_requests/331 |
679252d
to
2912a20
Compare
Now trying a wrapper script workaround as bayestar doesnt look like it will be able to output individual filenames any time soon Search will fail as we don't include ligo-make-skymap or bayestar-localize-coincs in pycbc. I will remove the changes from the config files before merging |
There is an error with There are also a couple minor changes I plan to make to the single-event-upload a script, but I think this is ready to review otherwise. |
c39b0be
to
860ff10
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added a few comments from myself for description / fixing a couple of things
return merge_psds(workflow, psd_files, ifo, out_dir, tags=tags) | ||
else: | ||
return psd_files[0] | ||
return merge_psds(workflow, psd_files, ifo, out_dir, tags=tags) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Always use merged psds, even if there is only one calculate_psd job
Example upload: |
poke @titodalcanton |
2c3c0e0
to
71f02ed
Compare
…eseries correspond to event
for sngl in sngl_inspiral_table: | ||
if sngl.event_id not in sngl_ids: | ||
continue | ||
sngl_inspiral_table_curr.append(sngl) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we including "subthreshold" detectors here? E.g. how would the single inspiral table for a L1-only trigger with observing-mode H1 look like?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not including subthreshold detectors here, this is inherited from the pycbc_page_foreground output, so does not have this info added. I'm not sure whether we'd want that though or not
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I found one from the github example run, and have uploaded to https://gracedb-playground.ligo.org/events/G1268054/view/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, I suspect we do need the single inspiral entries to be present in order for BAYESTAR to use the subthreshold detectors, but let's keep things as they are for now. We can fix that with a later PR if needed.
@@ -0,0 +1,185 @@ | |||
#!/usr/bin/env python |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I should probably say that this code would be run after the workflow has finished, and the box is opened.
We would use something like:
pycbc_upload_single_event_to_gracedb \
--xml-file-for-upload \
$upload_data_dir/H1L1V1-GENERATE_XML_FULL_DATA_UPLOAD_3-1186740100-3400.xml \
--snr-timeseries-plot \
$upload_data_dir/H1L1V1-GENERATE_XML_FULL_DATA_UPLOAD_3_SNR-1186740100-3400.png \
--asd-plot \
$upload_data_dir/H1L1V1-GENERATE_XML_FULL_DATA_UPLOAD_3_PSD-1186740100-3400.png \
--skymap-plot \
$upload_data_dir/H1L1V1-SKYMAP_PLOT_FULL_DATA_UPLOAD_3-1186740100-3400.png \
--skymap-fits-file \
$upload_data_dir/H1L1V1-BAYESTAR_FULL_DATA_UPLOAD_3-1186740100-3400.fits \
--labels \
GRACEDB_LABEL
With the upload number (3 here) being looped over and timestamp updated as appropriate
logging.basicConfig(format='%(asctime)s:%(levelname)s : %(message)s', | ||
level=logging.INFO) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use PyCBC's logging_init()
function here.
labels = [l.upper() for l in args.labels] | ||
allowed_labels = gracedb.allowed_labels | ||
|
||
if not all([l in allowed_labels for l in labels]): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if not all([l in allowed_labels for l in labels]): | |
if set(labels) - set(allowed_labels): |
comment = "PyCBC ASD estimate from the time of event" | ||
tag = "psd" | ||
displayname='ASDs' | ||
upload_file(args.asd_plot, displayname, comment, tag) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The use of variables here is a bit verbose, I suggest this:
comment = "PyCBC ASD estimate from the time of event" | |
tag = "psd" | |
displayname='ASDs' | |
upload_file(args.asd_plot, displayname, comment, tag) | |
upload_file( | |
args.asd_plot, | |
displayname='ASDs', | |
comment='PyCBC ASD estimate from the time of event', | |
tag='psd' | |
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment for the cases below of course.
ifo_list = f.attrs['ifos'].split(' ') | ||
for ifo in ifo_list: | ||
times[ifo] = f['{}/{}/time'.format(file_val,ifo)][:][event_idx] | ||
tids[ifo] = f['{}/{}/trigger_id'.format(file_val, ifo)][:][event_idx] | ||
bank_ids = f['{}/template_id'.format(file_val)][:][event_idx] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bank_ids = f['{}/template_id'.format(file_val)][:][event_idx] | |
bank_ids = f[file_val]['template_id'][:][event_idx] |
though you can remove quite a bit of verbosity with an auxiliary variable f_cat = f[file_val]
around line 102.
tids | ||
) | ||
|
||
sngl_tmplt_files, sngl_tmplt_plots = mini.make_single_template_plots( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the variable sngl_tmplt_files
actually used below? If not, turn to an underscore.
Updated the PR name as the scope has widened somewhat |
In order to avoid manually making skymaps for the data release (as in O3), we want to include files in the offline uploads as standard.
This includes:
an HDF file being uploaded, alongside the PSD (as is done for live)NB - edits to this description are in italic text