Skip to content

Commit

Permalink
Merge branch 'gwastro:master' into temp_bin_infra
Browse files Browse the repository at this point in the history
  • Loading branch information
ArthurTolley authored Mar 19, 2024
2 parents c67921d + 6040a0c commit 6a06a4b
Show file tree
Hide file tree
Showing 165 changed files with 2,220 additions and 1,155 deletions.
52 changes: 52 additions & 0 deletions .github/PULL_REQUEST_TEMPLATE/standard_template.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
<!---
Please add a title which is a concise description of what you are doing,
e.g. 'Fix bug with numpy import in pycbc_coinc_findtrigs' or 'add high frequency sky location dependent response for long detectors'
--->

<!---
This is a brief template for making pull requests for PyCBC.
This is _not_ a proscriptive template - you can use a different style if you want.
Please do think about the questions posed here and whether the details will be useful to include in your PR
Please add sufficient details so that people looking back at the request with no context around the work
can understand the changes.
To choose reviewers, please look at the git blame for the code you are changing (if applicable),
or discuss in the gwastro slack.
Please add labels as appropriate
-->

- [ ] The author of this pull request confirms they will adhere to the [code of conduct](https://github.com/gwastro/pycbc/blob/master/CODE_OF_CONDUCT.md)

## Standard information about the request

<!--- Some basic info about the change (delete as appropriate) --->
This is a: bug fix, new feature, efficiency update, other (please describe)

<!--- What codes will this affect? (delete as apropriate)
If you do not know which areas will be affected, please ask in the gwastro #pycbc-code slack
--->
This change affects: the offline search, the live search, inference, PyGRB

<!--- What code areas will this affect? (delete as apropriate) --->
This change changes: documentation, result presentation / plotting, scientific output

<!--- Some things which help with code management (delete as appropriate) --->
This change: has appropriate unit tests, follows style guidelines (See e.g. [PEP8](https://peps.python.org/pep-0008/)), has been proposed using the [contribution guidelines](https://github.com/gwastro/pycbc/blob/master/CONTRIBUTING.md)

<!--- Notes about the effect of this change --->
This change will: break current functionality, require additional dependencies, require a new release, other (please describe)

## Motivation
<!--- Describe why your changes are being made -->

## Contents
<!--- Describe your changes, this doesn't need to be a line-by-line code change discussion,
but rather a general discussion of the methods chosen -->

## Links to any issues or associated PRs
<!--- If this is fixing / working around an already-reported issue, please link to it here --->

## Testing performed
<!--- Describe tests for the code changes, either already performed or to be performed -->

## Additional notes
<!--- Anything which does not fit in the above sections -->
7 changes: 3 additions & 4 deletions .github/workflows/basic-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ jobs:
tox -e py-inference
- name: store documentation page
if: matrix.test-type == 'docs' && matrix.python-version == '3.8'
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
with:
name: documentation-page
path: _gh-pages
Expand All @@ -61,10 +61,9 @@ jobs:
if: github.ref == 'refs/heads/master' && github.event_name == 'push'
steps:
- name: retrieve built documentation
uses: actions/download-artifact@v4
uses: actions/download-artifact@v2
with:
pattern: documentation-page-*
merge-multiple: true
name: documentation-page
- name: debug
run: |
mkdir _gh-pages
Expand Down
10 changes: 4 additions & 6 deletions .github/workflows/distribution.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,8 @@ jobs:
CIBW_BUILD: cp38-* cp39-* cp310-* cp311-*
CIBW_SKIP: "*musllinux*"
CIBW_ARCHS_MACOS: x86_64 arm64
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v2
with:
name: wheels-${{ matrix.os }}
path: ./wheelhouse/*.whl
deploy_pypi:
name: Build and publish Python 🐍 distributions 📦 to PyPI
Expand All @@ -45,14 +44,13 @@ jobs:
uses: actions/setup-python@v4
with:
python-version: 3.8
- uses: actions/download-artifact@v4
- uses: actions/download-artifact@v2
with:
pattern: wheels-*
merge-multiple: true
path: ./
- name: build pycbc for pypi
run: |
python setup.py sdist
mv *.whl dist/
mv artifact/* dist/
- name: Publish distribution 📦 to PyPI
if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags')
uses: pypa/gh-action-pypi-publish@master
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/inference-workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ jobs:
wget -qO - https://download.pegasus.isi.edu/pegasus/gpg.txt | sudo apt-key add -
echo "deb https://download.pegasus.isi.edu/pegasus/ubuntu bionic main" | sudo tee -a /etc/apt/sources.list
sudo apt-get -o Acquire::Retries=3 update
sudo apt-get -o Acquire::Retries=3 install pegasus
sudo apt-get -o Acquire::Retries=3 install pegasus=5.0.6-1+ubuntu18
- run: sudo apt-get -o Acquire::Retries=3 install *fftw3* intel-mkl*
- name: Install pycbc
run: |
Expand All @@ -48,12 +48,12 @@ jobs:
find submitdir/work/ -type f -name '*.tar.gz' -delete
- name: store log files
if: always()
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
with:
name: logs
path: gw_output/submitdir/work
- name: store result page
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
with:
name: results
path: html
6 changes: 3 additions & 3 deletions .github/workflows/search-workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ jobs:
wget -qO - https://download.pegasus.isi.edu/pegasus/gpg.txt | sudo apt-key add -
echo "deb https://download.pegasus.isi.edu/pegasus/ubuntu bionic main" | sudo tee -a /etc/apt/sources.list
sudo apt-get -o Acquire::Retries=3 update
sudo apt-get -o Acquire::Retries=3 install pegasus
sudo apt-get -o Acquire::Retries=3 install pegasus=5.0.6-1+ubuntu18
- run: sudo apt-get -o Acquire::Retries=3 install *fftw3* intel-mkl*
- name: Install pycbc
run: |
Expand All @@ -55,12 +55,12 @@ jobs:
find submitdir/work/ -type f -name '*.tar.gz' -delete
- name: store log files
if: always()
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
with:
name: logs
path: output/submitdir/work
- name: store result page
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
with:
name: results
path: html
4 changes: 2 additions & 2 deletions .github/workflows/tmpltbank-workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ jobs:
wget -qO - https://download.pegasus.isi.edu/pegasus/gpg.txt | sudo apt-key add -
echo "deb https://download.pegasus.isi.edu/pegasus/ubuntu bionic main" | sudo tee -a /etc/apt/sources.list
sudo apt-get -o Acquire::Retries=3 update
sudo apt-get -o Acquire::Retries=3 install pegasus
sudo apt-get -o Acquire::Retries=3 install pegasus=5.0.6-1+ubuntu18
- run: sudo apt-get -o Acquire::Retries=3 install *fftw3* intel-mkl*
- name: Install pycbc
run: |
Expand All @@ -51,7 +51,7 @@ jobs:
find submitdir/work/ -type f -name '*.tar.gz' -delete
- name: store log files
if: always()
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
with:
name: logs
path: output/submitdir/work
4 changes: 2 additions & 2 deletions .github/workflows/workflow-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ jobs:
wget -qO - https://download.pegasus.isi.edu/pegasus/gpg.txt | sudo apt-key add -
echo "deb https://download.pegasus.isi.edu/pegasus/ubuntu bionic main" | sudo tee -a /etc/apt/sources.list
sudo apt-get -o Acquire::Retries=3 update
sudo apt-get -o Acquire::Retries=3 install pegasus
sudo apt-get -o Acquire::Retries=3 install pegasus=5.0.6-1+ubuntu18
- run: sudo apt-get -o Acquire::Retries=3 install *fftw3* intel-mkl*
- name: Install pycbc
run: |
Expand All @@ -50,7 +50,7 @@ jobs:
find submitdir/work/ -type f -name '*.tar.gz' -delete
- name: store log files
if: always()
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
with:
name: logs-${{matrix.test-type}}
path: examples/workflow/generic/${{matrix.test-type}}/submitdir/work
16 changes: 16 additions & 0 deletions bin/all_sky_search/pycbc_bin_trigger_rates_dq
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,18 @@ logging.info('Start')

ifo, flag_name = args.flag_name.split(':')

if args.gating_windows:
gate_times = []
with h5.File(args.trig_file, 'r') as trig_file:
logging.info('Getting gated times')
try:
gating_types = trig_file[f'{ifo}/gating'].keys()
for gt in gating_types:
gate_times += list(trig_file[f'{ifo}/gating/{gt}/time'][:])
gate_times = np.unique(gate_times)
except KeyError:
logging.warning('No gating found in trigger file')

trigs = SingleDetTriggers(
args.trig_file,
ifo,
Expand Down Expand Up @@ -134,6 +146,7 @@ with h5.File(args.output_file, 'w') as f:
frac_dt = abs(segs) / livetime
dq_rates[state] = frac_eff / frac_dt
bin_grp['dq_rates'] = dq_rates
bin_grp['num_triggers'] = len(trig_times_bin)

# save dq state segments
for dq_state, segs in dq_state_segs_dict.items():
Expand All @@ -142,7 +155,10 @@ with h5.File(args.output_file, 'w') as f:
starts, ends = segments_to_start_end(segs)
dq_grp['segment_starts'] = starts
dq_grp['segment_ends'] = ends
dq_grp['livetime'] = abs(segs)

f.attrs['stat'] = f'{ifo}-dq_stat_info'
f.attrs['sngl_ranking'] = args.sngl_ranking
f.attrs['sngl_ranking_threshold'] = args.stat_threshold

logging.info('Done!')
4 changes: 2 additions & 2 deletions bin/all_sky_search/pycbc_coinc_hdfinjfind
Original file line number Diff line number Diff line change
Expand Up @@ -161,8 +161,8 @@ for trigger_file, injection_file in zip(args.trigger_files,
% (len(found), len(missed), len(ambiguous)))

if len(ambiguous) > 0:
logging.warn('More than one coinc trigger found associated '
'with injection')
logging.warning('More than one coinc trigger found associated '
'with injection')
am = numpy.arange(0, len(inj_time), 1)[left[ambiguous]]
bm = numpy.arange(0, len(inj_time), 1)[right[ambiguous]]

Expand Down
4 changes: 2 additions & 2 deletions bin/all_sky_search/pycbc_coinc_statmap
Original file line number Diff line number Diff line change
Expand Up @@ -202,8 +202,8 @@ else:
back_locs = all_trigs.timeslide_id != 0

if (back_locs.sum()) == 0:
logging.warn("There were no background events, so we could not assign "
"any statistic values")
logging.warning("There were no background events, so we could not "
"assign any statistic values")
sys.exit()

logging.info("Dumping background triggers (inclusive of zerolag)")
Expand Down
34 changes: 14 additions & 20 deletions bin/all_sky_search/pycbc_fit_sngls_over_multiparam
Original file line number Diff line number Diff line change
Expand Up @@ -174,18 +174,14 @@ parser = argparse.ArgumentParser(usage="",

pycbc.add_common_pycbc_options(parser)
parser.add_argument("--version", action=pycbc.version.Version)
parser.add_argument("--template-fit-file",
parser.add_argument("--template-fit-file", required=True,
help="hdf5 file containing fit coefficients for each"
" individual template. Required")
parser.add_argument("--bank-file", default=None,
help="hdf file containing template parameters. Required "
"unless reading param from template fit file")
parser.add_argument("--bank-file", required=True,
help="hdf file containing template parameters. Required")
parser.add_argument("--output", required=True,
help="Location for output file containing smoothed fit "
"coefficients. Required")
parser.add_argument("--use-template-fit-param", action="store_true",
help="Use parameter values stored in the template fit "
"file as template_param for smoothing.")
"coefficients. Required")
parser.add_argument("--fit-param", nargs='+',
help="Parameter(s) over which to regress the background "
"fit coefficients. Required. Either read from "
Expand All @@ -196,20 +192,19 @@ parser.add_argument("--fit-param", nargs='+',
"multiple parameters, provide them as a list.")
parser.add_argument("--approximant", default="SEOBNRv4",
help="Approximant for template duration. Default SEOBNRv4")
parser.add_argument("--f-lower", type=float, default=0.,
help="Starting frequency for calculating template "
"duration, if not reading from the template fit file")
parser.add_argument("--f-lower", type=float,
help="Start frequency for calculating template duration.")
parser.add_argument("--min-duration", type=float, default=0.,
help="Fudge factor for templates with tiny or negative "
"values of template_duration: add to duration values"
" before fitting. Units seconds.")
parser.add_argument("--log-param", nargs='+',
help="Take the log of the fit param before smoothing.")
help="Take the log of the fit param before smoothing. "
"Must be a list corresponding to fit params.")
parser.add_argument("--smoothing-width", type=float, nargs='+', required=True,
help="Distance in the space of fit param values (or the "
"logs of them) to smooth over. Required. "
"This must be a list corresponding to the smoothing "
"parameters.")
help="Distance in the space of fit param values (or their"
" logs) to smooth over. Required. Must be a list "
"corresponding to fit params.")
parser.add_argument("--smoothing-method", default="smooth_tophat",
choices = _smooth_dist_func.keys(),
help="Method used to smooth the fit parameters; "
Expand All @@ -220,15 +215,14 @@ parser.add_argument("--smoothing-method", default="smooth_tophat",
"the smoothing until 500 triggers are reached. "
"'distance_weighted' weights the closest templates "
"with a normal distribution of width smoothing-width "
"trucated at three smoothing-widths.")
"truncated at three smoothing-widths.")
parser.add_argument("--smoothing-keywords", nargs='*',
help="Keywords for the smoothing function, supplied "
"as key:value pairs, e.g. total_trigs:500 to define "
"the number of templates in the n_closest smoothing "
"method")
"the number of templates for n_closest smoothing.")
parser.add_argument("--output-fits-by-template", action='store_true',
help="If given, will output the input file fits to "
"fit_by_template group")
"fit_by_template group.")
args = parser.parse_args()

if args.smoothing_keywords:
Expand Down
2 changes: 1 addition & 1 deletion bin/all_sky_search/pycbc_fit_sngls_split_binned
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ else:
if args.bin_param == 'template_duration' and params[args.bin_param].min() == 0:
# Accessing the 0th entry of an empty region reference will return
# zero due to a quirk of h5py.
logging.warn('WARNING: Some templates do not contain triggers')
logging.warning('WARNING: Some templates do not contain triggers')
# Use the lowest nonzero template duration as lower limit for bins
pbin_lower_lim = params[args.bin_param][params[args.bin_param] > 0].min()
else:
Expand Down
2 changes: 1 addition & 1 deletion bin/all_sky_search/pycbc_prepare_xml_for_gracedb
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ def read_psds(psd_files):
psd = [group["psds"][str(i)] for i in range(len(group["psds"].keys()))]
psds[ifo] = segmentlist(psd_segment(*segargs) for segargs in zip(
psd, group["start_time"], group["end_time"]))
return psds
return psds

psds = read_psds(args.psd_files)

Expand Down
9 changes: 2 additions & 7 deletions bin/bank/pycbc_aligned_bank_cat
Original file line number Diff line number Diff line change
Expand Up @@ -48,9 +48,8 @@ __program__ = "pycbc_aligned_bank_cat"
parser = argparse.ArgumentParser(description=__doc__,
formatter_class=tmpltbank.IndentedHelpFormatterWithNL)

pycbc.add_common_pycbc_options(parser)
parser.add_argument("--version", action="version", version=__version__)
parser.add_argument("--verbose", action="store_true", default=False,
help="verbose output")
parser.add_argument("-i", "--input-glob",
help="file glob the list of paramters")
parser.add_argument("-I", "--input-files", nargs='+',
Expand All @@ -76,11 +75,7 @@ tmpltbank.insert_base_bank_options(parser, match_req=False)

options = parser.parse_args()

if options.verbose:
log_level = logging.DEBUG
else:
log_level = logging.WARN
logging.basicConfig(format='%(asctime)s %(message)s', level=log_level)
pycbc.init_logging(options.verbose)

# Sanity check options
if not options.output_file:
Expand Down
11 changes: 3 additions & 8 deletions bin/bank/pycbc_aligned_stoch_bank
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ Stochastic aligned spin bank generator.
import argparse
import numpy
import logging

import pycbc
import pycbc.version
from pycbc import tmpltbank
Expand All @@ -45,8 +46,7 @@ parser = argparse.ArgumentParser(description=_desc,

# Begin with code specific options
parser.add_argument("--version", action="version", version=__version__)
parser.add_argument("--verbose", action="store_true", default=False,
help="verbose output")
pycbc.add_common_pycbc_options(parser)
parser.add_argument("-V", "--vary-fupper", action="store_true", default=False,
help="Use a variable upper frequency cutoff in laying "
"out the bank. OPTIONAL.")
Expand Down Expand Up @@ -96,12 +96,7 @@ tmpltbank.insert_ethinca_metric_options(parser)

opts = parser.parse_args()

if opts.verbose:
log_level = logging.DEBUG
else:
log_level = logging.WARN
log_format='%(asctime)s %(message)s'
logging.basicConfig(format=log_format, level=log_level)
pycbc.init_logging(opts.verbose)

# delete defaults for redundant options if not varying fupper
if not opts.vary_fupper:
Expand Down
Loading

0 comments on commit 6a06a4b

Please sign in to comment.