Skip to content

glamod/glamod-wfs-benchmarker

Repository files navigation

glamod-wfs-benchmarker

Benchmarking for Web Feature Service

Installation

See the file INSTALL.md

Running the benchmarker

Run as follows...

  1. Make sure you have installed python3.7 and locust, see: INSTALL.md.

  2. Activate the environment:

export PATH=$PWD/miniconda3/bin:$PATH
source activate
  1. Set up the base name of the output files:
SERVICE=glamod1
TESTER=$(hostname -s)
NOW=$(date +%Y%m%dT%H%M%S)
OUTPUT=results/${SERVICE}-${TESTER}-${NOW}
  1. And run the benchmarker for 30 minutes and 10 concurrent requests:
locust --no-web -c 10 --run-time 30m --csv=$OUTPUT

Where do the output files live?

The outputs get written to the terminal and also to 2 CSV files in the results/ sub-directory.

Basic usage

Command-line will find locustfile.py and use it to configure tests. Run with:

locust --no-web -c 5 -r 1 --run-time 10s

Where -c is number of locusts to spawn and -r is hatch rate.

Modifying behaviour of locust

Setting the wait_function

If we set the TaskSet.wait_function to return 20000 (milliseconds), then the initial number of concurrent locusts will spawn at the start but no more will appear until 20 seconds have elapsed, regardless of whether they complete.

Can test this with:

locust --no-web -c 4 --run-time 30s

Adding a logger

I have added a global logger called log that we can use to log to the screen to test out different modifications.

About

Benchmarking for Web Feature Service

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages