Benchmarking for Web Feature Service
See the file INSTALL.md
Run as follows...
-
Make sure you have installed
python3.7
andlocust
, see:INSTALL.md
. -
Activate the environment:
export PATH=$PWD/miniconda3/bin:$PATH
source activate
- Set up the base name of the output files:
SERVICE=glamod1
TESTER=$(hostname -s)
NOW=$(date +%Y%m%dT%H%M%S)
OUTPUT=results/${SERVICE}-${TESTER}-${NOW}
- And run the benchmarker for 30 minutes and 10 concurrent requests:
locust --no-web -c 10 --run-time 30m --csv=$OUTPUT
The outputs get written to the terminal and also to 2 CSV files in
the results/
sub-directory.
Command-line will find locustfile.py
and use it to configure tests. Run with:
locust --no-web -c 5 -r 1 --run-time 10s
Where -c
is number of locusts to spawn and -r
is hatch rate.
If we set the TaskSet.wait_function
to return 20000 (milliseconds), then the
initial number of concurrent locusts will spawn at the start but no more will
appear until 20 seconds have elapsed, regardless of whether they complete.
Can test this with:
locust --no-web -c 4 --run-time 30s
I have added a global logger called log
that we can use to log to the screen
to test out different modifications.