Skip to content

Commit

Permalink
Update README_reference.md
Browse files Browse the repository at this point in the history
  • Loading branch information
arjunsuresh authored Feb 26, 2024
1 parent 32f7852 commit 7b67e02
Showing 1 changed file with 2 additions and 8 deletions.
10 changes: 2 additions & 8 deletions docs/mlperf/inference/resnet50/README_reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@ cm run script --tags=generate-run-cmds,inference,_find-performance,_all-scenario
* Use `--division=closed` to run all scenarios for the closed division (compliance tests are skipped for `_find-performance` mode)
* Use `--category=datacenter` to run datacenter scenarios
* Use `--backend=tf`, `--backend=ncnn` or `--backend=tvm-onnx` to use tensorflow, ncnn and tvm-onnx backends respectively
* Remove `_all-scenarios` and use `--scenario=Offline` to run the `Offline` scenario and similarly for `Server`, `SingleStream` and `MultiStream` scenarios.


### Do full accuracy and performance runs for all the scenarios

Expand All @@ -28,14 +30,6 @@ cm run script --tags=generate-run-cmds,inference,_submission,_all-scenarios --mo
* Use `--division=closed` to run all scenarios for the closed division including the compliance tests
* `--offline_target_qps`, `--server_target_qps`, `--singlestream_target_latency` and `multistream_target_latency` can be used to override the determined performance numbers

### Populate the README files describing your submission

```
cm run script --tags=generate-run-cmds,inference,_populate-readme,_all-scenarios \
--model=resnet50 --device=cpu --implementation=reference --backend=onnxruntime \
--execution-mode=valid --results_dir=$HOME/results_dir \
--category=edge --division=open --quiet
```

### Generate and upload MLPerf submission

Expand Down

0 comments on commit 7b67e02

Please sign in to comment.