Skip to content

Commit

Permalink
fix readme
Browse files Browse the repository at this point in the history
  • Loading branch information
Giuseppe5 committed Oct 28, 2024
1 parent 5f789fb commit 6ad16bf
Showing 1 changed file with 0 additions and 10 deletions.
10 changes: 0 additions & 10 deletions src/brevitas_examples/llm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,23 +35,13 @@ usage: main.py [-h] [--model MODEL] [--seed SEED] [--nsamples NSAMPLES]
[--input-quant-granularity {per_tensor,per_row,per_group}]
[--input-group-size INPUT_GROUP_SIZE]
[--quantize-input-zero-point] [--quantize-last-layer] [--gptq]
<<<<<<< HEAD
[--gpfq] [--gpxq-act-order] [--gpxq-use-quant-activations] [--gpxq-create-weight-orig]
[--gpxq-max-accumulator-bit-width GPXQ_MAX_ACCUMULATOR_BIT_WIDTH]
[--gpxq-max-accumulator-tile-size GPXQ_MAX_ACCUMULATOR_TILE_SIZE]
[--act-calibration] [--bias-corr] [--ln-affine-merge]
[--no-quantize] [--no-float16] [--replace-mha]
[--weight-equalization]
[--act-equalization {None,layerwise,fx}] [--load-awq LOAD_AWQ]
=======
[--gpfq] [--gpxq-act-order] [--gpxq-use-quant-activations]
[--gpxq-create-weight-orig] [--act-calibration] [--bias-corr]
[--ln-affine-merge] [--replace-rmsnorm] [--no-quantize]
[--no-float16] [--replace-mha] [--weight-equalization]
[--graph-rotation] [--graph-rotation-mode {had,ort}]
[--layerwise-rotation] [--act-equalization {None,layerwise,fx}]
[--load-awq LOAD_AWQ]
>>>>>>> Fix and README
[--export-target {None,onnx_qcdq,torch_qcdq,sharded_torchmlir_group_weight,sharded_packed_torchmlir_group_weight}]
[--export-prefix EXPORT_PREFIX]
[--checkpoint-name CHECKPOINT_NAME] [--fuse-sequences]
Expand Down

0 comments on commit 6ad16bf

Please sign in to comment.