-
Hi. Some of the models in results/kodak include "encoding_time" and "decoding_time" fields. For example: https://github.com/InterDigitalInc/CompressAI/blob/master/results/kodak/compressai-cheng2020-attn.json Can you provide some details on how to interpret these numbers? My main questions are (1) which CPU and GPU were used, and (2) are the numbers average seconds per image (or total for all 24 kodak images or something else)? Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi David. Thanks for your comment and sorry for the late reply. Times correspond to average number of seconds per image: eval_model I actually re-ran some of the encodes/decodes and added a readme that mentions cpu/gpu main specs. As you know, those timings are here for information, but they are highly dependent on platform, implementation and hardware and should be considered with care. In particular, comparison with hybrid codecs is almost irrelevant. |
Beta Was this translation helpful? Give feedback.
Hi David. Thanks for your comment and sorry for the late reply.
Times correspond to average number of seconds per image: eval_model
I actually re-ran some of the encodes/decodes and added a readme that mentions cpu/gpu main specs. As you know, those timings are here for information, but they are highly dependent on platform, implementation and hardware and should be considered with care. In particular, comparison with hybrid codecs is almost irrelevant.