You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the paper, you set the scale factor as 2.5, so I set scale value as (0.4,1.0) as below. But the ratio is just mentioned as random, so how should I set RCR params? Compose([RandomCropandResize((args.image_height, args.image_width), scale=(0.4, 1.0), ratio = (0.5,2.0)), DownscaleFlow(), ToTensor()])
Optimizer setting
Without any information, I assumed Adam optimizer as shown below. Have you used L2 regularization? If so, what value did you use for weight_decay? optimizer = optim.Adam(filter(lambda p: p.requires_grad, parameters()), lr=args.lr, weight_decay=args.weight_decay)
iteration vs. epoch
The paper said that 100,000 iterations with batch_size=100 were used for the first stage to train pose network. If batch_size=100 and 400,000 images were trained, 4,000 iter is 1 epoch, so is it correct to train 25 epochs for the first stage?
The text was updated successfully, but these errors were encountered:
In the paper, you set the scale factor as 2.5, so I set scale value as (0.4,1.0) as below. But the ratio is just mentioned as random, so how should I set RCR params?
Compose([RandomCropandResize((args.image_height, args.image_width), scale=(0.4, 1.0), ratio = (0.5,2.0)), DownscaleFlow(), ToTensor()])
Optimizer setting
Without any information, I assumed Adam optimizer as shown below. Have you used L2 regularization? If so, what value did you use for weight_decay?
optimizer = optim.Adam(filter(lambda p: p.requires_grad, parameters()), lr=args.lr, weight_decay=args.weight_decay)
iteration vs. epoch
The paper said that 100,000 iterations with batch_size=100 were used for the first stage to train pose network. If batch_size=100 and 400,000 images were trained, 4,000 iter is 1 epoch, so is it correct to train 25 epochs for the first stage?
The text was updated successfully, but these errors were encountered: