When ensembling a kfold trained set, does it lose the ability to set other models? #20256
Unanswered
VDFaller
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Almost definitely me doing something wrong, but I'm trying to ensemble 3 of the same backbone model. When I do the simple torch stack, mean. For whatever reason, the models it's for()ing through seem to lose the ability to set the device and eval mode. Here's the simplest example I could come up with. Am I doing something obviously wrong? I can do it if each model is an individual variable, I assume that's by design.
Edit and as soon as I press the post button. I think I found my answer. I can just add_module??
Beta Was this translation helpful? Give feedback.
All reactions