You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm working on a data analysis and I'm giving 2 components to my data for NMF. Is it possible to constrain only 1 of the component's weight? Also, what we know is that in the second component, the weight should decay exponentially. Is it possible, for instance, to only fix a parameter of the exponential function?
The text was updated successfully, but these errors were encountered:
The first thing should be possible, but the second is not currently possible. It does sound like an interesting feature worth developing though. I'll have a think on that. One approach for now would be to initialize the second set of weights to an exponentially decaying guess, and try and nudge the optimization in that direction.
X=load_my_dataset() # torch tensor shape (m_example, n_features)weights_1=load_expected_weights() # torch tensor shape (m_examples, k_components)weights_2=load_best_guess_exponential_weights()
model=cmf.nmf.models.NMF(X.shape,
2,
initial_weights=[weights_1, weights_2],
fix_weights=[True, False])
loss=model.fit(X) # Learning curve of loss over timestepslearned_weights, learned_components=model.W.detach().numpy(), model.H.detach().numpy()
Hi,
I'm working on a data analysis and I'm giving 2 components to my data for NMF. Is it possible to constrain only 1 of the component's weight? Also, what we know is that in the second component, the weight should decay exponentially. Is it possible, for instance, to only fix a parameter of the exponential function?
The text was updated successfully, but these errors were encountered: