Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Personalized loss functions for tensor decompositions #486

Open
earmingol opened this issue Feb 10, 2023 · 7 comments
Open

Personalized loss functions for tensor decompositions #486

earmingol opened this issue Feb 10, 2023 · 7 comments

Comments

@earmingol
Copy link
Contributor

earmingol commented Feb 10, 2023

Any option to implement an easy way to "plug and play" different loss functions in the tensor decompositions?
It would be great having something like that instead of editing each of the different functions.

Tamara Kolda suggested more pertinent loss functions in a talk she gave in 2018, depending on the type of data. For example, she suggested using the Rayleigh one for non-negative CP, or the Boolean-Odds for tensors with values 0s and 1s.

Objective-Functions-Generalized-CP

However, it looks like that most of the decompositions implemented here only use the standard loss function, and changing that would require directly editing each of the methods.

@JeanKossaifi
Copy link
Member

JeanKossaifi commented Feb 16, 2023

If you want to use gradient-based optimization the easiest might be to just use TensorLy-Torch and plug in your loss directly.

Would be great to have more losses supported in the main library too if you're interested in opening a PR!

@cohenjer
Copy link
Contributor

So, it turns out we already have some implementation of GCP @JeanKossaifi @earmingol, but it is not yet merged in Tensorly mainly because we don't support all the backends.

The implementation was done by @caglayantuna and the repo is here:
https://github.com/caglayantuna/tensorly-gcp
and can be pip installed pypi link
It has not been fully tested yet and this is something I want to properly finish but have not found the time for.

There are a few examples in the doc, hope this helps!

@earmingol
Copy link
Contributor Author

earmingol commented Feb 17, 2023

Wow, this is cool! thanks @cohenjer, once I have a chance I'll start playing with it and see how I could contribute :)

@JeanKossaifi, I'm interested in trying tensorly-torch, but I'm not very familiar with using it beyond creating layers for bigger models. Do you have any quick example for running a decomposition, showing from how passing a tensor as input to obtaining the resulting factors?

Thanks!

@JeanKossaifi
Copy link
Member

Yes, I need to add some documentation but it should be fairly straightforward -- I tried to make things as easy as possible :)

Create a factorized tensor just like a PyTorch tensor

from tltorch import FactorizedTensor

tucker = FactorizedTensor.new(shape=(3, 4, 2), rank=0.5, factorization='tucker')

You can use your favourite optimizer to update the factors of the decomposition

optimizer = torch.optim.Adam(
    tucker.parameters(), 
    lr=1e-3, 
    weight_decay=1e-4
)

Then you can just create a custom loss and apply it to the reconstruction:

reconstruction = tucker.to_tensor()
loss = custom_loss(reconstruction - target_tensor)
loss.backward()

You'd of course want to add in a scheduler to decrease learning rate every couple of iterations, etc.

@JeanKossaifi
Copy link
Member

@earmingol did you get to give it a try, did it work for your usecase?

@earmingol
Copy link
Contributor Author

@JeanKossaifi unfortunately I got swamped by other things! I had a chance to try @cohenjer approach, which works well but outputs somehow unexpected results. Also, I tried implementing yours but wasn't able to get to the point to output results. That's something I need to do soon though!

I'll let you know if I have any progress or questions on that :)

@cohenjer
Copy link
Contributor

@earmingol Feel free to contact me directly if tensorly-gcp has issues. I plan to debug it at some point properly, so any feedback (even saying some functionality worked as expected) would be welcome!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants