Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doc and examples #471

Open
JeanKossaifi opened this issue Jan 1, 2023 · 4 comments
Open

Doc and examples #471

JeanKossaifi opened this issue Jan 1, 2023 · 4 comments

Comments

@JeanKossaifi
Copy link
Member

We probably need to add some more examples and doc for new features in TensorLy.

Would be great for instance to have an example for @earmingol @hmbaghdassarian as a user guide on how to use the CorrIndex in practice!

@earmingol
Copy link
Contributor

We've been a little busy these days, but once I'm free enough I will write an example :)

@earmingol
Copy link
Contributor

I just generated an example, what do you think @JeanKossaifi of this?:

"""
Correlation Index in Tensorly >=0.8
===============================================
Example of using the Correlation Index.
"""

##############################################################################
# Introduction
# -----------------------
# Since version 0.8 in Tensorly, a new metric is available to compare two
# different tensor decompositions of a same tensor (e.g., using different
# initializations) or to compare the decompositions of two different tensors
# with the same shape (representing the same elements).
#
# This metric is explained in detail in 
# [1] Sobhani et al 2022 (https://doi.org/10.1016/j.sigpro.2022.108457).

# Briefly, this metric measures a distance between two tensor decompositions.
# Wherein 0 represents exactly the same decompositions (regardless of factor
# permutations), and 1 represents completely different decompositions.
##############################################################################
# Here we will use the Non-negative PARAFAC algorithm to perform
# tensor decompositions. Here we will exemplify two comparisons:
# 1) Two different decompositions of the same tensor, using different
# `randon_states` for controlling the initialization values.
# 2) Two different tensors of the same shape. Here the second tensor will be
# the first tensor plus some random noise.

import tensorly as tl
from tensorly.decomposition import non_negative_parafac
import numpy as np

shape = (20, 1000, 10, 10)

##############################################################################
# Create synthetic tensor
# -----------------------
# There are several ways to create a tensor with non-negative entries in Tensorly.
# Here we chose to generate a random tensor from the sequence of integers from
# 1 to 1000.

# tensor-1 generation
np.random.seed(0)
array = np.random.randint(1000, size=shape)
tensor = tl.tensor(array, dtype='float')

# tensor-2 generation (tensor-1 plus random noise)
np.random.seed(0)
noise = np.random.normal(loc=0.0, scale=10, size=shape)
tensor2 = tl.tensor(array+noise, dtype='float')
##############################################################################
# Non-negative PARAFAC
# -----------------------
# Scenario 1: Same tensor, different initializations
(w1, f1) = non_negative_parafac(tensor, rank=10, random_state=0)
(w2, f2) = non_negative_parafac(tensor, rank=10, random_state=888)

# Scenario 2: Second tensor with random noise.
(w3, f3) = non_negative_parafac(tensor2, rank=10, random_state=0)
##############################################################################
# Correlation Index
# -----------------------
# Scenario 1: Same tensor, different initializations
tl.metrics.correlation_index(f1, f2)

# Scenario 2: Second tensor with random noise.

tl.metrics.correlation_index(f1, f3)
##############################################################################
# Here we can see that changing the random initialization does not affect much
# the results of decomposing our synthetic tensor. However, for real data the
# impact could be stronger.
# When introducing noise, we notice stronger changes in the results of the
# tensor decomposition, increasing the Correlation Index.

@earmingol
Copy link
Contributor

earmingol commented May 26, 2023

@JeanKossaifi just following up on this... does the example above work? also I'm not sure where I should include it.

@JeanKossaifi
Copy link
Member Author

Thanks @earmingol - I had lost track of this.

This looks good, you can add it in the gallery of examples, they are executed at each new PR - either in decomposition in the root. We can give it an informative title, e.g. comparing decompositions with the correlation index metric.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants