Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature: new tests added for tsne to expand test coverage #2229

Merged

Conversation

yuejiaointel
Copy link
Contributor

@yuejiaointel yuejiaointel commented Dec 17, 2024

Description

Added additional tests in sklearnex/manifold/tests/test_tsne.py to expand the test coverage for t-SNE algorithm.

PR completeness and readability

  • I have reviewed my changes thoroughly before submitting this pull request.
  • I have commented my code, particularly in hard-to-understand areas.
  • Git commit message contains an appropriate signed-off-by string (see CONTRIBUTING.md for details).

Testing

  • I have run it locally and tested the changes extensively.
  • All CI jobs are green or I have provided justification why they aren't.
  • I have extended testing suite if new functionality was introduced in this PR.

Copy link

codecov bot commented Dec 17, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Flag Coverage Δ
azure 83.90% <ø> (?)
github 83.18% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

see 53 files with indirect coverage changes

@ethanglaser ethanglaser marked this pull request as draft December 17, 2024 19:02
@yuejiaointel
Copy link
Contributor Author

/intelci: run

@yuejiaointel yuejiaointel marked this pull request as ready for review December 19, 2024 00:00
@ethanglaser
Copy link
Contributor

/intelci: run

sklearnex/manifold/tests/test_tsne.py Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
yue.jiao added 2 commits December 19, 2024 08:37
@yuejiaointel
Copy link
Contributor Author

/intelci: run

sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Show resolved Hide resolved
@david-cortes-intel
Copy link
Contributor

It looks like we don't have any test here nor in daal4py that would be checking that the results from TSNE make sense beyond having the right shape and non-missingness.

Since there's a very particular dataset here for the last test, it'd be helpful to add other assertions there along the lines of checking that the embeddings end up making some points closer than others as would be expected given the input data.

…or parametrization names, removed extra tests
@yuejiaointel
Copy link
Contributor Author

Hi David,
About the last comment, I think that is a good test to add! I spent some time thinking it through and have added a logic check in the final test to evaluate the overlap of close neighbors. Here’s a summary of the steps I implemented:

  1. get a distance array where [i, j] is Euclidean distance of point i and j in original space, same for tsne embedding space
  2. rank distances for each point wrt first column in original space, also for embedding space
  3. get top 5 neighbors of each point in original and embedding space see how many are same by dividing them
  4. get a mean of all fractions it should represent how the original and embedding space are similar for the most 5 closest points
  5. check if that mean is > 0.6
    Let me know your thoughts on this approach or if you believe it could be improved further.
    Thx a lot :D
    Yue

@david-cortes-intel
Copy link
Contributor

Hi David, About the last comment, I think that is a good test to add! I spent some time thinking it through and have added a logic check in the final test to evaluate the overlap of close neighbors. Here’s a summary of the steps I implemented:

  1. get a distance array where [i, j] is Euclidean distance of point i and j in original space, same for tsne embedding space
  2. rank distances for each point wrt first column in original space, also for embedding space
  3. get top 5 neighbors of each point in original and embedding space see how many are same by dividing them
  4. get a mean of all fractions it should represent how the original and embedding space are similar for the most 5 closest points
  5. check if that mean is > 0.6
    Let me know your thoughts on this approach or if you believe it could be improved further.
    Thx a lot :D
    Yue

I think given the characteristics of the data that you are passing, it could be done by selecting some hard-coded set of points by index from "Complex Dataset1" that should end up being similar, and some selected set of points that should end up being dissimilar to the earlier ones; with the test then checking that the euclidean distances in the embedding space among each point from the first set are smaller than the distances between each point in the first set and each point in the second set.

Also maybe "Complex Dataset2" is not needed.

@yuejiaointel
Copy link
Contributor Author

yuejiaointel commented Jan 8, 2025

Hi David, About the last comment, I think that is a good test to add! I spent some time thinking it through and have added a logic check in the final test to evaluate the overlap of close neighbors. Here’s a summary of the steps I implemented:

  1. get a distance array where [i, j] is Euclidean distance of point i and j in original space, same for tsne embedding space
  2. rank distances for each point wrt first column in original space, also for embedding space
  3. get top 5 neighbors of each point in original and embedding space see how many are same by dividing them
  4. get a mean of all fractions it should represent how the original and embedding space are similar for the most 5 closest points
  5. check if that mean is > 0.6
    Let me know your thoughts on this approach or if you believe it could be improved further.
    Thx a lot :D
    Yue

I think given the characteristics of the data that you are passing, it could be done by selecting some hard-coded set of points by index from "Complex Dataset1" that should end up being similar, and some selected set of points that should end up being dissimilar to the earlier ones; with the test then checking that the euclidean distances in the embedding space among each point from the first set are smaller than the distances between each point in the first set and each point in the second set.

Also maybe "Complex Dataset2" is not needed.

Hi David!
I fixed the logic based on your suggestion, and here is my understanding. First get a group A with similar points and group B with different points from group A, then check in embedding space distance b/t any 2 points in group A should be less than that point to any point in group B. I run the CI many times and one problem with this approach is that it fails sometimes for GPU devices, in these cases the embedding did not keep close points close, and it only occur on pipeline runs without problem on local machine. Not sure if I should create another ticket to investigate on that. I also removed complex test 2.
Best,
Yue

@yuejiaointel
Copy link
Contributor Author

/intelci: run

sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
@david-cortes-intel
Copy link
Contributor

Odd that the CI fails for sklearn1.0 and 1.1. I see that there's many places throughout the code with conditions for sklearn<1.2 though:
https://github.com/uxlfoundation/scikit-learn-intelex/blob/main/daal4py/sklearn/manifold/_t_sne.py
Perhaps something is going wrong there.

@yuejiaointel
Copy link
Contributor Author

Odd that the CI fails for sklearn1.0 and 1.1. I see that there's many places throughout the code with conditions for sklearn<1.2 though: https://github.com/uxlfoundation/scikit-learn-intelex/blob/main/daal4py/sklearn/manifold/_t_sne.py Perhaps something is going wrong there.

Hi David!
The tests are failing when checking 0 embedding when data is constant. But actually after I read the implementation I think it make sense. About the version difference, default initialization in TSNE is changed from random to pca from 1.2 and that is why we see different behavior for different versions. Here is my understanding why we are getting non 0 embedding when we are using self._init = "random":

  1. First we are generating a random embedding at first
    X_embedded = 1e-4 * random_state.standard_normal(size=(n_samples, n_components))
  2. Next the joint probabilities are meaningless since the pairwise distance are 0 but there could still be a value so it will still push the embedding in the gradient decent process and in the end we see some degenerated embedding.
    P = _joint_probabilities(distances, self.perplexity, self.verbose)

I actually tested using sklearn and see same behavior here is my simple script and I also see a non zero embedding in the end.
from sklearn.manifold import TSNE
import numpy as np
constant_data = np.full((10, 10), fill_value=1)
tsne = TSNE(n_components=2, init='random',random_state=42, perplexity=5)
embedding = tsne.fit_transform(constant_data)
print(embedding)

I removed the check for 0 embedding because we see zero and non zero embedding for different versions, hope this helps!
Yue

@david-cortes-intel
Copy link
Contributor

Odd that the CI fails for sklearn1.0 and 1.1. I see that there's many places throughout the code with conditions for sklearn<1.2 though: https://github.com/uxlfoundation/scikit-learn-intelex/blob/main/daal4py/sklearn/manifold/_t_sne.py Perhaps something is going wrong there.

Hi David! The tests are failing when checking 0 embedding when data is constant. But actually after I read the implementation I think it make sense. About the version difference, default initialization in TSNE is changed from random to pca from 1.2 and that is why we see different behavior for different versions. Here is my understanding why we are getting non 0 embedding when we are using self._init = "random":

  1. First we are generating a random embedding at first
    X_embedded = 1e-4 * random_state.standard_normal(size=(n_samples, n_components))
  2. Next the joint probabilities are meaningless since the pairwise distance are 0 but there could still be a value so it will still push the embedding in the gradient decent process and in the end we see some degenerated embedding.
    P = _joint_probabilities(distances, self.perplexity, self.verbose)

I actually tested using sklearn and see same behavior here is my simple script and I also see a non zero embedding in the end. from sklearn.manifold import TSNE import numpy as np constant_data = np.full((10, 10), fill_value=1) tsne = TSNE(n_components=2, init='random',random_state=42, perplexity=5) embedding = tsne.fit_transform(constant_data) print(embedding)

I removed the check for 0 embedding because we see zero and non zero embedding for different versions, hope this helps! Yue

Then please move it into a separate test, using init="pca" in the constructor.

Also I don't think the test should look for exact zeros everywhere. For constant data, should test for a constant first dimension, and a zero second dimension.

@yuejiaointel
Copy link
Contributor Author

Odd that the CI fails for sklearn1.0 and 1.1. I see that there's many places throughout the code with conditions for sklearn<1.2 though: https://github.com/uxlfoundation/scikit-learn-intelex/blob/main/daal4py/sklearn/manifold/_t_sne.py Perhaps something is going wrong there.

Hi David! The tests are failing when checking 0 embedding when data is constant. But actually after I read the implementation I think it make sense. About the version difference, default initialization in TSNE is changed from random to pca from 1.2 and that is why we see different behavior for different versions. Here is my understanding why we are getting non 0 embedding when we are using self._init = "random":

  1. First we are generating a random embedding at first
    X_embedded = 1e-4 * random_state.standard_normal(size=(n_samples, n_components))
  2. Next the joint probabilities are meaningless since the pairwise distance are 0 but there could still be a value so it will still push the embedding in the gradient decent process and in the end we see some degenerated embedding.
    P = _joint_probabilities(distances, self.perplexity, self.verbose)

I actually tested using sklearn and see same behavior here is my simple script and I also see a non zero embedding in the end. from sklearn.manifold import TSNE import numpy as np constant_data = np.full((10, 10), fill_value=1) tsne = TSNE(n_components=2, init='random',random_state=42, perplexity=5) embedding = tsne.fit_transform(constant_data) print(embedding)
I removed the check for 0 embedding because we see zero and non zero embedding for different versions, hope this helps! Yue

Then please move it into a separate test, using init="pca" in the constructor.

Also I don't think the test should look for exact zeros everywhere. For constant data, should test for a constant first dimension, and a zero second dimension.

Hi David,
Thx for the advice, I created a test for constant data and check both random and pca initialization
Best,
Yue

Copy link
Contributor

@david-cortes-intel david-cortes-intel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, pending CI results.

Copy link
Contributor

@ethanglaser ethanglaser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work on this so far! A few small comments from my side but looks like its almost ready for merge.

sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Outdated Show resolved Hide resolved
sklearnex/manifold/tests/test_tsne.py Show resolved Hide resolved
@ethanglaser
Copy link
Contributor

/intelci: run

@ethanglaser
Copy link
Contributor

/intelci: run

1 similar comment
@ethanglaser
Copy link
Contributor

/intelci: run

@yuejiaointel yuejiaointel merged commit 8f20757 into uxlfoundation:main Jan 14, 2025
27 of 28 checks passed
@yuejiaointel yuejiaointel deleted the expand_tsne_test_coverage branch January 14, 2025 18:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants