Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add smoothness metric #72

Draft
wants to merge 8 commits into
base: master
Choose a base branch
from
Draft

Conversation

hugofluhr
Copy link
Collaborator

@hugofluhr hugofluhr commented Aug 18, 2023

Closes #71

PR that add smoothness computation for signals that are defined on nodes of the graph. It's my first contribution to such a project so I'm happy to get feedback :)

Proposed Changes

  • add a smoothness fct in operations.metrics.
  • add corresponding test fct.
  • add corresponding class method in objects.

Change Type

  • bugfix (+0.0.1)
  • minor (+0.1.0)
  • major (+1.0.0)
  • refactoring (no version update)
  • test (no version update)
  • infrastructure (no version update)
  • documentation (no version update)
  • other

Checklist before review

  • I added everything I wanted to add to this PR.
  • [Code or tests only] I wrote/updated the necessary docstrings.
  • [Code or tests only] I ran and passed tests locally.
  • [Documentation only] I built the docs locally.
  • My contribution is harmonious with the rest of the code: I'm not introducing repetitions.
  • My code respects the adopted style, especially linting conventions.
  • The title of this PR is explanatory on its own, enough to be understood as part of a changelog.
  • I added or indicated the right labels.
  • I added information regarding the timeline of completion for this PR.
  • Please, comment on my PR while it's a draft and give me feedback on the development!

@codecov
Copy link

codecov bot commented Aug 18, 2023

Codecov Report

Merging #72 (854c047) into master (4ffa7cb) will increase coverage by 0.13%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #72      +/-   ##
==========================================
+ Coverage   90.93%   91.06%   +0.13%     
==========================================
  Files          14       14              
  Lines        1235     1254      +19     
==========================================
+ Hits         1123     1142      +19     
  Misses        112      112              
Files Changed Coverage Δ
nigsp/cli/run.py 97.14% <100.00%> (+0.08%) ⬆️
nigsp/objects.py 100.00% <100.00%> (ø)
nigsp/operations/metrics.py 100.00% <100.00%> (ø)
nigsp/references.py 100.00% <100.00%> (ø)

@venkateshness
Copy link
Collaborator

Looks good to me !

Some nice-to-have things:
a) Worth embedding a ref paper e.g: https://www.nature.com/articles/srep42013.pdf in the docstring
b) Compare the size of nodes in laplacian and signal and raise an error message if there's a mismatch; before the np.dot()

Copy link
Collaborator

@smoia smoia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @hugofluhr, this is a very solid start, thank you so much!

A couple of additional features would be great (as @venkateshness was proposing):

  • Add if statements to check that:
    1. the signal is 1D, in which case a warning should be returned (e.g. here) and an axis added (in last position, I think, but not sure) to fake a 2D matrix
    2. the first dimension of laplacian matches the first dimension of signal, and if not it returns a ValueError (like here)
  • Consequently, add breaking tests to the test suite
  • Add a citation (the one you believe being the most relevant):
  • Don't forget to add smoothness in the CLI (e.g. here) and in the supported metrics list

Optionally, if you have time:

  • when you add if statements, you could test whether, if 2 is false, transposing signal fixes things (raising a warning, ofc)
  • in an SDI fashion (if it makes sense?), you could replicate functional_connectivity's structure, transforming what you have now in a hidden sub-function, and then add features to compute smoothness on the whole timeseries and on the split timeseries!
  • Also, if signal has a third dimension (e.g. multisubject), check that the operations don't break!

@hugofluhr
Copy link
Collaborator Author

hugofluhr commented Aug 21, 2023

Thanks for the inputs!

Progress tracker :

  • add ref
  • check input sizes
  • add smoothness to CLI
  • add smoothness to list of supported metrics
  • try transposing signal if dimension mismatch
  • add corresponding tests
  • allow for 3rd dimension of signal, e.g. subjects
  • change structure to match functional_connectivity, hidden sub-function

@smoia smoia changed the title Adding smoothness metric Add smoothness metric Aug 21, 2023
@smoia smoia added the Minormod This PR generally closes an `Enhancement` issue. It increments the minor version (0.+1.0) label Aug 21, 2023
@hugofluhr
Copy link
Collaborator Author

hugofluhr commented Aug 22, 2023

@smoia the coverage is decreasing partly due to LGR.warning not being covered in tests, any guidance on how to cover those?

@venkateshness
Copy link
Collaborator

Unless I am missing (had a quick look on my mobile), LGF.warning is triggered by if statements in 2 cases. You can push to kick these if statements in the break tests ?

@smoia
Copy link
Collaborator

smoia commented Aug 22, 2023

If the coverage decreases a little, that's ok.
If you do want to test particular cases, I would add a couple of asserts in the main test of the function that do specific checks on those cases!

Break tests are only for raising errors!

computed_smoothness = metrics.smoothness(laplacian, signal)

assert (expected_smoothness == computed_smoothness).all()

Copy link
Collaborator

@smoia smoia Aug 22, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can add an assert here with signal = rand(10) and expected_smoothness computed on signal[..., np.newaxis]. And another one with signal = rand(2,10) and expected smoothness on signal.T.

Or use chatGPT 🤣

s3 = rand(2, 10)
laplacian = rand(10, 10)

expected_smoothness1 = np.dot(s1.T, np.dot(laplacian, s1))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this correct? If so you don't need a warning and a special treatment, since it's the same as the main case!

Copy link
Collaborator Author

@hugofluhr hugofluhr Aug 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for s3 I need to transpose it by hand to compute the expected_smoothness otherwise the dot product won't work! See the difference between lines 81 and 84

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm talking about s1-s2

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah yes that's correct sorry, I'm trying to make it work with a 3rd dimension for different subjects so I'll see how this evolves as I will probably move from np.matmul to np.tensordot

Copy link
Collaborator

@venkateshness venkateshness Aug 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure it'd fit in in memory, keep an eye on mem usage! You might have to run over subjects dim with a loop

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tensordot might indeed not work out - Maybe you can constrain axes in the multiplication, or indeed loop through subjects. I have some code in another function (the graph fourier transform, if I remember correctly) that split cases based on dimensions (and runs a loop).

else:
raise ValueError("The dimensions of the signal and Laplacian don't match.")

return np.matmul(signal.T, np.matmul(laplacian, signal))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeaaaah... Do you remember when your function was one line? Now you understand the pain of the Sdev...

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes definitely, thanks for the guidance and regular feedback!

@smoia
Copy link
Collaborator

smoia commented Nov 2, 2023

@hugofluhr can we merge this in?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Minormod This PR generally closes an `Enhancement` issue. It increments the minor version (0.+1.0)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

add computation of smoothness for graph signals
3 participants