Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Add improved goodness of fit implementation #190

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

stes
Copy link
Member

@stes stes commented Oct 27, 2024

This adds a better goodness of fit measure. Instead of the old variant which simply matched the InfoNCE and depends on the batch size, the proposed measure

  • is at 0 for "chance level" (vs. log batch size)
  • does not need an adjustment for single session vs. multi-session solvers
  • increases as the model gets better, which might be more intuitive

The conversion is quite simply done via

GoF(model) = log (batch_size_per_session * num_sessions) - InfoNCE(model)

This measure is also used in DeWolf et al., 2024, Eq. (43)

image

Application example (GoF improves from 0 to a larger value during training):

image


Close https://github.com/AdaptiveMotorControlLab/CEBRA-dev/pull/669

@cla-bot cla-bot bot added the CLA signed label Oct 27, 2024
@stes stes self-assigned this Oct 27, 2024
@stes stes mentioned this pull request Oct 27, 2024
3 tasks
@stes
Copy link
Member Author

stes commented Oct 27, 2024

TODO: Fix case where batch_size is None
image

@stes stes force-pushed the stes/better-goodness-of-fit branch from 5e21cdc to c826b68 Compare October 27, 2024 21:31
@stes stes force-pushed the stes/better-goodness-of-fit branch from c826b68 to f43971f Compare November 29, 2024 13:31
@CeliaBenquet
Copy link
Member

@stes about what I implemented in #202 that I do see here.

I think it would be good to have a really basic function where you provide the loss and the batch size, so that it is easily usable in the pytorch implementation as well.

Also, it would be nice to test for the default CEBRA.batch_size = None, not sure it is handled here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants