Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue 304 streamline tune #348

Merged
merged 75 commits into from
Nov 28, 2024
Merged

Issue 304 streamline tune #348

merged 75 commits into from
Nov 28, 2024

Conversation

evaham1
Copy link
Collaborator

@evaham1 evaham1 commented Nov 26, 2024

In this PR I have edited the exisiting tune functions and added new tune functions so that tune() can now be used to both optimise number of variables to keep and also number of components to keep. tune() functions now broadly have the following functionality:

  1. Optimise number of components to keep
  • Tells you what number of components are informative for a given model
  • Available for all tune() functions apart from tune.spca() and tune.rcc()
  • Is calculated by internally building the revlevant model and then calling perf() on this model (only exception is tune.pca() which just returns variance explained per component)
  1. Optimise number of variables to keep
  • Tells you what number of variables are informative per component for a given model
  • Only available for sparse methods, i.e. tune.spca(), tune.rcc(), tune.spls(), tune.splsda(), tune.block.splsda() and tune.block.plsda()
  • Is calculated using the code from the original tune() functions

List of current tune() functions and which functionalities they have:

  • tune.pca() - (1)
  • tune.spca() - (2)
  • tune.rcc() - (2)
  • tune.plsda() - (1)
  • tune.splsda() - (1) and (2)
  • tune.pls() - (1)
  • tune.spls() - (1) and (2)
  • tune.block.plsda() - (1)
  • tune.block.splsda() - (1) and (2)
  • tune.mint.plsda() - (1)
  • tune.mint.splsda() - (1) and (2)

For the functions that have dual functionality, optimising components (1) can be done by setting test.keepX = NULL (and also test.keepY = NULL for tune.spls(). If test.keepX is not NULL, functionality (2) is used.

@evaham1 evaham1 merged commit ea9bc05 into master Nov 28, 2024
10 of 11 checks passed
@evaham1 evaham1 deleted the issue-304-streamline-tune branch November 28, 2024 04:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Refactoring: streamline perf() and tune() functions to do performance assessment on just input model
1 participant