Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AmiciObjective: check that sensitivities wrt all relevant parameters … #1416

Open
wants to merge 11 commits into
base: develop
Choose a base branch
from

Conversation

dweindl
Copy link
Member

@dweindl dweindl commented Jun 17, 2024

…can be computed

Adds a check whether the amici model in an AmiciObjective is able to compute sensitivities w.r.t. to the objective parameters.

Closes #1414

This raised the question of whether an objective can have implicitly fixed parameters. I.e. whether it should be considered a use case, that some objective parameters that may change the objective value, have zero entries in the gradient. This is what's happening if one uses PetabImporter.create_objective() without passing it through PetabImporter.create_problem(). So far, pypesto is rather unclear on that. Whether this affects optimization will depend on into which Problem this objective will be put.

My assumption here is that this is undesirable and that changes in the objective value should be reflected in the gradient.

@codecov-commenter
Copy link

codecov-commenter commented Jun 17, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 31.05%. Comparing base (0bf7f36) to head (f107c48).

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@             Coverage Diff              @@
##           develop    #1416       +/-   ##
============================================
- Coverage    83.88%   31.05%   -52.83%     
============================================
  Files          160      160               
  Lines        13096    13105        +9     
============================================
- Hits         10985     4070     -6915     
- Misses        2111     9035     +6924     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@dweindl dweindl marked this pull request as ready for review June 17, 2024 14:39
@dweindl dweindl requested a review from FFroehlich as a code owner June 17, 2024 14:39
Copy link
Collaborator

@PaulJonasJost PaulJonasJost left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, the check looks sound to me, but probably good if @FFroehlich has another look at it.

Conceptual question: I get it right, that if currently (and henceforth) i create the objective only with PetabImporter.create_objective(), i get an objective back, which will artificially compute gradients/sensitivities for all parameters independent of whether they are fixed or not? artificially in the sense that for fixed parameters it returns 0? So the only difference between passing it/not passing it to .create_problem would be the dimensionality?

Comment on lines +450 to +452
This object is expected to be passed to
:class:`PetabImporter.create_problem` to correctly handle fixed
parameters.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we have a test to check the expected behavior if one does not pass it to PetabImporter.create_problem?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem is that we don't know the expected behaviour if one does not pass it to PetabImporter.create_problem

Comment on lines +196 to +198
objective = importer.create_problem(
objective=importer.create_objective(max_sensi_order=1)
).objective
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might be a minor thing, but i find this a really unintuitive way of creating the objective function correctly.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem is, that petab fixed parameters are only handled in create_problem, but not in create_objective. Maybe this could/should be changed. Not completely sure what might rely on the current behaviour.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

well both the importer and the problem should be aware of fixes parameters, so I agree that one should naturally expect problem.create_objective, importer.create_objective and importer.create_problem().objective to be equivalent and that passing importer.create_objective(max_sensi_order=1) to importer.create_problem isn't necessary.

"""Get the model parameters for which sensitivities are missing.

Get the model parameters w.r.t. which sensitivities are required,
but aren't available missing.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
but aren't available missing.
but aren't available.

Comment on lines +196 to +198
objective = importer.create_problem(
objective=importer.create_objective(max_sensi_order=1)
).objective
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

well both the importer and the problem should be aware of fixes parameters, so I agree that one should naturally expect problem.create_objective, importer.create_objective and importer.create_problem().objective to be equivalent and that passing importer.create_objective(max_sensi_order=1) to importer.create_problem isn't necessary.

@dweindl dweindl self-assigned this Sep 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

AmiciObjective: For sensi_orders>0, check that sensitivities wrt all relevant parameters are computed
4 participants