Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

michaelis_menten transformation as pt.TensorVariable #1054

Merged
merged 8 commits into from
Oct 17, 2024

Conversation

PabloRoque
Copy link
Contributor

@PabloRoque PabloRoque commented Sep 20, 2024

Make michaelis_meten consistent with the rest of SaturationTransformation by wrapping it as pt.TensorVariable

Description

  • Rename old michaelis_menten as michaelis_menten_function. Made to avoid major changes in mmm.utils.estimate_menten_parameters's L70.
  • Wrap previous michaelis_menten_function as pt.TensorVariable inside michaelis_menten
  • Minor change in test_michaelis_menten. The TensorVariable now needs to be evaluated.
  • Minor changes to type hints
  • Wrap MichaelisMentenSaturation.functionwith pt.as_tensor_variable
  • Expand test_plotting.mock_mmm fixture with MichaelisMentenSaturation
  • .eval() in example notebook

Related Issue

Checklist

Modules affected

  • MMM
  • CLV

Type of change

  • New feature / enhancement
  • Bug fix
  • Documentation
  • Maintenance
  • Other (please specify):

📚 Documentation preview 📚: https://pymc-marketing--1054.org.readthedocs.build/en/1054/

@PabloRoque PabloRoque changed the title michaelis_menten tranformation as pt.TensorVariable michaelis_menten transformation as pt.TensorVariable Sep 20, 2024
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@@ -914,6 +914,31 @@ def michaelis_menten(
return alpha * x / (lam + x)


def michaelis_menten(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need two functions?

Copy link
Contributor

@wd60622 wd60622 Sep 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can all of the SaturationTransformations be wrapped in as_tensor_variable instead of making these changes?

ie. here:

return self.function(x, **kwargs)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need two functions?

We don't need them, but made my life easier to avoid the pesky L70

Can all of the SaturationTransformations be wrapped in as_tensor_variable instead of making these changes?
Do you have anything in mind? I can think of 2 ways:

  • Adding boilerplate to all the classes. Perhaps defining a decorator to reduce boilerplate.
  • Rely on Transformation._checks() but here we would need to trust the type hints to use something like signature(class_function).return_annotation and wrap pt.as_tensor_variable if needed.

Perhaps you have something less convoluted in mind?

Copy link
Contributor

@wd60622 wd60622 Sep 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you not add it in the one location of the apply method? (where I linked) That is a common method that is not overwritten by the subclasses

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think having it for apply would make sure that other custom saturation transformations will not encounter this found bug as well

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It shouldn't hurt to has pt.as_tensor_variable wrapping an already TensorVariable. pytensor will figure out an optimization

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you not add it in the one location of the apply method? (where I linked) That is a common method that is not overwritten by the subclasses

The linked apply method needs to be called within a model context. Not the case in the plotting function.

Copy link
Contributor Author

@PabloRoque PabloRoque Sep 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The proposed sample_curve might work though, since it opens a model context. Having a look

Copy link
Contributor Author

@PabloRoque PabloRoque Sep 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to add another pm.Deterministic to the model?
It would be the case if we use _sample_curve

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We wouldnt want to use the private methods of the saturation transformation. Does sample_curve with fit_result not get us close to the data that is plotted? (Just not scaled on x and y)

Comment on lines 831 to 832
alpha: float,
lam: float,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still don't know the best type hints here. But TensorVariable is also accepted here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still don't know the best type hints here. But TensorVariable is also accepted here

Reverted changes to type hints

@@ -22,7 +22,7 @@
import xarray as xr
from scipy.optimize import curve_fit, minimize_scalar

from pymc_marketing.mmm.transformers import michaelis_menten
from pymc_marketing.mmm.transformers import michaelis_menten_function


def estimate_menten_parameters(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where is this still used? I thought there was a generalization of this? @cetagostini

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Created an issue here: #1055

Copy link
Contributor

@wd60622 wd60622 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you write a test for the case that discovered this bug

@wd60622
Copy link
Contributor

wd60622 commented Sep 20, 2024

I am looking over the _plot_response_curve_fit internals. I think it would be better to rely on the SaturationTransformation.sample_curve method so that the solution doesn't directly call the function method.

There needs to be a solution that works for others custom saturation transformations

I will create an issue for this. We can address in the future #1056

@PabloRoque
Copy link
Contributor Author

Can you write a test for the case that discovered this bug

Added saturation fixture. Now test_mmm_plots includes MichaelisMentenSaturation.
Would fail without TensorVariable in plot_direct_contribution_curves fixture, but does not with the fix in this PR (pending suggested changes)

@PabloRoque
Copy link
Contributor Author

SaturationTransformation.sample_curve

This will add 3 extra pm.Deterministic to the model.
Are we happy we that behavior?

@wd60622
Copy link
Contributor

wd60622 commented Sep 20, 2024

This will add 3 extra pm.Deterministic to the model.

Are we happy we that behavior?

Lets skip if that is required. Dont think that would be good

@@ -826,10 +826,10 @@ def tanh_saturation_baselined(
return gain * x0 * pt.tanh(x * pt.arctanh(r) / x0) / r


def michaelis_menten(
def michaelis_menten_function(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is only used in the functions we will deprecate, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is only used in the functions we will deprecate, right?

Yes, they are only used in the methods linked in #1055.
If we remove those, there is no need to define the extra function.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mind closing that now?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @wd60622,

having doubts about completely removing functions in #1055.

As an example, take estimate_menten_parameters. It was introduced in #329, and never used outside of the current test. I am guessing the original purpose was to double check that the introduced function gives correct results. If we remove the tests, we would never be checking that again.

As alternative I've simplified the PR to wrap only MichaelisMentenSaturation.function with pt.as_tensor_variable.

Note that the newly introduced fixture now passes the tests

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we need to remove tests, that is fine.

This is only saturation transformation which needs to be defined with two functions (which I find silly). Just cosolidate

@wd60622 wd60622 added MMM bug Something isn't working labels Sep 28, 2024
@wd60622
Copy link
Contributor

wd60622 commented Oct 8, 2024

Thanks for all the edits @PabloRoque

I am out atm but @juanitorduz will take a look

@juanitorduz
Copy link
Collaborator

I'll review on the weekend 🙏💪 (busy week ahead 😄)

@PabloRoque
Copy link
Contributor Author

No rush! I'll be taking some time off myself until the 28th.

@juanitorduz juanitorduz added this to the 0.10.0 milestone Oct 16, 2024
Copy link
Collaborator

@juanitorduz juanitorduz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is looking good! Shall we merge this one and continue with other smaller PRs?

Copy link
Contributor

@wd60622 wd60622 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Let's merge and continue elsewhere if needed

@juanitorduz juanitorduz merged commit 429b955 into pymc-labs:main Oct 17, 2024
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working MMM
Projects
None yet
Development

Successfully merging this pull request may close these issues.

MMM.plot_direct_contribution_curves errors when using MichaelisMentenSaturation
3 participants