Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add HSGP prior #415

Closed
wants to merge 80 commits into from
Closed

Add HSGP prior #415

wants to merge 80 commits into from

Conversation

nialloulton
Copy link
Contributor

@nialloulton nialloulton commented Oct 30, 2023

Essentially 3 things:

  1. Flexibility in prior selection to anything in the PyMC API (this is a duplicate as I believe there's already a PR for this )
  2. Flexibility in likelihood
  3. HSGP prior for model coefficients, intercept, alpha etc

Some inefficient functions can be deleted e.g GP_wrapper can be edited out of the code


📚 Documentation preview 📚: https://pymc-marketing--415.org.readthedocs.build/en/415/

@MarkusSagen
Copy link
Contributor

Is setting time-varying coefficients something one wants per default though?
Training time would increase exponentially, so would be nice to have an opt in for this or opt out?

@nialloulton
Copy link
Contributor Author

Is setting time-varying coefficients something one wants per default though? Training time would increase exponentially, so would be nice to have an opt in for this or opt out?

It won't be default, it'll be an option within the config - in combination with the upcoming pr on prior distribution flexibility. Model fit times using the HSGP are very efficient too, I found in testing there's minimal difference when using them in a single KPI marketing-mix model on all channels.

@ricardoV94
Copy link
Contributor

ricardoV94 commented Nov 1, 2023

I suggest we first merge #397 and then keep building this PR on top of it. This PR adds much more functionality and will also require more careful testing / design tweaking.

@ricardoV94 ricardoV94 added enhancement New feature or request MMM labels Nov 1, 2023
@ricardoV94 ricardoV94 marked this pull request as draft November 1, 2023 12:17
@eirikbaekkelund
Copy link

eirikbaekkelund commented Nov 7, 2023

I think flexibility in the order of adstock--> saturation and saturation --> adstock would also be useful. Additionally, It'd be nice to enable assignment of different adstock functions to different channels. Then, it would maybe make sense to instead of having a model_config, you have a adstock_config and saturation_config.
A bit like active_dims for kernels in GPyTorch for whoever's familiar.

@ricardoV94 ricardoV94 changed the title Prior & likelihood selection via config + HSGP prior Add HSGP prior Nov 15, 2023
priors[param] = self.gp_wrapper(name=param, X=np.arange(len(self.X[self.date_column]))[:, None], config=config, positive=is_positive)
continue

length = dimensions.get(config.get("dims", [None, None])[1], 1)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

needlessly terse. can be simplified to

dimensions.get(config.get("dims", None), 1)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my apologies, my suggesion won't work and this is actually decent code.

Comment on lines +297 to +302
# Initial value based on parameter name
is_positive = param in positive_params

# Override if the config explicitly sets the 'positive' key
if 'positive' in config:
is_positive = config.get('positive')

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can be simplified to:

is_positive = config.get('positive', param in positive_params)

eta = pm.Exponential(f"_eta_{name}", lam=1)
# cov = eta ** 2 * pm.gp.cov.ExpQuad(1, ls=ell)

cov = eta ** 2 * pm.gp.cov.Matern32(1, ls=ell)
Copy link

@ulfaslakprecis ulfaslakprecis Jan 16, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get this. eta ** 2 is going to blow up for negative x. Plotting eta ** 2 * pm.gp.cov.Matern32(1, ls=ell) for x' = 0 and lam = 1:
Screenshot 2024-01-16 at 16 50 51

Is the point to only let coef value at t = 0 be influenced by coef values at t < 0?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey, apologies, I've been a bit slow on this one. There's been quite a few changes to the main with the custom priors/likelihood pr & shortly more with the out-of-sample prediction.

@bwengals is going to be helping out implementing this one too

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No worries. For context, I'm here because I'm adding TVC to my company's MMM, and thought it would make sense to drop review comments for clarification sake (and to give back, thanks for the beautiful lib). But if any of these comments are out of place, please just ignore them.

@nialloulton
Copy link
Contributor Author

nialloulton commented Jan 16, 2024 via email

@nialloulton
Copy link
Contributor Author

@ulfaslak do we have a new pr for TVPs or are we doing it within this one?

@wd60622
Copy link
Contributor

wd60622 commented Mar 14, 2024

@ulfaslak do we have a new pr for TVPs or are we doing it within this one?

Might suggest new one unless ulf is rebase genie 😁

@ulfaslak
Copy link
Contributor

@nialloulton @wd60622 I'm working on a new one :).

@ulfaslak ulfaslak mentioned this pull request Mar 19, 2024
12 tasks
@wd60622
Copy link
Contributor

wd60622 commented May 9, 2024

Good to close this one @ulfaslak @nialloulton ?

@wd60622
Copy link
Contributor

wd60622 commented Jun 3, 2024

Closing following #628

@wd60622 wd60622 closed this Jun 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request MMM
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants