-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP: functional API #521
WIP: functional API #521
Conversation
Hi @jonas-eschle, thanks a lot for diving through the codebase and thinking along! First some background on the function interfaces. I think what you want to change has more to do with the argument order of the functions that are generated with SymPy. Essentially #488, but swapped, see here. But this is more about the implementation, not the API. For that we would have to think about 1. Note CI currently fails because of ComPWA/actions#61. The current constraint files were generated with Footnotes
|
Thanks for the explanations @redeboer, that's useful insights! AFAIU, the #488 introduces some ordering? I may don't quite understand the details of that, so that's why I am maybe missing the point (or is this anyways heavy WIP?) Btw,
fully agree, and I would suggest to do that anyways. But I think my point is different: the order of the arguments doesn't matter (as it's a mapping), but the code has a different meaning
gives precedence to p2 In the code, this means that it gives precedence to the default args. So think of the following assuming it takes x (data) and mu, sigma (parameters). The model has a default for mu and sigma. We can call it with just But we cannot explicitly say the parameters here, i.e. Now it says basically "if the users gives a parameter value, override it and ignore it, use the default". On the other hand, if we swap as suggested, it's "use the default values if not given, but if the user provides a parameter explicitly, use that instead" So nothing changes anywhere basically except that a parameter can be given explicitly. Actually, the current order of deliberatly (and silently!) overwriting parameters that are given explicitly to the function is probably not intentional? It looks a bit to me as if this always should have been the other way around? |
Indeed, it's an internal thing related to how sympy lambdifies symbols to argument names. Because non-python symbol names are dummified, you can't really use the (internal) lambdified function with keyword arguments, so #448 separates the parameter arguments from the data arguments. It's indeed WIP (see ComPWA/compwa.github.io#195), but doesn't affect the interfaces.
Ahh sorry I misunderstood the intention of the PR then :) I thought it had to do with the internal lambdified functions, so #488. You want to provide the option to use the parametrized function with different parameter values without using If we go this way, the interface descriptions have to be updated (always good to do that anyways :)). It means that a Btw, is there a speficic use case you have in mind? |
Yes, exactly, that's the idea, and I think, except of adding in the docs that you can also use it directly, there is nothing to change, should be 100% backwards compatible but rather forward compatible (at least if I think about the best API idea).
The main consequence: it becomes jittable (right? Or am I missing something here?)! Also, the parametrized function becomes, automatically, from a very specific object that has a very specific interface (such as the parameter update) to a general function that takes the variables as inputs. But even without the jitting, for example for minimization, it's now a completely functional API aswell, basically for free. So JIT and minimization is the main motivation. (I thought grads also work, that would have been better, but yep). So there is a difference between calling the function with specific parameters and changing the default values of the function. While autodiff doesn't quite work with it (not sure why?), it's also the way to go for that.
Is it? To me, it looks perfectly elegant. Especially as internally, the ParametrizedFunction does the exact same thing but overrides the parameters if given: It currently doesn't allow but forces it's state. I think it would just solve, in a very elegant way, one of the largest problems with the API, namely that it isn't functional (but here I may miss details). Since you're also working on jitting in other PRs I've seen, maybe I am really just missing details. But then, the functional API that it offers for free seems, without a downside I think, seems easy to enable? |
Yes that's true for the implementation that this PR affects: it extends the behaviour of a
I'm not too familiar with the internals of jitting, but could be. I wonder whether/why a
Good point, actually the perfect summary of the problems that the |
We can also discuss this tomorrow, it's an interesting issue ;)
It passes through a "symbolic parameter", much like sympy, but care must be taken that it's not converted somewhere. Also, it would not reflect any updated state, leading to a different behavior if run with JIT vs without (i.e. the Python object won't be updated if jitted, but it will if not jitted if the parameter is set). I think it's jittable because it's simply a function now? Hm, |
Indeed!
I don't know how jit works exactly (probably also package-specific), but if I were jit, I would have a hard time figuring out what the effect of those internal parameters is, even if the unpacking order is swapped 😅 If that is true, the fact that a
Hah, actually this touches on a larger issue :D The use of tensorwaves has shifted towards streamlining the creation and use of large expressions/functions and less towards fitting, even if the interfaces had the latter in mind. Perhaps that is also what this PR is about, that the interfaces need to be rethought towards that. Also interesting! |
actually, here, only Python is happening: yes, these are "symbolic parameters", but only the ones that survive the dict merge survive the jit, so to say
Sounds good, that is well aligned and I am really happy to see the extensive work on the conversion and careful backend handling, it's quite a tricky issue (so happy to make use of it!) P.S: I'll just keep it open for now if somebody has an urgent comment, but probably close it if things are anyways meant to be handled differently and we can see in other PRs/chat |
@redeboer feel free to close, whatever you think fits best, as discussed, it's not crucial and I'll use other ways to access the function. |
Thanks again for thinking along! :) import numpy as np
import sympy as sp
from tensorwaves.function.sympy import create_parametrized_function
x, μ, σ = sp.symbols("x mu sigma")
gaussian_expr = sp.exp(-(((x - μ) / σ) ** 2) / 2) / (σ * sp.sqrt(2 * sp.pi))
gaussian_func = create_parametrized_function(
gaussian_expr,
parameters={μ: 0, σ: 0.5},
backend="numpy",
)
gaussian_func.update_parameters({"mu": 3.5}) # means updating "default" parameters?
y_values = gaussian_func(
data={ # type hints and keyword suggest a data sample as input
"x": np.linspace(-2.0, +2.0, num=100),
"mu": -1.5, # now this parameter is used
}
) Instead, to achieve the desired behavior, it is probably better to use from tensorwaves.function.sympy import create_function
gaussian_func = create_function(gaussian_expr, backend="numpy")
y_values = gaussian_func({
"x": np.linspace(-2.0, +2.0, num=100),
"mu": -1.2,
"sigma": 0.5,
}) Still, the PR touched on many points that are probably better discussed in #525. |
If we swap the order of data and params, the params associated with the model are treated as defaults and can be overriden by giving them explicitly.
This (to my understanding) makes the function callable with parameters as args instead of updating the parameters in the model.
There should not be any legacy issue as I hardly doubt anyone gives wrong parameter values and relies on the functionto overwrite it with the defaults?
If that's fine, I'll gladly add a quick test, changelog etc.
What do you think @redeboer ?