You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello. I am working on a project where we are building Pytorch Gaussian Process Models and converting them into ONNX format. (https://gpytorch.ai/). At runtime, some of these models need to perform linear algebra operations such as a Cholesky decomposition or a triangular matrix solve. So, we would like to add operators to support these linear algebra methods. I see that there has been periodic interest in this idea here before. Would the group be open to the idea of adding a new domain to support linear algebra operations?
I'm thinking of creating an operator domain like ai.onnx.linalg and starting to add key operations modeled after the numpy.linalg library (https://numpy.org/doc/stable/reference/routines.linalg.html).
I realize that these operations are not core to neural networks, but many other kinds of models do make use of linear algebra routines.
Is there interest in this idea?
Has this already been done somewhere?
The text was updated successfully, but these errors were encountered:
FYI (for others) there has been some discussion of this already in Slack, in the operators channel.
A quick summary: A previous PR proposing to introduce SVD. One concern that came up in this PR is that there are different algorithms to compute a SVD, which produce different results. Hence, it was a bit ambiguous what an implementation is supposed to do. It would help if the spec is more precise (eg., may be using an attribute to specify what is required). It was felt that it would help to be driven by use-cases/models as to which implementation/algorithm is desired.
So, in short, the two points that arise are:
Ensuring the spec is clear (eg., for iterative methods in linear algebra, may require specification of convergence threshold),
Use-case/justification (both in terms of models that use it, and perhaps backends/implementations interested in supporting it).
Hello. I am working on a project where we are building Pytorch Gaussian Process Models and converting them into ONNX format. (https://gpytorch.ai/). At runtime, some of these models need to perform linear algebra operations such as a Cholesky decomposition or a triangular matrix solve. So, we would like to add operators to support these linear algebra methods. I see that there has been periodic interest in this idea here before. Would the group be open to the idea of adding a new domain to support linear algebra operations?
I'm thinking of creating an operator domain like ai.onnx.linalg and starting to add key operations modeled after the numpy.linalg library (https://numpy.org/doc/stable/reference/routines.linalg.html).
I realize that these operations are not core to neural networks, but many other kinds of models do make use of linear algebra routines.
Is there interest in this idea?
Has this already been done somewhere?
The text was updated successfully, but these errors were encountered: