You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? If so, please describe.
MLflow models are not currently supported as a modelFormat in modelmesh, despite KServe offering an InferenceService for MLflow models. Is there a particular reason for this limitation?
Describe your proposed solution
A custom serving runtime can be created using the MLFlowRuntime from MLServer. However, I was wondering if there could be a more streamlined or built-in way to support MLflow models directly in ModelMesh considering the above.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? If so, please describe.
MLflow models are not currently supported as a
modelFormat
in modelmesh, despite KServe offering anInferenceService
for MLflow models. Is there a particular reason for this limitation?Describe your proposed solution
A custom serving runtime can be created using the MLFlowRuntime from MLServer. However, I was wondering if there could be a more streamlined or built-in way to support MLflow models directly in ModelMesh considering the above.
The text was updated successfully, but these errors were encountered: