Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatic Inference Batching Support #295

Open
SinanAkkoyun opened this issue Jun 12, 2023 · 3 comments
Open

Automatic Inference Batching Support #295

SinanAkkoyun opened this issue Jun 12, 2023 · 3 comments

Comments

@SinanAkkoyun
Copy link

Hey!
the HuggingFace text-generation-inference is an inference server solution that can, if you do concurrent HTTP requests, automatically batch compute generations. (Let's say there is a request in progress and another one on the way, it can automatically adapt the batching in-progress)

I want to build an inference solution based on faster-whisper.
Is manual batching supported? I am not sufficient enough to safely implement it on my own, but I would like to build up on top of that, if possible.

@arnavmehta7
Copy link

Faster-whisper performs batching internally for various vad_segments. I would be curious to know if we can get more speedups if we batch multiple audios.

Afaik, huggingface one, has a DELTA during which, if multiple requests are sent then they will be batched and run together, otherwise not.

@guillaumekln
Copy link
Contributor

Faster-whisper performs batching internally for various vad_segments.

No, there is no batching at this time. See #59 which is the main issue for batching support.

@arnavmehta7
Copy link

@guillaumekln oh very very sorry for the oversight. I feel this could be easily done by taking this as reference https://github.com/m-bain/whisperX/blob/main/whisperx/asr.py#L210-L230

@guillaumekln guillaumekln changed the title Inference Batching Support Automatic Inference Batching Support Aug 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants