Replies: 2 comments 1 reply
-
Have you tried cloning the response body? // Clone the response before passing it to a new Response
const clonedResponse = response.clone(); // Clone the response safely
return new Response(clonedResponse.body, { // Now pipe the cloned body.
headers
}); |
Beta Was this translation helpful? Give feedback.
1 reply
-
I have a new attempt: const stream = new ReadableStream({
start(controller) {
const reader = response.body!.getReader();
function push() {
reader
.read()
.then(({ done, value }) => {
if (done) {
controller.close();
return;
}
controller.enqueue(value);
push();
})
.catch((err) => {
controller.error(err);
});
}
push();
}
});
return new Response(stream, {
headers
}); I have yet to verify it the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the bug
In a
+server.ts
, returning aResponse
with body of typeReadableStream
works....but does not close theReadableStream
. This causes the stream to stay open, never de-allocated by the GC.Reproduction
Logs
System Info
Severity
serious, but I can work around it
Additional Information
I discovered this after seeing my app's memory usage trending up. After doing a heapdump, I saw thousands of unreleased
Client
object (i.e.: thefetch
). To test my theory, I changed the code to:This works, which I expected because the
arrayBuffer
(andjson()
/text()
) consumes the stream completely. However, if the stream is huge, this will pose a problem.Question: What's the proper way to pass a stream to
Response
?Beta Was this translation helpful? Give feedback.
All reactions