Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make the concatener a Readable and upload to google chunk by chunk #90

Conversation

julienw
Copy link
Contributor

@julienw julienw commented Jun 18, 2020

This change halves memory usage during an upload by avoiding a copy, by giving the data to the google stream chunk by chunk instead of all at once (checked using top). This also seems to recover memory better when aborting big requests (but I'll have to double check this).

The first commit is #89, please don't look at it.

This might fix #83 incidentally but I haven't tried yet, I'll do before requesting review.
I already tested with a real profile end-to-end.

There's seem to still have a small leak, but I'm not sure yet. That's why I'm not requesting a review yet.

Comment on lines +132 to +138
ctx.req.on('aborted', () => {
log.debug('request-aborted', 'The request has been aborted!');
concatener.unpipe(googleStorageStream);
const error = new BadRequestError('The request has been aborted.');
googleStorageStream.destroy(error);
concatener.destroy(error);
});
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my test this doesn't seem to make a big difference. But this does no harm 🤷

this.chunks.length = 0;
callback();
}

transferContents(): Buffer {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I kept this one for tests, as it's still useful in that context. But we shouldn't use it for the production code.

I wonder if I should add a check for NODE_ENV==='test'... That wouldn't work when running tests obviously but we'd see it when running integration tests. Thoughts?

@julienw
Copy link
Contributor Author

julienw commented Jun 20, 2020

I've thought about this a bit more, this may be just easier to do the work I planned to generate a token before starting to upload, effectively removing the need to concatenate the content. I think i'll try this next week and see if I can get it to work with a small amount of work.

@julienw
Copy link
Contributor Author

julienw commented Jun 22, 2020

Superceded by #91.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Change implementation of the Concatenator to make it look like more a stream
1 participant