Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid full copy before compressing #220

Closed
erwanlpfr opened this issue Jun 6, 2023 · 1 comment
Closed

Avoid full copy before compressing #220

erwanlpfr opened this issue Jun 6, 2023 · 1 comment
Labels
duplicate This issue or pull request already exists

Comments

@erwanlpfr
Copy link
Contributor

erwanlpfr commented Jun 6, 2023

Hello,

I am about to start this week or next week the addition of compression backend.
For example, to use Zstd which is more efficient at some points.

But I would like to ask a question,
I have 200 GB + up to 2 TB files that I want to back up. It is actually ultra long because :

  • It copies before (while using gpg)
  • compressing
  • and then sending.

Am I right?

Is there any solution to directly pass files to the compressor ?

Maybe, I'll find out the answer by reading deeply the code, but maybe there is a logic I need to know before.

Thanks !

@m90
Copy link
Member

m90 commented Jun 6, 2023

As far as I understand this is a duplicate of #95

Implementing the entire processing in a streaming fashion would be very nice, but requires quite some refactoring as currently almost all code expects files to be passed. If you were to implement this I'd be happy to help out, but I could imagine that it's quite a complex undertaking.

@m90 m90 closed this as not planned Won't fix, can't repro, duplicate, stale Jun 6, 2023
@m90 m90 added the duplicate This issue or pull request already exists label Jun 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
duplicate This issue or pull request already exists
Projects
None yet
Development

No branches or pull requests

2 participants