You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When uploading files larger than 200MB (zipped) the browser cannot allocate the required memory for inflating using pako. It returns the following error:
RangeError: Array buffer allocation failed
at new ArrayBuffer (<anonymous>)
at new Uint8Array (<anonymous>)
at Object.flattenChunks (pako.js:892)
at Inflate.onEnd (pako.js:738)
at Inflate.push (pako.js:690)
at Object.inflate (pako.js:789)
at FileReader.reader.onload (index.js:201)
Currently, we inflate the whole file (pako), decode it (TextDecode), and then chunk it for processing. This needs to be refactored, some ideas:
Inflate only the required chunk with pako, then decode and process
Try a different library to inflate the gzip chunks, like (jszip
The text was updated successfully, but these errors were encountered:
The error was caused because browsers have a 2^32 size limit for unsigned arrays, required for all decompression libraries, and the 2.1GB reaches that, according to developers of fflate .
Currently evaluating fflate, in final stages of testing. Looks like it runs faster and with less memory use, since it decompresses in streaming instead of decompressing the whole file.
When uploading files larger than 200MB (zipped) the browser cannot allocate the required memory for inflating using pako. It returns the following error:
Currently, we inflate the whole file (pako), decode it (TextDecode), and then chunk it for processing. This needs to be refactored, some ideas:
The text was updated successfully, but these errors were encountered: