Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak #8

Open
pekim opened this issue Sep 28, 2012 · 6 comments
Open

Memory leak #8

pekim opened this issue Sep 28, 2012 · 6 comments

Comments

@pekim
Copy link

pekim commented Sep 28, 2012

I have some code that uses jsonparse (via JSONStream) to parse a file that is about 170MB. The heap keeps growing, and eventually almost continual gc grinds the process almost to a halt.

I thought at first the leak was caused by dominictarr/JSONStream, but I think that I've narrowed the leak down to jsonparse.

This code causes a leak, that I don't think should happen.

var Parser = require('jsonparse');

var string = (new Array(10 * 1024 + 1)).join("x");

var parser = new Parser();
// parser.onValue = function(value) {
//   //console.log('received:', value);
// };

parser.write('[')
while (true) {
  parser.write('"' + string + '",')
}

It streams a never ending array of strings to jsonparse. It's silly, but it seemed to be a simple way to simulate parsing a large file and provoke the leak.

Running with the -trace_gc flag shows that the heap grows rapidly, gc is unable to reclaim much from the heap, and the heap is quickly exhausted.

I don't see why this code shouldn't be able to run indefinitely. Until it does, I'm probably not going to be able to process large files with jsonparse (which is a shame).

@lancecarlson
Copy link

I have the same issue, posted it on JSONStream, but I think jsonparse it is the culprit. Here was the message from the issue ( dominictarr/JSONStream#32 ):

request = require('request'),
Stream = require('stream').Stream,
Parser = require('jsonparse');

var dbName = 'test'
var p = new Parser();
p.onValue = function(value) {
  console.log(value)
}

var down = new Stream()
down.writable = true
down.write = function(data) {
  p.write(data)
  return true
}
down.end = function() {
  console.log('end')
}

var host = process.env.DB_HOST
var path = '_all_docs?include_docs=true'
var url = host + '/' + dbName + '/' + path
request(url).pipe(down)

Memory leak steadily increases from 50 to 260MB's, then midway through it jumps to 500-600MB's. I feel like the json parser queues up data for something then does something else with the queued data afterwards.

@ralphtheninja
Copy link

@lancecarlson Is this still an issue with jsonparse?

@hayes
Copy link
Contributor

hayes commented Jan 10, 2017

Its a bit of a hack, but you can add:

if (parser.value) {
  parser.value = {}
}

to parser.onToken, and it will then store the accrued string on that empty value instead, which should get GCd at the end of the onToken method.

Not sure if that helps

@lojzatran
Copy link

Hi, is this error still valid? Does the hack from @hayes helped? Thanks.

@pekim
Copy link
Author

pekim commented May 31, 2017

I'm no longer have any code that uses this library. So I'm happy for this to issue to be closed.

@yocontra
Copy link

Still an issue AFAICT so keep it open.

<--- Last few GCs --->

[19589:0x102801600]    91378 ms: Scavenge 1394.1 (1423.3) -> 1393.4 (1423.8) MB, 2.6 / 0.0 ms  (average mu = 0.101, current mu = 0.050) allocation failure
[19589:0x102801600]    91384 ms: Scavenge 1394.3 (1423.8) -> 1393.6 (1424.8) MB, 2.6 / 0.0 ms  (average mu = 0.101, current mu = 0.050) allocation failure
[19589:0x102801600]    91389 ms: Scavenge 1394.5 (1424.8) -> 1393.8 (1425.3) MB, 2.7 / 0.0 ms  (average mu = 0.101, current mu = 0.050) allocation failure


<--- JS stacktrace --->

==== JS stack trace =========================================

    0: ExitFrame [pc: 0x1ad60cd841bd]
    1: StubFrame [pc: 0x1ad60cd87de1]
Security context: 0x20a2b359e6c9 <JSObject>
    2: /* anonymous */ [0x20a226c3da91] [/Users/contra/Projects/boundaries/node_modules/jsonparse/jsonparse.js:~127] [pc=0x1ad60cdf5a27](this=0x20a2f84882a1 <Parser map = 0x20a2d61ead19>,buffer=0x20a2d1384229 <Uint8Array map = 0x20a27f8e6541>)
    3: /* anonymous */ [0x20a2f8488371] [/Users/contra/Projects/boundaries/nod...

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0x100033d65 node::Abort() [/Users/contra/.nvm/versions/node/v10.6.0/bin/node]
 2: 0x100035500 node::FatalTryCatch::~FatalTryCatch() [/Users/contra/.nvm/versions/node/v10.6.0/bin/node]
 3: 0x10019f10a v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/contra/.nvm/versions/node/v10.6.0/bin/node]
 4: 0x10056d6b2 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/contra/.nvm/versions/node/v10.6.0/bin/node]
 5: 0x10056c669 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/contra/.nvm/versions/node/v10.6.0/bin/node]
 6: 0x10056a2f8 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/contra/.nvm/versions/node/v10.6.0/bin/node]
 7: 0x10057683c v8::internal::Heap::AllocateRawWithRetry(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/Users/contra/.nvm/versions/node/v10.6.0/bin/node]
 8: 0x1005451d4 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/Users/contra/.nvm/versions/node/v10.6.0/bin/node]
 9: 0x1007cf974 v8::internal::Runtime_AllocateInNewSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/Users/contra/.nvm/versions/node/v10.6.0/bin/node]
10: 0x1ad60cd841bd
Abort trap: 6

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants