npm install out of memory error - help!

I’ve been chasing my tail with an error that I believed was due to the recursive nature of some node module dependencies where I’m working on forks of those modules.

In summary I have forks of rdflib and solid-auth-client and am using my own safenetworkjs module in solid-auth-client. There’s a cyclic dependency because:

solid-auth-client → safenetworkjs
safenetworkjs → rdflib
rdflib → solid-auth-client

Where ‘->’ means ‘requires’. I know this is related to the problem I have because I can build things if I break the loop by commenting out the require('rdflib') in safenetworkjs .

I was using yarn link and getting the following error (or one very similar) when trying to build solid-auth-client:

[   ...............] | loadDep:string: sill resolveWithNewModule async@0.9.2 ch
<--- Last few GCs --->

[16429:0x2e15d80]   223683 ms: Mark-sweep 1297.8 (1483.8) -> 1297.7 (1483.8) MB, 446.6 / 0.1 ms  allocation failure GC in old space requested
[16429:0x2e15d80]   224288 ms: Mark-sweep 1297.7 (1483.8) -> 1297.7 (1445.3) MB, 604.3 / 0.2 ms  last resort GC in old space requested
[16429:0x2e15d80]   224821 ms: Mark-sweep 1297.7 (1445.3) -> 1297.7 (1439.8) MB, 532.5 / 0.0 ms  last resort GC in old space requested


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x39e0c025879 <JSObject>
    2: _stat [/home/mrh/.nvm/versions/node/v8.11.3/lib/node_modules/npm/node_modules/glob/glob.js:735] [bytecode=0xb83f19a0b51 offset=206](this=0x14fa62037861 <Glob map = 0x19c1f2c9d429>,f=0xb83f19a0159 <String[9]: server.js>,cb=0x14fa620360b9 <JSFunction (sfi = 0xb83f19a06e1)>)
    3: _processSimple [/home/mrh/.nvm/versions/node/v8.11.3/lib/node_modules/npm/node_modules/glob/glob.js:675] [byteco...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: node::Abort() [npm]
 2: 0x8c21ec [npm]
 3: v8::Utils::ReportOOMFailure(char const*, bool) [npm]
 4: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [npm]
 5: v8::internal::Factory::NewFixedArray(int, v8::internal::PretenureFlag) [npm]
 6: v8::internal::HashTable<v8::internal::StringTable, v8::internal::StringTableShape>::NewInternal(v8::internal::Isolate*, int, v8::internal::PretenureFlag) [npm]
 7: v8::internal::HashTable<v8::internal::StringTable, v8::internal::StringTableShape>::New(v8::internal::Isolate*, int, v8::internal::PretenureFlag, v8::internal::MinimumCapacity) [npm]
 8: v8::internal::HashTable<v8::internal::StringTable, v8::internal::StringTableShape>::EnsureCapacity(v8::internal::Handle<v8::internal::StringTable>, int, v8::internal::PretenureFlag) [npm]
 9: v8::internal::StringTable::LookupString(v8::internal::Isolate*, v8::internal::Handle<v8::internal::String>) [npm]
10: 0xffeb9e [npm]
11: v8::internal::Runtime_KeyedGetProperty(int, v8::internal::Object**, v8::internal::Isolate*) [npm]
12: 0x1464cd0042fd
Aborted (core dumped)

Several people suggested that npm should take care of cyclic dependencies, so I wondered if using ‘yarn link’ might be subverting that. So I have now switched to using local paths in the package.json of each module that is using one of the above modules.

Now I get a very similar error during the npm install and this is starting to look like it is related to the SAFE API module @maidsafe/safe-node-app.

Using npm install --verbose in safenetworkjs I get the following. Note that if repeatedly does the same GET. This carries on for a minute or two, and then I get the memory allocation error as before. I think this may be related to safe_app_nodejs module because of the reference to electron in the GET.

npm http fetch GET 200 https://registry.npmjs.org/indent-string 3ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/repeating 1ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/is-finite 2ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/speedometer 2ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/fstream 5ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/pullstream 5ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/match-stream 5ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/binary 6ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/chainsaw 2ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/buffers 2ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/traverse 3ms (from cache)
npm WARN deprecated graceful-fs@3.0.11: please upgrade to graceful-fs 4 for compatibility with current and future versions of Node.js
npm http fetch GET 200 https://registry.npmjs.org/natives 1ms (from cache)
npm WARN deprecated natives@1.1.6: This module relies on Node.js's internals and will break at some point. Do not use it, and update to graceful-fs@4.x.
npm http fetch GET 200 https://registry.npmjs.org/over 3ms (from cache)
npm http fetch GET 200 https://registry.npmjs.org/slice-stream 3ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 9ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 4ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 7ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 3ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 5ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 4ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 5ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 3ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 5ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 3ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 6ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 3ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 6ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 3ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 5ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 3ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 11ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 4ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 6ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 3ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 8ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 3ms (from cache)
[   ...............] \ fetchMetadata: sill resolveWithNewModule webidl-conversions@4.0.2 checking installable

The above npm install ends with:

...
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/deps-downloader/deps_downloader-0.3.0.tgz 26ms (from cache)
npm http fetch GET 200 https://s3.eu-west-2.amazonaws.com/electron-download-fork/electron-download-fork-4.1.0.tgz 10ms (from cache)
[   ...............] \ fetchMetadata: sill resolveWithNewModule regenerator-runtime@0.13.2 checking installabl
<--- Last few GCs --->

[22923:0x39e4520]  1142365 ms: Mark-sweep 1974.1 (2067.6) -> 1959.2 (2067.6) MB, 493.3 / 0.0 ms  (average mu = 0.209, current mu = 0.232) allocation failure scavenge might not succeed
[22923:0x39e4520]  1143047 ms: Mark-sweep 1974.6 (2067.6) -> 1959.8 (2068.1) MB, 511.8 / 0.0 ms  (average mu = 0.227, current mu = 0.250) allocation failure scavenge might not succeed


<--- JS stacktrace --->

==== JS stack trace =========================================

    0: ExitFrame [pc: 0x1a86142]
Security context: 0x3f387c61a299 <JSObject>
    1: validate(aka validate) [0x104a28e82a31] [/home/mrh/.nvm/versions/node/v12.2.0/lib/node_modules/npm/node_modules/aproba/index.js:~25] [pc=0x12824708ec62](this=0x2272fe2004d1 <undefined>,0x1e7935bf68e1 <String[#2]: SO>,0x1c83eeb170f1 <Arguments map = 0x4a221684239>)
    2: flatNameFromTree(aka flatNameFromTree) [0xe954e488409] [/home/mrh/.nvm/versions/node/v...

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0x98c680 node::Abort() [npm]
 2: 0x98d5e6 node::OnFatalError(char const*, char const*) [npm]
 3: 0xb077ce v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [npm]
 4: 0xb07b49 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [npm]
 5: 0xf12c05  [npm]
 6: 0xf1d56b v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [npm]
 7: 0xf1e287 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [npm]
 8: 0xf20d25 v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationType, v8::internal::AllocationAlignment) [npm]
 9: 0xeeba3c v8::internal::Factory::AllocateRawArray(int, v8::internal::PretenureFlag) [npm]
10: 0xeec9b6 v8::internal::Factory::NewUninitializedFixedArray(int, v8::internal::PretenureFlag) [npm]
11: 0xe9b841  [npm]
12: 0xe9b9a6  [npm]
13: 0xea0143  [npm]
14: 0x109a203 v8::internal::JSObject::AddDataElement(v8::internal::Handle<v8::internal::JSObject>, unsigned int, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyAttributes) [npm]
15: 0x104a4bf v8::internal::Object::AddDataProperty(v8::internal::LookupIterator*, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyAttributes, v8::Maybe<v8::internal::ShouldThrow>, v8::internal::StoreOrigin) [npm]
16: 0x104b2f7 v8::internal::Object::SetProperty(v8::internal::LookupIterator*, v8::internal::Handle<v8::internal::Object>, v8::internal::StoreOrigin, v8::Maybe<v8::internal::ShouldThrow>) [npm]
17: 0x11d85e7 v8::internal::Runtime::SetObjectProperty(v8::internal::Isolate*, v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Object>, v8::internal::StoreOrigin, v8::Maybe<v8::internal::ShouldThrow>) [npm]
18: 0x11d971a v8::internal::Runtime_SetKeyedProperty(int, unsigned long*, v8::internal::Isolate*) [npm]
19: 0x1a86142  [npm]
Aborted (core dumped)

I’ve tried messing with node versions. The above was with v12, but I was getting the same with v8. The rdflib and solid-auth-client are using v10.

I’d like to crack this so that a Solid app can just drop in a new solid-auth-client in order to work with SAFE. Any ideas? Oh, and there’s some earlier discussion of the issue on the Solid gitter here.

UPDATE: removing dependency @maidsafe/safe-node-app changes the error output, but it remains stuck repeating the same status resolveWithNewModule (but without the GET line that references ‘electron’ appearing above each time). So it isn’t related to safe-node-app although it is odd it kept repeating that particular GET at that point.

1 Like

I’d be curious to watch system resource usage while running the install.
Run top | grep "node" in one terminal while installing dependencies in another.

I’m not sure what the default memory allocation limit is for a Node.js process; at one point it was 1GB for 64-bit systems.

That can supposedly be modified by passing an option to v8 called max_old_space_size where the number passed to it is in MB. yarn --node-args="--max_old_space_size=2000" . ← It used to be the case that the max one could set the upper limit to on 64-bit systems was 1.7 GB but it may have changed. Trying to hunt it down in V8 documentation

Still not certain if this is the best solution. I’m going to read if it’s possible to split installs among several workers(processes). ← Although I was under the impression that yarn already does this.

1 Like

Thanks Hunter. I haven’t looked into this aspect (increasing resources) because I’m fairly sure the issue is the cyclic dependency and think increasing resources would just delay the inevitable! I’ll keep it in mind though.

Mean time I’ve thought of a workaround that I hope will achieve the aim, but avoid the cyclic dependency.

So don’t anyone spend much time on the above, but if anyone sees this and can say ‘Ahah! What you could do is…’ then great.

For once a Web search didn’t do the trick so I’m having to work for a solution!

2 Likes

This topic was automatically closed after 60 days. New replies are no longer allowed.