2

Let take this sample

export let index = (req: Request, res: Response) => {


for (let i = 0; i < 10; i++) {
    let bulk = Faker.collection.initializeUnorderedBulkOp();
    for (let y = 0; y < 200000; y++) {
        bulk.insert({
            name: randomName(),
            nights: Math.random(),
            price: Math.random(),
            type1: Math.random(),
            type2: Math.random(),
            type3: Math.random(),
            type4: Math.random(),
            departure: mongoose.Types.ObjectId(randomAreaID()),
            destination: mongoose.Types.ObjectId(randomAreaID()),
            refundable: randomBool(),
            active: randomBool(),
            date_start: randomDate(),
            date_end: randomDate(),
        });
    }
    bulk.execute();
}


return res.json({data: true});

};

With this code I try to insert "some" documents to my collection.

I use initializeUnorderedBulkOp but If I try to save more than 1 million docs then I have a memory issue

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory

I know that I can increase the memory but I want to find a better solution. Because now I need 2m records but in the future i will need 100m.

Any suggestion to avoid memory issues?

2
  • Is there any particular reason why you have to be able to insert two hundred thousand records at once? On this scale, wouldn't it be a better idea to push that to a background processor that can crank through it in manageable chunks without leaving the user hanging? Commented Oct 10, 2017 at 17:50
  • In my application, these is a need to create 2 million "packages"... so yes... there is a particular reason. Sometimes there are more than 2m combinations. If i send it to the background I will have the same memory issue. So first of all i want to solve this problem. In php if I split it to chunks there is no problem because it is sync operation Commented Oct 10, 2017 at 17:53

1 Answer 1

3

Async/Await is the solution. 200.000.000 documents without heap out of memory. Unlimited is now the limit....

let bulkPromise = (data) => {
return new Promise((resolve, reject) => {
    if (data.length > 0) {
        let bulk = Faker.collection.initializeUnorderedBulkOp();
        data.forEach((d) => {
            bulk.insert(d);
        })
        bulk.execute(() => {
            resolve(true);
        });
    } else {
        resolve(false);
    }
});
}

export let index = async (req: Request, res: Response) => {

for (let i = 0; i < 100; i++) {
    let data = [];
    for (let y = 0; y < 200000; y++) {
        data.push({
            name: randomName(),
            nights: Math.random(),
            price: Math.random(),
            type1: Math.random(),
            type2: Math.random(),
            type3: Math.random(),
            type4: Math.random(),
            departure: mongoose.Types.ObjectId(randomAreaID()),
            destination: mongoose.Types.ObjectId(randomAreaID()),
            refundable: randomBool(),
            active: randomBool(),
            date_start: randomDate(),
            date_end: randomDate(),
        });
    }
    await bulkPromise(data).then((data) => {
        console.log(i);
    });

}

return res.json({data: true});


};
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.