-
Notifications
You must be signed in to change notification settings - Fork 161
Description
Hi,
I suspect there is a memory leak when using firestore nodejs lib. I have an application running in GCP cloud runs, and memory never stops growing until it cannot allocate more than the 4GB I have reserved.
I have reduced my app to the example below (where I have redacted enterprise info). I simply get items, not even using them, and memory would never be properly cleared by the GC.
You can run the below script using the first argument as a number to produce more calls, here is the data I gathered looking at the process memory usage with ps:
// 10000 calls > 259 MB
// 1000 calls > 181 MB
// 100 calls > 151 MB
// 0 calls > 95 MB
To be thorough, I ran a similar sample using a mongodb database, and memory is properlly GC'd and goes back to around 150MB whatever the number of samples.
My code might have issues, in which case I would be happy to know how to fix it 👍
Environment details
- OS: Ubuntu 24.04.3 LTS
- Node.js version: v22.17.1
- npm version: 10.9.2
@google-cloud/firestoreversion: 7.11.6 /
Steps to reproduce
Execute the following code, and look at the memory of the process:
node --exmerimental-strip-types filename.ts 1
node --exmerimental-strip-types filename.ts 10
node --exmerimental-strip-types filename.ts 100
node --exmerimental-strip-types filename.ts 1000import { Firestore } from "@google-cloud/firestore";
const iteration = Number(process.argv[2]);
console.log(`iterating ${iteration}`);
for (let i = 0; i < iteration; i++) {
const firestore = new Firestore({
projectId: "my_project_id",
});
await Promise.all(
Array.from(Array(100).keys()).map(async () => {
await firestore
.collection("myCollection")
.where("name", "==", "my name")
.limit(1)
.get();
}),
);
await firestore.terminate();
console.log(`done ${(i + 1) * 100}`);
}
console.log("waiting...");
setTimeout(() => console.log("done"), 1000000);