0

I have a Spring Boot REST service that moves files from a folder into a ZIP archive every minute and upload the archive to another service. My service keeps a map of zipped files and zip name using Cache2K and HSQLDB. Another application uses the service to register the created files and later asks for the name of the zip in which the files was located.

The service is running on Windows Server 2019 Standard 32GB RAM with starting parameters

java -Xms512M -Xmx512M -XX:MaxMetaSpaceSize=128MB -XX:NativeMemoryTracking=detail

result of jcmd 1234 VM.native_memory

after 1min of running: total=1122461KB, commited=728301KB, task manager shows used=534MB

after week of running: total=1137792KB, commited=749376KB, but task manager shows used=8GB

Why is used/reserved memory so high?

Fight recorder do not shows any problems, GC runs well. I tried updating Oracle Java 8 to OpenJDK 11, updating Spring Boot to newer version, nothing helps. Memory consume growing seems to be linear.

5
  • Which expiration time do you set for your cache2k Cache entries? If your entries never expire the Cache will hold a reference to your entries forever and therefore the GC will never free this memory. Commented Nov 29, 2021 at 9:57
  • cache item expiration is 10 minutes, incoming speed of new files is 20/minute, every file (cache item) is requested for 3-5 minutes. Commented Nov 29, 2021 at 12:23
  • 2
    I'm not convinced there's a problem. Anyway, the most common thing would be an unclosed file or IO stream. Do make double-sure you're using try-with-resources everywhere you should. One surprising place is calls like Files.newDirectoryStream() or Files.find(). Use learn.microsoft.com/en-us/sysinternals/downloads/… to see which files your process holds. If it's all the zip files or anything else unexpected, there's your leak. Also, -XX:NativeMemoryTracking=datail should be detail, you have a typo in yours. Commented Nov 29, 2021 at 15:00
  • 1
    sounds like you have a memory leak Commented Nov 30, 2021 at 5:37
  • thanks, "datail" was just typu there. ProcesExplorer shows growing HandlesCountm one handle per second. Really, I found Files.newDirectoryStream() and Files.walk without try-with-resources in my code. Code fixed and handles ocsilating about 730-750. Thanks very much. Commented Nov 30, 2021 at 6:47

2 Answers 2

1

Problem solved.

Files.newDirectoryStream() and Files.walk enclosed with try-with-resources.

Thanks Petr Janeček

P.S. after a week of service running, memory consumption is stable

Sign up to request clarification or add additional context in comments.

1 Comment

Nice. That actually is a common problem. Feel free to add a tick to your answer as it is the correct one. And let me know in a week if the mem usage / opened files count is stable now!
0

Try the -XX:MaxDirectMemorySize parameter if off-heap memory is used

3 Comments

ok I try, but MaxDirectMemorySize should be equal to Xmx by default
Why do you say that? The manual entry says: "By default, the size is set to 0, meaning that the JVM chooses the size for NIO direct-buffer allocations automatically.". There is no mention of -Xmx in that.
Because I saw him say he did file manipulation, and it was a gut feeling

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.