How to deal with exceeding memory limits while dealing with really large lists?

While implementing and testing a text document Java-based index, one of my main problems was an OutOfMemoryException. This happened quite often since the size of the collection was pretty large. I did try to increase the heap size, but this did not really solve the problem. How do you handle processing large batches of data without exceeding the memory limits?
2 answers

You should try to use Streams which have lazy loading and generally perform decently!


This is the most common solution to this problem!

Marian Berendes - Mon, 12/10/2018 - 10:08 :::

The solution lies in changing your approach. For example, I populated a list of documents to be indexed, and then sent the whole list to the index.
A more appropriate solution would be to send each document one by one to the indexer and build the index gradually instead of handling really large object lists.