My Nmerge program runs within a JVM whose maximum memory is set to 1GB. That seemed to be plenty but it still ran out of something whenever I gave it more than about 50K of text to digest. I always thought this was because Java was such a memory-hog. And it is. But that wasn't the reason it was failing for years. The simple answer was that increasing memory space in Java via the -Xmx option doesn't increase the stack space in proportion. Stack space is controlled by the -Xss option, and the default stack size is just 512K. Now that's plenty for most applications but nmerge was no ordinary program. At its heart were two mutually recursive procedures. I proved mathematically that they would terminate and produce a valid result, but what I couldn't estimate was the maximum stack size they would require. It turns out that the stack size needed is proportional to the length of the file. Or more precisely it was the number of 'nodes' in the graph. A short file with a complex graph structure (and lots of variation) might need more stack space than a longer file with relatively little variation. Whatever the reason, increasing the stack space to 4MB allowed the program to deal with much larger files. And I had a Homer Simpson D'oh! moment.