user1964161
user1964161

Reputation: 585

Java Garbage Collection on Stack-Based Arrays

Suppose I have the following code:

public void process() {
    byte[] data = new byte[size]; 
    ... // code that uses the above data
    longProcess(); // a very long running process that does not use the data.
}

Assuming that the data is not referenced anywhere else in the program, is the JVM smart enough to allow the data to be garbage collected while the long process is still running?

If not, will adding

data = null;

before the long process allow this to happen?

Upvotes: 11

Views: 251

Answers (4)

NPE
NPE

Reputation: 500297

This depends on the JVM. The versions of Oracle's JVM that I've tried (1.6.0_41 and 1.7.0_09) don't perform this optimization by default. However, 1.7.0_09 does perform it when aggressive optimizations are turned on.

Here's is the test I've conducted:

public class Main {
    public static int g() {
        int n = 100000;
        int arr[][] = new int[n][];
        for (int i = 0; i < n; ++i) {
            try {
                arr[i] = new int[100000];
            } catch (OutOfMemoryError ex) {
                return i;
            }
        }
        return -1;
    }
    public static void f1() {
        int arr[] = new int[1000000];
        System.out.println(g());
    }
    public static void f2() {
        int arr[] = new int[1000000];
        arr = null;
        System.out.println(g());
    }
    public static void main(String[] argv) {
        for (int j = 0; j < 2; ++j) {
            for (int i = 0; i < 10; ++i) {
                f1();
            }
            System.out.println("-----");
            for (int i = 0; i < 10; ++i) {
                f2();
            }
            System.out.println("-----");
        }
    }
}

Using JVM 1.7 with default settings, f1() consistently runs out of memory after 3195 iterations, whereas f2() consistently manages 3205 iterations.

The picture changes if the code is run using Java 1.7.0_09 with -XX:+AggressiveOpts -XX:CompileThreshold=1: both versions can do 3205 iterations, indicating that HotSpot does perform this optimization in this case. Java 1.6.0_41 doesn't appear to do this.

In my testing, restricting the scope of the array has the same effect as setting the reference null, and should probably be the preferred choice if you feel you ought to help the JVM collect the array asap.

Upvotes: 6

Peter Bratton
Peter Bratton

Reputation: 6408

Given the code as written, the array will certainly not be garbage collected during the longprocess() execution, since there is still a scoped reference to the array on the stack. Once that array has been declared, it will not be eligible for garbage collection until all references to it have been removed. Your line

data = null;

will remove one reference to it, although depending on your processing code it may not be the only reference. If all references have been removed, then the garbage collector may very well collect that array's memory by the time longprocess() returns, although this is not guaranteed.

Upvotes: 1

user_CC
user_CC

Reputation: 4776

The data array will only be deallocated with memory after the process method is finished. So if you want the compiler to deallocate the memory then you will have to explicitly add data = null in the code.

The Garbage Collector only frees memory which doesn't have a valid reference available to it and there is no other way to point to that memory again.

Upvotes: 0

kofemann
kofemann

Reputation: 4413

If there are no reference to data, then GC will do the job.

Upvotes: 0

Related Questions