Indrajeet Gour
Indrajeet Gour

Reputation: 4510

GC overhead limit exceeded while read from CSV file

I am using below piece of code for reading the 150MB CSV file and getting GC error

enter image description here

Same code which was causing the problem

    public List<String[]> readCsvFile(String ipFilePath) {
         logger.info("Start executing readCsvFile method !!! on file " + ipFilePath);

         CSVReader csvReader = null;
         List<String[]> allRecrods = null;
         Reader reader = null;
    try {
        reader = Files.newBufferedReader(Paths.get(ipFilePath));
        csvReader = new CSVReader(reader);
        allRecrods = csvReader.readAll();
    } catch (Exception e) {
        logger.error("Error in CsvFileReader !!!");
        e.printStackTrace();
        logger.error("Exception : ", e);
    } finally {
        try {
            reader.close();
            csvReader.close();
        } catch (IOException e) {
            logger.error("Error while closing fileReader/csvFileParser !!!");
            e.printStackTrace();
            logger.error("IOException : ", e);
        }
    }
    return allRecrods;
}

I am getting error on the method : csvReader.readAll() as mentioned above. I am not sure what is the problem which the code, and how to solve this, as the same code is working fine with 20-30 MB files.

Upvotes: 0

Views: 1140

Answers (2)

StefanE
StefanE

Reputation: 7630

You should not read all the lines but instead process the file line by line and size won't matter and is way more memory efficient.

Example from here: http://www.baeldung.com/java-read-lines-large-file

FileInputStream inputStream = null;
Scanner sc = null;
try {
    inputStream = new FileInputStream(path);
    sc = new Scanner(inputStream, "UTF-8");
    while (sc.hasNextLine()) {
        String line = sc.nextLine();
        // System.out.println(line);
    }
    // note that Scanner suppresses exceptions
    if (sc.ioException() != null) {
        throw sc.ioException();
    }
} finally {
    if (inputStream != null) {
        inputStream.close();
    }
    if (sc != null) {
        sc.close();
    }
}

Edit: Also if you are looking for a framework I can recommend Spring Batch https://projects.spring.io/spring-batch/

Upvotes: 0

Guts
Guts

Reputation: 768

The simplest solution is to increase heap size with flag "-Xmx" for example: "-Xmx1024m". First you should use some heap size monitoring tool to see if the usage is expected.

Upvotes: 1

Related Questions