kamrul Islam Tushar
kamrul Islam Tushar

Reputation: 41

What is the best way to fetch millions of rows at a time in spring boot?

I have a spring boot application and for a particular feature I have to prepare a CSV everyday for another service to use. The job runs everyday at 6 AM. And dumps the csv on the server. The issue is the data list is big. It's Around 7.8 millions of rows.I am using spring JPA to fetch all the records. Is their any better way to make it more efficient? Here's my code....

@Scheduled(cron = "0 1 6 * * ?")
public void saveMasterChildList() {

    log.debug("running write job");
    DateFormat dateFormatter = new SimpleDateFormat("dd_MM_yy");
    String currentDateTime = dateFormatter.format(new Date());

    String fileName = currentDateTime + "_Master_Child.csv";
    ICsvBeanWriter beanWriter = null;
    List<MasterChild> masterChildren = masterChildRepository.findByMsisdnIsNotNull();
    try {
        beanWriter = new CsvBeanWriter(new FileWriter(new File("/u01/edw_bill/", fileName)),
            CsvPreference.STANDARD_PREFERENCE);
        String[] header = {"msisdn"};
        String[] nameMapping = {"msisdn"};
        beanWriter.writeHeader(header);
        for (MasterChild masterChild : masterChildren) {
            beanWriter.write(masterChild, nameMapping);
        }
    } catch ( IOException e) {
        log.debug("Error writing the CSV file {}", e.toString());
    } finally {
        if (beanWriter != null) {
            try {
                beanWriter.close();
            } catch (IOException e) {
                log.debug("Error closing the writer {}", e.toString());
            }
        }
    }

} here

Upvotes: 1

Views: 6663

Answers (1)

Joja
Joja

Reputation: 36

You could use pagination to separate data and load chunk by chunk. See this.

Upvotes: 1

Related Questions