rum-and-laughs
rum-and-laughs

Reputation: 43

Read large csv file into array c++

I am trying to read a csv file with over 170000 rows with 10 columns each entry. I wrote this code using c++ (in visual studio 2017) to read it, but it only reads 3600 entries before failing.

// Trial1.cpp : Defines the entry point for the console application.
//

#include "stdafx.h"
#include <iostream>
#include <cstdlib>
#include <fstream>
#include <sstream>

using namespace std;
int main()
{

    ifstream file("Brightest Day.csv");

    if (!file.is_open())
    {
        cout << "ERROR: File Open" << "\n";
    }

    string data[3000][10];

    for (long i = 0; i < 3000; i++)
    {

            for (int j = 0; j < 10; j++)
            {
                getline(file, data[i][j], ',');

            }

    }

    for (long i = 0; i < 3000; i++)
    {

        for (int j = 0; j < 10; j++)
        {
            cout<<data[i][j]<<" | ";
            if (j == 10)
            {
                cout << "\n";
            }
        }
    }
    return 0;
}

Even if it could only read around 10000 entries, I'd call it a success

Upvotes: 1

Views: 1272

Answers (1)

Drew Dormann
Drew Dormann

Reputation: 63775

You are overflowing your stack. Welcome to this website.

Your call stack is designed for small objects whose sizes are known at compile time. That's why your rubber ducky is wondering where 3000 came from. It's a guess, and anyone creating a 3001-line csv will likely crash your program. If you think 10000 lines is a success, then 10001 lines is a crash.

Use std::vector. It's an array-like structure. It manages its own size. And it doesn't store your data on the limited stack.

Upvotes: 5

Related Questions