Reputation: 528
I'm trying to make a feature in admin panel (with React and Antd framework).
I currently have around 15000 ~ 20000 user info in a csv file. This csv file is regularly updated. Since I have a pretty huge list I was thinking of creating an upload feature where admin can upload the regularly updated csv file so it is transformed into JSON and only selected fields are stored to firestore. This means everytime I upload a new file, all 15000~20000 (26mb) data records are overwritten. Is it inefficient to create this kind of workflow? There's a chance that these data becomes larger so I can't possibly update them manually.
Can someone give me some advice on how to handle such situation.
Upvotes: 2
Views: 699
Reputation: 5390
Browsers allow from 2 to 4 (some even larger) (GB) maximum size of file to upload so your file is not large.
And I used to work with task like this before and can say that antd
table does it easy. Only filters work a little slowly.
import React, {useState} from 'react';
import ReactDOM from 'react-dom';
const columns = [
/** Here was my columns */
]
const App = () => {
const [data, setData] = useState([]);
const handleUpload = (e) => {
const file = e.target.files[0];
const reader = new FileReader();
reader.onload = event => {
try {
const result = event.target.result;
setData(JSON.parse(result));
} catch (e) {
console.log(e)
}
}
reader.readAsText(file);
}
return (
<>
<input type="file" onChange={handleUpload}/>
<Table
columns={columns}
dataSource={data}
/>
</>
)
}
ReactDOM.render(
<App/>,
document.getElementById('root')
);
Glad if my answer was helpful to you :) Regards!
Upvotes: 3