Reputation: 121
TLDR: How do I open files (streamed txt) (async preferably as files can be large) and display them in order?
I have a page which allows users to open multiple files. Each file line is read then added to an array with push
to a display
variable. The display array is then shown with:
document.getElementById('content').innerHTML = <table id ="mainView">' + display.join('') + '</table>';
.
This works fine but since files are loaded asynchronously the file lines are pushed to the array out of order. Is there a way to design a mechanism which uses async file loading but preserves the file order? Lastly, is there a way to do this in javascript without a library/framework?
The file order is determined by the ascending numerical naming convention.
log12 <- lines in this log should appear first in the table
log16
log48
log103
Current Output
log48 content
log12 content
log48 content
log103 content
Expected Output
log12 content
log16 content
log48 content
log103 content
Here is the gist of how I load multiple files, read them line by line, append to an array and display them on the page.
File upload form calls this function:
function handleFileSelect(evt) {
var files = evt.target.files;
fileArray = [].slice.call(files);
for (var i = 0; i < fileArray.length; i++) {
console.log(fileArray[i].name);
getContents(fileArray[i]);
}
}
Loading and display lines for each file. I should note that display
is a global array.
function getContents(f){
if(f){
var reader = new FileReader();
reader.readAsText(f, "UTF-8");
var fileIndexname = f.name
}
reader.onload = function(evt){
var lines = evt.target.result;
// do some file splitting with regex here (removed for simplicity)
for(var i = 0; i < lines.length; i++){
display.push('<tr><td>' + lines[i] + '</td></tr>');
}
document.getElementById('content').innerHTML = '<table id ="mainView">' + display.join('') + '</table>';
}
}
Upvotes: 0
Views: 58
Reputation: 2530
I would rely on Promise.all to maintain the order of uploaded files.
On change event on the input file, I map the uploaded files to Promises that will read the file content of each file.
You can see that I use a different API to read the file content as Blob.text()
is the preferred new Promise-based API.
Then Promise.all() gathers the results keeping the order of uploaded files if this is what you want.
If you need to sort all lines from all files independently of the upload order then use this displayAllLinesSorted function.
document.getElementById('upload').addEventListener('change', (event) => {
const files = [].slice.call(event.target.files);
const filePromises = files.map(file => {
return file.text().then(result => {
return result.split(/[\r\n]+/g);
});
});
Promise.all(filePromises).then(display);
});
function linesReducer(accumulator, currentValue) {
return accumulator + `<tr><td>${currentValue}</td></tr>`;
}
function display(results) {
const table = document.getElementById('table');
table.innerHTML = results.flat().reduce(linesReducer, '');
}
function displayAllLinesSorted(results) {
const table = document.getElementById('table');
table.innerHTML = results.flat().sort().reduce(linesReducer, '');
}
<input type=file id=upload multiple />
<table id=table></table>
Just a note because you said:
async preferably as files can be large
Async does not mean that your performance will be ok on large files. Especially in this case, you are using FileReader.readAsText()
method which loads the whole file content at once.
If you really want to handle large files you should use Blob.stream()
or Blob.slice()
and your DOM <table>
should only contain a part of the file at once. Your UI should allow the user to move back and forth in the files.
Upvotes: 2