Reputation: 1
I have a folder with 20,000+ data entries, and am trying to graph the time in-between each entry, to see where our process slows down. I've been trying to use the Dir() function on Matlab, by combining a couple of different codes I've found, but I feel like I'm way out of my depth, and not even able to get the basic structure right.
for i=1:25910
n = num2str(i);
d = dir('P' n '_Bump.datx');
moddate = d.date;
Plot(n,moddate)
end
I'm more familiar with python if there's a similar function that could pull the timestamp off of a file in that.
Data is formatted like:
P1_Bump.datx
P2_Bump.datx
...
P25910_Bump.datx
Upvotes: 0
Views: 47
Reputation: 30046
You can pull all of your files into a table (which is slightly easier to deal with than a struct in this case) like this:
files = dir( '*_Bump.datx' );
files = struct2table( files, 'AsArray', true );
Then get the dates from the datenum
field of the table (since the date
field is a char and not a MATLAB date)
dates = datetime( files.datenum, 'convertfrom', 'datenum' )
To calculate the time between files you can use diff
timesBetween = diff(dates);
Then plot
figure;
plot( timesBetween );
Note that timesBetween
will be one element shorter than the number of files, since it's the differences in between files.
Upvotes: 1