Reputation: 1331
I have written data like that in a file (kind of)
{:a 25 :b 28}
{:a 2 :b 50}
...
I want to have a lazy sequence of these maps.
There are around 40 millions of lines. I can also write chunks of 10000 but I do not thnik it will change the way the functions are written (mapcat
instead of map
)
To read it, I wrote
(with-open [affectations (io/reader "dev-resources/affectations.edn")]
(map read-string affectations))
The problem is that Clojure tells
Don't know how to create ISeq from : java.io.BufferedReader
To be honest I understand nothing on the java.io namespace. I would like to have a lazy sequence of the data in the file but I do not know how to turn the stream into strings and then collections.
Any idea ?
Is this read-line
?
Thanks
Upvotes: 3
Views: 2488
Reputation: 13175
You are passing java.io.BufferedReader
to map
whereas map
expects a seq.
You need to use line-seq
to produce a (lazy) seq of lines from your file:
(with-open [affectations (io/reader "dev-resources/affectations.edn")]
(map read-string (lazy-seq affectations)))
Remember, that you need to force all your side effects on the data read from a resource opened in with-open
within its scope, otherwise you will get errors.
One option is to just force the whole seq of text lines from your files and return it using doall
. However, this solution could read all your data into memory which doesn't seem practical.
I guess you need to execute some logic for each of the line from the file and you don't need to keep all those parsed collections in memory. In such case you could pass a function representing that logic into your function handling reading your file:
(defn process-file [filename process-fn]
(with-open [reader (io/reader filename)]
(doseq [line (line-seq reader)]
(-> line
(read-string)
(process-fn)))))
This function will read your file line by line converting each of it individually using read-string
and calling your process-fn
function. process-file
will return nil
.
Upvotes: 2