Reputation: 999
I want to stream large binary files (exe, jpg, ..., all types of files). It seems that the aleph client can do it. I looked at the official sample and understood that if we pass the lazy sequence to the body, the response can be passed a stream.
(defn streaming-numbers-handler
"Returns a streamed HTTP response, consisting of newline-delimited numbers every 100
milliseconds. While this would typically be represented by a lazy sequence, instead we use
a Manifold stream. Similar to the use of the deferred above, this means we don't need
to allocate a thread per-request.
In this handler we're assuming the string value for `count` is a valid number. If not,
`Integer.parseInt()` will throw an exception, and we'll return a `500` status response
with the stack trace. If we wanted to be more precise in our status, we'd wrap the parsing
code with a try/catch that returns a `400` status when the `count` is malformed.
`manifold.stream/periodically` is similar to Clojure's `repeatedly`, except that it emits
the value returned by the function at a fixed interval."
[{:keys [params]}]
(let [cnt (Integer/parseInt (get params "count" "0"))]
{:status 200
:headers {"content-type" "text/plain"}
:body (let [sent (atom 0)]
(->> (s/periodically 100 #(str (swap! sent inc) "\n"))
(s/transform (take cnt))))}))
I have the following code:
(ns core
(:require [aleph.http :as http]
[byte-streams :as bs]
[cheshire.core :refer [parse-string generate-string]]
[clojure.core.async :as a]
[clojure.java.io :as io]
[clojure.string :as str]
[clojure.tools.logging :as log]
[clojure.walk :as w]
[compojure.core :as compojure :refer [ANY GET defroutes]]
[compojure.route :as route]
[me.raynes.fs :as fs]
[ring.util.response :refer [response redirect content-type]]
[ring.middleware.params :as params]
[ring.util.codec :as cod])
(:import java.io.File)
(:import java.io.FileInputStream)
(:import java.io.InputStream)
(:gen-class))
(def share-dir "\\\\my-pc-network-path")
(def content-types
{".png" "image/png"
".GIF" "image/gif"
".jpeg" "image/jpeg"
".svg" "image/svg+xml"
".tiff" "image/tiff"
".ico" "image/vnd.microsoft.icon"
".bmp" "image/vnd.wap.wbmp"
".css" "text/css"
".csv" "text/csv"
".html" "text/html"
".txt" "text/plain"
".xml" "text/xml"})
(defn byte-seq [^InputStream is size]
(let [ib (byte-array size)]
((fn step []
(lazy-seq
(let [n (.read is ib)]
(when (not= -1 n)
(let [cb (chunk-buffer size)]
(dotimes [i size] (chunk-append cb (aget ib i)))
(chunk-cons (chunk cb) (step))))))))))
(defn get-file [req]
(let [uri (:uri req)
file (str/replace (w/keywordize-keys (cod/form-decode uri)) #"/file/" "")
full-dir (io/file share-dir file)
_ (log/debug "FULL DIR: "full-dir)
filename (.getName (File. (.getParent full-dir)))
extension (fs/extension filename)
_ (log/debug "EXTENSION: " extension)
resp {:status 200
:headers {"Content-type" (get content-types (.toLowerCase extension) "application/octet-stream")
"Content-Disposition" (str "inline; filename=\"" filename "\"")}
:body (with-open [is (FileInputStream. full-dir)]
(let [bs (byte-seq is 4096)]
(byte-array bs)))}
]
resp))
(def handler
(params/wrap-params
(compojure/routes
(GET "/file/*" [] get-file)
(route/not-found "No such file."))))
(defn -main [& args]
(println "Starting...")
(http/start-server handler {:host "0.0.0.0" :port 5555}))
I get the uri and try to read file with chunks. I wanted to do it because files can be about 3 GB. So, what I expected is the application would not not use more memory than the size of a chunk. But when I've set 1GB (-Xmx option) for the application, it has taken all memory (1 GB). Why is 1GB taken? Does JVM work this way? When I have 100 simultaneous connections (e.g., each file is 3GB), I get OutOfMemoryError.
The task is to "stream" a file with chunks to avoid OutOfMemoryError.
Upvotes: 3
Views: 1344
Reputation: 651
In the get-file
function you are calling byte-array
on the result of byte-seq
. byte-array
will realize the LazySeq
returned by byte-seq
meaning all of it will be in memory.
As far as I know (at least for TCP server), aleph accepts any lazy sequence of ByteBuffer as a body and will handle it optimally for you, so you could just return the result of calling byte-seq
(defn get-file [req]
(let [uri (:uri req)
file (str/replace (w/keywordize-keys (cod/form-decode uri)) #"/file/" "")
full-dir (io/file share-dir file)
_ (log/debug "FULL DIR: "full-dir)
filename (.getName (File. (.getParent full-dir)))
extension (fs/extension filename)
_ (log/debug "EXTENSION: " extension)
resp {:status 200
:headers {"Content-type" (get content-types (.toLowerCase extension) "application/octet-stream")
"Content-Disposition" (str "inline; filename=\"" filename "\"")}
:body (with-open [is (FileInputStream. full-dir)]
(byte-seq is 4096))}
]
resp))
Aleph is in compliance with the ring specification and :body
accepts File
or InputStream
. So there is no need to return the bytes yourself.
(defn get-file [req]
(let [uri (:uri req)
file (str/replace (w/keywordize-keys (cod/form-decode uri)) #"/file/" "")
full-dir (io/file share-dir file)
_ (log/debug "FULL DIR: "full-dir)
filename (.getName (File. (.getParent full-dir)))
extension (fs/extension filename)
_ (log/debug "EXTENSION: " extension)
resp {:status 200
:headers {"Content-type" (get content-types (.toLowerCase extension) "application/octet-stream")
"Content-Disposition" (str "inline; filename=\"" filename "\"")}
:body full-dir}
]
resp))
Upvotes: 0