Reputation: 1203
I have a service that takes an HttpRequest from a client to get a file from another server via REST and then forward the file to the client as an HttpResponse.
Don't ask me why the client doesn't ask for the file him/herself because that is a long story.
I compiled a strategy to download the file to the file system and then send the file to the client. This is using extracts from other stackoveflow responses from @RamonJRomeroyVigil.
def downloadFile(request: HttpRequest, fileName: String): Future[IOResult] = {
Http().singleRequest(request).flatMap { response =>
val source = response.entity.dataBytes
source.runWith(FileIO.toPath(filePath))
}
}
def buildResponse(fileName: String)
val bufferedSrc = scala.io.Source.fromFile(fileName)
val source = Source
.fromIterator(() => bufferedSrc.getLines())
.map(ChunkStreamPart.apply)
HttpResponse(entity = HttpEntity.Chunked(ContentTypes.`application/octet-stream`, source))
}
However, I would like to do this in one step without saving the file system and taking advantage of the streaming abilities.
I also would like to limit the amount of request the client can serve at the same time to 5.
Thanks
Upvotes: 3
Views: 959
Reputation: 646
As you are already getting the file as a stream from the second server, you can forward it directly to the client. You only need to build your HttpResponse
on the fly :
def downloadFile(request: HttpRequest) : Future[HttpResponse] = {
Http().singleRequest(request).map {
case okResponse @ HttpResponse(StatusCodes.OK, _, _, _) =>
HttpResponse(
entity = HttpEntity.Chunked(ContentTypes.`application/octet-stream`,
okResponse
.entity
.dataBytes
.map(ChunkStreamPart.apply)
))
case nokResponse @ HttpResponse(_, _, _, _) =>
nokResponse
}
}
To change the maximum number of concurrent requests allowed for the client, you would need to set akka.http.client.host-connection-pool.max-connections
and
akka.http.client.host-connection-pool.max-open-requests
. More details can be found here.
Upvotes: 2