Reputation: 13
I'm working on developing an API to upload multiple files to S3 with Akka HTTP. I'm currently using the fileUploadAll
directive, which buffers all the files to disk. This is putting a limitation on the size of files that can be handled. Are there alternate approaches? How else can I handle multipart/form-data requests?
Upvotes: 1
Views: 1031
Reputation: 19527
Here's a simple example that takes a list of file paths, converts the list to a single Source[ByteString, _]
, and runs the Source
with an Alpakka S3 connector Sink
that uploads the data to S3:
val paths = List(Paths.get("/path/to/file1"), Paths.get("/path/to/file2"))
val source: Source[ByteString, _] = Source(paths).flatMapConcat(FileIO.fromPath(_))
// read the Alpakka documentation about setting up a S3 client and sink
val s3Sink: Sink[ByteString, Future[MultipartUploadResult]] = ???
val fut: Future[MultipartUploadResult] = source.runWith(s3Sink)
You could use fut
with one of the future directives in your Akka HTTP route.
As mentioned, the above approach creates a single Source
. If you need distinct buckets and keys for each file, then you could launch separate streams for each file:
val source1: Source[ByteString, _] = FileIO.fromPath(Paths.get("/path/to/file1"))
val source2: Source[ByteString, _] = FileIO.fromPath(Paths.get("/path/to/file2")
val s3Sink1: Sink[ByteString, Future[MultipartUploadResult]] = ???
val s3Sink2: Sink[ByteString, Future[MultipartUploadResult]] = ???
val fut1: Future[MultipartUploadResult] = source1.runWith(s3Sink1)
val fut2: Future[MultipartUploadResult] = source2.runWith(s3Sink2)
val fut: Future[List[MultipartUploadResult]] = Future.sequence(List(fut1, fut2))
Upvotes: 5