Reputation: 1
I'm curious about how to upload large files in chunks using the MinIO Python SDK in a FastAPI backend.
I've implemented chunked uploading using .stream()
to the server according to the following
post: How to Upload a large File (≥3GB) to FastAPI backend?
In this scenario, I need to upload the file to MinIO in chunks immediately upon receiving them, not after uploading all the files to the server.
I wonder if there's a way to stream upload chunks to MinIO immediately as they are received, instead of uploading all the files to the server first. Is this possible using Python MinIO SDK?
Currently, it is implemented as follows.
parser = StreamingFormDataParser(headers=request.headers)
file_target = FileTarget(temp_file_path)
parser.register("file", file_target)
async for chunk in request.stream():
parser.data_received(chunk)
await upload_stream_data_to_minio(filename, object_path)
I referred to tthis post and video.
I checked that when uploading chunks to MinIO using put_object, the file is being overwritten chunk by chunk because the object name is same.
Upvotes: 0
Views: 1546