Reputation: 11
I am trying to multipart upload files(which can get very large) to AWS S3 using the Upload from library lib-storage. Unfortunately, I am receiving this error. The S3 bucket is accessed through the credentials of an IAM user.
TypeError: Cannot read properties of undefined (reading 'done')
at getChunkStream.js:5:5
at Generator.next (<anonymous>)
at resume (chunk-Y34XHC4O.js?v=a9de758f:93:27)
at chunk-Y34XHC4O.js?v=a9de758f:99:63
at new ZoneAwarePromise (zone.js:1422:21)
at it.<computed> [as next] (chunk-Y34XHC4O.js?v=a9de758f:99:38)
at Upload.<anonymous> (Upload.js:114:9)
at Generator.next (<anonymous>)
at chunk-Y34XHC4O.js?v=a9de758f:83:61
at new ZoneAwarePromise (zone.js:1422:21)
Here is my code, which is more or less copied from here
async multipartUpload(event: any){
var file = event.target[0].files[0]
var creds = {accessKeyId:this.ACCESS_KEY, secretAccessKey:this.SECRET_ACCESS_KEY}
var target = {Bucket:this.BUCKET_NAME, Key: file.name, Body:file}
var s3Client = new S3Client({region:"ap-south-1", credentials:creds})
try {
const parallelUploads3 = new Upload({
client: s3Client,
params:target,
});
parallelUploads3.on("httpUploadProgress", (progress) => {
console.log(progress);
});
await parallelUploads3.done();
} catch (e) {
console.log(e);
}
}
I verified that the 's3Client' object was working by uploading a file using the 'PutObjectCommand', proving that this is not an issue of credentials like Access Keys, Bucketname etc.
Upvotes: 1
Views: 227
Reputation: 324
Answer (the solution that worked in my case)
I was seeing the same error in production (TypeError: Cannot read properties of undefined (reading 'done')
), even though everything worked fine locally. It turned out to be related to how my Vite config was bundling the AWS SDK code.
Here’s what fixed it for me:
Reorder the plugins in vite.config.js
Placing the @vitejs/plugin-react-swc
plugin before any other plugins (like vite-plugin-checker
) made sure the transformations for async/generator functions used by AWS SDK weren’t stripped out or mangled in production.
Resolve browser vs. Node references
In my resolve.alias
, I mapped ./runtimeConfig
to ./runtimeConfig.browser
and set global: 'globalThis'
in optimizeDeps.esbuildOptions
to avoid polyfill issues.
Below is an example of my working vite.config.js
(the critical change is the plugin order and the alias/config tweaks). This solution may not apply to every bundler setup, but it resolved the done()
being undefined in production for me:
import checker from 'vite-plugin-checker';
import react from '@vitejs/plugin-react-swc';
import { loadEnv, defineConfig } from 'vite';
export default ({ mode }) => {
const env = {
...process.env,
...loadEnv(mode, path.resolve(process.cwd(), '.env'), ''),
};
return defineConfig({
plugins: [
// 1) React SWC plugin first
react({ tsDecorators: true }),
// 2) Then the checker
checker({
typescript: true,
eslint: {
lintCommand: 'eslint "./src/**/*.{js,jsx,ts,tsx}"',
},
}),
],
resolve: {
alias: [
// ...other aliases
{
find: './runtimeConfig',
replacement: './runtimeConfig.browser',
},
],
},
build: {
sourcemap: true,
emptyOutDir: true,
},
});
};
Note: This exact fix worked for my specific setup. If you’re still running into issues, verify that your environment is purely browser‐based (not SSR), that you’re calling the correct AWS SDK modules, and that minification or tree‐shaking isn’t removing the AWS SDK’s generator functions.
Upvotes: 0