Reputation: 2144
I'm trying to do a simple function that resizes a newly uploaded image in Storage. I use the following to help me do that:
import { tmpdir } from 'os';
import { join, dirname } from 'path';
import * as sharp from 'sharp';
import * as fs from 'fs-extra';
When this code executes:
await bucket.file(filePath).download({
destination: tmpFilePath
});
I get the following error in the Google Cloud Function logs:
Error: ENOENT: no such file or directory, open '/tmp/images/1542144115815_Emperor_penguins.jpg' at Error (native)
Here is the full code [segment]:
const gcs = admin.storage();
const db = admin.firestore();
import { tmpdir } from 'os';
import { join, dirname } from 'path';
import * as sharp from 'sharp';
import * as fs from 'fs-extra';
export const imageResize = functions.storage
.object()
.onFinalize(async object => {
console.log('> > > > > > > 1.3 < < < < < < <');
const bucket = gcs.bucket(object.bucket);
console.log(object.name);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const tmpFilePath = join(tmpdir(), object.name);
const thumbFileName = 'thumb_' + fileName;
const tmpThumbPath = join(tmpdir(), thumbFileName);
console.log('step 1');
// Resizing image
if (fileName.includes('thumb_')) {
console.log('exiting function');
return false;
}
console.log('step 2');
console.log(`filePath: ${filePath}`);
console.log(`tmpFilePath: ${tmpFilePath}`);
await bucket.file(filePath).download({
destination: tmpFilePath
});
console.log('step 3');
await sharp(tmpFilePath)
.resize(200, 200)
.toFile(tmpThumbPath);
await bucket.upload(tmpThumbPath, {
destination: join(dirname(filePath), thumbFileName)
});
UPDATE 1: added await fs.ensureDir(tmpFilePath);
to ensure the filepath exists. Now getting a new error:
Error: EINVAL: invalid argument, open '/tmp/images/1542146603970_mouse.png' at Error (native)
UPDATE 2 SOLVED: Added a solution as an Answer below.
Upvotes: 4
Views: 8474
Reputation: 2144
I changed the following code
From
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const tmpFilePath = join(tmpdir(), object.name);
const thumbFileName = 'thumb_' + fileName;
const tmpThumbPath = join(tmpdir(), thumbFileName);
To
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const thumbFileName = 'thumb_' + fileName;
const workingDir = join(tmpdir(), `${object.name.split('/')[0]}/`);//new
const tmpFilePath = join(workingDir, fileName);
const tmpThumbPath = join(workingDir, thumbFileName);
await fs.ensureDir(workingDir);
As you can see, I created a workingDir
that would be shared between the paths and then ran await fs.ensureDir(workingDir);
to create the path. That solved my problem.
Upvotes: 4
Reputation: 317362
I suspect you'd see that message because you tried to write to this path:
/tmp/images/1542144115815_Emperor_penguins.jpg
Without first creating the parent directory:
/tmp/images
You can't write a file to a local filesystem folder that doesn't exist, and it seems that the Cloud Storage SDK will not create it for you.
Upvotes: 2