Reputation: 397
I am using the react-native-camera plugin in my app. I am trying to take an image and then immediately upload it to an AWS S3 bucket via AWS Amplify.
My Code looks like this:
export default class Camera extends Component {
render() {
return (
<View style={styles.container}>
<RNCamera
ref={ref => {
this.camera = ref;
}}
style={styles.preview}
flashMode={RNCamera.Constants.FlashMode.auto}
permissionDialogTitle={'Permission to use camera'}
permissionDialogMessage={'We need your permission to use your phone\'s camera'}
/>
<TouchableOpacity
onPress={this.takePicture.bind(this)}
style={styles.capture}
>
<Text style={{fontSize: 15}}> SNAP </Text>
</TouchableOpacity>
</View>
);
}
takePicture = async function () {
if (this.camera) {
const options = {base64: true};
const data = await this.camera.takePictureAsync(options);
console.log(data.uri);
uploadPictureToS3(data.uri, "image.jpg");
}
};
}
function readFile(fileUri) {
return RNFetchBlob.fs.readFile(fileUri, 'base64').then(data => new Buffer(data, 'base64'));
}
function uploadPictureToS3(uri, key) {
readFile(uri).then(buffer => {
Storage.put(key, buffer, {
contentType: "image/jpeg"
})
})
.then(r => {
console.log(r);
})
.catch(e => {
console.error(e);
});
}
When trying to access the file using the readFile method I get the following error: Error: file not exists.
What is happening here? Why am I not able to read the image from the cache folder directly after it was taken?
Upvotes: 3
Views: 4475
Reputation: 21
i am using the RNCamera plugin in this app,this code show image tack by camera in app
import React, { Component } from 'react';
import { StyleSheet, Text, TouchableOpacity, View, Image } from 'react-native';
import { RNCamera } from 'react-native-camera';
export default class CameraPage extends Component {
constructor(props) {
super(props);
this.state = {
img : '',
}
}
takePicture = async function(camera) {
const options = { quality: 0.5, base64: true };
const data = await camera.takePictureAsync(options);
this.setState({
img: data.uri
})
};
render() {
const{img}= this.state;
return (
<View style={styles.container}>
<RNCamera
style={styles.preview}
type={RNCamera.Constants.Type.back}
flashMode={RNCamera.Constants.FlashMode.off}
permissionDialogTitle={'Permission to use camera'}
permissionDialogMessage={'We need your permission to use your camera phone'}
>
{({ camera}) => {
return (
<View style={{ flex: 0, flexDirection: 'row', justifyContent: 'center' }}>
<TouchableOpacity onPress={() => this.takePicture(camera)} style={styles.capture}>
<Text style={{ fontSize: 14 }}> SNAP </Text>
</TouchableOpacity>
</View>
);
}}
</RNCamera>
<View style={{flex:1,justifyContent: 'center', alignItems: 'center'}}>
<Image source={{uri:img}} style={{ width: 200, height: 200}}/>
</View>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
flexDirection: 'column',
backgroundColor: 'black',
},
preview: {
flex: 2,
justifyContent: 'flex-end',
alignItems: 'center',
},
capture: {
flex: 0,
backgroundColor: '#fff',
borderRadius: 5,
padding: 15,
paddingHorizontal: `**enter code here**`20,
alignSelf: 'center',
margin: 20,
},
});
Upvotes: 2
Reputation: 158
Your code looks alright although there are some "weird" stuff.
You pass the option base64
to the takePictureAsync. This makes the camera return the base64 of picture taken automatically, but you do not use it for anything. You can do uploadPictureToS3(data.base64, "image.jpg");
And skip the readFile method.
Another thing is, don't know why it is like this, to get the image base64 from the uri we usually do:
const filepath = data.uri.split('//')[1];
const imageUriBase64 = await RNFS.readFile(filepath, 'base64');
So try the const filepath = data.uri.split('//')[1];
trick instead of passing data.uri directly.
Upvotes: 2