I am trying to get images from my test devices camera roll to render as thumbnails. I have successfully fetched the images from the camera roll and displayed them within a series of Image elements in a list view but they take a really long time to load in. Also, I read in the React Native docs that the Image element will pick the correct image size for the space it will render into.
This is from the docs.
iOS saves multiple sizes for the same image in your Camera Roll, it is very important to pick the one that's as close as possible for performance reasons. You wouldn't want to use the full quality 3264x2448 image as source when displaying a 200x200 thumbnail. If there's an exact match, React Native will pick it, otherwise it's going to use the first one that's at least 50% bigger in order to avoid blur when resizing from a close size. All of this is done by default so you don't have to worry about writing the tedious (and error prone) code to do it yourself. https://facebook.github.io/react-native/docs/image.html#best-camera-roll-image
The code I'm using to read the images is super simple.
CameraRoll.getPhotos({
first: 21,
assetType: 'Photos'
}, (data) => {
console.log(data);
var images = data.edges.map((asset) => {
return {
uri: asset.node.image.uri
};
});
this.setState({
images: this.state.images.cloneWithRows(images)
});
}, () => {
this.setState({
retrievePhotoError: messages.errors.retrievePhotos
});
});
And then to render it I have these functions.
renderImage(image) {
return <Image resizeMode="cover" source={{uri: image.uri}} style={[{
height: imageDimensions, // imageDimensions == 93.5
width: imageDimensions
}, componentStyles.thumbnails]}/>;
},
render() {
<ListView
automaticallyAdjustContentInsets={false}
contentContainerStyle={componentStyles.row}
dataSource={this.state.images}
renderRow={this.renderImage}
/>
}
What am I missing here? I'm going crazy!!!