Reputation: 20182
I am using the following test to read a file, base 64 encode it, then decode the base 64 back to a new image. I noticed that the new image file size (after the conversion) is significantly less than the original image leading me to think that somehow, part of the image data is being lost in the conversion. I can see the image but am worried about image quality. Any insight on what I might be doing wrong would be greatly appreciated.
Test class:
package test;
import javax.imageio.ImageIO;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Base64;
public class ImageTest { //should be B64Test
public static void main(String[] args) {
ImageTest imageTest = new ImageTest();
try {
BufferedImage img = null;
BufferedImage finalImg = null;
try {
// img = ImageIO.read(new File("/home/user/Desktop/test1.jpg"));
img = ImageIO.read(Files.newInputStream(Paths.get("/home/user/Desktop/test1.jpg")));
//encode base64 and print
final String base64encoded = ImageConverter.encodeToString(img, "jpeg");
System.out.println("read file " + base64encoded);
//convert base64 string to image
finalImg = ImageConverter.decodeToImage(b64encoded);
ImageIO.write(finalImg, "jpeg", new File("/home/user/Desktop/test2.jpg"));
} catch (IOException e) {
System.out.println("exception " + e);
}
} catch (Exception e) {
System.out.println("exception " + e);
}
}
}
ImageConverter
package test;
import javax.imageio.ImageIO;
import java.awt.image.BufferedImage;
import java.awt.image.RenderedImage;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.UncheckedIOException;
import java.nio.charset.StandardCharsets;
import java.util.Base64;
import sun.misc.BASE64Encoder;
import sun.misc.BASE64Decoder;
public class ImageConverter {
public static String imgToBase64String(final RenderedImage img, final String formatName) {
final ByteArrayOutputStream os = new ByteArrayOutputStream();
try {
ImageIO.write(img, formatName, Base64.getEncoder().wrap(os));
return os.toString(StandardCharsets.ISO_8859_1.name());
} catch (final IOException ioe) {
throw new UncheckedIOException(ioe);
}
}
public static BufferedImage base64StringToImg(final String base64String) {
try {
return ImageIO.read(new ByteArrayInputStream(Base64.getDecoder().decode(base64String)));
} catch (final IOException ioe) {
throw new UncheckedIOException(ioe);
}
}
public static String encodeToString(BufferedImage image, String type) {
String imageString = null;
ByteArrayOutputStream bos = new ByteArrayOutputStream();
try {
ImageIO.write(image, type, bos);
byte[] imageBytes = bos.toByteArray();
BASE64Encoder encoder = new BASE64Encoder();
imageString = encoder.encode(imageBytes);
bos.close();
} catch (IOException e) {
e.printStackTrace();
}
return imageString;
}
public static BufferedImage decodeToImage(String imageString) {
BufferedImage image = null;
byte[] imageByte;
try {
BASE64Decoder decoder = new BASE64Decoder();
imageByte = decoder.decodeBuffer(imageString);
ByteArrayInputStream bis = new ByteArrayInputStream(imageByte);
image = ImageIO.read(bis);
bis.close();
} catch (Exception e) {
e.printStackTrace();
}
return image;
}
}
I can try testing both the base 64 encoder/decoder available in jdk8 as well as the one in sun.java.misc (which I realize I do not need to use). Any thoughts on what might be causing the image size to shrink (I would prefer doing that myself if needed using imagemagick or graphicsmagick etc.).
The original image was 1.2 MB (1,249,934 bytes) but the new image is 354.5 kB (354,541 bytes) - width/height is the same for both images.
Upvotes: 0
Views: 3533
Reputation: 27054
As @JBNizet points out in his comment, the reason for the change in size (the size may grow as well, depending on the input image and compression settings), is that you are not just encoding/decoding binary data to/from Base64, you are also re-encoding the image data (two times) using JPEG encoding (with default encoding settings). Unless the original image was encoded with the exact same settings, you will lose some precision, and the file size will change.
Another likely reason for the decrease in file size, is that a BufferedImage
does not carry any of the meta data contained in the original JPEG file. So your process of re-encoding the JPEG will also lose any Exif or XMP metadata, thumbnail, color profile etc. Depending on the source of the image, this may contribute to a significant part of the file size.
Again, as @JBNizet says, the best thing is to not involve ImageIO
at all in this case, just use normal file I/O and encode the original bytes using Base64, and decode again to recover the original file contents exactly.
PS: If you intend on doing image processing on the image in between the Base64 encoding/decoding, you will of course need to decode the image data (using ImageIO
or similar), but you should try to do it only once (for better performance), and perhaps look into preserving the meta data. Also, I think image encoding/decoding and Base64 encoding/decoding are separate issues, and should not be interleaved like it is now. Split it up, for a better separation of concerns.
Upvotes: 2