A Trip Down Memory Lane
You wait all afternoon for that clock on the wall to strike 19:00. Einstein was right: time is relative to the observer. It seems like with each second that passes, a little extra time is added to the next one. The anticipation is excruciating… palms sweating… your right leg jumping… you close your eyes, hoping that by the time you open them again, time would have magically travelled forward only to reveal that… [spoiler alert]… it doesn’t. “What’s so special about seven o’clock?”, I hear you ask. Well, that’s when the Internet on your 28k dial-up modem is cheaper of course! They say that time is money, so when you’re on the brink of less expensive post-seven o’clock pure Internet bliss, where every second is precious, one should be fully prepared and ready to open only the websites you absolutely must. So you’ve got your list of websites that you’re about to open as soon as the clock…
SORRY, CAN’T TALK NOW! IT’S JUST GONE SEVEN O’CLOCK! YOU KNOW WHAT THAT MEANS!
Alright first website… enter the website address… Hit ENTER… Here we go!
OK… Here we go NOW!
MAN this website has a lot of images. It’s taking forever to load! Might as well go and make myself a sandwich. Not quite the pure Internet bliss I was waiting in anticipation for. If only there was a way to make images load quicker. Sure, I know you can compress them and that would make them load quicker, but then they will look UGLY. Again: not quite the pure Internet bliss I was hoping for. Maybe someday… some far away day in the future, where people live on the moon with their robot butlers and everyone is driving around in flying cars or jetpacks… maybe then they would have figured out how to get these website images to load quicker while still having pretty darn good quality.
Time Travel To The Not-So-Far-Away Future ie. NOW!
Luckily the Internet has become significantly more accessible since those early days. You can easily access high-speed Internet on your fancy Über-portable laptop, your gaming console, your smart TV, your tablet, your smartphone and even smart watches. The list goes on and on. At home, you have high-speed ADSL Internet for streaming HD videos and downloading a bunch of stuff you probably don’t need or will end up never looking at. You have instant access to your friends’ Facebook profiles and can post Twitter updates in the blink of an eye. It’s great. It’s only once you leave the comfort of your high-speed ADSL Internet connection that you need to start relying on mobile networks to access the Internet on your tablet or smartphone. Mobile data can sometimes be quite costly, especially when your best friend has just posted a brand new photo album packed full of the holiday photos of their adventures travelling across Europe.
Even though the above image seems rather pointless, I’m actually using it as an example. With images on websites, there has always been a trade-off between file size, image dimensions, transfer times, and image quality. For the above image, I had to resize the image from its original 3264 x 2448 pixel dimensions down to 900 x 618 pixels. Then I had to export it as a progressive JPG image so that it starts showing immediately in low quality and then systematically starts to look better as it downloads more of the image. Finally I had to reduce its quality down to 60% just so that its file size could be reduced from 3.7 MB (ouch for mobile data) to 168 KB. Of course the quality and size of the original image would look much better, but it would take about 22.6x longer to load and also take a big bite out of your data plan. Not ideal.
Moz-what? Also known as the Mozilla JPEG Encoder Project, it’s a project started near the beginning of 2014 by Mozilla (who also develops the Firefox web browser). The goal of the project was to create a library that is able to reduce the file size of JPEG images without reducing the quality or compatibility with software that decodes and displays the image. A JPEG image that has a small file size would load and display faster than an image that has a larger file size. Two images that have the same pixel dimensions, but different image quality, will have different file sizes and would also take different amounts of time to download and display.
By reducing a JPEG’s file size without reducing the quality, the image would load and display faster and still look the same. That is where mozjpeg comes in.
First you will need to download the source code. You can get the latest version from the official releases page on github. After downloading the zip archive to a folder on your computer, unzip it. This post deals with installing mozjpeg on a Mac OS X computer. For instructions on how to install it on other platforms, follow the official building instructions on github. In order to install mozjpeg, you will need to install the following libraries:
- autoconf 2.56 or later
- automake 1.7 or later
- libtool 1.4 or later
- NASM 0.98 or later
You can make use of MacPorts to install any libraries you need. You will likely receive an error message during installation if any library is missing or if there is a problem with any of the installed libraries. Open the Terminal app and navigate to the folder you just unzipped and run the following commands:
autoreconf -fiv mkdir build && cd build sh ../configure sudo make install
Once the installation has completed without any errors, you can enter the following command to test if the installation was successful:
If the installation was successful, the above command will display various options that you can use with mozjpeg. If a message displays that says the command cannot be found, try closing the Terminal app, reopen it, and try again.
Using mozjpeg To Reduce JPEG File Size
Using the Terminal app, navigate to a folder where you have a JPEG image that you would like to reduce in file size. The shorthand command to compress a JPEG image using mozjpeg is as follows:
jpegtran original.jpg > compressed.jpg
The input image is “original.jpg” and the output filename of the compressed JPEG image is “compressed.jpg”. There are a number of switches that can be added to the command to change how the output image is compressed or manipulated. These switches are as follows:
Switches: -copy none (Copy no extra markers from source file) -copy comments (Copy only comment markers (default)) -copy all (Copy all extra markers) -optimize (Optimise Huffman table (smaller file, but slow compression)) -progressive (Create progressive JPEG file) Switches for modification: -crop WxH+X+Y (Crop to a rectangular subarea) -flip [horizontal|vertical] (Mirror image (left-right or top-bottom)) -grayscale (Reduce to grayscale (omit color data)) -perfect (Fail if there is non-transformable edge blocks) -rotate [90|180|270] (Rotate image (degrees clockwise)) -scale M/N (Scale output image by fraction M/N, eg, 1/8) -transpose (Transpose image) -transverse (Transverse transpose image) -trim (Drop non-transformable edge blocks) -wipe WxH+X+Y (Wipe (gray out) a rectangular subarea) Switches for advanced users: -arithmetic (Use arithmetic coding) -restart N (Set restart interval in rows, or in blocks with B) -maxmemory N (Maximum memory to use (in kbytes)) -outfile name (Specify name for output file) -verbose or -debug (Emit debug output) Switches for wizards: -scans file (Create multi-scan JPEG per script file)
After trying various combinations of switches, the command that seems to reduce the image file size optimally is the following:
jpegtran -copy none -optimize -progressive original.jpg > compressed.jpg
Putting mozjpeg To The Test
We compressed various versions of the above city scene image with the following results:
Original Dimensions of 1680 x 1050
- 100% quality reduced from 932 KB to 860 KB (7.7% less)
- 80% quality reduced from 801 KB to 789 KB (1.5% less)
- 60% quality reduced from 431 KB to 425 KB (1.4% less)
- 30% quality reduced from 212 KB to 207 KB (2.4% less)
- 0% quality increased from 87 KB to 89 KB (2.2% more)
Resized Dimensions of 900 x 563
- 100% quality reduced from 447 KB to 424 KB (5.1% less)
- 80% quality reduced from 220 KB to 216 KB (1.8% less)
- 60% quality reduced from 132 KB to 130 KB (1.5% less)
- 30% quality reduced from 70 KB to 69 KB (1.4% less)
- 0% quality increased from 30 KB to 31 KB (3.2% more)
It should be noted that these results are in no way indicative of the results you could expect to get with every image you compress. Different images have varying amounts of detail, which would affect the original image file sizes and resulting compressed image file sizes.
From our non-definitive test results, it seems that the best results are obtained by compressing the original image at 100% quality. The effectiveness decreases as the image’s initial quality is reduced before compression. Setting the image quality to 0% actually increases the file size after compression, which is interesting. As you can see, the compressed image file size of the image that is set to 100% quality is larger than the uncompressed image file size that is set to 80% quality. The visible difference between an image of 100% versus an image of 80% is highly negligible. It might seem as though the difference mozjpeg makes in reducing image file size doesn’t warrant using it to reduce JPEG file sizes, but if you consider the scenario where all of the images were on one page of a website, the total image file size would be reduced from 3.28 MB to 3.16 MB (3.7% or 123 KB less). That is almost enough to add the above lake image to the page as well, which is 173 KB and has a pixel dimension of 1920 x 1080. This test applies only to the 10 images we used. The effect of compression with an image gallery of many more images would be more noticeable when comparing overall file sizes.
As mentioned earlier in the post, there has always been a trade-off between file size, image dimensions, transfer times, and image quality. The best way to reduce file size still seems to be by reducing image dimensions and quality. You can reduce the quality of JPEG images to a certain extent without it having much of a visible impact on the original. Using mozjpeg can reduce the image file size even further. The difference in image size might seem negligible, but when you take into consideration that the image quality remains the same, you might as well use it. The Mozilla JPEG Encoder Project is still ongoing and improvements are constantly being made. Maybe one day in the not-so-far-away future, everyone will be using mozjpeg and websites will load Über-fast and use much less mobile data. Now THAT’s the pure Internet bliss I was talking about!