Google Guetzli Case Study: New Encoder Tested on 1000 JPEGs

Last week, Google announced Guetzli: A New Open Source JPEG Encoder.

In the days since, there has been a lot of buzz around the reported ability for Guetzli to create “high quality JPEG images with file sizes 35% smaller than currently available methods.”

That is indeed buzz-worthy, and hopefully the release of the source code will result in even better, faster versions of this encoder.

For now, since Guetzli is available to download on Github, I decided to try it out.

Poor optimization

The Problem

I recently published results from speed and performance testing over 1000 WordPress themes.

While running speed tests, I also took screenshots of each theme.

The screenshots were all output as high quality PNG images, many of which contained combinations of simple typography (2 color) with complex photographs.

This kind of image has always been a challenge to optimize. Formats like GIF and PNG end up too big. And Jpeg compression levels that work fine for complex images, can produce unsightly artifacts around areas of contrast.

If there weren't so many images to process, I would have just picked the best format based on the image. But since I didn't want to spend hours optimizing images, I just automated the jobs and made them all small low quality JPEGs. The results weren't pretty.

Since I still had all the original files handy, this big batch of 1000 low quality Jpeg images seemed like decent candidates for testing Guetzli.

The Setup/ Installing Guetzli

Currently, Guetzli is only available as a command line utility. Some people might be intimidated by this, and wait for it to be implanted in a GUI app, but it's actually really easy to install and use.

I'm on a Mac, so I used Homebrew.

As I said, the installation was really easy.

Just open the terminal (command + space – then type “terminal”) and at the prompt, type:
brew install guetzli

And the usage is also straight forward.
guetzli [--quality Q] [--verbose] original.png output.jpg
guetzli [--quality Q] [--verbose] original.jpg output.jpg

Here's a modified example:
guetzli /path/to/original.jpg /path/to/optimized.jpg

I didn't specify a quality for the above example, but by default, the quality is set at 95.

The official announcement from Google states that Guetzli takes “significantly longer to create compressed images than currently available methods.”, and this was definitely true.

The first image I tried the encoder on was a 24 bit PNG image that was about 500KB. I started the job and waited. And waited. And waited more.

At first, I thought something was wrong. But after canceling a few times, I eventually realized that it was just slow (read the manual/ article).

This wasn't going to be as straight forward as I thought

I found that using the encoder on smaller files wasn't nearly as slow, so I decided to resize all of the images first, and then try again. For this, I used Imagemagick (convert) to resize all 1000 of the images (from 2800×1606 to 500×287).

After resizing, I tested Guetzli again. On the smaller image, it only took about 7 seconds. The results seemed really impressive (from 36KB to 20KB) until I actually looked at the result. It was pretty terrible (Low quality with a lot of noise).

It wasn't until I compared it to the “original” that I realized the program I used to resize the images (convert/Imagemagick) actually has compression on by default, so it compressed and resized. I was starting from low quality images! Not good!

After some back and forth, I decided on using Photoshop to resize all the images. Even as a batch automation, It took a lot longer, but it was the closest I was going to get to a good starting point for this test.

This was just one of several problems (on my end) I ran into trying to come up with the best way to test Guetzli.

Here is one of the early tests where I tried to compare the original (full quality – resized to 500×287) images processed with Guetzli to the same images processed with Photoshop.

Guetzli was at it's lowest possible setting of 84, and Photoshop was set to 80.

Bad Guetzli test

As you can see in the result, even the result from Photoshop's “high” setting of “save for web” was smaller than the Guetzli result.

Much, Much, Later (joke)

After 3rd time running the encoder on the 1000+ images at up to 10 seconds each and realizing the comparison wasn't right and could be better, I decided to only test the top 250 images to speed things up.

(insert 2nd “much, much, later” joke. But seriously, I love this stuff!)

After a few more runs of 250 and several one-off runs, I finally decided on a fair enough way to do this. At least as fair as I could think of.

A Fair Test

1. For the first step, I used Photoshop again (b/c it is the main app I use when optimizing images).
I opened an image and used the “Save for web” feature to reduce the quality as much as possible without noticeable loss of quality. After doing this with about 6 sample images, I found that a quality of 80 worked pretty well for all of them. So I ran a batch job on all 250 images where I saved the 500×287 uncompressed image at a quality of 80.

2) Next I processed the same sample images after they were output from Photoshop “Save for web” –quality 80 with Guetzli. I found that the default guetzli –quality 95 worked best (no noticable artifacts/minimal loss of quality). So I processed the 250 images that were output from Photoshop with Guetzli.

The Results

Guetzli Case Study Results - takes an additional 22% off of the optimized photoshop jpeg

Guetzli Case Study example 1

Photoshop optimized – 24KB
Guetzli Case Study example 1 - Photoshop optimized - 24KB

Guetzli optimized – 20KB – saved 17%
Guetzli Case Study example 1 - Guetzli optimized - 20KB - saved 17%

Guetzli Case Study example 2

Photoshop optimized – 32KB
Guetzli Case Study example 1 - Photoshop optimized - 32KB

Guetzli optimized – 24KB – saved 25%
Guetzli Case Study example 1 - Guetzli optimized - 24KB - saved 25%

Guetzli Case Study example 3

Photoshop optimized – 40KB
Guetzli Case Study example 1 - Photoshop optimized - 40KB

Guetzli optimized – 36KB – saved 10%
Guetzli Case Study example 1 - Guetzli optimized - 36KB - saved 10%

Guetzli Case Study example 4

Photoshop optimized – 44KB
Guetzli Case Study example 1 - Photoshop optimized - 44KB

Guetzli optimized – 32KB – saved 27%
Guetzli Case Study example 1 - Guetzli optimized - 32KB - saved 27%

Guetzli Case Study example 5

Photoshop optimized – 44KB
Guetzli Case Study example 1 - Photoshop optimized - 44KB

Guetzli optimized – 36KB – saved 18%
Guetzli Case Study example 1 - Guetzli optimized - 36KB - saved 18%

Here's a Video Version of the Examples

The Verdict

I realize this test doesn't do a great job at showing that Guetzli is better or worse than any other JPEG encoder. And I don't doubt that this is because of my limited knowledge of image compression.

However, I feel that this study does show real strengths and gives good reason to use Guetzli if you are able to.

With every Image I encoded, I was able to further reduce all more than I would have been able to otherwise. And with as much importance as is put on site speed these days, every little bit counts.

As a matter of fact, I found myself re-optimizing all of the JPEG images in this post as I created them. So I'm sure I'll continue to use it. After all, How an I leave those extra bites on the table?

Thanks for reading, and please share this if you found it useful. Have you tried Guetzli? What did you think of it? Comment at -> @badijones on twitter or at https://facebook.com/allthingsblogging. I'd love to hear any feedback or experiences you've had with Guetzli.

Share This: