I ran a few (incredibly unscientific) tests on Google’s new Guetzli JPEG encoder last night at 100%, 90%, and 84% compression. Why 84%? Well, that’s the lowest the Guetzli binary will let you go without editing the source and recompiling.
Each run (compressing a single image) took about 20 minutes on a medium sized cloud instance with 8GB of RAM. During these runs, the server routinely went into swap. If you’re interested in seeing how things panned out, here’s the output: https://img.boogah.org/g/
Included in the link above are 2 versions (lossless, and lossy) of the same image run through ImageOptim on macOS. Doing both of those took me less than 2 minutes, combined. And while the output of ImageOptim’s lossy compression isn’t near as sharp, it’ll still be “good enough” for most folks.
At the end of the day, Guetzli’s output is really nice. And it does do a great job compressing things. I saw anywhere from a 74.63% to 89.28% decrease in size from my original image with very few visual artifacts. In its current form, however, it takes way too long to act as an efficient enough batch processor for small, independent publishers.
So don’t go throwing away Kraken, Imagify, or Smush just yet… Especially if you post a lot of galleries. 😀