How much image-processing magic does the iPhone's Camera app do by default (left)? I tested Halide's new Process Zero mode (right) to find out. I took these photos from my SUP out on the lake early yesterday morning. The newly risen sun was behind me and was illuminating the hillside in front of me. The unprocessed view seems very dull and drab in comparison. The default Apple processed image better matches what I saw with my naked eye.
(This was also a great opportunity for me to learn how to create an Image Comparison Slider using only HTML and CSS. Most implementations involve Javascript. It doesn't work on Safari for iOS because Mobile Safari does not support the resize
CSS property, but at least I can set it to 50/50.)
Update: Both photos were shot with an iPhone 14 Pro Max and reduced to 2880 x 2160 pixels, stripped of metadata, and rewritten as WebP at 75% quality by a Retrobatch script.