Fixing iPhone Video Colour
Published at 09:15 on 31 March 2025
Executive Summary
Export your video from the Photos app in original, unmodified format. Then, in the Finder (yes, the Finder), right-click on the video file you exported and choose Services → Encode Selected Video Files, and choose your encoding (1080p in my case). The result will be an HD video that can be shared on YouTube and which will not be all desaturated and overexposed.
The Details
The first time I tried importing a video shot on my iPhone into DaVinci Resolve it happened: the video was all washed-out and overexposed. It brought back bad memories of uploading still photos to the Web and viewing them on my Mac in the late aughts, as the overall effect was quite similar.
Then, the fault was Apple software botching colour management. Specifically, Safari was assuming that any image file without colour management data embedded in it should be displayed using the native Apple colour space. The latter has a wider gamut than the de-facto standard sRGB colour space, and using it to view unconverted sRGB data causes photos that look overexposed and desaturated, i.e. “washed out.” The same web page would have photos that would look just fine on Linux and Windows systems.
Mac fanboys at this stage would get all pompous about how “Apple does colour management right” when in fact Apple was getting it massively wrong. Yes, Apple did use a colour space that provided a wider gamut than Windows or Linux. Yes, Apple system tools and libraries had support for reading colour space data before Linux and Windows did, but their handling of data with no colour space information was flawed; what should have been interpreted as sRGB was instead being interpreted as being in the native system colour space.
So it was Apple’s fault. The workaround was to always embed colour space data in every image saved for Web use, and to always save that data in sRGB form. Windows and Linux would ignore the colour space information but the image data would be sRGB so it would display correctly there. Apple software would see the sRGB colour space metadata and do the necessary conversion before passing it on for display.
Eventually, Apple fixed their broken colour management, but old habits die hard and I still save still images in the above way for Web use.
I don’t know exactly who is at fault here, but:
- The iPhone camera application is being weird. The Rec.709 gamma and colour space are the industry defaults for 1080p (i.e. “HD”) video (in fact, they were developed for use in HD video), yet if you tell your iPhone to shoot video in “HD” mode, it uses the Rec.2020 colour space with the Rec.2100 HLG gamma. You get an oddball video file instead of a standard HD one.
- The Photos app on both the iPhone and the desktop Macs will display the resulting video just fine, as will QuickTime Player and iMovie.
- When you import the video into DaVinci Resolve, the result looks all washed-out.
- When you export the video from the Photos app and tell it to use 1080p format, it does convert the colour space, but it does a poor job of it. The result looks somewhat washed-out and it has weird colour shifts.
- When you add a colour space transform to DaVinci Resolve’s colour processing, it also does a poor job of conversion.
So at this stage my money is on it mostly being DaVinci Resolve’s fault. It seems to be ignoring colour space information and assuming everything is Rec.709. It also seems deficient in reasonable defaults for colour space conversion (if the Finder can do it and get acceptable results without a lot of tedious tweaking by hand, DaVinci Resolve should offer a way to do this as well).
But Apple doesn’t completely escape blame here. If video colour space conversion is so tricky to get right (and I think this is part of the problem), then why use the troublesome Rec.2020 colour space when the user is telling the Camera app to shoot HD videos?
Apple fanboys should at this stage have a nice hot steaming cup of STFU. Yes, I know that Rec.2020 is “better” in the sense that it has a wider gamut and finer resolution than the industry standard, and thus preserves the ability to do more recovery of correct information in postprocessing. But the user has told the Camera app to shoot an HD video. That is critical. When the rest of the world talks about an “HD” video, they are talking about a video in the Rec.709 colour space, not some oddball Franken-video with the HD resolution but a non-HD colour space that will massively fail when shown on most video players on most platforms. Preserve the ability to shoot and save with greater colour resolution, yes, but don’t call it “HD” video if it’s not recording standard HD video.
There is, thankfully, a way to do a colour space conversion that produces acceptable results. It is hidden in, of all places, the Finder. See the executive summary above.
This, too, is Apple’s fault. The conversion should not be hidden in the Finder. It should not be in the Finder at all. It should be an option in the Photos app. (Well, it is, but that option doesn’t do a good job. Apple needs to fix the colour space conversion in Photos and clean up the Finder to not have the feature creep it does.)
To reiterate, it all brings back bad memories of what life was like fifteen or so years ago with still images. Implementing colour management in ways that could be theoretically superior to industry standards, but botching the implementation and making life needlessly difficult for your users, just seems to be in Apple’s genes. And mostly ignoring the desirability of embedding colour space info in media files seems to be in everyone else’s genes.