Just some days ago I found a question in the facebook Logik group about how to apply a preview look LUT in flame when working in ACES. I had to do this already sometimes, so I could help out amongst others, but for sure I will forget about it in the next weeks and won’t be able to find it when I need it again. So I put together the working steps that are needed to do the same in Nuke and pointed out some obvious limitations with 3D-LUTs in ACES.
From zero to three HDR displays without buying a new one and Apple hopefully pushing the topic for this year’s WWDC event.
Sometimes the head gets full and a good way to make space again is to note everything down. This has been happening over the past weeks and here is my personal experience. Let’s dive into the topic of HDR.
getting started without a HDR screen
The unwanted C… break gave me the time to turn to a subject I was always keen on getting to know: seeing and creating HDR content.
At my job we have been already working for years already with ACES, so an HDR output is “basically” a switch to a different outlet (ODT) of your film/media instead of the regular HD Rec.709 destination. But up to now no project requested an HDR output.
The best, and by best I mean the most accessible and straight forward, tool that I have available on MacOS for dealing with HDR content is Final Cut Pro X on MacOS Catalina.
FCPX is HDR capable for a while but not aces (yet?)
So I took some ARRI Alexa and RED demo footage that is available for free to download. The waveform is already set to Nits in FCPX if the Library and Project formats are properly set (UHD-HDR Dolby PQ). At least technically I could “see” on the waveforms how bright this footage must look like in HDR, because my old 27″ iMac still has an sRGB display. So far so good.
first review on a real HDR screen while looking at one
The next step was to take the project and the footage to the now empty office due to the pandemic and setup my 2018 MBP and try to hook it up to an LG C8 HDR capable client display in one of the suites.
MBP –> USB-C-Dock –> HDMI-cable and fire up FCPX with the HDR project and … nothing happened. It took me some time to figure out that I needed an HDR capable USB-C-HDMI dock (actually Apple sells this directly for only 79€), a UHD-HDR capable HDMI cable and voila! I was somehow able to see a much brighter output from FCPX on the LG TV, but I was not sure what I really saw. The setup I used didn’t allowed the HDR metadata signal to pass through somehow. In theory this metadata stream should tell the LG TV that it is receiving an HDR signal.
During the setup attempts I noticed something different on my MBP display, which I didn’t use to setup the FCPX project in the first place. The viewer looked different, it looked bright. Okay, it seems I see HDR directly on the MBP screen! Now I got my first HDR display without even buying an extra one. As a side note, I need to mention that without MacOS Catalina there is no full HDR support on the Mac.
three HDR screens at home without buying a new one
Back at home, by creating more HDR projects with FCPX & Compressor, I could test two more HDR-capable screens that I have never tested before – an iPhone 11ProMax and an iPadPro from 2019. Both devices are HDR-capable with the lastest iOS version installed.
To watch HDR content is rather easy: render a HEVC 10-Bit compressed file from the FCPX project and put it in an iCloud folder and look for the files on the iOS devices. The files displayed on the iPhone screen, as the screen is smaller, look the most impressive. However the experience on the iPad Pro is not so noticeable; it lacks specific bright highlights. The MBP display looks the second best of my now three HDR-capable displays.
Usage of FCPX and HDR content
Let’s have a word about FCPX: HDR-Wide-Gamut projects are using the working color space Rec.2020 (linearized, with a preview tone mapping method unknown to me). It is easy to color grade Alexa and RED footage with the inbuilt Color Wheels, but I felt the need to desaturate a lot of the footage, especially the highlights, and I tend to overall darken the shadows to enhance the contrast. The need to desaturate content could be a result of a “missing” gamut mapping transform from Rec.2020 to the display P3 screens by Apple. ACES is also facing a lot of issues concerning out of gamut “colors”. (You can read about that topic on ACESCentral.com)
The next step is using the HDR-Tools effect in FCPX and limit the output peak brightness to 1.000 Nits. It does that with a soft clip nicely. Another thing is to fill out some HDR metadata fields, that I am not sure yet how to fill out properly. I can now produce HDR content and review it on three screens. FCPX also allows an easy way to create a cloned project that is a down converted version to Rec.709 for comparison.
easy review on the LG c8 without additional hassle
Back at the office with the LG C8, I found that the easiest way to play back the HEVC files (up to UHD @25fps) is directly from a USB stick, without hooking up the MBP to the screen at all. Now the TV also recognizes the formats and plays back SDR as SDR and HDR as HDR as intended.
Resolve studio takes HDR a step further
I have also finally invested in a Resolve Studio license to get all the HDR features. When Resolve is set up properly in the preferences, I am also able to grade HDR directly on the MBP screen without an external display. The support of ACES speacially makes the whole process of switching between SDR and HDR deliveries a lot easier in Resolve.
please use professional HDR displays
All these tests are helping me to learn about HDR. But if you really want to produce HDR content for clients, please use a proper HDR mastering setup with professional video cards from AJA, Blackmagic etc., and a professional grade HDR monitor. The simple setup that I use has several flaws, one of them being that only one HDR application should run at once. Not speaking of HDR content should be viewed at 1.000 Nits at least.
apple seems to push for HDR or I am just looking for it?
Apple seems to push for HDR a lot. With MacOS Catalina the the support of HDR started directly inside the operating system. The revamped FCPX with Metal support is fully capable of HDR and as fast as always, together with Motion, Compressor and the native Quicktime Player X applications.
even the photos app is now HDR! (since may 2020)
The next surprise was the iOS 13.5 update that paved the way for Corona warning apps worldwide. As I was just getting used to judge HDR content on my iPhone, I suddenly realized that after the update of iOS, a lot of my photos in my smartphone are also shown in HDR now! I cannot find any information about this new feature neither from Apple nor anywhere else. High contrast photos or a direct view into the sun shows the effect especially very good. HDR makes the photos look even better than before in my opinion.
pump up the screen intensity
Overall, I noticed that on all three displays, the moment I feed a HDR clip (or watch some photos) to the screen, the brightness pumps up in small steps to the peak brightness. I am not sure if this is controlled by the HDR metadata or it is a “feature” of the Apple displays. But I guess the next generation of Apple devices won’t need this pumping effect anymore. In my opinion this as a limitation of the hardware at the moment. And I hope for the release of a new big iMac with an XDR like display soon. I would like to finally update my old iMac 🙂
Important note to the following embedded Vimeo links: I am only able to see the “real” HDR clips on the iPhone, not on an iPad nor on the MBP screen in Safari. For other devices, here is a download link – Strato HiDrive (on an iPad the files need to be downloaded and viewed through the files app, not with Safari!)
samsung does HDR too
During my tests, I found out that some Samsung Android smartphones (I could test only a S10), are perfectly playing back the HDR content from a google drive share directly.
compressed HDR photos or real unpacked HDR photos
HDR photography exists for a long time. Creating a HDR photo is done by basically three steps: First, taking three or more exposures of the same subject. Second, merging the data to an HDR high dynamic range source. And third creating an HDR photo which has no high dynamic range whatsoever anymore. What remains in an HDR photo is the attempt to show all acquired dynamic range compressed or tone mapped into a normal display range. But it looks like this might change now.
Apple is offering HDR photos too, but the processing is done on the fly since the iPhone X. It improved with the iPhone 11 Pro. With the iOS 13.5 update, I think there is a new option in the photos preferences called “show-full-HDR”. When this option is enabled, I can suddenly see old photos taken a while back in a whole new way, with bright captured areas actually displayed a lot brighter on the screen with up to 500 nits of peak brightness. I hope this new feature would be addressed too in the next Apple keynote, and that it would be a much broader feature in the next iOS release this fall. This would make it possible to use HDR content in social media images as well as in videos. I am curious to see how this feature be called, as the term HDR (tone mapped) is already taken.
At the moment, only YouTube and Vimeo are supporting HDR content in their apps for iOS. I have found out that I can directly show HDR content on this website, if it is viewed on an iPhone X or 11 Pro with Safari.
At the moment, I am not able to do a direct export from FCPX / Resolve to YouTube / Vimeo with HDR content. I am not sure whether I am doing something wrong or if this is not yet supported.
finding footage for HDR
During the process of my HDR tests, I used a range of free available demo footage clips from ARRI, RED and Blackmagic Design. Sadly, I am not allowed to show the footage in public. Professional digital cinema cameras have been recording for many years a dynamic range that is needed for an HDR output. You only need to treat the footage “right”. Right means, you should basically use a proper color management system. I am lucky that I have some work in progress playouts of old projects that I have worked on, that I could re-grade in HDR for my personal tests. Furthermore, in companies where I normally work, I was able to re-use some existing projects. The only thing left for me to do, was to do a new simple primaries re-grade of the film in HDR. As for projects already graded in ACES with Resolve, it is as simple as choosing a different ODT for HDR, and exporting the project again. Sometimes, I did a small trim pass to push the HDR effect even a little bit more.
Both approaches are interesting to compare: One, a standard Rec.709 grade (maybe even based on a DOP/production show LUT) and do a simple re-grade in HDR. Two, having an ACES grading already finished for Rec.709 and changing the ODT to see a HDR result. I prefer the ACES approach, but I am aware of some gotchas that colorists have to fight when grading in ACES.
creating HDR content in 3D
At the same time as I am investigating HDR with recent/old projects, or demo camera footage, I am also digging deeper into Blender 3D and ACES.
For decades 3D applications render scene linear images in floating point precision, which support a very high dynamic range. All EXR renders are HDR capable. Until now, you just couldn’t view the results properly. Before ACES and other color management approaches like OCIO (from Sony Imageworks), the standard SDR view transform was sRGB (also called sRGB gamma, although technically it is not only a power function). This “gamma” curve clips values hard at 1.0.
So in the past, the approach was to render in a range between 0-1. If a hot specular highlight reaches values far above 1.0, there was a SoftClip available (for example in Nuke) to make the result look a bit more pleasing by compressing the highlights.
Blender switched from the standard sRGB to the FILMIC view transform some time back and other tools switch to OCIO and ACES configs. The need to work in a limited display referred dynamic range between 0-1 is lifted, and finally you can light a scene without any restraints and with real light values to get more realistic render results easily.
Blender FILMIC is a OCIO config that supports the gamut of the sRGB/Rec.709 primaries and an ACES-RRT like tone mapping with a soft roll-off of specular highlights. The FILMIC view transform uses a gamut compression method that desaturates high color values. This is similar to the way film stocks are behaving, which is a unique feature of Blender FILMIC.
To sum up, with HDR one can see bright light sources that are actually displayed as bright as the display allows. I remember similar problems with color saturation in the old days: I wanted to make an element in an image more colorful, but I was only able to clip values which looked even worse. The gamut was too limited. Nowadays with HDR and wider color gamuts, the available color and dynamic range palettes are broadening. Furthermore this technology has become very easy to use and accessible to everyone, thanks to Apple.
My eyes are fooling me
The same way as you get used to color-tinted sunglasses and don’t question the “look” after a short while, I find that when watching HDR vs. SDR versions of the same content back to back, sometimes my brain tells me that the SDR version looks very similar to the HDR version. Nonetheless my brain tells me that the HDR version is not so bright as I had expected. As a consequence I have realized that showing the films to another person helps to “recalibrate” by brain.
I found out that I can link a HDR-Vimeo video here on the website and watch this HDR content on a iPhone (X or 11Pro) directly on the page when viewed with Safari. I don’t even have to be in the Vimeo App for that. I did not know that this is possible.
Sadly it doesn’t work on a iPad or MBP that support HDR as well. If the device doesn’t support HDR you will be presented with a down converted SDR version. Please note also that this page needs to be opened directly in Safari on iOS and not from within LinkedIn or Instagram.
Converting a simple scene with some textures and a HDRI
Here comes another project that I wanted to check out and test. Can I take a very simple scene that I created in Blender out of the box and convert it to render it in ACES? For me it worked out smoothly in this case.
I learned a lot on the way and I am happy to share the results. It can be also found in the menu Learning ACES: “1.9. FILMIC to ACES“
Limitations with version 2.83 and up to 2.90.3 alpha
Today (12th of May 2020) I filed a bug or limitation when working with 8-Bit textures in Blender that need a colorspace transform. So I added some notes of other limitations I ran into in the last weeks and months while exploring Blender & ACES:
All color images should be EXR files in the colorspace ACEScg – in this way they don’t need any colorspace transform inside Blender.
Colorspace transforms in Textures (Input – Generic – sRGB Texture) will result in imprecise or sometimes wrong results when the image texture is a 8-Bit file! (the transforms should happen in floating point precision)
Non-Color image texture files e.g. bump or normal maps need to have the color space set to “raw” (no gamut or gamma mapping) – then there is not problem when using JPG/PNG files
Blender misses an option to select the OCIO file in the preferences – at the moment you must set the OCIO environment variable before starting Blender.
The list of colorspaces are not sorted in categories like it is in Nuke for example. The list is so large and long you need a giant screen to see them all and be able to select one with is at the end of the list – as a workaround you can temporally scale down the Resolution Scale smaller than 1.0 in the Interface/Display Preferences.