

Blender 5.0 Introducing HDR Support On Linux With Vulkan + Wayland (phoronix.com) 16
Michael Larabel writes via Phoronix: The upcoming Blender 5.0 3D modeling software application is introducing High Dynamic Range (HDR) display support on Linux when making use of Wayland -- no X11 support for HDR -- and Vulkan graphics accelerator. HDR support for Blender 5.0 on Linux is currently considered experimental. Enabling the HDR support on Linux for the Blender creator software requires having a High Dynamic Range display (of course) and be running on a Wayland desktop, enabling Vulkan API acceleration rather than OpenGL, and enabling the feature currently deemed experimental. Additional details can be found via this Blender DevTalk thread.
so for AMD then (Score:3)
I hear more than 8bpp works kind of ok with AMD on X, but it definitely doesn't work well with Nvidia... Which also doesn't work well with Wayland. Looks like users who want HDR can choose between AMD or Intel, and Intel is going away.
Re: (Score:2)
>8bpp is not HDR.
Re: (Score:2)
I believe we've been over this before. It appears I failed at making you understand.
You cannot explain to me what you do not understand.
>8bpp is not HDR.
If they weren't talking about higher bit depth, it wouldn't matter which display system was involved. The most popular format for HDR displays to support is HDR10 [wikipedia.org], which involves 10bpp. You can view HDR content on 8bpp displays, but you won't get full fidelity. Remember Half Life 2: Lost Coast? (Apparently Riven also used HDR techniques, can't say I noticed.)
I can make HDR images on my camera using Magic Lantern, and have done. It uses the technique where
Re: (Score:2)
I hear more than 8bpp works kind of ok with AMD on X, but it definitely doesn't work well with Nvidia...
X does not support HDR, period, end of discussion.
I can make HDR images on my camera using Magic Lantern, and have done. It uses the technique where they're generated from bracketed exposures. They are striking even on a normal display with a typical color gamut.
Same name, different thing. I can see how that would be confusing for some people.
Even displays with 8bpp panels can get more out of having a HDR signal (including >8bpp) so even I would like to have it, with my cheapass LG43UT80[00].) Unfortunately, I have an Nvidia card.
Repeat after me. HDR is not >8bpp.
They are entirely separate things.
You can have an HDR transfer function and color space with 1bpp if you like (though less than 10 is painful)
Re: (Score:2)
You can have an HDR transfer function
Please put your autism aside for a moment. Precisely zero people in the world talk about HDR support meaning "support a transfer function to display HDR content on SDR displays" Universally when we talk about supporting HDR for displays we are talking about >8bpp
You being technically correct does not help your comment look any less stupid. You're arguing for the sake of arguing about a point no one was making and a feature no one is discussing.
Re: (Score:2)
Please put your autism aside for a moment. Precisely zero people in the world talk about HDR support meaning "support a transfer function to display HDR content on SDR displays" Universally when we talk about supporting HDR for displays we are talking about >8bpp
Wrong.
People are talking about "HDR" when they're talking about "HDR". They don't know exactly what that means- but I'm here to tell you. It's the transfer function.
Not one single display that advertises "HDR" support is referring to it being a 10bpc panel (and some even aren't)
The only people who refer to >8bpc as HDR are misinformed nerds too proud to admit they're wrong.
Your interjection here just made you look like a moron.
Re: (Score:2)
10bpc has worked on X for NV for literally decades, now.
However, that is not HDR.
Re: (Score:2)
Nowhere did I say that X did HDR.
What I said about this being the domain of AMD is that Nvidia drivers for Wayland don't work well. And this is becoming a problem for Nvidia more and more in general, which is ironic because I've always used them specifically because AMD was bad at drivers. Last I looked they still were on Windows, maybe that's better now, but I don't care about that at all.
The other thing I said was that 10 bpp is a requirement for output in the most commonly used HDR Display output format,
Re: (Score:2)
This is similar to what is called "HDR" in images (which are also not HDR, and very much not 10bpc (though they can be that- or even more!)
HDR support in Windows did not exist until Windows 10.
The other thing I said was that 10 bpp is a requirement for output in the most commonly used HDR Display output format, which is still true.
No, that is confusing TVs with computer displays. The "HDR10" standard does require 10bpc, but that is not "HDR". That is "HDR10, The Standard" (A collection of technologies someone may want in order to sl
Re: (Score:2)
HL Lost Coast does not use HDR
It renders in HDR, then it does cute tricks to represent the HDR content on a normal display. This still improves the visuals in both bright and dark areas, accomplishing a huge percentage of what you expect from HDR.
This is similar to what is called "HDR" in images
The scare quotes ship has sailed, that's always going to be called HDR.
HDR support in Wayland is accomplished by allowing the clients to set color space and giving them floating point pixel buffers. NV does all of this just fine.
It does, what it doesn't do is work reliably.
The NV drivers are still fully featured.
IME they've been problematic when I've tried Wayland. I've been sticking with X11 as a result. It's then disappointing that 30 bpp does not work well with Nvidia with X, but sadly
Re: (Score:2)
It renders in HDR, then it does cute tricks to represent the HDR content on a normal display. This still improves the visuals in both bright and dark areas, accomplishing a huge percentage of what you expect from HDR.
No, it does not.
It simulates what an HDR image would look like for you with things like blooming, slowly adjusting the contrast to simulate your eyes adjusting, etc.
There is nothing HDR about its rendering.
It's a very cool effect, but it results in a loss of detail, not an increase.
The scare quotes ship has sailed, that's always going to be called HDR.
Sure, I agree with that. However, it's entirely distinct from what is called HDR on displays.
It does, what it doesn't do is work reliably.
Depends how we define reliably.
As mentioned, AMD recently had a color space application fuckup for HDR as well. Do we consider the A
Re: (Score:3)
If it weren't for CUDA, I would have gone with ATI in this machine.
Yep. Well that and AMD being amazingly shit with their ecosystem. PyTorch does support AMD apparently, but the difference is stark. Firstly they don't want their gaming cards to cannibalize the nonexistent compute card sales where people will magically find the money for pro cards which is not a thing. Figuring out which card works today is (or was, haven't checked in the last 6 months) an exercise in trawling through poorly documented and w