Replies: 11 comments 16 replies
-
Hi Alan, SDL pixel format For the rest - have you managed to enable the 10-bit pipeline? For which setup - Intel, AMD or NVIDIA, X11 I guess? The thing is that I have done a small research and it still looks to me that it is still a bit tricky to make it run. Just AMD (and perhaps also Intel) seem to (at least) sometime work, yet I am not sure about details. It is also currently not clear to me the actual output format for OpenGL, but it may be possible at least to test but I currently don't have a working 10-bit setup. |
Beta Was this translation helpful? Give feedback.
-
I've gotten 10bit working at least at the X11 level by creating this file
So it does seem to be set at the X11/OS level. I've added |
Beta Was this translation helpful? Give feedback.
-
I've tried a bunch of stuff, and adding |
Beta Was this translation helpful? Give feedback.
-
Thanks, I got it working with AMD in the end. Anyways, SDL2 R10k still doesn't display anything for me and unfortunately I don't have may resources for 10-bit rendering yet. BTW, do you have some content which would present the 8/10-bit difference in a visually observable way? I am now trying with UVG dataset... |
Beta Was this translation helpful? Give feedback.
-
Hi Martin, Thinking some more, I need to do better tests to validate. Attached is a ZIP with 3 TIFF files that have gradients in them. In 8bit you would see definitive banding, in 10bit it will be smooth. |
Beta Was this translation helpful? Give feedback.
-
In the past I've tested with BMD HDMI out, and it does properly display in 10bit. So that is my baseline reference proper color precision. My recent tests I don't have that baseline (different TV), so I need to do that. |
Beta Was this translation helpful? Give feedback.
-
it is also important to set the display to Full range and set the max bpc
https://www.soi.pw/posts/10-bit-color-on-ubuntu-20.04-with-amdgpu-driver/ |
Beta Was this translation helpful? Give feedback.
-
I've tried to implement 10-bit OpenGL rendering according to this document for AMD + X11 (settings) and I have to say that it looks that it works. You can try out with current UG code:
(gradient2 pattern produces "full-range" R10k now) Of course it currently works only if the selected codec (either implicitly or explicitly) is R10k. If you have some preferred workflow (mainly compression), you would like to share here, I could look if there is (or can be) created a conversion. |
Beta Was this translation helpful? Give feedback.
-
I confirm |
Beta Was this translation helpful? Give feedback.
-
Hi Martin, Sorry for the late followup. I've tested the latest builds. Using -d gl:fs results in less than 24fps playback (around 20). If I use -d sdl:fs then I get realtime playback. So the choice is higher color fidelity but non-realtime, and lower color fidelity but realtime playback. |
Beta Was this translation helpful? Give feedback.
-
Intel NUC8 and NUC10... I believe they both have UHD 630 integrated graphics. |
Beta Was this translation helpful? Give feedback.
-
Hi there,
I think we emailed about this a while back, but I'm wondering if there have been any changes to being able to get 10bit output from SDL/GL display surface in Linux? That would be great if so.
SDL does have the following pixel format:
SDL_PIXELFORMAT_ARGB2101010
Thanks for everything.
Alan
Beta Was this translation helpful? Give feedback.
All reactions