30-bit Color Depth on the Mac

This morning I was trying to find information about 30-bit (10 bits per component) color depth support on Macs. As you might guess, I found nothing usable. Although high-end graphics cards from both AMD and NVIDIA support the OpenGL extension for 10 bpc color, Apple does not advertise their machines as such.

So I ended up creating a little tool for assessing 10 bpc color capability. The following code tries to create a window with 10 bpc and queries the result (this code is highly similar to those found in AMD’s and NVIDIA’s 10 bpc guides).

#include <stdio.h>
#include <GLUT/GLUT.h>

int main (int argc, char *argv[])
    glutInit (&argc, argv);

    // Request a 10-bit per component bit depth and create
    // the window.

    glutInitDisplayString ("red=10 green=10 blue=10 alpha=2");
    glutInitDisplayMode (GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
    int window = glutCreateWindow ("bitdepth");

    // Check how many bits we have.

    GLint red, green, blue;
    glGetIntegerv (GL_RED_BITS, &red);
    glGetIntegerv (GL_GREEN_BITS, &green);
    glGetIntegerv (GL_BLUE_BITS, &blue);

    // Clean up and show results.

    glutDestroyWindow (window);

    printf ("Bit depth is %d/%d/%d\n", red, green, blue);

    return 0;

You should link with the OpenGL and GLUT frameworks.

Or you can grab the compiled executable here. It runs on 10.5 and up. Just run it and it will display the assessment result:

Bit depth is 8/8/8

I currently have no access to newer Mac Pros, which I suspect to have 30-bit support. So if you have and got 10/10/10, please let us know in the comments!

This site is made possible by the people downloading my apps - thank you all! I neither beg you to purchase goods through my site nor display advertisements, but if you find my writings useful or entertaining, I would encourage you to check out my tools that may make your life as a photographer or cinematographer easier and more productive.


  1. Hi there
    I’m very keen on getting 10bpc working on the mac. I’ve got an nvidia gtx-570 which works natively in mountain lion.
    Running your app gives me 8/8/8, using DVI.

    If you ever figure out how to enable 30-bit colour depth, please let me know.


  2. Hmm, Paul you won’t be able to display more than 8bpc over DVI. I’m not sure whether that would’ve affected your test results though, as I’m no expert when it comes to drivers and code like this.. I have just spent a lot of money on a 10bpc capable monitor, so I join the party waiting to realise it’s potential with mac…

  3. I don’t think any Mac will do it – I have a Quadro 4000 with EIZO CG245W on DisplayPort, in a Mac Pro 3,1 – but it’s not an issue of the age of the Mac Pro, of course. There’s just no implementation for it in the OS, and if there were, I would expect Photoshop to be able to use it as it does on Windows.

    I can’t use your utility – it crashes Archive Utility – but given that you’re using the same framework that CS6 has access to, I don’t expect it to report anything other than 8bpc regardless of card and display.

    It’s a shocking state of affairs for Apple to have allowed this to continue, after adopting a 30-bit capable format in DisplayPort.

  4. I am using a NEC PA301W which definitely is capable of upto 14 bits per channel. I am connected over Displayport but still only get a bit depth of 8/8/8.

  5. This may be of some help. its from a couple of years ago, and he was having some problems, but seemed to have it working. https://discussions.apple.com/thread/3614614


  6. Ernesto Revilla says:

    I’ve just tested this on my MacPro Mid 2012 with OS X Mavericks and I get also:


  7. Hi guys! Mavericks 10.9.4 here, same 8/8/8…..

Speak Your Mind


Prove that you are human * Time limit is exhausted. Please reload CAPTCHA.