Macs... oh dear...

Messages
434
Name
Simon
Edit My Images
Yes
This will start a few questions, it has done for me.

I have only recently come to know the fact that Macs cannot provide 10 bit per colour displays. (Apparently).
ie; the operating system is incapable of achieving this!

I know there is a ton of hardware that can do this & some Mac users have these monitors & graphic cards, Quadro or Firepro.
But what is the point when the Mac os can't use & thus display the proper colours?

Have some Mac users found a 'get round' maybe using Linux or Windows when the need to show case the correct colours.

How do & why do the big animation companies, Pixar for instance Macs when they can't display 30 bit.

I have been scouring the net for info on this & each time the line, 'Mac users can stop reading now' has become synonymous with reviews of such 30 bit hardware.

Anyone got a link to show if there is a work around for this, I am a window 7 pro user but at work we use a combination of Macs & PC's.

Or are the reviews I have read all wrong?

Cheers....
 
What camera are you using to capture the image?
 
Hello shreds, bit thick of me, so please don't take this as rude, but a genuine question; can you explain what the camera has to do with Macs not displaying 10-30 bit images.
 
Sounds like a load of twaddle to me. I just checked my system info and it clearly says '32 bit colour' on my ancient 2011 imac :)
 
Me thinks some more digging is needed on my part then.
I could link some of the site reviews that have this apparent information but then they are available to find.
Off I go....
 
Hello shreds, bit thick of me, so please don't take this as rude, but a genuine question; can you explain what the camera has to do with Macs not displaying 10-30 bit images.

Think it was a reference to cameras being 14 bit colour depth
 
Yes I thought it might be the 12-14 bit of a camera but can only surmise if a cameras bit level was lower than 10 then it would matter,
but are there any camera that take less than ten bits per colour?

Probably! But then they definitely wouldn't be 'the best camera in the world 'boom boom' :naughty: as Basil Brush would retort...
 
OSX does not support 10 bit colour your right but there are few true ten bit monitors out there, most of them are 8 bit and dithered and as to why the professionals use macs considering this limitation then I guess its because they really don't need to display 10 bit colour

I believe there are work arounds
 
Last edited:
I wonder what srichards Mac is doing then?
Possibly pretending... whoops sorry srichards, bit mean, get it... ;)
 
I wonder what srichards Mac is doing then?
Possibly pretending... whoops sorry srichards, bit mean, get it... ;)
Mac's 32 bit color is RGBA... 24 bit RGB + 8 bit transparency. But even 24bit (8bit/color) is ~ 7M more colors than the human eye can perceive. I think the real concern is the process in converting the recorded bit depth into a "viewable" bit depth, more than it is "total bit depth."
 
Ahh! someone with some knowledge thus saving me from once again trudging the net, thanks sk66...
It always amazes me how people put up blogs about such & such & don't research properly.
I read the sites I went to pretty much end to end, loads of different sites, but none mentioned the transparency layer.

Which by the way catches us out where I work more times than we'd like to acknowledge!
A rouge square under a layer we can't see & too late when we print when the outline square shows as a faint line.
Reprint & money lost from the job.
 
I read the sites I went to pretty much end to end, loads of different sites, but none mentioned the transparency layer.
Because it is irrelevant... Windows graphics is 32bit too - thee reason is it is more convenient for a processor to work on 4 8bit elements at a time than 3. (and why 64bits - the next multiple of 2 after 4 - is the next larger compute size).

Macs and PCs use essentially the same hardware so what one can do, the other can... if the software supports it.
 
10bit capability is completely pointless and hassle to use. Even on PC. You have to do many tradeoffs to be able to use it with hardly any benefit.

For photography a decent monitor with 16bit 3D LUT is far more important for colour accuracy than 10bit channels. 10bit won't make the situation any better (unless considering highly specialised apps like grayscale X-ray evaluation).
 
Last edited:
Too many people get excited about having a "10 bit display" and usually when have no idea what it actually is, what it does, or why it's important. It's a number, and people just like to have a bigger number than their peers. Mainly a man thing this.

I use a machine with a 10bit workflow at work, and 8bit at home. There's no practical difference unless you work with completely linear gradients in graphic design or CAD. Photos just do not produce such mathematically linear gradients in real life.

However... even though most computer video cards output in 8 bits per pixel (24bit colour... 8 + 8 + 8 = 24) this isn't necessarily a detriment to quality if handles correctly.

8 bit has some limitations however, and these mainly come to the fore when software calibrating. When you calibrate with a normal monitor, you're not actually calibrating the monitor at all, but actually the graphics card. The problem here is that the 8bit output is probably "full" already as the gamut and range of the camera's output far exceeds that of your video card. This means that when you shift colour values around within that 8bit space to calibrate your screen, you are breaking up the linearity of your RGB histogram.

If you are concerned with quality in photography, you shouldn't be striving for a full 10bit workflow, as it's pretty pointless for the majority of photographers. Instead, you should strive for a monitor that can be hardware profiled instead. In this scenario, you are NOT calibrating the video card's LUT like you are with software calibration, but are instead programming the monitor's LUT. As the monitor's LUT is usually 12 or 14bit, there is "space" to move RGB values around without leaving "gaps" in your RGB histogram.

Fact: A hardware calibrated 8bit system will far surpass the quality of a 10bit software calibrated system, which is why I like to laugh at those who use Mac monitors, as they can't be hardware profiled.

Never mind Quadro or Fire cards... put our money into a monitor that can be hardware profiled, and you'll be able to enjoy the benefits of a un-molested 8bit output, as all calibration is done at the screen's LUT at a much higher bit depth.

If you do graphic design or CAD, 3D etc.... then sure.... fill your boots... but no need for a full 10bit workflow with Photography. There IS a need for a decent hardware calibrated screen though,


10bit capability is completely pointless and hassle to use. Even on PC. You have to do many tradeoffs to be able to use it with hardly any benefit.

For photography a decent monitor with 16bit 3D LUT is far more important for colour accuracy than 10bit channels. 10bit won't make the situation any better (unless considering highly specialised apps like grayscale X-ray evaluation).

Only if you are hardware calibrating that screen. Many monitors have high bit depth LUT but do not allow you to program them. If you can not hardware calibrate it, then you are actually still calibrating the GPU's 8bit output, which isn't that great.
 
Last edited:
  • Like
Reactions: mpe
Although OS X can't do 10bit output and Apple displays (those in iMacs, Macbook Pro, Thunderbolt display) cannot be hardware calibrated they are still decent when compared to many mainstream PC displays. If you buy any Mac with display you have a good chance to get a decent factory calibrated display that isn't way off. This is consistently confirmed by reviews. Add a simple software faux calibration and profiling and although it can't do miracles and results are nowhere close to 16bit 3D LUT hardware calibrated displays you have a good tool for almost anything apart from professional pre-press and other high-end specialised use. Certainly good enough for advanced amateur photo editing in Photoshop.

Obviously there is no competition between these and hardware calibrated professional wide-gamut displays like Eizo Color Edge, HP DreamColor, high end NEC displays, etc. However, these are easily more expensive than iMac or Macbook.
 
Obviously there is no competition between these and hardware calibrated professional wide-gamut displays like Eizo Color Edge, HP DreamColor, high end NEC displays, etc. However, these are easily more expensive than iMac or Macbook.
But you can get reasonably priced mid-range, hardware calibratable monitors. I have 2 x 27" 2713H's here that are fully hardware calibrated for example and they were less than £1k for the pair....
 
10bit capability is completely pointless and hassle to use. Even on PC. You have to do many tradeoffs to be able to use it with hardly any benefit.

For photography a decent monitor with 16bit 3D LUT is far more important for colour accuracy than 10bit channels. 10bit won't make the situation any better (unless considering highly specialised apps like grayscale X-ray evaluation).

On monitors upto a certain brightness, yes. But on some of the newer brighter monitors you start to get banding at 8 bits and with the move towards high dynamic range monitors then you need at least 10.
 
Although OS X can't do 10bit output and Apple displays (those in iMacs, Macbook Pro, Thunderbolt display) cannot be hardware calibrated they are still decent when compared to many mainstream PC displays. If you buy any Mac with display you have a good chance to get a decent factory calibrated display that isn't way off.

It depends how you define "mainstream PC displays". If you mean a cheap monitor that comes with most off the shelf PCs, then yes, the iMac display may well be better. It's not terrible.... but seeing as you can't change it, I'd want it to be damned good. It also depends if you can cope with a shiny screen. I personally hate them... others don't seem to mind at all. It's a personal thing.

First of all, it may be close enough visually straight out of the box.... but how many reviews actually test this? A display that's quite a long way out, can still "look" right if you've nothing to compare it to, and depending on the ambient lighting conditions. Furthermore, monitors do not stay calibrated. hey drift over time. People assume this was only true of CRT displays, but they'd be wrong - they do drift as backlights age, and even the LCD matrix itself. I calibrate once every 200 hours use, and while no great visual differences occur (although it is slightly detectable) this is cumulative, and if I never calibrated again after an initial calibration, it would be visually different within a few months of use.



Add a simple software faux calibration and profiling and although it can't do miracles and results are nowhere close to 16bit 3D LUT hardware calibrated displays you have a good tool for almost anything apart from professional pre-press and other high-end specialised use. Certainly good enough for advanced amateur photo editing in Photoshop.


There's nothing wrong with software calibration in terms of gamut and colour accuracy.. white point etc... it's just that calibrating the 8bit GPU output can cause banding and "holes" in your histogram. In normal operation for photography, you'll probably never notice this though, unless you start obsessing over smooth, falsely created gradients in photoshop.

Obviously there is no competition between these and hardware calibrated professional wide-gamut displays like Eizo Color Edge, HP DreamColor, high end NEC displays, etc. However, these are easily more expensive than iMac or Macbook.

Obviously :) You don't have to go to those lengths to beat a iMac, and especially a Macbook display though.

On monitors upto a certain brightness, yes. But on some of the newer brighter monitors you start to get banding at 8 bits and with the move towards high dynamic range monitors then you need at least 10.


I'm sorry.... I'm confused... what has brightness got to do with it? If you're calibrating your screen, you;ll be calibrating it to somewhere between 90 and 120cd/m2 any way... so not sure why you think brightness has anything to do with banding. Banding is a bit depth issue... not a brightness issue.
 
I'm sorry.... I'm confused... what has brightness got to do with it? If you're calibrating your screen, you;ll be calibrating it to somewhere between 90 and 120cd/m2 any way... so not sure why you think brightness has anything to do with banding. Banding is a bit depth issue... not a brightness issue.

No it's not. The Weber fraction (i.e. banding visibility) is dependent on brightness. Hence all the technical papers appearing from the likes of Dolby, BBC, Technicolor, Philips on the required bit-depth for high dynamic range displays.
 
No it's not. The Weber fraction (i.e. banding visibility) is dependent on brightness. Hence all the technical papers appearing from the likes of Dolby, BBC, Technicolor, Philips on the required bit-depth for high dynamic range displays.


But two calibrated monitors will be of equal brightness... that's the point of calibration :)

The Weber fraction is not a specific display measuring metric, but is simply a mathematical way of representing difference ratios. It states that the greater the differences between two values, the more noticeable those differences are.. pretty simple really. So if two are screens calibrated to the same luminance value, white point and gamma, any differences in banding are a product of bit depth, either the GPU's LUT or the screen's.

You are correct is as much as the brighter the screen is, the more you will be able to detect the banding differences (which is why it's always harder to detect banding in shadow areas), but anyone concerned about this would have calibrated their screen.
 
Last edited:
Back
Top