Photo Tip of the Week
Join Planet Minecraft! We're a community of creatives sharing everything Minecraft! Even if you don't post your own creations, we appreciate feedback on ours. Roll Random Blog!
Gamma 1.8 was very popular due to Mac OS. Gamma 1.8 curve produces slightly brighter images than gamma 2.2 curve so sometimes it is more preferred in some cases. However, since Mac OSX 10.6, gamma 2.2 has become the standard gamma curve for Mac OS as well. An example of gamma 1.8 versus gamma 2.2 is illustrated in Figure 4. In this guide I show you how to increase your brightness/gamma for minecraft (Mac). I hope you find this guide useful.
The Gamma Question: 1.8 or 2.2 ?
Gamma: if you've tried profiling your monitor or done more than just dabble in the digital darkroom you've undoubtedly encountered the term. You may have even read advice telling you that Macs should be set to gamma 1.8 and PC monitors to 2.2. While there are a few specialized monitors out there that serve as exceptions, the fact is that most of us are better off using gamma 2.2 on whichever platform we use.
Computer monitors are non-linear devices. A traditional CRT works by translating input voltages for each channel into specific levels of red, green and blue light output, but a given increase in voltage won't produce a proportional increase in brightness. For compatibility reasons, newer digital LCD monitors behave similarly even though their input signal is a stream of numbers for each channel rather than a varying voltage. For both, the actual relationship between input and output can be approximated by a formula known as a power curve wherein the input raised to some power or exponent yields the output. While monitors do vary somewhat, the exponent value needed to quantify this power curve for a typical one is somewhere around 2.5.
It is this exponential function that is what we are talking about when we refer to gamma. A 1.0 gamma means that the output value is the same as the input value. A graph of this would be a straight diagonal line as shown below. Gamma values less than 1.0 yield graphs that bow somewhat upward while gamma values greater than 1.0 will bow downward as illustrated. The further from 1.0 the gamma is, the more pronounced the curve.
Together with the traditional brightness and contrast controls found on most monitors, the full equation relating input values and gamma to monitor luminance would be:
Monitor luminance = Contrast × (Input Gamma) + Brightness
That is, while the input value to the power of the gamma value is the heart of what is going on, the contrast control will scale the overall intensity of the output luminance and the brightness control increases all results equally, effectively setting the black level for the monitor. For this reason, both of these should be set to maximize the available range of useful luminance values that can be displayed. Once set, they should not be changed.
A somewhat similar phenomenon to gamma happens when ink is applied to paper. A printer's job is to lay down ink droplets proportional in size to the intensity of the color being printed. But due to the mechanical nature of printing and the absorbency of the paper itself, the ink tends to spread in a way known as 'dot gain.' The darker the color being printed, the more the resulting ink is affected by dot gain in a non-linear way.
Back when the first Macintosh graphical user interface was released by Apple in 1984, those of us in the PC world were still stuck using DOS. Windows had yet to be invented. So too had the wonders of color management. The only way to get reasonably close to 'what you see is what you get' is to force it to come out that way. There wasn't much Apple could do to alter the way images printed on the early Laser Writer printer since the dot gain was determined mainly by the interaction of ink and paper, not software. And the standard CRT monitor built into the Mac wasn't anything special either, still having a native gamma somewhere near the 2.5 mark. So to make the two come closer to matching, Apple specified how their QuickDraw graphics libraries recorded pixel values to pull the native gamma of the monitor down to 1.8. This made it so that a user adjusting an image on the Mac monitor created pixel values recorded by QuickDraw that printed as a reasonable match to the monitor image. This worked so successfully in fact that the 1.8 gamma became regarded as the gamma of the Mac monitor itself, even though it was actually more a product of QuickDraw than anything else. Even as Quartz all but replaced QuickDraw in OS X, the resulting 1.8 gamma continued as a standard to ensure compatibility with older Macs.
All of this today though is reasonably a moot point since monitor and printer appearance are each now controlled by color profiles via ColorSync. Indeed, in a color managed workflow, Mac monitor gamma can be set to any reasonable value without altering what the displayed image looks like since ColorSync automatically compensates. Soft proofing in a color managed application like Photoshop can produce accurate onscreen displays of how an image will print without needing to actually have your monitor replicate the gamma of the printer's dot gain. Today, it is more important to choose a gamma value based on optimizing display performance and the 2.2 standard used on PC systems is a closer match to the 2.5 value native to most monitors than is the legacy 1.8 value.
So the obvious question at this point would be why then don't we all use a gamma of 2.5? The 2.2 value comes from how PC display standards originated. Before computer monitors could display images, television had been using CRT systems with similar characteristics for a long time. Indeed, one of the first monitors I used way back when was actually a TV set connected via an RF modulator. In the United States, the National Television Standards Council, or NTSC, determined that they had to slightly under-compensate for the native 2.5 gamma for TV images to look correct. Back when the NTSC standard was set in 1953, it seems that people generally watched their sets in dimly lit rooms, making the displays look less contrasty against the dark background than they actually were. Engineers found that by artificially using a gamma of 2.2 instead of 2.5 images looked more natural in the assumed dim viewing environment. The PC standard for monitor gamma retained this 2.2 gamma since it worked so well for TV.
But I can hear at least a few Mac users insisting that when they set their monitor to gamma 2.2 everything looks too dark. This is quite true for non-color managed applications, simply because the gamma changed but the image did not and ColorSync isn't converting things for us. Similarly, if you edit an image so it looks correct in a non-color managed application at gamma 2.2 (say, on a PC perhaps) it will look too bright when viewed at 1.8. But all that both these observations tell us is that changing gamma will affect how existing images get displayed unless our color management system is aware of the change and compensates for it. Neither says anything about the inherent benefits of either choice of gamma value.
Indeed, non-color managed images will only look correct if they are viewed under the same conditions as they were originally edited under. This problem often plagues web graphics as they generally do not contain embedded color profiles. The obvious way to solve this would be for PC users and Mac users to standardize on the same choice of gamma, and the logical choice would be for both to use gamma 2.2 for the reasons outlined here. While some Mac users may still secretly wish that Windows users would switch to 1.8 instead, they're out of luck. The realities of how monitors work is against them and besides, we PC users out number them.
Date posted: September 3, 2006
Copyright © 2006 Bob Johnson, Earthbound Light - all rights reserved.
Machine translation: Español Deutsch Français Italiano Português
A new photo tip is posted each Sunday, so please check back regularly.