> I don't know how long they were set weird. There is an argument for having them different though. Since arcade monitors were usually 9300k and our modern hdtv standards are 6500k, but it's probably not a great idea to do try to adjust the color temp in the shader. You'll lose a lot of brightness and everything will look wrong when you start up a game at 9300k. Also, since arcade monitors have pots to adjust levels, white balance was probably all over the place all the time.
I've read it's the other way: old tubes designs were lower color temp, and PC monitors were higher. All the PC CRTs I've had default at 9300. Sometimes I've put it at 6500, but not with current MAME. The months-new LED backlit LCD TV on my main PC rig doesn't have color temp, but instead color schemes: normal (which looks like 9300), warm (which looks like 6500), and cool (which is blue-heavy or red-missing). My theatre rig plasma is similar.
Anyways, general color palette had changed some over time, and sometimes weirdly, but never so strongly as it has into .174 . I understand the enthusiasm to make an 'authentic' type of image in the default settings, but the settings should change to reflect this, not as it seems hard-coded, and then try to adjust.
Scifi frauds. SF illuminates.
_________________
Culture General Contact Unit (Eccentric)
|