nesdev.com
http://forums.nesdev.com/

PAL / SECAM gamut palette test
http://forums.nesdev.com/viewtopic.php?f=21&t=15614
Page 2 of 2

Author:  lidnariq [ Thu Mar 09, 2017 7:23 pm ]
Post subject:  Re: PAL / SECAM gamut palette test

It is pedantically true that $30 is whiter than white. At 110 IRE, it's 110% of full scale.

However, it's also true that $20 is the exact same brightness, also 110 IRE.

(Via the RF modulator and input on a US TV set they'll turn out even brighter; the sync depth is -37 IRE instead of -40, and black level is 7.5 IRE so the both palette entries will be normalized up to ≈121% of full scale)

Author:  hawken [ Thu Mar 09, 2017 8:16 pm ]
Post subject:  Re: PAL / SECAM gamut palette test

rainwarrior wrote:
hawken wrote:
I was reading a bit about what "official" games were allowed to use and Jaws is a pretty rare case, apparently developers were asked not to use $30 because it caused problems with some CRT sets as it was "whiter than white" - indeed games would be rejected by Nintendo if they used #30. The same went for $0d as it was "blacker than black" (not sure what damage that could do though). $e & $f range were not allowed either.

Having trouble converting this for FCEUX as it requires 8bit colour depth, which this is not. The quest ends here ;)

I don't know what your sources are for this but they're completely bogus. (If you'd care to share the source, I'm sure there's people here who wouldn't mind correcting them.)

Again, $30 is the same as $20 by design, so this "whiter than white" idea makes no sense.

Super Mario Bros. uses $0F for black, and $30 for white. (So do a lot of games.) I'm certain that $0E is acceptable too, but I don't have a common example offhand.

I mentioned Jaws only as a weird case that I happened to notice uses both $20 and $30 on the same screen.

Nintendo didn't really have a way to test for and reject $0D either. For example, the common TMNT uses $0D and it made it through all licencing tests and was one of the highest selling games for the system. It doesn't use large amounts of it on the screen, just for sprite outlines, so I don't think it tends to cause the common desync problems, but my point is I don't believe they did any sort of categorical test for the use of $0D.



May have got muddled a bit in my own limited memory but I read it here:

https://www.gamedev.net/topic/671394-re ... mitations/

Quote:
Now as it so happens, the "high" level for rows 2 AND 3 is exactly the same, and it's white. Actually, it's BRIGHTER than white, which is why patches of white often cause minor distortions on CRTs. Similarly, the "low" level for row 0 is "blacker than black" and thus can confuse the sync detection circuits on some CRTs and so should never be used. In fact, officially licensed games NEVER used column D, and you probably shouldn't use them either.


Would be great to hear from devs from back in the day if Nintendo had any official restrictions.

Author:  lidnariq [ Thu Mar 09, 2017 8:49 pm ]
Post subject:  Re: PAL / SECAM gamut palette test

It's moderately well attested that Nintendo's "SDK" for the NES was a small pile of badly-translated Japanese documents, just barely enough information to get started.

I really doubt there was any admonition at the time against using the column $D colors ... or really any prohibitions at all.

We even already have a page on the wiki of games that use color $0d, many of which are licensed.

Page 2 of 2 All times are UTC - 7 hours
Powered by phpBB® Forum Software © phpBB Group
http://www.phpbb.com/