Revisiting the NES Palette Emulation

Discuss emulation of the Nintendo Entertainment System and Famicom.

Moderator: Moderators

Post Reply
anikom15
Posts: 22
Joined: Mon Nov 30, 2020 2:41 am

Revisiting the NES Palette Emulation

Post by anikom15 »

I was playing Mother the other day when it dawned on me that the full range of the NES's video signals can be taken advantage of with modern TVs and monitors that support wide gamuts. I spent some time searching for the raw YIQ values of the NES video system but was unable to find thm. Long ago some measurements off the PPU were used to come up with video emulation for Nestopia, and the default palette would end up in most of the other emulators (maybe slightly different).

I wasn't able to find YIQ or YUV values but I did find the voltage levels off of the nesdev wiki, so I created a mock-up of a raw PPU signal and drove it into my CRT tools. This is what I ended up with after a great deal of tinkering with the filters:

Image

Ignore the fact that the colors are on there twice. It's just that I had to generate a complete frame, i.e. two fields.

It seems to match up with the puNES NTSC palette (that's what I'm using these days, sue me), so I guess it's okay. But I had to offset the colorburst generator by 3.5 clock cycles and I'm not sure why. Also the output of the PPU is quite sharp and needed some aggressive filtering. I imagine an actual NES has some filtering that no one's bothered to quantify, in addition to the TV video filters. If you look closely you can see the subcarrier isn't fully suppressed, but I was tired of making filters. I normalized the output according to the colorburst level, otherwise everything's just too damn bright. I also color corrected it for sRGB screens (slightly reduces magenta).

Well anyway the purpose of this was to propose a way to potentially get better color on displays that are capable of it. I haven't actually measured, but it seems that the output falls outside the TV gamut. What that means is some of the color components are negative or greater than 1. The TVs of yore also used very slightly different color primaries. The primaries correspond to Rec. 601 almost exactly. HDTV and sRGB use Rec. 709 primaries (these primaries were developed as a compromise between phosphors used in NTSC TVs and phosphors used in PAL and SECAM TVs [those have their own primaries]). You can convert, but you'll lose some precision. It hardly makes a difference however. One thing I want to clear up is that TVs in the 80s didn't use NTSC primaries (i.e. the primaries listed in the FCC standard). In fact, hardly any TVs used those primaries. The 'NTSC' gamut is a complete waste of time and you should never think about it.

Anyway, with this generated palette I can sample the colors and create some matrices/palettes/whatever for BT.2020, DCI, Adobe RGB, etc. I'm not exactly sure how it will turn out. The correct method will be to clamp on the Rec. 601 gamut so we should get an almost identical picture to what we see here, but it will be possible to see how the image looks unclamped as well.
NewRisingSun
Posts: 1510
Joined: Thu May 19, 2005 11:30 am

Re: Revisiting the NES Palette Emulation

Post by NewRisingSun »

anikom15 wrote:The TVs of yore also used very slightly different color primaries. The primaries correspond to Rec. 601 almost exactly.
Actual receiver phosphors of the 1980s were very different from what you call "Rec. 601" primaries [1], also see Note 1.
anikom15 wrote:One thing I want to clear up is that TVs in the 80s didn't use NTSC primaries (i.e. the primaries listed in the FCC standard). In fact, hardly any TVs used those primaries. The 'NTSC' gamut is a complete waste of time and you should never think about it.
1980s television receivers did not "use" NTSC primaries in the sense that their own phosphors' primaries resembled the 1953 primaries [1]. But 1960-1990s television receivers did "use" the 1953 primaries in the sense that they assumed that the NTSC picture signal was encoded for them [2], and applied a conversion matrix [3] to make the colors look on the 1960s-1990s receiver phosphors as they would have looked on a 1953 NTSC receiver, to the extent that this was possible. It would not be before 1992's publication of SMPTE-170M that receivers would assume that the signal was encoded for SMPTE "C" primaries, and that change was never followed by Japanese receiver manufacturers for SDTV.

So yes, you definitely need to think about the 'NTSC' gamut.

The recent availability of wide-gamut displays is irrelevant for the NES palette, as wide-gamut displays were not available in the NES' commercial era, and so no NES graphics artist would have expected his graphics to be viewed with "the full range of the NES's video signals". You can do such a thing as an experiment, but it will not be useful for actual NES games.

Notes:
  1. Strictly speaking, there is no such thing as "Rec. 601 primaries". What people sometimes call "Rec. 601 primaries" either refers to the coefficients used to sum up the luminance component E'Y from E'R/E'G/E'B (i.e. 0.299/0.587/0.114) which is not "colorimetry" but just an encoding formula, or to the SMPTE "C" primaries that were only added in Revision -6 or -7 of the Rec. 601 standard and are just informatively reciting the "Chromaticity coordinates specified [as] those currently used by 625-line and 525-line conventional systems" [4].
[1] DeMarsh, Leroy (1993): TV Display Phosphors/Primaries -- Some History. SMPTE Journal, December 1993, pg. 1095-1098.
[2] Rec. ITU-R BT.470-6 (1998), pg. 35.
[3] Parker, N.W. (1966): An analysis of the necessary receiver decoder corrections for color receiver operation with nonstandard primaries. IEEE Transactions on Broadcast and Television Receivers 12 (1), 23-32.
[4] Rec. ITU-R BT.601-7 (2011), pg 6.
anikom15
Posts: 22
Joined: Mon Nov 30, 2020 2:41 am

Re: Revisiting the NES Palette Emulation

Post by anikom15 »

Thanks for your feedback NewRisingSun. It's a pleasure knowing you're still alive. Indeed, I thought the color correction matrices had been abandoned far earlier, but it appears they were in use in the broadcast end at least until 1987.

So here's the result after calculating the colors using the gain and phase angle specified at the end of Parker. The colorspace is defined by primaries very similar to SMPTE C, but slightly different, and with a white point of C. The image has been corrected for sRGB.

Image

I'll have to try it with SMPTE C and D65 to see how it looks (quite a lot of math involved), but it's rather different. I didn't normalize on the colorburst this time. I'm sure some of these colors are out of gamut for sRGB, so could be taken advantage of with a wide gamut display. While SMPTE-170M didn't come out until the 90's, SMPTE did recommend that the color correction matrices no longer be used in 1987. This likely represented a practice that was already taking place. I know in the mid-80s everything rapidly went from analog to digital, and I imagine this spilled over into the video world as well. It would be interesting to know if the change is related to this.

Apparently Japan never used correction circuits. However, the monitors would have used the new primaries anyway, so without the correction circuits things would appear as SMPTE C, but with a 9300K white point. One question, however, does the Famicom's output differ from the NES, aside from the RF path? The Famicom didn't have composite out, so may be hard to qualify I suppose.
lidnariq
Posts: 11429
Joined: Sun Apr 13, 2008 11:12 am

Re: Revisiting the NES Palette Emulation

Post by lidnariq »

RF path necessarily requires a normalization stage, possibly on sync depth or colorburst magnitude or both. Composite input doesn't necessarily, although NewRisingSun has reported that he's worked with at least one TV that normalizes both via composite.

RF path tentatively seems to have a bigger effect on audio on the NES/FC than anything else.

The other huge problem we've discovered is that the output of the 2C02 is not ideal, and there's a dramatic change in hue angle as things get brighter.
anikom15
Posts: 22
Joined: Mon Nov 30, 2020 2:41 am

Re: Revisiting the NES Palette Emulation

Post by anikom15 »

The RF signal is normalized by comparing the sync with the maximum modulation depth (It's defined to be somewhere on the order of 80%–90%). This is a benefit of negative modulation. Since the composite signal isn't modulated, there's no defined way to normalize the signal. That said, I've seen actual AGC ICs that normalize the signal using the sync, while others use the colorburst. The engineer in me would pick the colorburst since it's an AC signal: its amplitude won't be so much affected by losses.

Has anyone considered the age of the device to be a culprit of the phase shifts on the PPU? I haven't looked into the matter closely, but it's something that immediately came to mind.
lidnariq
Posts: 11429
Joined: Sun Apr 13, 2008 11:12 am

Re: Revisiting the NES Palette Emulation

Post by lidnariq »

anikom15 wrote: Tue Dec 01, 2020 3:17 am Has anyone considered the age of the device to be a culprit of the phase shifts on the PPU? I haven't looked into the matter closely, but it's something that immediately came to mind.
The output phase delay appears to be an artifact of how the 2C02's video DAC works. Namely, it's a long windy on-die resistor with a set of analog multiplexers to pick one tap. (Ignoring emphasis. Emphasis seems to change the voltage supply to the resistor, and it's much slower)

So the output impedance will vary depending on the current tap, and a parasitic capacitance will turn the whole output stage into a digitally-controlled delay (apparently on the order of 12ns)

I don't think these on-die resistors will change much as they age? I really don't know, it's been far too long since I learned things about IC fabrication.
anikom15
Posts: 22
Joined: Mon Nov 30, 2020 2:41 am

Re: Revisiting the NES Palette Emulation

Post by anikom15 »

Okay, based on that description I believe it is intrinsic to the physical design and not a result of age. Age could still influence the specific phase change slightly, but the bulk of it is due to the differing impedance dependent on the multiplexer selection.

I'm guessing design wise that the shift was not strong enough to create enough of a hue difference to matter. Also, at the brightest levels, there is very little saturation in final RGB space, regardless of the IQ values. On the simulation side I suppose all we can do (aside from modeling the analog path directly) is a best effort approximation.

I think over the past decade we've seen an increasing interest in the analog side of hardware that was largely ignored in the 90s and 2000s.
lidnariq
Posts: 11429
Joined: Sun Apr 13, 2008 11:12 am

Re: Revisiting the NES Palette Emulation

Post by lidnariq »

anikom15 wrote: Thu Dec 03, 2020 5:53 pm I'm guessing design wise that the shift was not strong enough to create enough of a hue difference to matter.
In practice, there's about 15 degrees of difference between the $3x colors and the $0x colors. This is less than the 30 degrees quanta in the digital abstraction, and maybe that's enough.
Also, at the brightest levels, there is very little saturation in final RGB space, regardless of the IQ values.
I assume you're referring to colors being out of gamut?
anikom15
Posts: 22
Joined: Mon Nov 30, 2020 2:41 am

Re: Revisiting the NES Palette Emulation

Post by anikom15 »

lidnariq wrote: Thu Dec 03, 2020 6:12 pm
anikom15 wrote: Thu Dec 03, 2020 5:53 pm I'm guessing design wise that the shift was not strong enough to create enough of a hue difference to matter.
In practice, there's about 15 degrees of difference between the $3x colors and the $0x colors. This is less than the 30 degrees quanta in the digital abstraction, and maybe that's enough.
Also, at the brightest levels, there is very little saturation in final RGB space, regardless of the IQ values.
I assume you're referring to colors being out of gamut?
There are two aspects. I'd have to look at the raw numbers, but the RGB space defines a sort of double pyramid-ish shape. At the luminance extrema, the IQ range that falls within the RGB gamut is narrower, i.e. the maximum amplitude of saturation is lower. Due to the unequal weights of each primaries contribution to luminance (i.e. green is much brighter than blue), a phase shift can cause a color to go from in-gamut to out of gamut. If a color goes from in-gamut to out-of-gamut, it may not change the perceptible color. Likewise, if a color goes from out-of-gamut to out-of-gamut, then it is even less likely to have a perceptible change, esp. if the luminance doesn't change. (Whether it is perceptible is a function of the distance from gamut the point is and how much precision the colorspace has, in most cases 8 bits.) The other aspect is human perception of color. Our eyes are best able to distinguish hue at high saturation. At low saturation, which necessarily occurs when the luminance is low or high, our eyes are less able to distinguish hue. So some phase change may have an effect at moderate luminance but not high luminance.

There is one more thing, I suppose, which is that our perception of hue also changes with perceived brightness. But this also has to do with the viewing environment.

It would be fairly trivial to add phase noise to the IQ data for the last two rows to see how it affects the palette visually in real-time. This could be done by modifying one of the NTSC shaders in an emulator.

P.S. I should add that by 'enough of a hue difference to matter', I mean that to the developers, appearance wise, the colors look fairly linear enough. For PC systems in the early 80s, perfectly linear hue response was absolutely not a design goal.
lidnariq
Posts: 11429
Joined: Sun Apr 13, 2008 11:12 am

Re: Revisiting the NES Palette Emulation

Post by lidnariq »

anikom15 wrote: Thu Dec 03, 2020 9:00 pm If a color goes from in-gamut to out-of-gamut, it may not change the perceptible color. Likewise, if a color goes from out-of-gamut to out-of-gamut, then it is even less likely to have a perceptible change, esp. if the luminance doesn't change.
My understanding is that behavior on out-of-gamut colors doesn't necessarily just clamp the output to the guns, but may instead trigger some more complicated behavior. Try comparing the output of Drag's generator with some of the hand-tuned perceptual palettes (e.g. FirebrandX's).
anikom15 wrote: Thu Dec 03, 2020 9:00 pm Our eyes are best able to distinguish hue at high saturation. At low saturation, which necessarily occurs when the luminance is low or high, our eyes are less able to distinguish hue.
You're using the HSL definition of saturation to say that dark colors are desaturated, but I don't think our eyes work that way; talking about these colors in La*b* shows that we're still very sensitive to blue-vs-not even then, although our ability to distinguish red-vs-green is indeed diminished.

Also, the NES's $0x dark colors have a high chrominance magnitude and are quite saturated despite their low luminosity, and many of them are extremely out of gamut.
It would be fairly trivial to add phase noise to the IQ data for the last two rows to see how it affects the palette visually in real-time. This could be done by modifying one of the NTSC shaders in an emulator.
I mean, just assuming that each row should have its raw UV angle rotated by 5 degrees more than the first seems like a reasonable approximation. I personally find this perceptible, primarily because $38 is actually fairly yellow-ish, and $08 is too green.
anikom15
Posts: 22
Joined: Mon Nov 30, 2020 2:41 am

Re: Revisiting the NES Palette Emulation

Post by anikom15 »

lidnariq wrote: Fri Dec 04, 2020 12:13 am
anikom15 wrote: Thu Dec 03, 2020 9:00 pm If a color goes from in-gamut to out-of-gamut, it may not change the perceptible color. Likewise, if a color goes from out-of-gamut to out-of-gamut, then it is even less likely to have a perceptible change, esp. if the luminance doesn't change.
My understanding is that behavior on out-of-gamut colors doesn't necessarily just clamp the output to the guns, but may instead trigger some more complicated behavior. Try comparing the output of Drag's generator with some of the hand-tuned perceptual palettes (e.g. FirebrandX's).
There are a number of techniques to limit the range of the television. Once the Y'IQ components are separated from the composite signal, it's recommended to limit in that domain before being decoded to RGB, because chroma compression won't shift hue. If the RGB signals are compressed unevenly, there will be a hue shift. The Y' siginal is essentially clipped, though some TVs allow whiter than white. The I and Q signals are reduced depending on the Y value. The more extreme the Y'. value (close to 0 or 100 IRE), the less allowable range I and Q have. After the decoding, amplifiers drive the R, G, and B guns which can be driven to compressed if stimulated too high. There is a bit of a curve to that compression. It wouldn't be a hard clip. I would guess that the Y'IQ compressor would also be a soft compressor as well.

The best way to implement the clamping technique would be to limit Y'IQ such that I, and Q have minimum phase error from Y'IQ -> RGB -> Y'IQ, Y' is preserved, and no values of R, G, and B fall outside the allowable range. If we assume that a well-behaved television works in this manner, we can reasonably expect that the clamping won't result in a hue shift, but rather a desaturation. (So desaturate should be the 'correct' method) I haven't actually solved the optimization yet, so I can't prove desaturate is the solution. It's part of my to-do list. It's difficult because R, G, and B contribute to Y differently, and I and Q (or Pb, Pr) do not have equal axes. I'm pretty sure it's solvable though as it's still all linear.
anikom15 wrote: Thu Dec 03, 2020 9:00 pm Our eyes are best able to distinguish hue at high saturation. At low saturation, which necessarily occurs when the luminance is low or high, our eyes are less able to distinguish hue.
You're using the HSL definition of saturation to say that dark colors are desaturated, but I don't think our eyes work that way; talking about these colors in La*b* shows that we're still very sensitive to blue-vs-not even then, although our ability to distinguish red-vs-green is indeed diminished.

Also, the NES's $0x dark colors have a high chrominance magnitude and are quite saturated despite their low luminosity, and many of them are extremely out of gamut.
When I say luminance, I'm talking about Y'. Perceived brightness is the term I have used for visual brightness. I am not saying dark colors are necessarily desaturated. I am saying that it is more difficult to distinguish desaturated hues from other desaturated hues than it is to distinguish saturated hues from other saturated hues. This is independent of perceived brightness (as long as illumination is reasonable). However, since extreme values of Y' necessarily result in desaturation, extreme values of Y' will be less affected by shifts in phase at the perception level.

I hope that clarifies things.
lidnariq
Posts: 11429
Joined: Sun Apr 13, 2008 11:12 am

Re: Revisiting the NES Palette Emulation

Post by lidnariq »

anikom15 wrote: Fri Dec 04, 2020 3:34 am I hope that clarifies things.
I ... maybe? I'm not certain I follow. Or at least, I think I understand the words you're saying, but I don't think what evidence we have is consistent with it?

If you take FirebrandX's palettes and plot them in any space where you can see the PPU-induced hue rotation, there's generally a rotation in the green-towards-red direction as colors get brighter, but as things clip against the RGB or YUV boundaries, those clipping behaviors seem to be what causes the hue angle to trend the other way or be non-monotonic:
X=hue(0=255=red, 85=green, 170=blue) Y=luminosity color=saturation
X=hue(0=255=red, 85=green, 170=blue) Y=luminosity color=saturation
PVM Style D93 (FBX).hsl.png (3.86 KiB) Viewed 6155 times
... although I guess your point is that those would clip to that same hue angle regardless of the underlying hue rotation. I have no idea if this graph gives any data either way...
anikom15
Posts: 22
Joined: Mon Nov 30, 2020 2:41 am

Re: Revisiting the NES Palette Emulation

Post by anikom15 »

What HSL translation is used for that graph? The best transform would be to have the Y coefficients be correct for the given colorspace. That should correct for the yellow and blue brightness offsets. I have a tool that can calculate the coefficients for any RGB space.
lidnariq
Posts: 11429
Joined: Sun Apr 13, 2008 11:12 am

Re: Revisiting the NES Palette Emulation

Post by lidnariq »

Just gimp's. I'm pretty certain that the luminosity variation you're seeing was deliberately encoded by FirebrandX to match what he saw, and not a flaw in gimp. I see the same non-uniformities in other destination colorspaces too (e.g. YCrCb and La*b*, although for these I have to plot them in 3d to visualize trends)

I'm specifically mostly playing with his "PVM Style D93" one, which I've converted into a 1x64 png here:
Attachments
PVM Style D93 (FBX).pal.png
PVM Style D93 (FBX).pal.png (265 Bytes) Viewed 6127 times
anikom15
Posts: 22
Joined: Mon Nov 30, 2020 2:41 am

Re: Revisiting the NES Palette Emulation

Post by anikom15 »

Well yes, the luminosity coefficients for YCbCr and Lab are completely different from the Y is YIQ and YUV, and those are different still for D93. It’s not so much ‘deliberately encoded’ as it is simply a consequence of differences in how we convert to luminance. If we use coefficients for SMPTE C and D93, I expect the result to be flatter.

I can probably generate some similar graphs with various scenarios. We should be able to directly compare an idealized YPbPr model with what we’ve created with all our estimated models and palettes. That could potentially shed some light on what’s going on.
Post Reply