Emulator palettes vs real NTSC TVs

A place for your artistic side. Discuss techniques and tools for pixel art on the NES, GBC, or similar platforms.

Moderator: Moderators

tepples
Posts: 22708
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Emulator palettes vs real NTSC TVs

Post by tepples »

Is the master clock input exactly 50% duty? If not, that might mess with the phase generator, causing the odd phases ($9, $B, $1, $3, $5, $7) to be offset.
Drag
Posts: 1615
Joined: Mon Sep 27, 2004 2:57 pm
Contact:

Re: Emulator palettes vs real NTSC TVs

Post by Drag »

lidnariq wrote:
NewRisingSun wrote:Has it been confirmed that color burst is always at the same phase as color $8?
In the digital domain, it's definitely always phase $8, and there's no ability to specify anything other than some multiple of 30°.
I will repeatedly say this over and over until I understand: If colorburst is phase 8, why does the on-screen color not have the greenish-yellow colorburst hue? It really looks like it should be color 9 and not color 8. Every single NTSC color generator matrix/formula/whatever I've used, with NES parameters plugged in (including forcing color 8 to be the literal colorburst hue) just does not produce results that look like anything I've ever seen the NES look like, even if I set color 9 to be the colorburst hue, it still doesn't look right unless I shift the hues slightly, which screws up other colors. Moreover, there's things like gamma curves (R, G, and B each have their own independent gamma curves, but you'd never know if you asked anyone because everyone wants you to apply gamma only to the luminance, which doesn't produce the right colors, especially in the darker $0x-$1x range of the palette) to worry about too, and really, it doesn't seem like anyone's legitimately interested in a palette that resembles a physical CRT's color output. Moreover, what's the correct way to deal with out-of-gamut colors (which the NES is fond of producing, especially in the blues)?

So this is basically why I completely gave up on this. :P
NewRisingSun
Posts: 1510
Joined: Thu May 19, 2005 11:30 am

Re: Emulator palettes vs real NTSC TVs

Post by NewRisingSun »

Drag wrote:If colorburst is phase 8, why does the on-screen color not have the greenish-yellow colorburst hue?
It has on the Sharp AN-500B Twin Famicom. The Sharp AN-505BK Twin Famicom looks even more greenish (positive hue shift, about +10°). I've never had a US NES, but all video captures seem to indicate a negative hue shift (about -10°), making on-screen color #8 look less greenish. All this on the same TV. On a different TV, the AN-500B looks just as greenish as the AN-505BK. The reason seems to be that the NES does not properly output a sine wave for the color burst but some sort of weird filtered triangle wave, which causes further variation across models and television sets.

The PAL NES on the other hand has rock-solid (i.e. constantly washed-out) colors across all sets I've seen.
Drag wrote:R, G, and B each have their own independent gamma curves, but you'd never know if you asked anyone because everyone wants you to apply gamma only to the luminance
There is not a single television standards document that ever calls for R/G/B having different electro-optical transfer characteristics.
Drag wrote:especially in the darker $0x-$1x range of the palette
Those colors are very much affected by how you assume your emulated TV handles non-standard video levels, in particular, the NES' too-small sync amplitude. Also try both 0 and 7.5% black-level setup. These are more likely sources of variation than putative gamma curves.

Consider the following ways of interpreting the NES' video signal --- all valid methods found in models of TV capture cards and television sets, see attached pictures. I could post another threesome of pictures for NTSC-J with 0% black-level setup. (All pictures use a -13° hue shift from color 8, and no NTSC->sRGB color correction).
Drag wrote:it doesn't seem like anyone's legitimately interested in a palette that resembles a physical CRT's color output.
That's what I've been doing for the past 8 years or so...
Drag wrote:Moreover, what's the correct way to deal with out-of-gamut colors (which the NES is fond of producing, especially in the blues)
No television standards document mentions a correct way of doing so, assuming that with proper receiver adjustment, you just get the same R/G/B values that originated in the hypothetical television camera. You can either just clip at 0 and 255, or reduce saturation for that color until none clip.
Attachments
U.S. NTSC (7.5% black-level setup), use blanking level for DC removal, scale amplitudes by comparing sync amplitude to theoretical value
U.S. NTSC (7.5% black-level setup), use blanking level for DC removal, scale amplitudes by comparing sync amplitude to theoretical value
U.S. NTSC (7.5% black-level setup), use blanking level for DC removal, take amplitudes as they are
U.S. NTSC (7.5% black-level setup), use blanking level for DC removal, take amplitudes as they are
U.S. NTSC (7.5% black-level setup), use sync tip for DC removal, take amplitudes as they are
U.S. NTSC (7.5% black-level setup), use sync tip for DC removal, take amplitudes as they are
Last edited by NewRisingSun on Sat Aug 24, 2013 1:00 pm, edited 1 time in total.
tepples
Posts: 22708
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Emulator palettes vs real NTSC TVs

Post by tepples »

NewRisingSun wrote:
Drag wrote:R, G, and B each have their own independent gamma curves, but you'd never know if you asked anyone because everyone wants you to apply gamma only to the luminance
There is not a single television standard document that ever calls for R/G/B having different electro-optical transfer characteristics.
Gamma correction applies only to an all-positive signal. The common practice in digital video editing in the YCbCr (YUV) domain is to gamma-correct only the Y channel. But in RGB, yes you're supposed to gamma correct all three.

My guess for the phase error is slew rate. There's definitely a low-pass characteristic in the net output from the PPU. Some designs of amplifier have a "slope overload" characteristic, which produces more phase delay at high amplitudes than at low amplitudes because the overall rate of change in voltage over time has limits. There's a difference between this limit and impedance that depends on voltage: impedance that depends on voltage would affect the lower-frequency luma signal, while slew rate mostly affects chroma. There's a test for this: the chroma component of $08 or $38 has a larger amplitude than colorburst, and the chroma component of $18 or $28 has an even larger amplitude. If there's more phase delay on $18 and $28 than on $08 and $38, you've found your culprit.

And since 2006, there's been a standard for how to handle out-of-gamut colors: xvYCC.
NewRisingSun
Posts: 1510
Joined: Thu May 19, 2005 11:30 am

Re: Emulator palettes vs real NTSC TVs

Post by NewRisingSun »

tepples wrote:If there's more phase delay on $18 and $28 than on $08 and $38, you've found your culprit.
That's not what I've seen with the NES. And it doesn't explain why two consoles look different on one TV, but the same on another.

I agree however that NTSC is definitely susceptible to the slew rate phenomenon: contrary to what is typically claimed, the famous NTSC phase shift problem is not caused by strange things happening to terrestial signals in the air, but by receiver equipment and especially transmitters having an amplitude-dependent phase shift. The first NTSC transmitters shifted the highest amplitudes 30 degrees further than the lowest amplitudes; adjusting the (amplitude-independent) hue control on the receiver could therefore only result in either bright or dim colors looking correct. Hence the popularity of PAL, whose patent specifically mentions its ability to perfectly correct this "differential phase error", as it's called.

Another question, which tepples might be able to answer: SMPTE-170M states that the peak-to-peak color burst amplitude shall be 40 IRE. The NES outputs about 50 IRE (with 75 ohm load), suggesting that saturation ought to be attenuated by 40/50. However, SMPTE-170M describes the color burst as a sine wave, whereas the NES outputs a square (or triangle?) wave. If the square-wave 3.58 MHz signal is filtered into a sine-wave, its peak-to-peak amplitude would be larger, but by how much?
tepples
Posts: 22708
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Emulator palettes vs real NTSC TVs

Post by tepples »

NewRisingSun wrote:If the square-wave 3.58 MHz signal is filtered into a sine-wave, its peak-to-peak amplitude would be larger, but by how much?
True, an ideal brick wall at 4.2 MHz will cause the sine wave to have a bigger peak-to-peak amplitude. But the real amplitude depends on how the filter's transfer function looks around 3.58 MHz (fundamental) and 10.74 MHz (third harmonic, which is the first overtone of a square wave). Attenuation of -6 dB per octave, for example, will turn a square wave into a triangle wave, and if the corner isn't well above 4 MHz, it'll reduce the amplitude of the fundamental.

Disclaimer: I have more of a digital signal processing background than an analog EE background.

Two consoles looking different on one TV but the same on another might be an impedance mismatch. The TV where they look the same might have a lower input load.
NewRisingSun
Posts: 1510
Joined: Thu May 19, 2005 11:30 am

Re: Emulator palettes vs real NTSC TVs

Post by NewRisingSun »

tepples wrote:True, an ideal brick wall at 4.2 MHz will cause the sine wave to have a bigger peak-to-peak amplitude.
By how much? :)
tepples wrote: But the real amplitude depends on how the filter's transfer function looks around 3.58 MHz (fundamental) and 10.74 MHz (third harmonic, which is the first overtone of a square wave). Attenuation of -6 dB per octave, for example, will turn a square wave into a triangle wave, and if the corner isn't well above 4 MHz, it'll reduce the amplitude of the fundamental.
The way I read the drawing in the NTSC document from 1954, it's 0% attenuation at 4.2 MHz and 100% attenuation at 4.5 MHz. (The 1941 document on the other hand has a drawing indicating 0% attenuation at 4.0 MHz and 100% attenuation at 4.5 MHz.)
tepples
Posts: 22708
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Emulator palettes vs real NTSC TVs

Post by tepples »

NewRisingSun wrote:
tepples wrote:True, an ideal brick wall at 4.2 MHz will cause the sine wave to have a bigger peak-to-peak amplitude.
By how much? :)
Google fourier transform of square wave finds "Square wave" on Wikipedia, which shows the amplitude of the fundamental frequency as 4/π ≈ 1.27, or 20 log(4/π) ≈ 2.1 dB above the flat part of the ideal square wave.
NewRisingSun
Posts: 1510
Joined: Thu May 19, 2005 11:30 am

Re: Emulator palettes vs real NTSC TVs

Post by NewRisingSun »

Empirically I had found that the value must be about 1.3, so that seems correct. Thanks.
tepples wrote:Two consoles looking different on one TV but the same on another might be an impedance mismatch.
And slew rate would cause an impedance mismatch to affect phase?
tepples
Posts: 22708
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Emulator palettes vs real NTSC TVs

Post by tepples »

As I said, I'm no EE, but perhaps an amplifier tracks the input voltage better when it's feeding a smaller load. I only took time to learn about slew rate in the first place when I realized it led to the same distortion phenomenon as DPCM slope overload.
User avatar
blargg
Posts: 3715
Joined: Mon Sep 27, 2004 8:33 am
Location: Central Texas, USA
Contact:

Re: Emulator palettes vs real NTSC TVs

Post by blargg »

I believe that slew rate is due to limited drive current of an amplifier, combined with the inevitable capacitance in parallel with the load. Here's a great picture of how a limited slew rate can cause a phase shift (red=in, green=out):

Image
Post Reply