It is currently Thu Nov 23, 2017 7:10 pm

All times are UTC - 7 hours





Post new topic Reply to topic  [ 47 posts ]  Go to page Previous  1, 2, 3, 4  Next
Author Message
PostPosted: Sat Jul 05, 2014 8:04 pm 
Offline

Joined: Mon Sep 27, 2004 2:57 pm
Posts: 1248
How do the default settings look? The darker palette just has a lower contrast setting (and a slightly higher saturation setting to compensate).


Top
 Profile  
 
PostPosted: Sun Jul 06, 2014 9:04 am 
Offline
User avatar

Joined: Fri May 21, 2010 4:10 pm
Posts: 279
Drag wrote:
How do the default settings look? The darker palette just has a lower contrast setting (and a slightly higher saturation setting to compensate).



I think they look pretty good. Its close, closer than any other palette i have used. If you are using this palette on a tv instead of a monitor, are there any options you would recommend?

Thanks again for your efforts to the perfect palette! :beer:


Top
 Profile  
 
PostPosted: Sun Jul 06, 2014 11:07 am 
Offline

Joined: Mon Sep 27, 2004 2:57 pm
Posts: 1248
No problem, I'm glad someone else is finally getting use out of it. :P

It really depends on the TV; if it's an LCD TV, then my palette generator with the default settings ought to suffice, though I've discovered that a white point of D55 (0.3324, 0.3474) looks even closer to what my CRT TV displays. To input that into the palette generator, click on "get current preset" and put that coordinate in for Wx and Wy. For you, it might look better or might look worse, this is just what worked best for me.

If you're using a CRT TV with your emulator, I'm not sure what to recommend, because I haven't tested that. I guess it comes down to how warm or cool the grays are.


Top
 Profile  
 
PostPosted: Sun Jul 13, 2014 6:00 pm 
Offline
User avatar

Joined: Fri May 21, 2010 4:10 pm
Posts: 279
sorry for the long delay in my reply. I am using mainly an hd lcd tv. I found that bumping up the brightness on the settings in the emu by about 10 seem to get it pretty good.

Great work on everything sir! :beer: The nes palette has been a long standing discussion (that often) seems to have no true and definitive answer lol.


Top
 Profile  
 
PostPosted: Sun Aug 03, 2014 12:47 pm 
Offline

Joined: Thu May 19, 2005 11:30 am
Posts: 313
A few questions, if I may, from someone who has spent way too much time on the NES palette himself:
  1. How are you normalizing video levels? Are you just treating color 0x0E as black and colors 0x20 and colors 0x30 as white?
  2. Are you using the color burst amplitude as an amplitude reference for the chroma signals?
  3. What does a hue of -0.25 actually mean? How many degrees is that? I notice that 360 degrees (meaning no change) is at about 31.4, so it seems to be related to radians. Better specify it in degrees, because that is what the television literature does.
  4. Are you taking US NTSC's 7.5% setup into account? It seems you do, given that the default brightness and contrast settings are different from zero, though to a larger extent than 7.5% setup would do.

Suggestions:
  1. I assume that "colorimetry" refers to the colorimetry of the emulated ("source") display, and that the target display is always sRGB. Since wide-gamut displays are quite widespread these days, allowing to select the target display colorimetry might be desirable as well.
  2. There are at least two more white points that were (and to some extent are) in widespread use: 9300K+27MPCD (xWhite 0.281, yWhite 0.311) and CIE D93 (xWhite 0.285, yWhite 0.293). Should you decide to add these, you might want to separate the RGB primaries from the white point. Note that when trying to simulate D93 white on sRGB, the blue value will become greater than 1, so everything must be scaled accordingly. Right now that is not done when entering these as custom values.
  3. I understand you are using the default YIQ/YUV to RGB matrix. Using this matrix together with FCC primaries will reflect what an idealized NTSC monitor would show, but not necessarily the actual behavior of an 1980s television set. They would instead use a modified YIQ/YUV-to-RGB matrix with custom phosphors. These can be simulated perfectly if the values are known, which they are to some extent.
    I don't know how deeply you want to dig into this, but given the work you have put into this already, you might want to consider the following two documents:
    Neal, C.B., "Television Colorimetry for Receiver Engineers," IEEE Transactions on Broadcast and Television Receivers, vol.BTR-19, no.3, pp.149,162, Aug. 1973 This documents contains both typical primaries and the matching YUV-to-RGB conversion matrices for 1970s televisions, good enough for NES purposes. The method of creating modified YUV-to-RGB conversion matrices for any set of primaries is described in this document: Parker, N.W., "An Analysis of the Necessary Decoder Corrections for Color Receiver Operation with Non-Standard Receiver Primaries," IEEE Transactions on Broadcast and Television Receivers, vol.12, no.1, pp.23,32, April 1966.


Top
 Profile  
 
PostPosted: Sun Aug 03, 2014 8:30 pm 
Offline

Joined: Mon Sep 27, 2004 2:57 pm
Posts: 1248
NewRisingSun wrote:
How are you normalizing video levels? Are you just treating color 0x0E as black and colors 0x20 and colors 0x30 as white?

I'm using the normalized values from here, which indeed normalize as you say.
Quote:
Are you using the color burst amplitude as an amplitude reference for the chroma signals?

I'm not, because I've read documentation somewhere that says the color burst amplitude has absolutely no effect on the resulting picture. Whether or not that's true seems to depend on who you ask, so I don't have a clear answer as to what to do, nor if any kind of DC bias on the colorburst signal makes any difference.
Quote:
What does a hue of -0.25 actually mean? How many degrees is that? I notice that 360 degrees (meaning no change) is at about 31.4, so it seems to be related to radians. Better specify it in degrees, because that is what the television literature does.

Yeah, the hue tweak is in radians, and that's just because that's how most programming languages handle trig functions. I probably should change it to degrees, now that you mention it, if only because radians kinda suck. :P
Quote:
Are you taking US NTSC's 7.5% setup into account? It seems you do, given that the default brightness and contrast settings are different from zero, though to a larger extent than 7.5% setup would do.

I'm not actually sure what that is, to be honest. The reason the brightness is lower is because I noticed the darkest row of colors didn't look right until I lowered it, and when raising the brightness on my TV, it takes a while for 1D to actually start lightening, so I figured the brightness setting is actually supposed to be lowered. The contrast setting is to compensate for the reduced brightness.


Quote:
I assume that "colorimetry" refers to the colorimetry of the emulated ("source") display, and that the target display is always sRGB. Since wide-gamut displays are quite widespread these days, allowing to select the target display colorimetry might be desirable as well.

Your assumption is correct. I can add the target colorimetry at some point, but I'd be doing it without understanding wide gamut displays nor why it's necessary.
Quote:
There are at least two more white points that were (and to some extent are) in widespread use: 9300K+27MPCD (xWhite 0.281, yWhite 0.311) and CIE D93 (xWhite 0.285, yWhite 0.293). Should you decide to add these, you might want to separate the RGB primaries from the white point. Note that when trying to simulate D93 white on sRGB, the blue value will become greater than 1, so everything must be scaled accordingly. Right now that is not done when entering these as custom values.

I started thinking this was the case, given that neither C nor D65 looked correct for me. The custom colorimetry was a quick afterthought, so it doesn't do any scaling or anything like that. I'm not even sure how I'd need to properly "scale" anything.
Quote:
I understand you are using the default YIQ/YUV to RGB matrix. Using this matrix together with FCC primaries will reflect what an idealized NTSC monitor would show, but not necessarily the actual behavior of an 1980s television set. They would instead use a modified YIQ/YUV-to-RGB matrix with custom phosphors. These can be simulated perfectly if the values are known, which they are to some extent.
I don't know how deeply you want to dig into this, but given the work you have put into this already, you might want to consider the following two documents:
Neal, C.B., "Television Colorimetry for Receiver Engineers," IEEE Transactions on Broadcast and Television Receivers, vol.BTR-19, no.3, pp.149,162, Aug. 1973 This documents contains both typical primaries and the matching YUV-to-RGB conversion matrices for 1970s televisions, good enough for NES purposes. The method of creating modified YUV-to-RGB conversion matrices for any set of primaries is described in this document: Parker, N.W., "An Analysis of the Necessary Decoder Corrections for Color Receiver Operation with Non-Standard Receiver Primaries," IEEE Transactions on Broadcast and Television Receivers, vol.12, no.1, pp.23,32, April 1966.

Thanks for the literature! I knew the YIQ->RGB matricies had to be different from what the FCC specified, if only because I couldn't get anything to look exactly right, I could only get it "close". :P (Trying out D55 like I mentioned before helped me, which kinda opened me to the possibility that TVs may not all be using D65)


Top
 Profile  
 
PostPosted: Sun Aug 03, 2014 8:58 pm 
Offline

Joined: Sun Apr 13, 2008 11:12 am
Posts: 6450
Location: UK (temporarily)
Drag wrote:
Quote:
Are you using the color burst amplitude as an amplitude reference for the chroma signals?
I'm not, because I've read documentation somewhere that says the color burst amplitude has absolutely no effect on the resulting picture. Whether or not that's true seems to depend on who you ask, so I don't have a clear answer as to what to do, nor if any kind of DC bias on the colorburst signal makes any difference.
DC during colorburst definitely doesn't matter, but is "supposed" to be 0 IRE.
This is how Macrovision works: VCRs have AGCs, and they sample for "darkest value on a scanline" during colorburst. Macrovision adds a large positive offset during colorburst.

Atari's 2600, as initially released, specifically was designed to take the nominal chroma scaling into account by attenuating the colorburst to the nominal ~40 IRE. Later revisions removed the connection for cost savings, because it was found that basically no televisions cared. Most don't even scale luminance through composite, simply assuming that the sync depth is "good enough"; the only AGC is on OTA because it's amplitude modulated and so has to be adjusted for distance from the broadcaster.

Quote:
Quote:
Are you taking US NTSC's 7.5% setup into account? It seems you do, given that the default brightness and contrast settings are different from zero, though to a larger extent than 7.5% setup would do.

I'm not actually sure what that is, to be honest.
Nominally, US NTSC TV (but not Japanese NTSC TV) defines "black" as 7.5 IRE, and values below that as blacker-than-black. Most early video game consoles don't support this, but the end result is that a console that doesn't compensate for this in the US will have a smidge more contrast and darker than in Japan.


Top
 Profile  
 
PostPosted: Sun Aug 03, 2014 10:06 pm 
Offline

Joined: Mon Sep 27, 2004 2:57 pm
Posts: 1248
Oh I see. So this means that the blanking and black levels on the NES are the same?

Reducing the brightness to -0.075 instead of -0.2 doesn't look like my TV though, on which, color $08 is almost black. I guess my TV likes to make the picture even darker?


Top
 Profile  
 
PostPosted: Sun Aug 03, 2014 10:17 pm 
Offline

Joined: Sun Apr 13, 2008 11:12 am
Posts: 6450
Location: UK (temporarily)
Yeah, the NES only emits one voltage—color $1D—for blanking, black, &c.

On the bright side, it's not the hack that was video for the ZX80 (which by default assumed that the TV didn't do any normalizing at all and so just emitted full white during the back porch)


Top
 Profile  
 
PostPosted: Mon Aug 04, 2014 12:05 am 
Offline

Joined: Thu May 19, 2005 11:30 am
Posts: 313
Drag wrote:
I'm using the normalized values from here, which indeed normalize as you say.
The problem is that no television set will do it this way, because it has no way of knowing at what video level the console wants its black and white to be. Consider this post of mine, which might explain why you need brightness values lower than -0.075 to replicate a particular television set.
lidnariq wrote:
Atari's 2600, as initially released, specifically was designed to take the nominal chroma scaling into account by attenuating the colorburst to the nominal ~40 IRE. Later revisions removed the connection for cost savings, because it was found that basically no televisions cared.
With a baseband composite connection, my multi-standard Sony CRT uses the color burst amplitude as an amplitude reference for chroma with PAL signals and as an amplitude reference for the entire signal with NTSC signals.
lidnariq wrote:
the only AGC is on OTA because it's amplitude modulated and so has to be adjusted for distance from the broadcaster.
This is very important, and implies that the same television with the same console would produce pictures of different brightness between a baseband composite connection and an RF-modulated connection.
Drag wrote:
I can add the target colorimetry at some point, but I'd be doing it without understanding wide gamut displays nor why it's necessary.
Wide gamut displays provide a more saturated picture. Normal sRGB images will appear oversaturated on them. Most web browsers nowadays can be made to color manage these images, that is, convert their values from sRGB to the monitor's native primaries. I have not seen any NES emulator provide that functionality. Therefore, it would be useful to specify the target monitor's primaries to generate the correct colors directly. Another advantage of this is that on these monitors, saturated reds and greens can be seen without them needing to be clipped (as much).
Drag wrote:
The custom colorimetry was a quick afterthought, so it doesn't do any scaling or anything like that. I'm not even sure how I'd need to properly "scale" anything.
Convert 100% white (R=G=B=1.0) to the target colorspace, take the largest value, and divide everything else by that, i.e. if you get R=1.0, G=1.1, B=1.2, then divide everything by 1.2. All this with linear values (the ones you use for color space conversion), not gamma-correct values.

By the way: if you wanted to get really crazy, you could emulate differential phase distortion as well. Basically, your current hue setting is amplitude-independent, so it influences the hue shift at zero amplitude. You could add a second hue setting that is multiplied with Y, resulting in a total hue shift for any pixel of baseHue+Y*diffHue. While it does replicate what is definitely going on in some devices, it might be a bit far out there for emulation purposes.


Top
 Profile  
 
PostPosted: Mon Aug 04, 2014 1:14 am 
Offline

Joined: Sun Apr 13, 2008 11:12 am
Posts: 6450
Location: UK (temporarily)
NewRisingSun wrote:
This is very important, and implies that the same television with the same console would produce pictures of different brightness between a baseband composite connection and an RF-modulated connection.
Yes, but the usual AGC is on sync depth, normalizing it to -40 IRE. There's no clearly correct thing to do if given contradictory gains needed to normalize both colorburst and sync depth.


Top
 Profile  
 
PostPosted: Mon Aug 04, 2014 5:36 am 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 19254
Location: NE Indiana, USA (NTSC)
On my Magnavox CRT SDTV, I do get a noticeably brighter picture with composite out than with RF out. I noticed this when I decided to use RF out when recording composite with my DVD recorder to prevent lag during gameplay


Top
 Profile  
 
PostPosted: Sun Oct 19, 2014 7:34 pm 
Offline
User avatar

Joined: Fri May 21, 2010 4:10 pm
Posts: 279
Hey Drag,

Ive been using the settings for :

{-0.25, 0.8, -0.2, 1.0, 1.0}
This is a slightly darker palette, but represents the colors a lot more faithfully, for instance, I can actually tell what hues the lightest colors are, just like I can with my TV. If you're looking to design some graphics and you really need an accurate representation of the colors and how they contrast against each other, this is what I'd recommend. Even though I said this palette is darker, you don't even notice unless you have something BRIGHT WHITE open next to your emulator.

This palette is pretty awesome. One question, is there something i can do to make the palette retain the green color from the border on the second stage of contra for nes? It is supposed to look like this:

Image

but instead it is just one single gray color, hardly any green. When this palette is used in conjunction with the ntsc filter it really shines. Just a few little tweaks and it will be almost perfect. Just figured i would ask since you know more about the settings on the palette generator page than i do. I always remember the nes colors being dark and gritty than most emulators show. Do you know what i can do to retain this green back and possibly just lighten up things just a tiny hair?

anyways this palette generator is awesome, thanks for all you have put into it. Im definitely getting use of it.


Top
 Profile  
 
PostPosted: Sun Oct 19, 2014 11:22 pm 
Offline

Joined: Mon Sep 27, 2004 2:57 pm
Posts: 1248
Bumping the brightness up a little might help. The green does look a little dark in the palette, but I haven't looked at this scene on my TV so I don't know if it's dark there too.


Top
 Profile  
 
PostPosted: Thu Jan 08, 2015 8:57 pm 
Offline

Joined: Tue Jul 10, 2012 1:37 pm
Posts: 54
I apologize for bumping this topic. I am sure many are aware of my open source project Retro Graphics Toolkit http://forums.nesdev.com/viewtopic.php?f=21&t=9894. If you don't mind would it be okay if I use your code for palette generation? It was easy to port to C++. I have not committed the code as I have not implemented de-emphasis bits yet which I believe to be a necessary feature. If you have any pointers on how to implement de-emphasis bits I would appreciate that greatly.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 47 posts ]  Go to page Previous  1, 2, 3, 4  Next

All times are UTC - 7 hours


Who is online

Users browsing this forum: No registered users and 3 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group