Overscan/HBlank

Discuss technical or other issues relating to programming the Nintendo Entertainment System, Famicom, or compatible systems. See the NESdev wiki for more information.

Moderator: Moderators

NewRisingSun
Posts: 1510
Joined: Thu May 19, 2005 11:30 am

Post by NewRisingSun »

After this, what is the most lacking aspect of best of breed NES emulation?
Output voltage levels, required for accurate colors (and to a lesser extent, dot crawl).
augnober
Posts: 23
Joined: Sun Jan 08, 2006 12:22 pm

Post by augnober »

Jagasian wrote:Will this new NTSC emulation have any benefit for PC's hooked up to CRT based NTSC televisions, or is it only useful for emulating on a PC's CRT? This looks like NES emulation as taken another leap in accuracy. Is anything like this necessary for audio?
This seems like it's heading offtopic and should move to another thread instead of continuing here further. I'll reply for now though.

The audio possibility is something I thought of too while reading this thread but I was going to wait until the video stuff is released before mentioning it. Yes, this kind of thing could be done for audio too, and it'd probably be cool to eventually see the video and audio together for super-authentic emulation. People have already done that kind of audio processing before, though perhaps not in emulation, and I don't remember exactly where (I remember seeing the proper processing as part of a set which also included an option to encode to telephone encoding and back again). Finding someone's implementation which takes a wav file may be as easy as doing a search for both the particular spec which the NES uses to send audio to the TV (those kind of jibberish standards body numbers) and source.

As for connecting to a TV.. Though I'm not an expert, I suspect that if you hook up high-quality outputs to a high-quality TV you could get a result which is worth using. There's no benefit over the PC though (Edit: A non-composite interlacing mode would be best to use on TV's. I think TV's would have an advantage there). It's worth pointing out that you'd want to avoid using a composite connection, because otherwise you'd be applying an extra layer of artifacting that would further mess with the emulation's output. Unfortunately I'm not aware of any way to bypass a PC's (or any console's) video output hardware and output your own raw values (perhaps this is what you were thinking?). If you're very hardcore, you could in theory create a hardware/software solution which could output your own software-processed NTSC signal from the PC :).. but I think it'd be difficult and not usable by many people.

I'll add that the video processing screenshots look insanely cool :)
LocalH

Post by LocalH »

Using S-Video or better interconnects between the PC and the video monitor/TV, then the image should be 99% accurate to a real NES - the only visual difference will be that the PC outputs an interlaced signal while the real NES is noninterlaced. I don't know of any way to force a TV-out to skip the extra half-scanline that triggers sets to interlace, so that sounds like the best one could get.
User avatar
blargg
Posts: 3715
Joined: Mon Sep 27, 2004 8:33 am
Location: Central Texas, USA
Contact:

Post by blargg »

I used the composite NTSC video output from my Mac and a standard NES emulator looked pretty good on a TV. I was expecting horizontal scrolling to result in the "comb" effect due to interlace, but didn't see any, though it might be due to the flicker-reduction filter that the video output has. I don't have a TV with S-Video inputs so I can't try that with my modified NES emulator that does NTSC emulation. I'm hoping to post a Win32 build of it and source code sometime soon.
LocalH

Post by LocalH »

Interlace combing is rarely visible on a true interlaced display, since each field has already started to fade by the time the other field is drawn. I haven't done it in a while, but I used to play NES games using TV-out all the time, and I never saw any combing artifacts. You see those mostly when capturing interlaced signals.
User avatar
blargg
Posts: 3715
Joined: Mon Sep 27, 2004 8:33 am
Location: Central Texas, USA
Contact:

Post by blargg »

I haven't done it in a while, but I used to play NES games using TV-out all the time, and I never saw any combing artifacts.
But that's the point, the TV outputs on PCs usually include a "flicker reducer" filter since computer graphics with their crisp images would otherwise flicker badly on an interlaced display.
drk421
Posts: 329
Joined: Sun Nov 14, 2004 11:24 am
Contact:

Post by drk421 »

AdvanceMAME is able to mess with your video card registers to output a 15.75Khz signal. I run the DOS version on a MAME cabinet I have, and hacked a TV for RGB input.

Original I used an AD723 to get S-Video/Composite NTSC for use on a regular TV.

It works quite well, and you can get non-interlaced output out of a PC. I think windows/dos/linux ports exist.
LocalH

Post by LocalH »

blargg wrote:
I haven't done it in a while, but I used to play NES games using TV-out all the time, and I never saw any combing artifacts.
But that's the point, the TV outputs on PCs usually include a "flicker reducer" filter since computer graphics with their crisp images would otherwise flicker badly on an interlaced display.
Right, but I always turned that crap off. I never noticed any excessive flickering with NES emulator output.
tepples
Posts: 22708
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Post by tepples »

blargg wrote:the TV outputs on PCs usually include a "flicker reducer" filter since computer graphics with their crisp images would otherwise flicker badly on an interlaced display.
So does the GameCube. In the case of the GameCube, the game can turn on and off a [1 2 1]/4 line convolution kernel that the RAMDAC applies to each component before encoding it in video.
tepples
Posts: 22708
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Post by tepples »

LocalH wrote:Interlace combing is rarely visible on a true interlaced display, since each field has already started to fade by the time the other field is drawn. I haven't done it in a while, but I used to play NES games using TV-out all the time, and I never saw any combing artifacts.
All NES games and almost all Super NES games ran in 240p (NTSC) or 288p (PAL), not 480i or 576i.
User avatar
blargg
Posts: 3715
Joined: Mon Sep 27, 2004 8:33 am
Location: Central Texas, USA
Contact:

Post by blargg »

All NES games and almost all Super NES games ran in 240p (NTSC) or 288p (PAL), not 480i or 576i.
This mini-discussion was about displaying emulator output on a TV using PC video cards that only output an interlaced signal, and how negatively it affected the result as compared to the non-interlaced output of a NES.
Jagasian
Posts: 421
Joined: Wed Feb 09, 2005 9:31 am

Post by Jagasian »

Wouldn't the ideal accurate NES emulator try to get video output to a TV to look the same as the NES's video output to the same TV? Does anybody have the ability to capture the NES's video output and compare it against the video output coming out of the TV out of a computer? Would a TV tuner card be an accurate way to capture clips of video or do TV tuner cards introduce too many artifacts of their own, to be able to be a tool for objectively measuring the accuracy of video emulation?

What I am getting at is, lets get straight to the point and start posting screen captures of emulation video sent out on, composite, S-video, and component, side-by-side compared to the NES's composite video. That way everybody can see how close emulation can get the video to look like when displayed on a TV.
User avatar
blargg
Posts: 3715
Joined: Mon Sep 27, 2004 8:33 am
Location: Central Texas, USA
Contact:

Post by blargg »

You have identified two main targets for emulation: casual emulation using the PC monitor and speakers, and hard-core emulation using a TV. The main point of this thread has been the former, making the image on a computer monitor look close to what you see on a TV.

Hard-core emulation on a TV can be taken as far as generating the raw NTSC composite signal using a high-speed DAC. On the old boards someone was talking about taking this route. This seems the best way to go if you want to use a TV, since otherwise it might be very difficult to ensure that your emulator->RGB->PC video card->NTSC->TV matches NES->TV, due to the PC video card unknowns.
LocalH

Post by LocalH »

I think you can reach sufficient results for 90% of people by having the emulator able to use either 640x480 or 720x480 mode for NTSC fullscreen, disabling scanlines (otherwise it will flicker like hell), weaving emulated fields together into a frame, and offering an option for PC or TV color so that the user can choose what looks best on their TV. Higher resolutions can be offered in fullscreen, but they're pretty much useless for TV-out. Sure, it won't be 100% accurate, but with S-Video or better, you won't have any additional artifacts on top of the emulated NES artifacts.
Post Reply