It is currently Wed Apr 25, 2018 11:00 pm

All times are UTC - 7 hours





Post new topic Reply to topic  [ 13 posts ] 
Author Message
PostPosted: Fri Aug 26, 2016 8:43 am 
Offline

Joined: Fri Aug 26, 2016 8:32 am
Posts: 2
For a long time i've had this notion that input lag in emulation is unavoidable, simply because it always seems to be there. But is this accurate? Is input lag inherently a problem of emulation, or is it due to other things piling up, such as our hardware not being powerful enough and monitors not being fast enough? I know that some of this is subjective, but I'd like to have a better idea as to why it happens.

Sorry to invade your forum with a basic question, but I wanted to ask people who know.


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 9:14 am 
Offline
Formerly 43110
User avatar

Joined: Wed Feb 05, 2014 7:01 am
Posts: 320
Location: us-east
Lag usually happens when there's a buffer that must be filled up before can begin to be emptied. Often there's is a fixed data rate flowing to or from these buffers by their nature (like in a frame buffer). So throwing "more powerful processing" won't help. The environment of emulation usually has larger and more buffers in the entire line from initial audible/visual output to button input to resulting audible/visual output.


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 9:20 am 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 19961
Location: NE Indiana, USA (NTSC)
In particular:
  • Modern gamepads are USB, and USB typically has bigger buffers than the SPI-like protocol of the NES controller. In addition, operating systems will typically sample USB gamepads around once or twice per frame, which is slower than the NES controller interface allows.
  • Modern PC operating systems use an audio mixer, which introduces a buffer.
  • Modern PC operating systems use a video compositor for windowed operation and tearing avoidance, which introduces a buffer.
  • LCD monitors also include a buffer.


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 9:57 am 
Offline

Joined: Fri Aug 26, 2016 8:32 am
Posts: 2
So unless I'm misunderstanding, playing on a CRT with a NES or Analogue NT will always be the best in terms of input lag, because emulation will always be going against the current of modern hardware/operating systems...?


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 12:01 pm 
Offline
User avatar

Joined: Sun Jan 22, 2012 12:03 pm
Posts: 6226
Location: Canada
UserError021 wrote:
So unless I'm misunderstanding, playing on a CRT with a NES or Analogue NT will always be the best in terms of input lag, because emulation will always be going against the current of modern hardware/operating systems...?

No, I wouldn't say so. Lots of PCs are completely capable of very low latency emulation and operation.

The most common problem is HDTVs, which tend to be designed with playing video (which, not being interactive, doesn't care about lag), so they have various delays while it processes the video (colour correction, motion interpolation, etc. they do all sorts of junk). A lot of TVs have a "game mode" specifically to turn off unnecessary processing and lower the delay, but sometimes the TV just isn't designed with games in mind (they're an afterthought). Also if you're using a composite input signal, it will typically interlace two 240p frames into a single 480i frame, halving the framerate and introducing delay while it buffers the signal for interlacing.

A CRT is always immediate display, but under the right conditions a PC can be effectively as good as this. PC monitors are designed to be used with computers (i.e. interactive devices) and often have very fast response. (There are good and bad monitors though, in this respect. Also: fullscreen or bust- windowed modes always introduce extra buffering.)

So I guess, yes, if you want to be sure, the CRT with a real device is hard to beat, but if you get a good collection of parts on a PC it's just as good, I think.


I think "output lag" would be a more accurately descriptive term than "input lag" for gaming setups, because the most common/prominent delay problem is on the output video, not the inputs coming in. (Audio is almost always significantly delayed on modern systems, much more than video, but for most games it isn't considered a problem until it gets relatively large.) Though you could say that a monitor's output lags from its input (the video signal it receives, not your gamepad input), that's probably the more accurate application of the term.


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 12:40 pm 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 19961
Location: NE Indiana, USA (NTSC)
rainwarrior wrote:
tepples wrote:
Modern gamepads are USB, and USB typically has bigger buffers than the SPI-like protocol of the NES controller. In addition, operating systems will typically sample USB gamepads around once or twice per frame, which is slower than the NES controller interface allows.

This does not sound correct to me at all. You think the OS polls the gamepad on its own schedule, rather being done not at the request of the program?

Even if the program does poll controllers rapidly, I doubt that a low-speed (1.5 Mbps) device can return valid data 7,200 times a second. I can think of a couple NES use cases for polling 120 times in one frame: seeding a PRNG or reading a light gun. But I agree that for in-game operation with a standard joystick, a well-written application should be able to manage 60 or 120 Hz polling. But there are still possibly a few ms of lag between when the emulator requests the state from the controller and when the game requests it from the emulator.

Quote:
What do you mean "bigger buffers"?

USB packets are far larger than NES controller packets.


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 12:56 pm 
Offline
User avatar

Joined: Sun Jan 22, 2012 12:03 pm
Posts: 6226
Location: Canada
Heh, I tried to delete that before you saw it so I wouldn't have to get into it, but I'll respond to this:

tepples wrote:
USB packets are far larger than NES controller packets.

We're talking about a problem of time though. The size of the packet is just a variable.

How long does an NES controller packet take to read? ~300 cycles at 1.8 MHz = 0.2 ms, maybe 3x as much on DPCM games?

How long does a USB packet for a controller take to read on a modern PC? I'd bet it's less than 0.2ms.


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 1:35 pm 
Online

Joined: Sun Apr 13, 2008 11:12 am
Posts: 7036
Location: Seattle
rainwarrior wrote:
How long does a USB packet for a controller take to read on a modern PC? I'd bet it's less than 0.2ms.
1.5Mbit and 12Mbit USB devices operate on a 1kHz timebase, so the latency is a uniformly distributed random variable from 0 to 1ms.

This paper: http://doc.utwente.nl/56344/1/Korver03adequacy.pdf

says it's much much worse, that typical real-world USB input devices have at least 11ms latency (see histogram on p27(PDF)=p23(printed))

Maybe that's the latency from poll to response? In which case, it doesn't get better than half (5.5ms; or about the same as the output latency)


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 3:24 pm 
Offline
User avatar

Joined: Sun Jan 22, 2012 12:03 pm
Posts: 6226
Location: Canada
Ah, interesting. Yeah the latency itself is what's important here.

So according to that, not insignificant, but also sounds a smaller factor than typical display lag problems?


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 4:18 pm 
Offline
User avatar

Joined: Sat Sep 07, 2013 2:59 pm
Posts: 1588
UserError021 wrote:
For a long time i've had this notion that input lag in emulation is unavoidable, simply because it always seems to be there. But is this accurate?

Does the emulator you use happen to render its graphics with Direct3D in fullscreen and you turned on vsync?
If yes, then this is a very specific problem of DirectX: For some reason, vsync in Direct3D has a huge input lag of three seconds frames. The old rendering method DirectDraw doesn't have this problem.

You can test this phenomenon with MAME:

Take a camera that can record videos with 60 FPS.
Set your monitor to 60 Hz.
Use a game that plays with 60 FPS.
Load the game in MAME and map one of the action buttons to the Shift Lock or Num Lock key.
Position the camera so it can film your screen and the keyboard at once.

Press the Shift Lock or Num Lock key, so that the character on the screen does something.

Watch the video and count how many frames it takes between the key's LED on they keyboard going on and the character doing his action on the screen.

vsync with Direct3D in fullscreen will always lag longer than vsync with DirectDraw or vsync with Direct3D in windowed mode.

_________________
Available now: My game "City Trouble".
Website: https://megacatstudios.com/products/city-trouble
Trailer: https://youtu.be/IYXpP59qSxA
Gameplay: https://youtu.be/Eee0yurkIW4
German Retro Gamer article: http://i67.tinypic.com/345o108.jpg


Last edited by DRW on Fri Aug 26, 2016 4:53 pm, edited 1 time in total.

Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 4:30 pm 
Offline
User avatar

Joined: Sun Jan 22, 2012 12:03 pm
Posts: 6226
Location: Canada
DRW wrote:
this is a very specific problem of DirectX: For some reason, vsync in Direct3D has a huge input lag of three seconds. The old rendering method DirectDraw doesn't have this problem.

Three seconds is absurd. Did you mean three frames?

The newer Direct3D API is just as good for response as old ones when used correctly. I don't know if this is a problem specific to your system, or a bad implementation in MAME, or what, but it's simply not true that Direct3D is inherently worse than DirectDraw. The API was carefully designed with games in mind, and it's capable of working properly (of course depends on correctly implemented drivers for the hardware you're using, correct implementation by software using the API, etc., on PC there's always a way to screw it up somewhere in the chain-- this is a big advantage for the original hardware with CRT, far fewer setup failure problems).

Like, possibly you are getting increased lag on your particular hardware + driver setup, or MAME's two implementations are divergent in a signifcant way that adds lag in the Direct3D version?


But yes I agree with the suggestion to measure. If you care about it, measure it. Even a lot of phone cameras these days have decent "slow motion" modes that you can use to capture a good test.


Last edited by rainwarrior on Fri Aug 26, 2016 4:50 pm, edited 1 time in total.

Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 4:49 pm 
Offline
User avatar

Joined: Sat Sep 07, 2013 2:59 pm
Posts: 1588
rainwarrior wrote:
Three seconds is absurd. Did you mean three frames?

Oops. Yes, frames.

rainwarrior wrote:
The newer Direct3D API is just as good for response as old ones when used correctly. I don't know if this is a problem specific to your system, or a bad implementation in MAME, or what, but it's simply not true that Direct3D is inherently worse than DirectDraw.

I did a good bunch of tests and discussed this in detail on the MAME forums.
Also, it's not specific to MAME. Nestopia has exactly the same effect. In fact, I noticed it by chance, when I tried Nestopia for the first time, enabled vsync and used it in fullscreen: I immediately noticed that something is off because Mario didn't react as fast as I was used to. That was when I first got into contact with input lag at all.

Then I did the tests.

So, unless you can confirm that you actually did the tests that I described, measuring the lag in the same game with a camera in straight comparison between Direct3D, DirectDraw, window vs. fullscreen and vsync vs. no vsync, please be aware that a simple "The newer API is just as good as the old one" doesn't convince me. It was just too consistent among various PC systems to be classified as a mere coincident. And since the stuff happens in MAME and Nestopia, it's probably not a MAME-issue either.

_________________
Available now: My game "City Trouble".
Website: https://megacatstudios.com/products/city-trouble
Trailer: https://youtu.be/IYXpP59qSxA
Gameplay: https://youtu.be/Eee0yurkIW4
German Retro Gamer article: http://i67.tinypic.com/345o108.jpg


Top
 Profile  
 
PostPosted: Fri Aug 26, 2016 5:14 pm 
Offline
User avatar

Joined: Sun Jan 22, 2012 12:03 pm
Posts: 6226
Location: Canada
DRW wrote:
discussed this in detail on the MAME forums.

Well, searching for this led me to a lot of discussions about ATI's "flip queue" setting, which apparently by default buffers 3 extra frames for some reason (but the setting can thankfully be changed). I don't have an ATI card, so I can't test this.

If that's your specific problem, then this is at the driver level on some ATI cards, not an inherent problem in Direct3D itself.


(But yeah, I fully understand people who can't stand to deal with this crap and will give up trying to get low latency on their PC. Finding the correct setup is definitely a big drawback.)


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 13 posts ] 

All times are UTC - 7 hours


Who is online

Users browsing this forum: Majestic-12 [Bot], TmEE and 3 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group