Latency challenge of conventional emulation compared to FPGA

You can talk about almost anything that you want to on this board.

Moderator: Moderators

Post Reply
tepples
Posts: 22288
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Latency challenge of conventional emulation compared to FPGA

Post by tepples » Tue Nov 03, 2020 2:58 pm

The Super Game Boy (SGB) accessory interfaces the LR35902 system on chip (SoC) of the Game Boy with the Super NES, feeding pixels to VRAM to be read by the S-PPU. A user suggested in another topic to do the same thing by running emulation cores from the MISTer project on an FPGA. Another user suggested running the emulator on a more conventional computer with a program counter, such as ARM Cortex microcontroller. There is precedent for this in the AI coprocessor of a Morita Shogi game for Super Famicom.

As lidnariq pointed out in that topic, a conventional emulator can be just as accurate as an emulator on an FPGA. It can indeed prove convincing for single-player games that use a standard mapper, use a standard controller for input, and use standard audio and video output. In practice, however, on a conventional computer, latency is likely to interfere with interfacing to anything else unless the same emulator also emulates the anything else. I'll take replicating SGB functionality as an example.

Novel mappers

Game Boy ROM address space is limited to 32 KiB. To work around this, cartridges use mappers to switch pages of address space. An FPGA-based emulator can communicate with mappers by interacting with the cartridge on the 240 ns time scale that the cartridge expects. A conventional emulator must read the entire cartridge into RAM at the start of play and thus must be aware of all common mappers. The vast majority of Game Boy games use the MBC1, MBC3, or MBC5 mapper to switch banks. If a new release uses a specialized mapper, such as MMM01, EMS, or TPP1, somebody needs to update all the emulators to support it.

Serial communication

All Game Boy systems other than the original SGB support Game Link, a synchronous serial communication port clocked at 8192 bits per second. It sends an interrupt to both CPUs after every 8 bits finish (up to 17 times a frame). An emulator without a sub-millisecond latency link to a similar emulator would need to emulate two Game Boy systems (or four for DMG-07 games). Two hosts each emulating both systems would exchange controller presses to remain in sync, or one host could display both systems in split screen.

It gets even less practical with Robopon and several Game Boy Color games that support infrared communication. As I understand it, the IR protocol operates a bit at a time, using on-off keying on a carrier in the 30 to 50 kHz range, and each system encodes and decodes the modulation in software using loops timed down to the microsecond. FPGA can time things down to the microsecond. Conventional emulator? Good luck.

Input and output devices

Several Game Boy cartridges include sensors connected to the Game Pak edge connector. Examples include the accelerometer of Kirby Tilt n Tumble and the image sensor of Game Boy Camera. Others can be connected to specialized input and output devices through serial, such as a Game Boy Printer that prints grayscale images 160 pixels wide on 38 mm thermal paper, or even a PC keyboard or MIDI sync option for use with the LSDJ tracker. It has been suggested that the emulator emulate these as well. Though this would come at no extra cost to the FPGA, a conventional emulator would need to find something on the host to which to map each input or output.

93143
Posts: 1317
Joined: Fri Jul 04, 2014 9:31 pm

Re: Latency challenge of conventional emulation compared to FPGA

Post by 93143 » Thu Nov 05, 2020 9:56 pm

tepples wrote:
Tue Nov 03, 2020 2:58 pm
It can indeed prove convincing for single-player games that use a standard mapper, use a standard controller for input, and use standard audio and video output.
I'm not sold on that. I've ranted about this before, but...

I suppose "can" is a pretty subjective word, but I find that the gamefeel changes substantially, and a lot of games are sufficiently random and fast-paced that you can't just learn what moves to do in what sequence - you actually inevitably have a harder time with the game. It's not just Punch-Out!!; there's a wide range of games between "I can play this game with my eyes closed and the sound off" and "the final boss is literally impossible on an LCD unless you're Nico Rosberg".

I find Super Mario World almost unplayable in emulation. Super Mario Kart is bad too. F-Zero, oddly, isn't quite as bad, but it definitely destroyed my finely-honed sense of timing. IMO it's only convincing for people who don't have a finely-honed sense of timing and people who play on flat panels.

Except for SNES Doom. I'm pretty sure the latency in that game is already so bad that the difference is immaterial. Super FX emulation inaccuracy is probably more important...

...

Speaking of SNES Doom, I keep hearing that XBand is impossible to use today. What's the technical reason for this? Does it have anything to do with the thread topic?

calima
Posts: 1307
Joined: Tue Oct 06, 2015 10:16 am

Re: Latency challenge of conventional emulation compared to FPGA

Post by calima » Fri Nov 06, 2020 1:07 am

Xband doesn't work because SIP/other phone emulation has higher latency than phone modems. Phone lines no longer exist in many places.

Pokun
Posts: 1761
Joined: Tue May 28, 2013 5:49 am
Location: Hokkaido, Japan

Re: Latency challenge of conventional emulation compared to FPGA

Post by Pokun » Fri Nov 06, 2020 7:54 am

Well, it's widely known that even one frame of input lag kills games of high skill, and in serious speedruns only real hardware are allowed. Correct latency is definitely part of what makes emulation accurate.

The main argument seems to me to be that software emulators can in theory be as accurate as conventional programmed logic emulators. The main argument in favor of programmed logic emulators is that they are said to be able to, in theory, emulate the target hardware with 100% accuracy.

Software emulators can supposedly use advanced time travel techniques by guessing a frame in advance and go back and correct it before it's processed and such things, but I'm not sure if any conventional emulators does that in practice?
Likewise, programmed logic emulators generally don't emulate the hardware on gate level, but uses a higher abstraction level which approximates the devices, and may therefore not be as accurate as you'd hope. Also there may be analogue parts of the hardware (especially in older arcade systems) which can't be emulated with FPGA technology anyway, and must be approximated even more. For not mentioning mechanical parts like ROB (or the whole living room in Mario Kart Live Home Circuit).

User avatar
Dwedit
Posts: 4412
Joined: Fri Nov 19, 2004 7:35 pm
Contact:

Re: Latency challenge of conventional emulation compared to FPGA

Post by Dwedit » Fri Nov 06, 2020 2:01 pm

I didn't really see any point in the first post where Input to Output latency was mentioned at all, but that's what the topic ended up getting changed to...

When you have RunAhead 1, and Exclusive Fullscreen mode, you've matched console latency for a local keyboard, then RunAhead 2 (if the game actually has two frames of internal lag) will match console latency for a Bluetooth controller. It's absolutely miraculous.

The problem is that speedrunners have decided that RunAhead is banned because it internally uses savestates, or because it can exceed actual consoles if you have a real CRT connected to the PC. Never mind that CRT monitors connected to PCs are pretty rare.

Exclusive Fullscreen mode is important, without it, Windows adds 3 frames of extra lag.
Here come the fortune cookies! Here come the fortune cookies! They're wearing paper hats!

Pokun
Posts: 1761
Joined: Tue May 28, 2013 5:49 am
Location: Hokkaido, Japan

Re: Latency challenge of conventional emulation compared to FPGA

Post by Pokun » Sun Nov 08, 2020 8:51 am

The first post is all about latency and communication with input and output devices.

The Mister doesn't seem to support much real hardware outside normal controllers though. I'm not sure it even has enough pins for all interfaces it emulates, like cartridge connectors and expansion ports.

tepples
Posts: 22288
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Latency challenge of conventional emulation compared to FPGA

Post by tepples » Sun Nov 08, 2020 9:26 am

Analog directional controllers

Runahead on a conventional emulator works great for something like a standard NES or Super NES controller, where predicting no change from the last frame is such a good model of input that run-length encoding (RLE) is standard practice in games' attract mode and the Hori Game Repeater. It's less effective for an analog controller, such as the Super NES Mouse, the Control Stick of Nintendo 64 and later consoles, the Circle Pad of the Nintendo 3DS and Sony PSP, or the accelerometer of Kirby Tilt n Tumble or the Wii Remote. Runahead can be made compatible with an analog controller by giving controller response a bit of hysteresis, so that random noise from an imperfectly steady thumb doesn't cause quite so many rewinds. This might end up simulating the 16-direction response of an Intellivision controller, the 7x7 grid response of the Sinistar "49-way" joystick, the 8-direction by about 4-distance interpretation of the input layer of Bomberman 64, etc. It also tends to be quite game-specific, possibly running into legal encumbrances on use of game-specific hacks.

Even then, I don't see runahead as doing anything useful for novel mappers, front-facing image sensors (EyeToy, PlayStation Move, Kinect), or serial communication.

Post Reply