The lowest level possible for modern game development?

You can talk about almost anything that you want to on this board.

Moderator: Moderators

User avatar
Drew Sebastino
Formerly Espozo
Posts: 3496
Joined: Mon Sep 15, 2014 4:35 pm
Location: Richmond, Virginia

The lowest level possible for modern game development?

Post by Drew Sebastino »

Another typical Espozo question... :lol: I've heard all this business about how the Nintendo Switch is easy to develop for, but I realized I really didn't know what that meant in a modern context. It's not like you have to deal with crap like sprites only taking 16KB of VRAM (obviously), and dealing with how different areas of memory are placed or formatted (HiOAM table. Again, obviously not an issue, but you all know how much experience outside the SNES I have...) would, I imagine, be completely eliminated by the level of abstraction in programming now a days. I'm sure many of you have heard the news about Unreal Engine running on the Switch, which, does that mean the Wii U didn't? Does "easy to develop for" mean that all the popular game engines are running on it, or it's easy to make game engines for it? If you were to develop a game for the PS4 and Xbone, would you have to change anything at all outside of pictures for button icons or whatever (assuming you're not taking advantage of the PS4's faster GPU) if they're both using a game engine supported for either platform?

Because I don't play modern PC games, how does a PC game not differ for each GPU being used? Correct me if I'm wrong, but I don't think there's some kind of GPU standard like x86 that works with (nearly) every other x86 processor (made before it). I saw something called an API that I guess is a program that runs at all times as a middle man is massaging data from a program to work with the GPU, but I couldn't find much about it. Even though a console game is only designed to run on one system, I heard that register lists are not even supplied to developers anymore, which means that the games would have to still interact with the API (which runs in the background here too?) I couldn't find this with a quick Google search, and I know rainwarrior, knows about this sort of stuff (and 93143 said this place has gotten more boring since I stopped posted whenever the simplest problem would run my way... :lol:)
User avatar
tokumaru
Posts: 12427
Joined: Sat Feb 12, 2005 9:43 pm
Location: Rio de Janeiro - Brazil

Re: The lowest level possible for modern game development?

Post by tokumaru »

Espozo wrote:how does a PC game not differ for each GPU being used?
This is what APIs like DirectX and OpenGL are for. The games just have to use the functions defined by these standards, and the GPU manufacturers are the ones responsible for making sure that their hardware will behave accordingly.
I saw something called an API that I guess is a program that runs at all times as a middle man is massaging data from a program to work with the GPU, but I couldn't find much about it.
An API is the interface you use to interact with something. It's a set of functions with documented parameters and return values that you use as described without worrying about their implementation. The people writing libraries and drivers and manufacturing hardware will worry about that.
tepples
Posts: 22705
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: The lowest level possible for modern game development?

Post by tepples »

DirectX, OpenGL, OpenGL ES, Mantle, Vulkan, or Metal? I seem to remember the PlayStation 4 and Xbox One using Mantle, a proprietary API that formed the basis for Vulkan. Apple has the competing Metal, because it needed something before Vulkan was ready, and has proposed a Metal-based WebGPU spec to the W3C.
User avatar
rainwarrior
Posts: 8731
Joined: Sun Jan 22, 2012 12:03 pm
Location: Canada
Contact:

Re: The lowest level possible for modern game development?

Post by rainwarrior »

Espozo wrote:Because I don't play modern PC games, how does a PC game not differ for each GPU being used?
1. To accommodate differing levels of GPU power, lots of games have settings. Settings to disable shadows, change level of texture detail, change resolution, etc. are quite common.

2. The video driver acts as an intermediate layer between the GPU itself and the API being used. The program makes a call to the API, the API passes the request on to the driver, and the driver translates that request into code that drives the particular GPU. Unless you work for nVidia you probably don't write any low-level GPU control code.

There are sometimes surprises when the API seems to function differently on different GPUs, but it's generally a bit of a "black box" situation. You can try to use mostly older features and techniques with the hope that there is less variation because they've had time to become robustly supported, but otherwise you just hope to catch any problems like this in testing.
User avatar
Drew Sebastino
Formerly Espozo
Posts: 3496
Joined: Mon Sep 15, 2014 4:35 pm
Location: Richmond, Virginia

Re: The lowest level possible for modern game development?

Post by Drew Sebastino »

rainwarrior wrote:2. The video driver acts as an intermediate layer between the GPU itself and the API being used. The program makes a call to the API, the API passes the request on to the driver, and the driver translates that request into code that drives the particular GPU.
How slow is this normally? (if you can find a good measure for it.) Also, does this run on the CPU or the GPU side of things? I thought modern GPUs were programmable to a certain extent.

I do find it interesting though, that I guess you could write a modern game in assembly, if you really wanted to go through the headache of understanding the modern x86 ISA. (I haven't looked at ARM, but because it's supposed to be RISC, it can't be as bad.) I don't even know what the hell half of what the x86 instructions are supposed to do, even after the descriptions I've seen for them. Because the CPU is still the least pushed in a game and seems to be the only thing you can interact on a really low level, the reason for doing any assembly today seems very weakened. I had only heard it would be a pain in the ass; that's a given, but if you're not getting any extra GPU performance, (you can correct me if I'm wrong) than it's not worth it.
User avatar
Dwedit
Posts: 4921
Joined: Fri Nov 19, 2004 7:35 pm
Contact:

Re: The lowest level possible for modern game development?

Post by Dwedit »

You write shaders in C-like HLSL or GLSL, send them to the shader compiler (part of Direct3D or OpenGL), then the driver does the rest.
Here come the fortune cookies! Here come the fortune cookies! They're wearing paper hats!
User avatar
Drew Sebastino
Formerly Espozo
Posts: 3496
Joined: Mon Sep 15, 2014 4:35 pm
Location: Richmond, Virginia

Re: The lowest level possible for modern game development?

Post by Drew Sebastino »

Dwedit wrote:send them to the shader compiler (part of Direct3D or OpenGL), then the driver does the rest.
Wait, maybe I'm just not understanding you correctly, but are you saying the GPU code has to be compiled at run time upon starting the game? If I'm not making sense, it's because I'm under the impression Direct3D is a program that the computer (or whatever device) runs in the background and not a developers' tool. Although you would never want to bypass the driver even if you could, I'm under the impression that something like Direct3D doesn't change from computer to computer and could be possible to bypass if you really wanted to. I don't know if there's some sort of memory protection or some weird thing though; I really need to look into how modern computers (and probably video game consoles, at this point) handle stuff like multiple applications at one, not causing the whole computer to crash when a program does, allocating memory, interacting with background and open software, etc. I'll say right now that a lot of it is probably hardware interrupts, although I really don't have a clue. :lol:
calima
Posts: 1745
Joined: Tue Oct 06, 2015 10:16 am

Re: The lowest level possible for modern game development?

Post by calima »

"easy to develop for" nowadays means you can make a game just by clicking, be it in Unity or Unreal. You buy a FPS kit and some models from the store, click to combine them, then click to output for Switch. Zero lines of code, zero programming, zero modeling. Often zero design too.

Yeah, I don't hold those in very high regard, you could tell?
Espozo wrote:When's GPU code compiled?
It depends, and there's multiple passes on desktops. For a console, they're generally precompiled fully since the hw is constant.

GLSL is compiled whenever the game passes it in. This may be at game start, level load time, or dynamically which can cause jitter.
HLSL may be used like that, or precompiled to bytecode, which then gets compiled to machine code later on. This essentially does half of the work in advance.
User avatar
rainwarrior
Posts: 8731
Joined: Sun Jan 22, 2012 12:03 pm
Location: Canada
Contact:

Re: The lowest level possible for modern game development?

Post by rainwarrior »

Espozo wrote:How slow is this normally? (if you can find a good measure for it.) Also, does this run on the CPU or the GPU side of things? I thought modern GPUs were programmable to a certain extent.
This takes place on the CPU side. The translation layer between API to driver to GPU itself is generally fairly efficient, but interacting with a GPU is "very slow". The layers in between aren't responsible for this, it's mostly because every interaction with the GPU involves a lot of data transfer (mesh data, texture data, shaders, rendering states, rendering commands, etc.) that completely dwarfs anything the CPU is doing. Being able to write driver-level code directly wouldn't help this, that's not where the bottleneck is.

Good renderers do a lot of CPU work trying to minimize the data that needs to be sent to the GPU. The most important step is usually culling, i.e. deciding which objects are onscreen/visible and which don't need to be drawn at all. Every draw call you can get rid of is a big deal.


Code for the GPU is called a shader, but it runs only on pieces of the drawing pipeline, like determining where onscreen a point in a triangle ends up, or choosing a colour for the final output pixel. The CPU gives it a 3D mesh, all the textures and shaders and other data it needs, then executes a draw call where it starts performing several steps in series. Shader programs can replace the default operation for some of these steps.
Espozo wrote:Wait, maybe I'm just not understanding you correctly, but are you saying the GPU code has to be compiled at run time upon starting the game?
Yes, to some extent. The driver is ultimately responsible for creating the final machine code shader that gets uploaded to the GPU, so that has to be done at runtime.

There's layers to this, though. The high level shader code can sometimes be pre-compiled into some intermediate format, which does most of the important compiling work (optimization, etc.) and then the driver step is more like a final translation. Even though GPU machine code is abstracted away here, it's similar enough from GPU to GPU that it's still pretty valid to do the bulk of the work offline without knowing which particular GPU it's going to be.

Finally, you can cache the shader data after it gets compiled, and save it to disk, if you want to avoid compiling it every time. (In some cases the driver does this automatically, too.)
User avatar
Drew Sebastino
Formerly Espozo
Posts: 3496
Joined: Mon Sep 15, 2014 4:35 pm
Location: Richmond, Virginia

Re: The lowest level possible for modern game development?

Post by Drew Sebastino »

rainwarrior wrote:Good renderers do a lot of CPU work trying to minimize the data that needs to be sent to the GPU.
Man, so you'd almost want to always have the CPU at 100% so you can get as much graphics performance as possible, unless you at just absolutely content with how the game looks. PCs actually seem like a pain in the ass to do any non super high level stuff, because parts changes from computer to computer. When you were talking about trying to do as much as possible on the CPU, what happens for people who have a very good graphics card but a bad CPU? (If you'd ever run into this problem.) Because you would want to run your game cleanly on as many systems as possible, you'd then have to develop a system that gauges how good your CPU is compared to your GPU and adjust the load accordingly. At least with a console, you don't have to worry about that. That's one advantage, but that's only a programming one (as in easier for the developer), but not really a performance one.

Being honest though, I imagine this stuff isn't worried about 90% of the time. You'd probably only need to worry about it if you're developing a game engine.
rainwarrior wrote:The driver is ultimately responsible for creating the final machine code shader that gets uploaded to the GPU, so that has to be done at runtime.
Okay, I think I'm starting to get it... Direct3D is like a traditional compiler that is used only for development and is not ran when you open a game, and the driver is a program that is part of the operating system and run at runtime? I would be under the impression a driver is strictly for compatibility, (as in you don't need a different version of the program for every computer) but I thought I heard game consoles have them now. For consoles, it would make sense to me to just have the shader compiler output into machine code directly; if a piece of software is designed to run on only one machine, a driver should be an unnecessary step.

Lastly (no promises! :lol:) assuming it's standard (I mean, the whole point is that it's supposed to make it work on multiple systems) shouldn't every driver be the same from the game's perspective, in that the all accept the same input? (not including forwards compatibility, obviously.) I can't seem to find how it wants data formatted, although I really don't know what I'm looking for.
calima wrote:"easy to develop for" nowadays means you can make a game just by clicking, be it in Unity or Unreal. You buy a FPS kit and some models from the store, click to combine them, then click to output for Switch. Zero lines of code, zero programming, zero modeling. Often zero design too.Yeah, I don't hold those in very high regard, you could tell?
It can't be that bad, can it? :lol:
tepples
Posts: 22705
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: The lowest level possible for modern game development?

Post by tepples »

Espozo wrote:
rainwarrior wrote:Good renderers do a lot of CPU work trying to minimize the data that needs to be sent to the GPU.
Man, so you'd almost want to always have the CPU at 100% so you can get as much graphics performance as possible, unless you at just absolutely content with how the game looks.
That or you want to keep it low to save battery on a laptop or tablet PC, or you want to optimize the code so that it doesn't exceed 100% (and therefore slow down) in more complex scenes.
what happens for people who have a very good graphics card but a bad CPU?
Some games need more CPU than others. Say you bought an off-lease Core 2 Duo and put a $150 GPU in it. Games that do a good job of offloading stuff to the GPU will play well; others may exhibit low frame rate.
Okay, I think I'm starting to get it... Direct3D is like a traditional compiler that is used only for development and is not ran when you open a game, and the driver is a program that is part of the operating system and run at runtime?
Direct3D is a library, not a "compiler", but your understanding is otherwise correct.
For consoles, it would make sense to me to just have the shader compiler output into machine code directly; if a piece of software is designed to run on only one machine, a driver should be an unnecessary step.
On any console more sophisticated than the original Wii, you still need some sort of system-wide driver so that your game can share the GPU with operating system functions. These include notifications that a friend has signed on and wants to play, notifications that an update for a different game has finished downloading in the background, notifications that you have earned an achievement, the status of recording gameplay or streaming to Twitch, the status of voice chat, etc.
Lastly (no promises! :lol:) assuming it's standard (I mean, the whole point is that it's supposed to make it work on multiple systems) shouldn't every driver be the same from the game's perspective, in that the all accept the same input?
Different cards support different versions of the shader language, and the driver is supposed to tell the game what versions are supported.
calima wrote:"easy to develop for" nowadays means you can make a game just by clicking, be it in Unity or Unreal. You buy a FPS kit and some models from the store, click to combine them, then click to output for Switch. Zero lines of code, zero programming, zero modeling. Often zero design too.Yeah, I don't hold those in very high regard, you could tell?
It can't be that bad, can it? :lol:
Search the web for "asset flip". Feel disgust.
User avatar
rainwarrior
Posts: 8731
Joined: Sun Jan 22, 2012 12:03 pm
Location: Canada
Contact:

Re: The lowest level possible for modern game development?

Post by rainwarrior »

Espozo wrote:Man, so you'd almost want to always have the CPU at 100% so you can get as much graphics performance as possible, unless you at just absolutely content with how the game looks. PCs actually seem like a pain in the ass to do any non super high level stuff, because parts changes from computer to computer. When you were talking about trying to do as much as possible on the CPU, what happens for people who have a very good graphics card but a bad CPU? (If you'd ever run into this problem.) Because you would want to run your game cleanly on as many systems as possible, you'd then have to develop a system that gauges how good your CPU is compared to your GPU and adjust the load accordingly. At least with a console, you don't have to worry about that. That's one advantage, but that's only a programming one (as in easier for the developer), but not really a performance one.
No, you wouldn't normally try to adjust the CPU vs GPU load automatically at run-time. That's what the graphics settings are for. (These days nVidia cards even come with a utility that's like a library of "good" settings for games based on the hardware you have installed.)

The nature of things is that you will either be using 100% of the available CPU, or 100% of the GPU, and the one that isn't fully utilized will wait on the other. One of these two things will be determining your maximum framerate in any given situation. Even if you have just one set of hardware it's quite normal for the changing game situation to shift from one to the other. It depends on how much stuff changes in your game.

I mentioned culling as the most important thing to do on the CPU to unload the GPU, but it's not like a transfer of work. You can't really just take stuff from the GPU and transfer it to the CPU. You can't just spend more CPU on the problem and cull more stuff, you just remove everything you can so the GPU doesn't end up doing redundant work, and doing this isn't usually a big burden on the CPU.

There are some tasks that could potentially be done on a CPU or GPU, but that's a bit of an aside... the usual purpose of having a GPU is that it's really efficient at rendering things and you want to let it do what it's good at. The CPU acts like its "manager", trying to give it tasks in the appropriate order and form that will keep things running smoothly, and getting rid of tasks that would have gone to waste.
Espozo wrote:Being honest though, I imagine this stuff isn't worried about 90% of the time. You'd probably only need to worry about it if you're developing a game engine.
Incorrect. The quality of the engine might adjust your overall efficiency, but the load you're putting on the GPU is driven entirely by the art assets you're feeding into it. It doesn't matter how good the engine is, if you cram too much work into either end of it, your framerate will drop.

The only time it doesn't matter is when you're under-budget and already running at your target framerate. (At that point you're not bound by either CPU or GPU.)

If you don't have capable engineers on your team to diganose specific performance problem situations, there's lo-fi approaches to trying to solve them (e.g. "delete the trees in that room, see if it helps?").

It's easier to do it on a console setting where there's only one target hardware, but it's still a problem there too.

For PC you set a "minimum specification" target and test and make sure your game runs on that with all the lowest settings. You might also set a "medium" and "maximum" target and try to balance the assets for that as well. You develop on whatever is reasonable and hope that by testing at least against the minimum spec your game will be viable across a range of hardware setups. If you're lucky you have a QA department and a bunch of random hardware to get some additional testing done on, and then of course if you eventually release the game you suddenly get live testing on all sorts of hardware you couldn't possibly consider, probably find new bugs and problems and try to patch issues as they're reported (or not, if you don't have the budget for it).
Espozo wrote:Okay, I think I'm starting to get it... Direct3D is like a traditional compiler that is used only for development and is not ran when you open a game, and the driver is a program that is part of the operating system and run at runtime? I would be under the impression a driver is strictly for compatibility, (as in you don't need a different version of the program for every computer) but I thought I heard game consoles have them now. For consoles, it would make sense to me to just have the shader compiler output into machine code directly; if a piece of software is designed to run on only one machine, a driver should be an unnecessary step.
Consoles are pretty much like the PC architecture, and drivers still exist there. Think about what it means to have both a PS4 and a PS4 Pro, for example. There's less variations on the hardware, but that doesn't mean the driver architecture idea isn't worthwhile. Again, you seem to be under the impression that the drivers are a significant source of inefficiency, and they really aren't, for the most part.

Direct3D is a runtime system that sits between the game program and the driver. If you're talking about shader compilation specifically, you can do some of the work before runtime, because it has an intermediate bytecode format for that purpose. Again, the driver has to have the final word when compiling the shader, though, so that gets done at runtime. (Results can typically be cached so it only has to happen the first time it's run.)
Espozo wrote:Lastly (no promises! :lol:) assuming it's standard (I mean, the whole point is that it's supposed to make it work on multiple systems) shouldn't every driver be the same from the game's perspective, in that the all accept the same input? (not including forwards compatibility, obviously.) I can't seem to find how it wants data formatted, although I really don't know what I'm looking for.
Yes, that's entirely the purpose of having a driver. Standard input, with output customized to the hardware.
zzo38
Posts: 1096
Joined: Mon Feb 07, 2011 12:46 pm

Re: The lowest level possible for modern game development?

Post by zzo38 »

You mention GLSL and HLSL, but you forgot about ARB assembly language.

I use ARB assembly language to code shaders, although I would have preferred Checkout. But I don't do 3D graphics anyways.

I still think the modern systems are too complicated though.
(Free Hero Mesh - FOSS puzzle game engine)
User avatar
rainwarrior
Posts: 8731
Joined: Sun Jan 22, 2012 12:03 pm
Location: Canada
Contact:

Re: The lowest level possible for modern game development?

Post by rainwarrior »

To be honest, I think of the shader assembly languages as obsolete at this point. They are useful to understand for debugging a GPU or analyzing the compiler output, but I haven't ever come across a situation where it seemed better to write in shader assembly than one of the high level languages.

Unlike typical C++ programs, shaders are very small, self contained, and well behaved programs. The compilers tend to do a very good job of optimizing HLSL / GLSL / Cg. At my last job I used to spend quite a lot of time looking at the generated assembly code, and there was never a moment where I thought it had missed an optimization opportunity (and it frequently found things to simplify that I hadn't thought of). I find it doesn't really compare to the assembly vs. compiler experience in CPU programming-- GPU shader compilers are almost perfect.

Plus, as mentioned before, the "assembly" doesn't even translate directly to machine language because the driver does a translation pass on it. The shader assembly languages were always an intermediate thing.

Microsoft even deprecated their assembly shader language in D3D10, it's only for debugging in 10.
calima
Posts: 1745
Joined: Tue Oct 06, 2015 10:16 am

Re: The lowest level possible for modern game development?

Post by calima »

ARB assembly is either limited to old functionality, or to only running on Nvidia cards (only Nvidia extended it). The only case you want to use it is if you need to run on R300-R400 or the equivalent Nvidia cards that do not support GLSL, and even then you should write GLSL and have the Cg compiler compile it to ARB for you.
Post Reply