Some Games Needs To Be Restarted While Others Not When You Change Graphical Settings, Here you will get the answer to this question which may have raised in your mind before. Every game handles things differently because every game’s graphics pipeline is going to be different.
One such 3D pipeline might work like this:
- The game calculates (on CPU) where objects in the world are positioned
- Game passes this information to the GPU
- GPU calculates what part of the world can be seen from the camera, using is frustrum (imagine a pyramid with the point chopped off)
- GPU draws the world, using triangles (e.g. a simple square takes two calculations — but GPUs are really, really good at this)
- GPU then re-draws to apply shaders (this can happen a lot of times if you have many shaders and/or multi-pass shaders)
- GPU sends a frame to the operating system to display on the monitor
Some Games Needs To Be Restarted While Other Not When You Change Graphical Settings, Do You Know Why?
Keep in mind that this process has to happen very quickly; if you’re playing at 60 frames per second, you have to draw each frame in 0.01666… seconds. Dropped frames occur when it takes too long to complete part of the process. Here’s a perfect example of why it’s hard to get one answer for you: some games will hold off everything to finish getting that frame on the screen, some will calculate what happens but stop rendering until the frame is finished. You could code it to only calculate some things, or even to throw up what you have and move on.
This whole process relies on several things being in place already. For example, models (3D shapes) and textures (images “painted” onto the models) need to exist. Shaders need to be written, enabled, and given objects or cameras to apply to. These are, at the most basic level, just files. Textures can be png, shaders can be opened with notepad, and models can sit on your computer even if you don’t have software to read them.
Another major hardware component is RAM, often called memory. There is a difference between RAM and a hard drive is you trade out storage space for speed — a lot of speed. If the operating system had to query the hard drive (even a solid state) for texture information several times a frame, we’d be playing slide shows. So we put all this data into hardware designed to move information quickly between CPU, GPU, and other programs loaded into RAM (such as the operating system), at the cost of not holding much data at once.
Unfortunately, moving data from HDD/SSD to RAM happens at the speed of your HDD/SSD, because it’s slower. This is why loading screen happen. Since you have so much extra work to do, you’re likely going to take longer than 0.016666… seconds to draw your frame, so you get to choose between loading slowly but at the good performance or loading quickly with poor performance.
I know this is talking a lot about framerate instead of why the pipeline is hard to change on the fly, but we’re getting there, I promise. At this point, problems start to come down to the choices of various coders.
Is it acceptable to have the game stutter and choke when the player adjusts the settings? If so, the game programmer can allow those settings to change during gameplay. Simpler changes are more likely to pass this test.
Is it acceptable for the engine to allow certain elements to change on the fly? If the camera is altered, does that play nice with how the engine reads the shaders and such? Questions like these are answered by a different set of programmers doing a different job. The point of an engine is to create a workspace for game programmers that flows smoothly from task to task without causing too many issues; a choice here affects how we do things there, an optimization in this area comes at the cost of tasks in that area, etc. A game programmer can change this if and only if they have the time and ability to change the engine itself.
Engine programmers have to answer to graphics API programmers. The engine (and therefore the game) does not talk directly to the GPU or the operating system; these have to go through a graphics API. The engine says “SDL, I would like a window to display my program in” and SDL says “give me your window size, the border details, whether you want it to be resized or not, and I’ll talk to Windows for you”. The people who engineer the API have to go through a similar process as the people making the game engine. Thus, anything the graphics API can’t do isn’t available to the game engine or the game programmer.
This goes down another level to the operating system (perhaps Windows) and the GPU’s language (perhaps OpenGL). Anything those can’t do trickles back to the game programmer.
So, want to change how much bloom there is? Might be the shader developer made it so you can pass in a number and intensify/diminish the effect. On the other hand, might be the engine doesn’t like passing in variables to shaders because that slows the pipeline, so you need to unload that shader and load in a different one. (I’d hope not, but it could happen.) This might necessitate rebooting the client.
Another example is screen size. Remember when we were talking about SDL getting all the information about your game window before it made the screen? Maybe another API will let you change the window size on the fly, or maybe it doesn’t. If it can’t, it’ll take a note somewhere (config.ini is a good friend) so it knows what to tell the API when it boots up next.
These examples aren’t perfect, and anyone with experience will point out the mistakes I’ve made (graphics programming is far from my specialty). But I hope they give you the gist of what’s going on behind the scenes: there’s a number of layers where programmers had to look at their options and make choices about what they would allow the next user to do. Sometimes they agree that it’s ok to make the end-user wait a moment or to sacrifice some performance in places so they have more options. Other times they favor a smoother experience, or just don’t have the tools to do what they want because of other strengths coded in. And also together, that means different options menus get handled differently.