Serious Sam 4

Serious Sam 4

Not enough ratings
Fixing the "Could not detect GPU" error
By RiptoR
I have noticed that there are quite a few discussion posts about Serious Sam not detecting recent GPU's. I'm in the same boat: the game simply won't recognize my 3080 TI, even with the latest drivers installed.

While you can easily just skip the warning and boot the game, it seems that the game will reset your graphics settings each time however. And manually having to redo these settings every time you want to play the game very quickly becomes very annoying.

So I did a bit of digging, and found out what the problem actually is. And I also have a solution for it!
2
2
   
Award
Favorite
Favorited
Unfavorite
What is actually going wrong?
If you have the same problem, you probably visited the discussion forum too here on Steam and might have seen the devs asking to post log files.

First, browse to the install directory of the game, since this is where the log files can be found (the script files we'll take a closer look at are located here too). The easiest way is to right-click on the game in the Steam Library page, and then choose "Manage >> Browse local files".

Next, locate the log file called "Logs/Sam4.log", and open it in the text editor of your choice.

If you look in this log file, you'll most likely see your GPU is indeed recognized. The lines we are looking for are similar to the example below:

... 10:57:26 LOG: [D3D11] Using DXGIFactory1 interface. 10:57:26 LOG: GPU #1: NVIDIA GeForce RTX 3080 Ti from nVidia 10:57:26 INF: [D3D11] Detected devices: 10:57:26 INF: #1: (10DE; 2208:38971462.00A1) NVIDIA GeForce RTX 3080 Ti (12108, 0/16296 MB) ...

So this means the game does indeed recognize the GPU, but for some reason will still claim it has no idea what GPU is installed.

Let's keep looking. A bit further down in the log file it will also say something like:

... 10:57:26 LOG: Processing file Content/SeriousSam4/Config/CheckDriver.lua 10:57:26 INF: Driver version: 47196 (required: 45600) 10:57:26 LOG: Processing file Content/SeriousSam4/Config/SystemCompatibility.lua 10:57:26 LOG: Compatibility check failed: GPUDetect ...

So even though the GPU is recognized, it seems the game is using a script file called "SystemCompatibility.lua". And for some reason, this script file fails to detect the GPU somehow.

Since this is a LUA script file, we should be able to easily check what is happening in it.

Open the file "Content/SeriousSam4/Config/SystemCompatibility.lua" with any text editor (notepad will do just fine), and near the end of the script we'll find the code that checks the GPU:

... -- video RAM check if gfx_ulVideoMemoryMB < 2500 then CollectCompatibilityFailMessage("Video memory is below minimum requirements (" .. gfx_ulVideoMemoryMB .. " MB)", "VidRAM") end -- get the GPU specs dofile("Content/Shared/Config/PerfIndexGPU.lua") local gpuSpecs = globals.gpuSpecs globals.gpuSpecs = nil -- if we have obtained the GPU specs ...

So it seems the script is able to detect the amount of VRAM on the GPU, but not the GPU specs.

The line dofile("Content/Shared/Config/PerfIndexGPU.lua") is of interest here, as it seems that is another script file that seems to be used to determine the actual GPU specs.

Since it's another LUA script file, we can simply open it in a text editor to see what it does.

In this file, we'll see that first a couple of variables are declared with information on hundreds of different GPU's, each with a performance index. A quick search through these variables quickly showed that my card (the 3080 TI) wasn't in the list, so it's probably safe to say that this is the actual problem. The game seems to use a hardcoded list of GPU identifiers, and if your card isn't in the list, you're out of luck.

If we look near the end of the file, this seems to be the case indeed:

... local function GetGpuSpecs() local retVal = nil -- get GPU vendor from the id local gpuVendor = gpuVendorIDs[sys_iGPUVendorID] if gpuVendor == nil then return retVal end -- in any case, we can provide the vendor information since we found it retVal = {vendor = gpuVendor} -- get gpu from vendor local vendorGpuIDs = perVendorGpuIDs[gpuVendor] -- nothing more to do if there is no gpu info for this vendor if vendorGpuIDs == nil then return retVal end -- get gpu specs from vendor gpus and current device id local uwCurrentDevID = (sys_iGPUDeviceID % 0x10000); -- store the calculated device id as card id in return value (as it gets used sometimes) retVal.cardId = uwCurrentDevID local card = vendorGpuIDs[uwCurrentDevID] -- nothing more to do if we cannot determine card information if card == nil then return retVal end ...

If we take a closer look ath the "GetGpuSpecs()" function, we see that:

  1. The function first determines the vendor based on the "sys_iGPUVendorID" variable: local gpuVendor = gpuVendorIDs[sys_iGPUVendorID]
  2. Next, it uses this info to select the relevant variable with the list of GPU's for this vendor: local vendorGpuIDs = perVendorGpuIDs[gpuVendor]
  3. Then it uses the list to find the actual GPU: local card = vendorGpuIDs[uwCurrentDevID]

If the last step fails, for example due to the hardcoded lists not having your GPU in them, then the script stops and returns the value "nil" (in other words: nothing) to the "SystemCompatibility.lua" script, which in turn throws the "GPUDetect" error.


Now that we know what the problem is, we can easily fix it. And there are 2 different ways to do this.
Fix 1 - Bruteforcing the detection
The first way is the easiest: we'll simply make the script believe we have a different GPU.

Open the "Content/Shared/Config/PerfIndexGPU.lua" script.

In the variables with the different vendors and GPU's, locate a card of similar performance to your own GPU. In my case, I used the info for the 3090, since the 3080 TI and 3090 are basically the same when it comes to performance in games.

Write down the identifier for this GPU. This is the value in between the square brackets in the line with the GPU info, for example "0x2204" in my case:

[0x2204] = { performance=4820, name="GeForce RTX 3090" }

Next, scroll all the way down to the bottom of the script, and locate the following lines:

... local function GetGpuSpecs() local retVal = nil ...

We simply need to add a new line between these 2 lines in which we manually set the "sys_iGPUVendorID" variable to the identifier of the GPU we chose earlier. The code should look like this:

... local function GetGpuSpecs() sys_iGPUVendorID = 0x2204 local retVal = nil ...

Finally, simply save the changes to the file, and we're done. The game will now believe we have a different GPU.
Fix 2 - Expanding the list of known GPU's
The second fix is slightly more complicated, but a bit more elegant: we will simply add our own GPU to the existing list of known GPU's.

First, we need to find out what the device identifier is for our own GPU. We can find this in the log files we examined earlier.

The ID we are looking for can be found in the following lines in the logfile:

... 10:57:26 INF: [D3D11] Detected devices: 10:57:26 INF: #1: (10DE; 2208:38971462.00A1) NVIDIA GeForce RTX 3080 Ti (12108, 0/16296 MB) ...

The identifier that we need can be found between the semicolon (";") and colon (":") of the detected device information, marked in bold in the line below:

#1: (10DE; 2208:38971462.00A1) NVIDIA GeForce RTX 3080 Ti (12108, 0/16296 MB)

So in my case, for a 3080 TI this would be "2208".


Next, open the "PerfIndexGPU.lua" script, and locate the variable with the GPU list of your own card's vendor (in my case, nVidia). In this list, locate the GPU that is closest in performance to your own:

... -- device ID tables per vendor local perVendorGpuIDs = { -- nVidia nVidia = { -- desktop (driver 461.72) [0x06C0] = { performance=440, name="GeForce GTX 480" }, [0x06C4] = { performance=345, name="GeForce GTX 465" }, ...a whole lot of other GPU's here... [0x21C4] = { performance=1600, name="GeForce RTX 1660 SUPER" }, [0x21D1] = { performance=835, name="GeForce GTX 1650 Ti" }, [0x2204] = { performance=4820, name="GeForce RTX 3090" }, [0x2206] = { performance=4050, name="GeForce RTX 3080" }, [0x2207] = { performance=4050, name="GeForce RTX 3080" }, ...even more GPU's here... }, -- ATi ATi = { ...

Now simply copy the line of the GPU with the performance closest to your own GPU (in my case, the 3090), change the identifier to the one we found above (in my case "2208"), and (optionally) change the description to the name of your card.

And if you really want to, you could also update the performance value, although this is completely optional and won't make much of a difference.

The code should look like this now:

... ...a whole lot of other GPU's here... [0x2204] = { performance=4820, name="GeForce RTX 3090" }, [0x2208] = { performance=4800, name="GeForce RTX 3080 TI" }, ...even more GPU's here... }, -- ATi ATi = { ...


Finally, simply save the changes to the file, and we're done. The game will now recognize our GPU.
Final words
There you have it: we fixed the GPU detection!

The game will now start without the annoying message and without resetting your graphics settings each time.

And as an extra bonus, you'll also be able to use "Autodetect" to automatically determine the best settings for your system.


It's finally time to kick some serious butt!
18 Comments
RiptoR  [author] 10 hours ago 
yw :D:
swaggerfox 22 hours ago 
tysm
RiptoR  [author] 23 May @ 7:13am 
Glad I could help :D:
Roastd 22 May @ 2:09pm 
Thanks for your guide man, it didnt recognize my rx7900xtx...i got the game on gog and now i can finally play it! :2019love:
null🌀 19 Mar @ 2:45am 
many thanks to the author, I couldn't find a solution anywhere else.
λ 𝑭𝑹𝑬𝑬𝑴𝑨𝑵 4 Dec, 2024 @ 3:27pm 
Great work :smartsam:
ElMono420 22 Oct, 2024 @ 9:55pm 
Thanks for the coding curse hahaha
RiptoR  [author] 31 Jan, 2024 @ 1:58pm 
@Gray Cat: I'll reply in the discussion forums, as comments here are limited to 1000 characters.
Gray Cat 31 Jan, 2024 @ 12:48pm 
Hey, it's not really relevant to guide's topic, but maybe you'll know what to do (since no one responds to my discussion threads and I pretty much have no one else to ask, and you seem to understand all that stuff better than me)

My friend has NVIDIA GeForce RTX 3080. He doesn't get any error messages from the game and it should work fine, but mid-game he's experience fps drops to 0. Game freezes more frequently and for longer periods of time if there are a lot of enemies and/or game session lasts long. We tried to use the fixes presented in this guide, cus the data in .lua files was similiar to his files, but it didn't really change anything.

These freezes only happen in SS4 and Siberian mayhem, any other game works fine. And ideas,please? :lunar2019deadpanpig:

https://drive.google.com/file/d/1KYX6bngmDdES4HG_h1D3w8zhSchnH4aL/view?usp=drive_link

Log
RiptoR  [author] 28 Jan, 2024 @ 2:12pm 
Yeah, it sucks the devs haven't updated the detection scripts, or at least made it so the game doesn't reset graphics settings each time on boot when it fails to detect your actual gpu.

That's the problem with games using methods like this to detect optimal performance settings for a system: the scripts get outdated pretty fast and need regular updating to prevent problems like this.