Stop buying NVIDIA
dang they got the monopoly
< >
Showing 1-15 of 38 comments
Dom 4 Sep @ 7:49am 
Yet NVIDIA's competitors would love to be where NVIDIA is at and do the exact same business practices.

Companies are not your friends. Buy whatever works best for you.
Gaming and graphic cards became a fawking marathon.
Producers are watching gamers competing but something is wrong.......RTX 30XX series are still in a great position while the "AI boosted cards with fake FPS for unoptimized games" are on top because people think 18 GB or 25 GB is better than 10GB (ignoring the cards frequencies an other factors) thus they think 40XX and 50XX series are the best but.......no, not always.
Monk 4 Sep @ 8:04am 
Originally posted by The Grin:
Gaming and graphic cards became a fawking marathon.
Producers are watching gamers competing but something is wrong.......RTX 30XX series are still in a great position while the "AI boosted cards with fake FPS for unoptimized games" are on top because people think 18 GB or 25 GB is better than 10GB (ignoring the cards frequencies an other factors) thus they think 40XX and 50XX series are the best but.......no, not always.

No the newer cards are literally better than the older 30 series.

OP instead of asking people to stop buying the best product, what you should be doing is asking for the competitors to step up and be actual competition.

No one in their right mind would buy the inferior product to support a different billion dollar company vs another.

Buy the best product for your budget and use case, to do different is madness.
Originally posted by Monk:
Originally posted by The Grin:
Gaming and graphic cards became a fawking marathon.
Producers are watching gamers competing but something is wrong.......RTX 30XX series are still in a great position while the "AI boosted cards with fake FPS for unoptimized games" are on top because people think 18 GB or 25 GB is better than 10GB (ignoring the cards frequencies an other factors) thus they think 40XX and 50XX series are the best but.......no, not always.

No the newer cards are literally better than the older 30 series.

OP instead of asking people to stop buying the best product, what you should be doing is asking for the competitors to step up and be actual competition.

No one in their right mind would buy the inferior product to support a different billion dollar company vs another.

Buy the best product for your budget and use case, to do different is madness.

Look, I have an MSI RTX 3080 Ventus 3x OC, with 10 GB ram in a 2500 bucks pc I bought 4 years ago.
Every time I buy a newer game (of course I am not dumb I rarely buy day one especially at a horrendously bloated and unfair price and unoptimized ) and I play it (heavily modded, with RTX effects and reshade + LUT when the game affords it), I rarely have any problems (performance, crashes unless on Cyberpunk because it's still a tech demo, or artifacts or blue screen...).

Meanwhile, when I look at the forums, I see RTX 40 and 50 owners having tons of problems (even on HD resolution screens !) complaints are just every time between driver problems, low performances FPS wise even with DLSS, crashes or artifacts etc....

I play the same games on very high or ultra (depends on the necessity to crank it up this far or not), usually they have the same processor I have or even better but many have more problems.

How do you explain this ?

I know there are numerous factors to take into account but put side-by-side, it is still questionnable to see some players having worse performances with newer gen cards right ?
Last edited by The Grin; 4 Sep @ 8:23am
Monk 4 Sep @ 8:33am 
Originally posted by The Grin:
Originally posted by Monk:

No the newer cards are literally better than the older 30 series.

OP instead of asking people to stop buying the best product, what you should be doing is asking for the competitors to step up and be actual competition.

No one in their right mind would buy the inferior product to support a different billion dollar company vs another.

Buy the best product for your budget and use case, to do different is madness.

Look, I have an MSI RTX 3080 Ventus 3x OC, with 10 GB ram in a 2500 bucks pc I bought 4 years ago.
Every time I buy a newer game (of course I am not dumb I rarely buy day one especially at a horrendously bloated and unfair price and unoptimized ) and I play it (heavily modded, with RTX effects and reshade + LUT when the game affords it), I rarely have any problems (performance, crashes unless on Cyberpunk because it's still a tech demo, or artifacts or blue screen...).

Meanwhile, when I look at the forums, I see RTX 40 and 50 owners having tons of problems (even on HD resolution screens !) complaints are just every time between driver problems, low performances FPS wise even with DLSS, crashes or artifacts etc....

I play the same games on very high or ultra (depends on the necessity to crank it up this far or not), usually they have the same processor I have or even better but many have more problems.

How do you explain this ?

I know there are numerous factors to take into account but put side-by-side, it is still questionnable to see some players having worse performances with newer gen cards right ?

Anecdotal.

I've had zero issues with any of them.

You cannot believe your 3080 beats a 4080 or 5080 because of anecdotal evidence of a very small minority having issues that get blown out of proportion by the media to get clicks and views.

You also likely play at 1080p, maybe 1440p or are happy with lower refresh rates, but my the 3090 had its limits as did my 4090, 2080ti, sli 1080ti, sli 980 etc etc. Tech moves on and in most cases, the new is better than the old.

The only benefit the older cards have over the new ones is they can use Physx.
Lixire 4 Sep @ 8:41am 
Originally posted by The Grin:
Originally posted by Monk:

No the newer cards are literally better than the older 30 series.

OP instead of asking people to stop buying the best product, what you should be doing is asking for the competitors to step up and be actual competition.

No one in their right mind would buy the inferior product to support a different billion dollar company vs another.

Buy the best product for your budget and use case, to do different is madness.

Look, I have an MSI RTX 3080 Ventus 3x OC, with 10 GB ram in a 2500 bucks pc I bought 4 years ago.
Every time I buy a newer game (of course I am not dumb I rarely buy day one especially at a horrendously bloated and unfair price and unoptimized ) and I play it (heavily modded, with RTX effects and reshade + LUT when the game affords it), I rarely have any problems (performance, crashes unless on Cyberpunk because it's still a tech demo, or artifacts or blue screen...).

Meanwhile, when I look at the forums, I see RTX 40 and 50 owners having tons of problems (even on HD resolution screens !) complaints are just every time between driver problems, low performances FPS wise even with DLSS, crashes or artifacts etc....

I play the same games on very high or ultra (depends on the necessity to crank it up this far or not), usually they have the same processor I have or even better but many have more problems.

How do you explain this ?

I know there are numerous factors to take into account but put side-by-side, it is still questionnable to see some players having worse performances with newer gen cards right ?

I used a 3080 10GB since 2020 until I got the 5080 right in February and while I did have noticeable issues like black screens or DX errors with the early drivers like 572.83
with the later 576.xx drivers and onward my experience was basically smooth sailing across everything I do so far and def I have a significant performance improvement over the previous card (more than 50% on average and much better lows on games that did eat more VRAM)

Running on 7800X3D, 64GB (2x32) DDR5 6000@cl30 with X670E Aorus Master on WIn11 25H2 (and yes I'm running the Insider preview so build 26200.5074)
Also fun fact, updating the BIOS of the motherboard had reduced the amount of random black screens I was getting in the earlier 572 drivers by a large margin

Everyone's setup is very different when it comes to their configuration at the BIOS level, OS level and etc or also their hardware as a whole
The Grin 4 Sep @ 8:59am 
Originally posted by Monk:
Originally posted by The Grin:

Look, I have an MSI RTX 3080 Ventus 3x OC, with 10 GB ram in a 2500 bucks pc I bought 4 years ago.
Every time I buy a newer game (of course I am not dumb I rarely buy day one especially at a horrendously bloated and unfair price and unoptimized ) and I play it (heavily modded, with RTX effects and reshade + LUT when the game affords it), I rarely have any problems (performance, crashes unless on Cyberpunk because it's still a tech demo, or artifacts or blue screen...).

Meanwhile, when I look at the forums, I see RTX 40 and 50 owners having tons of problems (even on HD resolution screens !) complaints are just every time between driver problems, low performances FPS wise even with DLSS, crashes or artifacts etc....

I play the same games on very high or ultra (depends on the necessity to crank it up this far or not), usually they have the same processor I have or even better but many have more problems.

How do you explain this ?

I know there are numerous factors to take into account but put side-by-side, it is still questionnable to see some players having worse performances with newer gen cards right ?

Anecdotal.

I've had zero issues with any of them.

You cannot believe your 3080 beats a 4080 or 5080 because of anecdotal evidence of a very small minority having issues that get blown out of proportion by the media to get clicks and views.

You also likely play at 1080p, maybe 1440p or are happy with lower refresh rates, but my the 3090 had its limits as did my 4090, 2080ti, sli 1080ti, sli 980 etc etc. Tech moves on and in most cases, the new is better than the old.

The only benefit the older cards have over the new ones is they can use Physx.

The Game industry is just like that:

Game developers think they need to "optimise" their games with framegen in mind to reduce the performance hit to the native FPS..., INSTEAD OF OPTIMISING THE GAME to run at high enough native FPS to not need the framegen in the first place.

Now....what was that again with " Fake fps for a wow effect, covering up the actual blurry textures " and " AI architecture from 40 and 50 series" with people ejaculating frenetically when they play a game with 300 FPS while the game is just simply playable at 60 or 80 with no issues at all.

This is all Marketing Bullshyat. The more sheeps you convert, the better.

On top of that, the human eye doesn't see 300 FPS on the screen (>~>) and so many still brag about it.

Nvidia has hit the ceilling but they just go through it to greater heights because there are still customers eating their " green bean soup".
Monk 4 Sep @ 9:17am 
Originally posted by The Grin:
Originally posted by Monk:

Anecdotal.

I've had zero issues with any of them.

You cannot believe your 3080 beats a 4080 or 5080 because of anecdotal evidence of a very small minority having issues that get blown out of proportion by the media to get clicks and views.

You also likely play at 1080p, maybe 1440p or are happy with lower refresh rates, but my the 3090 had its limits as did my 4090, 2080ti, sli 1080ti, sli 980 etc etc. Tech moves on and in most cases, the new is better than the old.

The only benefit the older cards have over the new ones is they can use Physx.

The Game industry is just like that:

Game developers think they need to "optimise" their games with framegen in mind to reduce the performance hit to the native FPS..., INSTEAD OF OPTIMISING THE GAME to run at high enough native FPS to not need the framegen in the first place.

Now....what was that again with " Fake fps for a wow effect, covering up the actual blurry textures " and " AI architecture from 40 and 50 series" with people ejaculating frenetically when they play a game with 300 FPS while the game is just simply playable at 60 or 80 with no issues at all.

This is all Marketing Bullshyat. The more sheeps you convert, the better.

On top of that, the human eye doesn't see 300 FPS on the screen (>~>) and so many still brag about it.

Nvidia has hit the ceilling but they just go through it to greater heights because there are still customers eating their " green bean soup".

Oh so you really have no idea.

'human eye can't see 300fps'... That you would say that tells us all we need to know.

Games are still being 'optimised' it's just they are also getting more detailed and complex than ever before, there is only so much you can optimise stuff.

We are at a point where complexity simply needs more power.

Frame gen is a way to get it without quadruoling the cost of gpu's.

Also, a 120, 240 or 360fps gameplay is smoother and nicer to watch than one at 60 or 80 even if the latency is the same (or very slightly slower but at a level that is not perceptable with reflex running).

Anyway, you don't think we can see high fps, so it's pretty pointless discussing anything with you.
Nvidia’s proprietary CUDA will keep nvidia at top. Its position in the market wont change anytime soon
Last edited by Tiberius; 4 Sep @ 9:54am
Haruspex 4 Sep @ 10:00am 
Originally posted by Tiberius:
Nvidia’s proprietary CUDA will keep nvidia at top. Its position in the market wont change anytime soon

This proprietary baloney is exactly why AMD gets my money, but that's just me. Most people don't base their purchases on how open their standards are. On the competitive high end, you're absolutely right. AMD never misses an opportunity to miss an opportunity.
Well I bought a 5090 at launch but I'll make a promise to you I won't buy another till the 6090 drops.
Originally posted by Monk:
...

The only benefit the older cards have over the new ones is they can use Physx.

Correction, they can run 32bit Physx. The 50-series still uses 64bit Physx just fine.

There is roughly 200 games in total that use 32bit Physx from 2006 - around 2012. 64bit Physx was introduced in 2008 and was largely moved to by 2012; because in March 2012 with the launch of Unreal Engine 4 UE moved the default Physx to version 3.3.3 which was both 64bit and supported both GPU accelerated and CPU (software based) processing. Pretty much everything after 2015 is all 64bit because that is when Unity 5 moved the native Physx version to 3.3; thus making the two largest engines having their default physics engines being 64bit Physx.

If people are still wanting to run those 10 - 20 year old games with 32bit Physx they can still use any old cheap second NVIDIA GPU prior to the 50-series, such as an old 970; and set it as the dedicated Physx GPU.
Originally posted by A Chaos Emerald?:
dang they got the monopoly

Iike my Nvidia 4060
_I_ 4 Sep @ 1:15pm 
amd and intel do have mid and lower end gpus

nvidia doesnt really have competition for high end

if you want a high end gpu, there isnt really a much of a choice
Originally posted by Monk:
It's a discussion forum, you made points and I countered them with the truth
Oh absolutely. This is a discussion forum, so it's expected that discussion will occur. The problem is that you don't discuss things. You seem to think that your position is absolute truth, infallible, and that you cannot possibly be wrong under any circumstances. You seem to be incapable of viewing anything from any perspective other than your own, and your perspective is the only "true" perspective, while all others are wrong and in need of correction.

Discussions move in both directions, but you're only interested in telling others what you think without hearing or considering what others think.
< >
Showing 1-15 of 38 comments
Per page: 1530 50