Hmm... checked r_stats, my graphic card does not lie. 4096MB on board, 4096MB reported as free. Seems the extra bucks invested into a model with more VRAM was a good decision. I have texture streaming turned off anyway.
matsoMaster of PatchesJoin Date: 2002-11-05Member: 7000Members, Forum Moderators, NS2 Developer, Constellation, NS2 Playtester, Squad Five Blue, Squad Five Silver, Squad Five Gold, Reinforced - Shadow, NS2 Community Developer
Hmm... checked r_stats, my graphic card does not lie. 4096MB on board, 4096MB reported as free. Seems the extra bucks invested into a model with more VRAM was a good decision. I have texture streaming turned off anyway.
Yea, well... thing is, the driver reports texture memory available as an unsigned int (GetAvailableTextureMem) . Which maxes out at 4Gb of vram. So your card may be trying to lie its head off ... but it just can't.
fan freaking tastic. So the virtual memory is a thing NS2 does, but it gets that number from the video card driver? I'm assuming that is the case otherwise that would never have been an issue(the other parts of course notwithstanding).
Perhaps put a thing into the code that detects when memory is likely a false reading (somewhere around 3.5GB, I've never seen a 3.5GB graphics card ever before) and prompts the user to change the setting manually for best game performance.
I've got a quadro card in my laptop with 4GB of VRAM, any way of letting us disable texture streaming if we have an ungodly amount of VRAM? I prefer to not see textures load in.
AsranielJoin Date: 2002-06-03Member: 724Members, Playtest Lead, Forum Moderators, NS2 Playtester, Squad Five Blue, Reinforced - Shadow, WC 2013 - Shadow, Subnautica Playtester, Retired Community Developer
There will be a "disabled"/"none" setting (forgot its name), that is basically unlimited and what you would want with such a card (that said, ns2 will never actually use that much).
Can the appropriate value really not be detected automatically some other way? I think we can safely assume a relatively small portion of players even venture into to the options and an even smaller understands what they're looking at. Running well out of the box must be considered important. An option is a great start but it only helps the existing, technical player base.
Can the appropriate value really not be detected automatically some other way? I think we can safely assume a relatively small portion of players even venture into to the options and an even smaller understands what they're looking at. Running well out of the box must be considered important. An option is a great start but it only helps the existing, technical player base.
If the game had the proper device ID name, it could query a list and know I imagine.
DC_DarklingJoin Date: 2003-07-10Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
every hardware device has a unique id. Games can indeed query this. So technicly yes, its possible.
However with all the new hardware coming out on such fast pace and all the hardware already out, making such a list is huge work I imagine. Keeping it up to date even more.
I have seen games from far bigger companies doing just this. And what happens? After x years they do not recognise the card or recognise it wrong. Meaning it picks wrong settings and you still gotta do it manually.
But surely you don't have to know a model specific ID of the video card the game is running on, in advance? I'll be the first to admit I don't know how this works, but that's just implausibly terrible.
DC_DarklingJoin Date: 2003-07-10Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
if you want to know what card the game has you need to match the id and or model number. So yes, you are required to know them.
Of course you can update the list later, but that still requires labor.
How else would you know? To my knowledge, gpu manufacturers dont have a list called 'our cards' for game devs to use. (then again, I never asked them)
But seeing as how folk are reliant on 3rd party sites to check unknown IDs I doubt such a open list from manufacturers exists.
Yes, it should be possible. On Windows you can use DXGI (DirectX 10 and up)/DirectDraw/WMI to get mostly correct figures; DXGI and DirectDraw seem to low-ball it a bit, returning 1990 MB instead of 2048 MB on my card for example. I don't know about Linux, but I can't imagine it'd be too difficult.
Yes, it should be possible. On Windows you can use DXGI (DirectX 10 and up)/DirectDraw/WMI to get mostly correct figures; DXGI and DirectDraw seem to low-ball it a bit, returning 1990 MB instead of 2048 MB on my card for example. I don't know about Linux, but I can't imagine it'd be too difficult.
If you need to use DX10, aren't you leaving out the XP users?
PS: I don't have access to the engine code or necessarily know if there are other limitations, self-imposed or otherwise.
If you need to use DX10, aren't you leaving out the XP users?
Yup, but then you'd just fall back to DirectDraw if DXGI isn't available. With DXGI being simpler, suited to the task and more future proof it'd probably be better to attempt it first.
Yes, it should be possible. On Windows you can use DXGI (DirectX 10 and up)/DirectDraw/WMI to get mostly correct figures; DXGI and DirectDraw seem to low-ball it a bit, returning 1990 MB instead of 2048 MB on my card for example. I don't know about Linux, but I can't imagine it'd be too difficult.
If you need to use DX10, aren't you leaving out the XP users?
PS: I don't have access to the engine code or necessarily know if there are other limitations, self-imposed or otherwise.
Apropos XP users.
Microsoft stopped supporting that.
How many of you are still using XP?
I'm curious, because maybe we can stop worrying about some issues if there's only 3 people left using XP.
Yes, it should be possible. On Windows you can use DXGI (DirectX 10 and up)/DirectDraw/WMI to get mostly correct figures; DXGI and DirectDraw seem to low-ball it a bit, returning 1990 MB instead of 2048 MB on my card for example. I don't know about Linux, but I can't imagine it'd be too difficult.
If you need to use DX10, aren't you leaving out the XP users?
PS: I don't have access to the engine code or necessarily know if there are other limitations, self-imposed or otherwise.
Apropos XP users.
Microsoft stopped supporting that.
How many of you are still using XP?
I'm curious, because maybe we can stop worrying about some issues if there's only 3 people left using XP.
matsoMaster of PatchesJoin Date: 2002-11-05Member: 7000Members, Forum Moderators, NS2 Developer, Constellation, NS2 Playtester, Squad Five Blue, Squad Five Silver, Squad Five Gold, Reinforced - Shadow, NS2 Community Developer
Mmm... thing is, x64 is not like adding support for osx or linux; ie expanding to a new group of users. x64 users can run the current program as well as 32-bit users... so the only reasons to go to 64-bit would be that either you wanted to do things that can't be done on a 32-bit, or there are so few 32-bit users left that you can drop that platform and go to 64 bit just because its easier to work in.
But right now 32 bit isn't causing us that much of a problem, and there are enough 32bit users left (I think? Anyone got numbers?) that going 64 bit only seems unnecessary.
Mmm... thing is, x64 is not like adding support for osx or linux; ie expanding to a new group of users. x64 users can run the current program as well as 32-bit users... so the only reasons to go to 64-bit would be that either you wanted to do things that can't be done on a 32-bit, or there are so few 32-bit users left that you can drop that platform and go to 64 bit just because its easier to work in.
But right now 32 bit isn't causing us that much of a problem, and there are enough 32bit users left (I think? Anyone got numbers?) that going 64 bit only seems unnecessary.
I was saying way down the line because it is unnecessary. The only reason I would even want x64 is because I have seen some mods that made infestation look better for example, or shadows better, but used more memory and were not viable do to the x32 limit.
DC_DarklingJoin Date: 2003-07-10Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
Actually for 64bit users, you can use much more memory then 2GB per application if the application is aware of a certain flag. (I checked, it is)
The hard max is still 4GB but really.. you would hit 4GB of memory purely for ns2?
Comments
Now to translate - New patch = good! Grug make fire now!
Yea, well... thing is, the driver reports texture memory available as an unsigned int (GetAvailableTextureMem) . Which maxes out at 4Gb of vram. So your card may be trying to lie its head off ... but it just can't.
Perhaps put a thing into the code that detects when memory is likely a false reading (somewhere around 3.5GB, I've never seen a 3.5GB graphics card ever before) and prompts the user to change the setting manually for best game performance.
I've got a quadro card in my laptop with 4GB of VRAM, any way of letting us disable texture streaming if we have an ungodly amount of VRAM? I prefer to not see textures load in.
Thanks for this post, very enjoyable to read.
Suck on that nvidia fanbois!
It's forced on / hard coded for all users.
Hence low VRAM users having issues since reinforced when this was introduced.
Low VRAM means in this case anyone with <1.5 GB VRAM at some maps
I have R9 290 with 4Gb VRAM and I have "bad" FPS. Glad it can be fixed ^_^
If the game had the proper device ID name, it could query a list and know I imagine.
However with all the new hardware coming out on such fast pace and all the hardware already out, making such a list is huge work I imagine. Keeping it up to date even more.
I have seen games from far bigger companies doing just this. And what happens? After x years they do not recognise the card or recognise it wrong. Meaning it picks wrong settings and you still gotta do it manually.
Of course you can update the list later, but that still requires labor.
How else would you know? To my knowledge, gpu manufacturers dont have a list called 'our cards' for game devs to use. (then again, I never asked them)
But seeing as how folk are reliant on 3rd party sites to check unknown IDs I doubt such a open list from manufacturers exists.
still if it gets the job done.. ^^
/alwaysnicetolearnstuff
PS: I don't have access to the engine code or necessarily know if there are other limitations, self-imposed or otherwise.
FPS =/= frametimes.
Don't get any false expectations.
Yup, but then you'd just fall back to DirectDraw if DXGI isn't available. With DXGI being simpler, suited to the task and more future proof it'd probably be better to attempt it first.
Apropos XP users.
Microsoft stopped supporting that.
How many of you are still using XP?
I'm curious, because maybe we can stop worrying about some issues if there's only 3 people left using XP.
http://store.steampowered.com/hwsurvey
4.9% of Steamusers on XP32-bit, .27% on 64-bit.
Looks like 1.9% thereabouts below DX10?
But right now 32 bit isn't causing us that much of a problem, and there are enough 32bit users left (I think? Anyone got numbers?) that going 64 bit only seems unnecessary.
I was saying way down the line because it is unnecessary. The only reason I would even want x64 is because I have seen some mods that made infestation look better for example, or shadows better, but used more memory and were not viable do to the x32 limit.
The hard max is still 4GB but really.. you would hit 4GB of memory purely for ns2?
Pretty sure I don't have 5898 MB of VRAM ....