Thoughts on a server
douchebagatron
Custom member title Join Date: 2003-12-20 Member: 24581Members, Constellation, Reinforced - Shadow
<div class="IPBDescription">from some dude who ran a server</div>So I ran a server all day yesterday and today. mostly because I was curious about the whole tickrate issue.
to start, here are my computer specs:
Intel core i7 920 @ 2.67GHz
3 GB ram
64 bit OS
Radeon HD 5770
I ran this server all day today, started it this morning before work at about 8am, came home to see no one had touched it at 5pm, then joined the server myself. at this point the idle server was still running at 25-30 ticks per second.
I idled in the server and left for a few hours, came back at 8pm to see the server full with people playing. server still running at 25-30 ticks per second.
I experienced a similar situation last night, where I was playing for about 2 hours with the server never dipping below 20 ticks per second, and averaging around 25.
While the tick rate was staying fairly high, people did complain about lag. I couldn't get any straight answers from anyone, but I'm betting I can attribute this to me running this off a home connection, and even doing it over wireless at that. no one had a terrible ping, but I can't find anything else that could be wrong.
I can't seem to figure out why my tickrate stayed around 30 the whole time, while every other server I connect to has trouble ever getting above 20. Is it my machine? or is there something with my setup that is preventing server degrading?
The only thing I can imagine that's different is that I'm running it off of a slower internet connection. Only way I can imagine that affecting anything is if the packets are coming in/out at a rate that lua can manage with garbage collection, but could that really even be the problem?
Also something interesting: I saw an incredibly regular spiking of processor power. Would that be from garbage collection too?
to start, here are my computer specs:
Intel core i7 920 @ 2.67GHz
3 GB ram
64 bit OS
Radeon HD 5770
I ran this server all day today, started it this morning before work at about 8am, came home to see no one had touched it at 5pm, then joined the server myself. at this point the idle server was still running at 25-30 ticks per second.
I idled in the server and left for a few hours, came back at 8pm to see the server full with people playing. server still running at 25-30 ticks per second.
I experienced a similar situation last night, where I was playing for about 2 hours with the server never dipping below 20 ticks per second, and averaging around 25.
While the tick rate was staying fairly high, people did complain about lag. I couldn't get any straight answers from anyone, but I'm betting I can attribute this to me running this off a home connection, and even doing it over wireless at that. no one had a terrible ping, but I can't find anything else that could be wrong.
I can't seem to figure out why my tickrate stayed around 30 the whole time, while every other server I connect to has trouble ever getting above 20. Is it my machine? or is there something with my setup that is preventing server degrading?
The only thing I can imagine that's different is that I'm running it off of a slower internet connection. Only way I can imagine that affecting anything is if the packets are coming in/out at a rate that lua can manage with garbage collection, but could that really even be the problem?
Also something interesting: I saw an incredibly regular spiking of processor power. Would that be from garbage collection too?
Comments
As a programmer, this stands out. How "regular" is regular? Repeatability exposes bugs. Especially time-based repeatability.
While I wouldn't ask you to bust out some advanced tools like Proc Explorer, could you at make a Task Manager screenshot of CPU Usage to show HOW regular? Bonus points for timing the interval, too, since Task Manager doesn't record timestamps.
As to your question, modern versions of Lua (since 2006) are supposed to spread garbage collection (GC) over time, avoiding spikes that would cause latency (aka stutter). In theory, the GC shouldn't be spiking the CPU constantly unless there's some pathological memory abuse going on.
Disclaimer: from bitter experience in embedded programming, I know there are so many edge cases, exceptions, and nuances that it's impossible know definitely unless you are really observing the program in action. tldr: Optimization is the devil.
so its what max has said recently regarding solving how GC interfaces with C++ as being the issue alongside server stability, and once these are fixed the apparent "lag" people will be experiencing due to tick rate will disappear. (didnt he mention something regarding drastically re writing the Occlusion Culling as well to help?)
since his only technical tasks listed currently are two, and they are simple, something tells me he is really focusing on these parts..
o_stats is more usefull to see what makes the memory grow. If you just want to see the hitches profile 1 is perhaps more usefull. But it is true that with profile 1 you wont see what caused the hitch (well, you can pause on a hitch and look for a garbage collector in the method list)
From running my server on 3 forms of hardware, I can tell you that mhz is the most important feature. I started on a xeon at 2.4ghz, and moving to an i7 at 3.3~ghz helped increase the players I can have without tick rate dropping. Increasing that speed again to 4ghz also helped. Right now tick rate can stay at a stready 30 with up to about 12 players in a medium sized game, and 14 in a light game. Anything over this and tick rate dips below 25, which begins to cause problems. Actually I think anything below 30 (or the average of 30, since it is always hovering between 28 and 32) will start to cause noticable issues, getting worse and worse as the tick rate drops.