Hi I'm trying to reduce the network utilization of a server program, for each connected client. I am monitoring the network utilization in the windows task manager's networking tab. When I start up the server and wait for connections the utilization is at 0.00% when one client connects to the server and logs into the game I get 0.13% utilization while the client is idle. When the client is running around and playing the game it goes up to 0.25% and varies by a few hundredths of a point. I'm trying to squeeze as many available connections into my server computer as possible
When I tested out my server program on my new computer the idle network utilization per client was 0.30% and the playing utilization was 0.90%. After tinkering with the client and server's connection thread(where all the tcp socket communcation happens) I found out that having the client's connection thread sleep for 25ms was as long as it could sleep for without losing performance and "lagging". Anything faster than that would dramatically increase the network utilization on the server computer. Also having the server's connection thread sleep for 15ms was best for performance(anything more would also cause the client to "lag"), anything less would increase the network utilization.
I tried shortening some of the strings the server sends the client, by replacing the identifying first part of the string ex. "MonsterData" with "6" and I did that for everything the server sent. That did not lessen the network utilization either.
I rewrote some of the server code to have some "lastSent" variables with different data stored in the connection thread for variables like hitpoints, strength, experience points, etc. Instead of always sending all of the variables(around 15 variables) I had it send them only if they had changed since the last sending of the variable. That did not decrease network utilization.
I also added another thread.sleep(5) to the server connection, right after it sends the monster data, if it sends no data then it sleeps, that decreased network utilization by 0.03% while the client is idle. If i put more of the thread.sleep() after the sending of different data it does not decrease network utilization.
I also tinkered with setReceiveBufferSize() and setSendBufferSize() on the client and server, that also did not help.
I've been researching on Google but I have not found any valuable information over the past few days.
I'm at a standstill on my project and I'll keep messing around with the code but I'm not making much progress alone with this roadblock.
EDIT*** I just commented out setTCPNoDelay(true); to see if that would have an impact....sure enough it took a single connection's network utilization down to 0.01% idle and up to 0.07% while playing...but the client is very laggy now. I'll lower the two connection's thread.sleep() although that will probably just increase network utilization again. One thing, I enabled setTCPNoDelay() because the game was very laggy without it out a couple months back. I'm not sure if I'll be able to use Nagle's alrorith for that reason.
EDIT2*** With Nagle's alrorith enabled, nomatter what I set those connection's thread.sleep() to, even .sleep(1) It fails to improve performance like when Nagle's alrorith is disabled. NPCs moving nearly 1 tile at a time instead of a couple of pixels, same thing with the fireball spell the player can cast, it jumps away from you in large steps, instead of smoothly. If there was a way to use Nagle's alrorithm for the huge improvement in network utilization but still be able to increase or decrease those thread.sleep() with their improvements i'd be on the right track.
If anyone can shed some insight on this topic for me I'd greatly appreciate it. I'll also answer almost any questions you might have. Thanks for reading.