Hey Jonathan,
This might be old news to everyone there at Valve, and I kinda hope it is and just hasn't been fixed in the SDK yet :/
First off: let me state that I've only reproduced this bug with a local (map <mapname>) server and haven't tested it with any other setups (client connecting to dedi server, etc).
About 5 minutes with the profiler in my new copy of VS2012 and I was pretty sure there was something wrong with the animation system. Visual Studio was reporting that about 5% of total game time for both the server and the client was being spent on VerifySequenceIndex() in animation.cpp. I didn't look too far into how the system works, but I assumed that the animation system generates two CUtlVectors with the purpose of creating an "optimized" comparison method for finding animations, aka avoiding having to do strcmp everywhere and instead just using integers for activity names.
It seems like the value "activitylistversion" in studiohdr_t(\public\studio.h) is networked between the client and server? The problem arises when VerifySequenceIndex() is called in animation.cpp. What should happen is the CStudioHDR's GetActivityListVersion() function should return the same value as g_nActivityListVersion, except for the very first time, when it needs to be initialized. What actually happens is the client's g_nActivityListVersion equals 1, and the server's g_nActivityListVersion equals 2.
This happens because on the server in CWorld::CWorld(), (\game\server\world.cpp) the function ActivityList_Init() is called, then in CWorld::Precache() the function ActivityList_Free() is called, which increments g_nActivityListVersion (which is initialized to 1, and after being incremented is now 2). On the client however (in cdll_client_int.cpp, InitGameSystems()) only ActivityList_Init() is called, so on the client g_nActivityListVersion remains at 1.
Because of these differences, every frame the server will call VerifySequenceIndex(), find that the CStudioHDR's GetActivityListVersion() equals 1, and will rebuild the optimized activity list, then set the CStudioHDR's activitylistversion to 2. The client will then call VerifySequenceIndex(), find that the CStudioHDR's GetActivityListVersion() equals 2, and will rebuild the optimized activity list, then sent the CStudioHDR's activitylistversion to 1.
This will keep going indefinitely, thus resulting in a 5% overall performance loss according to Visual Studio. Whatever the true reason for why these variables are synced between the client and server, changing
ActivityList_Init();
ActivityList_RegisterSharedActivities();
EventList_Init();
EventList_RegisterSharedEvents();
in InitGameSystems() in cdll_client_int.cpp to
ActivityList_Init();
ActivityList_Free();
ActivityList_RegisterSharedActivities();
EventList_Init();
EventList_Free();
EventList_RegisterSharedEvents();
fixed the problem. Hopefully this has already been fixed and the SDK hasn't been updated with the fix, or if not, there's 5% more performance
This was the response I received back from him a few days later:
Wow, good find Matt! I’ll look into it here. Thanks.
With typical Valve style I never received anything back after that, but whatever.







