Last week my team and I spent some time in Bellvue working with Valve's Linux team on their port of Left 4 Dead 2.
It was the most successful work trip I've ever had. The guys at Valve were amazing to work with. They are sharp, driven, and have an aggressive plan. Looking at how far they've come and the short amount of time in which they've done it, I have every confidence that they're going to kick their plan right in the butt. It's going to be a good time to be a Linux gamer.
We had three main goals going in:
- Help them tune their game for our driver / hardware.
- Find out where our performance is lacking.
- Find out what OpenGL features they need / want.
I think we scored on every point. We helped them find some performance bugs in their vertex buffer management (which also affected other GPUs / drivers) and some places where the accidentally triggered shader recompiles. This gave some healthy performance improvements.
We also found some areas where our driver really, really needs to improve. They have a couple shaders that devolve into register spilling nightmares. There are also a few places where we eat way, way too much CPU. A lot of these problems mirror issues that we've seen with other game engines (e.g., Unigine).
These have been a lot easier to diagnose on L4D2 because we have access to their source code. Being able to take a profile that shows times in the driver and in the application makes a world of difference. Being able to tweak little things in the app (what happens if I do this...) is also helpful for diagnosing performance problems. Eric has already started landing patches for L4D2 performance, and there will be many more over the coming weeks.
The funny thing is Valve guys say the same thing about drivers. There were a couple times where we felt like they were trying to convince us that open source drivers are a good idea. We had to remind them that they were preaching to the choir. Their problem with closed drivers (on all platforms) is that it's such a blackbox that they have to play guess-and-check games. There's no way for them to know how changing a particular setting will affect the performance. If performance gets worse, they have no way to know why. If they can see where time is going in the driver, they can make much more educated guesses.
We also got some really good feedback about features. The biggest
feature they want is better output from
They really want to know when they do things that fall off performance
paths, trigger shader recompiles, etc. We hacked out some initial
versions of this, and it was a big help. Some patches in that area
should hit the mailing list soon.
They're also interested in what they call "smart vsync." Swaps that
are scheduled soon enough will have vsync, and the application will be
vsync limited. Swaps that are scheduled too late happen immediately.
In their words, "Tearing is bad, but dropping to 30fps is worse." On
GLX, we can expose this with