Welcome to my blog. Have a look at the most recent posts below, or browse the tag cloud on the right. An archive of all posts is also available.
So... I've been on an old games kick for some time now. As part of that, I recently purchased a Namco neGcon Playstation controller. I'm not going to dig out a copy of Wipeout... I want to support it in some of the demo programs I write for the graphics programming class I teach... because I can.
A tiny bit of background for people too lazy to click the Wikipedia link... This is an old Playstation controller. It came out long before even the Dual Analog (April 1997, according to the Wikipedia article. I couldn't find a firm release date, but I remember seeing adds for it around the launch time of the Playstation. I could find some rec.games.video.sony posts from March 1996 about importing it from Japan. What makes it special are the quirky analog inputs. The left trigger and the two red buttons are analog. The real kicker is twisting it in the middle is also an analog input.
I hooked it up to my laptop with a generic Playstation-to-USB converter, and hacked up a demo program in SDL to see how this thing reports itself. The first disappointing thing is the name. SDL just reports it as "USB Gamepad " (yes, with a space at the end). I'm sure that's a quirk of the adapter. Since I have several controllers that I use, I use the name to set default button mappings. My Logitech DualShock look-a-like reports as "Logitech Logitech Dual Action" (yes, Logitech twice), and my PS3 Sixaxis reports as "Sony PLAYSTATION(R)3 Controller".
It shows up as 5 axes, 12 buttons, and a hat. Let's look at the mapping to see it get crazy:
- Twisting: axis 0
- Buttons I and II (the red ones): axis 1. Yes, the two jolly, red, candy-like analog buttons show up, together, as axis 1. Button I gets the negative values, and button II gets the positive values.
- Button A: button 1
- Button B: button 0
- Left trigger: axis 3
- Right trigger: button 7
- Start: button 9
- D-pad: Hat 0. I hate controllers that advertise the d-pad as a hat. My Logitech controller does that, but the Sixaxis just shows them as buttons.
All of this begs the question, "WTF?" It also begs a couple follow-up questions. Is all of this madness caused by the adapter, or is it endemic to the controller itself? I suspect it shows up as 12 buttons because of the DualShock. The DualShock actually has 13 buttons (the "Analog" selector), but I don't think that gets sent over the protocol. I think that just changes the function of the controller itself. My Logitech has a similar "mode" button, and that doesn't go over the protocol.
Did anyone ever use a neGcon with a parallel port adpater? How did it show up? It looks like the Linux kernel has supported it for ages, so someone must have done it...
The extension is modeled after Apple's
interface. Before even embarking on this project, several ISVs told me
that they really liked that interface. There are some small changes,
but it's still pretty similar.
functions allow applications query a bunch of aspects about the driver
and hardware before creating a context:
- Driver version
- Driver vendor
- Hardware vendor (may be different from driver vendor!)
- Hardware chipset (PCI ID)
- Available memory
- UMA vs. non-UMA
- Supported and prefered (by the driver) GL profiles
The last one is really important to know before creating a context because it may influence what sort of context that app creates.
The initial branch, with the extension spec, is available at:
Woo hoo! In order that people can announce products at Mobile World Congress next week, Khronos voted to approve the OpenGL ES 3.0 conformance submissions before the usual 30-day review period was up. It took quite a bit longer to get the conformance tests ready, so this move was made to not punish companies (who all worked hard to get their products ready) for delays in the test suite.
Anyway... the vote passed, and WE'RE IN!!! We did it!!!
As an aside, thanks to Ken's recent patches, we should be able to submit results for Sandy Bridge on Mesa 9.1 soon. Unfortunately, that submission will have to wait for the 30-day review period.
The slides from my FOSDEM talk are now available. I would have gotten them out sooner, but I caught a bit of a cold on the way home. I guess averaging 3 hours of sleep per night wore my resistance down. Who knew?
In any case, the talk went really well. The attendance was low due to not being in the printed schedule (I got my information in really late) and being scheduled across from Luc's graphics talk in the main track. It was really good to chat with the Wine developers... I continue to be amazed that thing works. I don't envy them at all... but I'm glad they're there.
Unbeknownst to most, last week was a pretty big week for open-source graphics drivers... and I'm not referring to all the presentations at FOSDEM. Intel and the Mesa community made good on my bold claim at SIGGRAPH 2012, and we submitted OpenGL ES 3.0 conformance results for Ivy Bridge GPUs on the Mesa 9.1 release branch last Tuesday. So far, only two other companies have made submissions, one before and one after ours.
This is a really big deal for both Intel and Mesa! When was the last time either produced a hardware driver within a reasonable amount of time after the spec was released? I'm not sure that Mesa has ever done that, and I believe it was around OpenGL 1.3 for Intel... and now we're one of the first...
This is a great day for Mesa and open-source graphics drivers. Just a tad over a month ago, I submitted OpenGL ES 2.0 conformance test results to Khronos for Intel Sandy Bridge and Ivy Bridge GPUs with Mesa 8.0.4. There were no objections during the 30 day review period, so we are now officially conformant! Finally being on that list is pretty cool.
Not only is this great news for my team at Intel, but it's terrific news for Mesa. Mesa has had a long history with OpenGL, the ARB, and Khronos. This is, however, the first time that Mesa has ever, in any way, been listed as a conformant implementation. This is a big boost to Mesa's credibility.
Hopefully we'll be able to follow this with OpenGL ES 3.0 conformance next year.
The OpenGL ES BoF at SIGGRAPH is still in progress, but let's have some
real-time news to go with our real-time rendering! In addition to announcing
the availability of OpenGL ES 3.0 and ASTC, several companies were able to
announce their plans for OpenGL ES 3.0. Intel's Open-Source Technology Center
was there. We are working on ES 3.0, and we have a public branch to prove it:
Right now things are "pre-alpha" quality. There are a few features, like ETC2, that are not yet supported. There are also a bunch of holes in support for some enums (both ones that should be supported and ones that should not). At this point, we've enabled a big pile of the functionality, and the driver says "OpenGL ES 3.0 Mesa 8.1-devel". That's why this lives on a branch, and it will continue to live on a branch until at least after the next Mesa release (probably in September).
Like the slide presented at the BoF says, our plan is to have ES 3.0 fully enabled for release in 2013Q1. This probably means February, but we'll see. Software releases happen when they happen.
Last week my team and I spent some time in Bellvue working with Valve's Linux team on their port of Left 4 Dead 2.
It was the most successful work trip I've ever had. The guys at Valve were amazing to work with. They are sharp, driven, and have an aggressive plan. Looking at how far they've come and the short amount of time in which they've done it, I have every confidence that they're going to kick their plan right in the butt. It's going to be a good time to be a Linux gamer.
We had three main goals going in:
- Help them tune their game for our driver / hardware.
- Find out where our performance is lacking.
- Find out what OpenGL features they need / want.
I think we scored on every point. We helped them find some performance bugs in their vertex buffer management (which also affected other GPUs / drivers) and some places where the accidentally triggered shader recompiles. This gave some healthy performance improvements.
We also found some areas where our driver really, really needs to improve. They have a couple shaders that devolve into register spilling nightmares. There are also a few places where we eat way, way too much CPU. A lot of these problems mirror issues that we've seen with other game engines (e.g., Unigine).
These have been a lot easier to diagnose on L4D2 because we have access to their source code. Being able to take a profile that shows times in the driver and in the application makes a world of difference. Being able to tweak little things in the app (what happens if I do this...) is also helpful for diagnosing performance problems. Eric has already started landing patches for L4D2 performance, and there will be many more over the coming weeks.
The funny thing is Valve guys say the same thing about drivers. There were a couple times where we felt like they were trying to convince us that open source drivers are a good idea. We had to remind them that they were preaching to the choir. Their problem with closed drivers (on all platforms) is that it's such a blackbox that they have to play guess-and-check games. There's no way for them to know how changing a particular setting will affect the performance. If performance gets worse, they have no way to know why. If they can see where time is going in the driver, they can make much more educated guesses.
We also got some really good feedback about features. The biggest
feature they want is better output from
They really want to know when they do things that fall off performance
paths, trigger shader recompiles, etc. We hacked out some initial
versions of this, and it was a big help. Some patches in that area
should hit the mailing list soon.
They're also interested in what they call "smart vsync." Swaps that
are scheduled soon enough will have vsync, and the application will be
vsync limited. Swaps that are scheduled too late happen immediately.
In their words, "Tearing is bad, but dropping to 30fps is worse." On
GLX, we can expose this with
The other night I watched the pilot for TRON: Uprising, and was frickin' impressed. The plot was quite decent, probably even better than TRON: Legacy. For a TV show on the Disney Channel, the production quality was amazing. The voice acting was really well done, and the rendering was beautiful. (I'm not sure if the soundtrack was original, or if was just harvested from TRON: Legacy.)
That's actually my biggest concern about it. When the series begins in June, I have a hard time believing that every episode will have as much work put into as the pilot.
As an additional shock, it was broadcast without commercials. Somebody must have gotten fired for that!
Watch it for yourself...
This wiki is powered by ikiwiki.