Welcome to my blog. Have a look at the most recent posts below, or browse the tag cloud on the right. An archive of all posts is also available.
This previous weekend was the Portland Retro Gaming Expo, and it was awesome! I was there all day both days.
One of the highlights of the show where the performances by 8Bit Weapon. You'll notice in the second image that she's playing an Atari paddle controller. You can also see a C64 (donated by the local Commodore User Group) in the last image.
The GIANT arcade was... amazing! There were some games there that I haven't played in years. My uncle would have enjoyed Mappy. And I didn't win the tabletop Asteroids game. I spent a lot of time playing APB (Hey Ground Kontrol: put that in the arcade!!!). I was also pretty surprised that the only console version of APB is for the Lynx. Oof.
I love the Space Wars instructions... I think those are still the instructions for Windows... lol.
There was a cool presentation about making games for the Atari 2600. There were even a couple "celebrities" in the audience. On the right is Darrell Spice Jr. (programmer of Medieval Mayhem), and on the left is Rebecca Heineman (programmer London Blitz for the 2600 among other things). She's also responsible for the awesome disassembly of the game Freeway.
Joe Decuir was also in the audience, and I met him later at the Atari Age booth (I had to get Medieval Mayhem and Moon Cresta ). I can't believe Joe doesn't have a Wikipedia page. He was one of the original designers of the Atari 2600, programmer of Video Olympics and Combat (with Larry Wagner). He was also one of the original Amiga designers. He was having a conversation with one of the Atari Age guys about wanting to design a new console in the style of an old Atari 8-bit or an Amiga. I told him that there's a group trying to make an Amiga 2000 from scratch (using FPGAs for custom chips). So now I'm tasked with finding that project again and getting him in contact with them. I'm sure they'd appreciate any help he could offer!
Now my Mesa: State of the Project talk is done too, and the slides are available.
I'm assuming that someone will send a long a link to the video soon...
My XDC talk "Effects Framework for OpenGL Testing" just got done, and the slides are available. The talk went pretty well, and the discussion was healthy.
The three big high points of the discussion were:
- For the most part, adding more language to learn won't necessarily make it easier to add more complex tests. Just writing C tests in piglit isn't bad these days. The worst parts are dealing with cmake and all.tests. The best thing about shader_runner (and similar) is that you don't have muck about with any of that.
- One difficulty with complex tests is validating the correctness of results. The red / green box tests are good because the pass / fail metric is obvious. Perceptual difference algorithms can work (VMware uses them), but they can be twitchy and frustrating (Cairo gave up on them).
- The shader_runner parser is a mess because everyone just added one more piece of duct tape for the next tiny feature they need. There used to be a clean, simple piece of code, but you can't see it now... all you can see is the big ball of duct tape. One advantage of nvFX is that it is a consistent, defined language... instead of a ball of duct tape. We could borrow their syntax for some of the things that shader_runner already does.
There may have been other important points, but those are the two that really stuck with me. The forum is open.
I'm giving a talk tomorrow at LPC in the Graphics and Display microconf. Since the time slots are so short (25 minutes!), I wanted (okay, Laurent requested) to provide some details before the talk to prime the pump, so to speak.
One line summary: Right now debugging and profiling graphics applications and graphics systems on Linux is a disaster. It's better on Windows, but not by a lot (especially for OpenGL).
There are very few tools, and the tools that exist are either insufficient or are vendor-specific. Moreover, the tools don't provide any kind of system view. At some point in some desktop environment, every developer has to figure out why / how the compositor is wrecking performance. This often takes a lot of work because there is no system view.
The one tool for Windows that can provide a system view is GPUView.
As a result, even on Windows, many developers end up rolling their own tools.
The methods remind me a lot of the old days of sprinkling
rdtsc() calls all
over the code. What has changed is the level detail provided by the tools
that display the logged data. Valve has famously
talked about the system they use.
Other developers have told me they use similar systems.
There is common thematic problem in all of these tools and approaches. The developer is either gets a lot of detailed data and is tied to a particular vendor, or the developer gets very coarse data. And they don't get system-wide data.
There are several disparate groups that need data
- People creating stand-alone debug / profile tools (e.g., apitrace).
- People building data collection into their application and using an external, post-hoc visualization tool
- People building data collection and visualization into their application.
Generally, folks doing one of the last two are doing both to varying degrees.
So here's the question. Can we provide a set of interfaces, probably from the kernel, that:
- Provides finer grained data than is available from, say,
GL_ARB_timer_queryabout the execution of commands on the GPU.
- Provides the above data at a system level with semantic information (e.g.,
this block of time was your call to
glDrawArrays, this block of time was the compositor doing "stuff", this block of time was your XRender request, etc.) without leaking information in a way that compromises security.
- Allow closed-source drivers to expose these interfaces.
I've just pushed a branch to my fd.o tree that brings back the standalone compiler. I've also made some small enhancements to it. Why? Thanks for asking.
One thing that game developers frequently complain about (and they're damn right!) is the massive amount of variability among GLSL compilers. Compiler A accepts a shader, but compiler B generates errors. Then the real fun begins... which compiler is right? Dig through the spec, report bugs to the vendors, wait for the vendors to finger point... it's pretty dire.
Mesa compiler has a reputation of sticking to the letter of the spec. This has caused some ruffled feathers with game developers and with some folks in the Mesa community. In cases where our behavior disagrees with all other shipping implementations, I have submitted numerous spec bugs. If nobody does what the spec says, you change the spec.
This isn't to say our conformance is perfect or that we don't have any bugs. Reality is quite the contrary. However, we are really picky about stuff that other people aren't quite so picky about. When we find deviations from the behavior of other implementations, one way or another, we sort it out.
Sometimes that means changing our behavior (and adding piglit tests).
Sometimes that means changing our behavior (and getting the spec changed).
Sometimes that means implementing a work-around for specific apps (that is only enabled for those apps!).
Sometimes that means not changing anything (and taking a hard line that someone else needs to fix their code).
The combination of our ability to build our compiler on many common platforms and our spec pedanticism puts Mesa in a fairly interesting position. It means that developers could use our compiler, without the baggage of the rest of a driver, as the thrid-party to settle disputes. It can be the "if it compiles on here, it had better compile anywhere" oracle.
Even if it fails at that, we emit a lot of warnings. Sometimes we emit too many warnings. A standalone compiler gives a good start for "lint" for GLSL. There are already a couple places that we give portability warnings (things we accept that some compilers are known to incorrectly reject), so there's already some value there.
Three weeks ago today (that's how far behind I am!), I gave one of Intel's "Sponsored Technical Sessions" at SIGGRAPH. Last year I presented one slide in the OpenGL ES BoF. This year I present in my company's paid room. Next year... the world.
The presentation was a brief overview of performance tuning graphics applications... games... for the open-source driver on Intel's GPUs. This is a collection of tips and suggestions that my team has gathered from tuning our driver for shipping apps and working with ISVs like Valve. I'm already working to improve the slide set, and I'm hoping to present something similar at GDC. We'll see how that turns out.
Anyway, my slides are available from Intel.
So... I've been on an old games kick for some time now. As part of that, I recently purchased a Namco neGcon Playstation controller. I'm not going to dig out a copy of Wipeout... I want to support it in some of the demo programs I write for the graphics programming class I teach... because I can.
A tiny bit of background for people too lazy to click the Wikipedia link... This is an old Playstation controller. It came out long before even the Dual Analog (April 1997, according to the Wikipedia article. I couldn't find a firm release date, but I remember seeing adds for it around the launch time of the Playstation. I could find some rec.games.video.sony posts from March 1996 about importing it from Japan. What makes it special are the quirky analog inputs. The left trigger and the two red buttons are analog. The real kicker is twisting it in the middle is also an analog input.
I hooked it up to my laptop with a generic Playstation-to-USB converter, and hacked up a demo program in SDL to see how this thing reports itself. The first disappointing thing is the name. SDL just reports it as "USB Gamepad " (yes, with a space at the end). I'm sure that's a quirk of the adapter. Since I have several controllers that I use, I use the name to set default button mappings. My Logitech DualShock look-a-like reports as "Logitech Logitech Dual Action" (yes, Logitech twice), and my PS3 Sixaxis reports as "Sony PLAYSTATION(R)3 Controller".
It shows up as 5 axes, 12 buttons, and a hat. Let's look at the mapping to see it get crazy:
- Twisting: axis 0
- Buttons I and II (the red ones): axis 1. Yes, the two jolly, red, candy-like analog buttons show up, together, as axis 1. Button I gets the negative values, and button II gets the positive values.
- Button A: button 1
- Button B: button 0
- Left trigger: axis 3
- Right trigger: button 7
- Start: button 9
- D-pad: Hat 0. I hate controllers that advertise the d-pad as a hat. My Logitech controller does that, but the Sixaxis just shows them as buttons.
All of this begs the question, "WTF?" It also begs a couple follow-up questions. Is all of this madness caused by the adapter, or is it endemic to the controller itself? I suspect it shows up as 12 buttons because of the DualShock. The DualShock actually has 13 buttons (the "Analog" selector), but I don't think that gets sent over the protocol. I think that just changes the function of the controller itself. My Logitech has a similar "mode" button, and that doesn't go over the protocol.
Did anyone ever use a neGcon with a parallel port adpater? How did it show up? It looks like the Linux kernel has supported it for ages, so someone must have done it...
The extension is modeled after Apple's
interface. Before even embarking on this project, several ISVs told me
that they really liked that interface. There are some small changes,
but it's still pretty similar.
functions allow applications query a bunch of aspects about the driver
and hardware before creating a context:
- Driver version
- Driver vendor
- Hardware vendor (may be different from driver vendor!)
- Hardware chipset (PCI ID)
- Available memory
- UMA vs. non-UMA
- Supported and prefered (by the driver) GL profiles
The last one is really important to know before creating a context because it may influence what sort of context that app creates.
The initial branch, with the extension spec, is available at:
Woo hoo! In order that people can announce products at Mobile World Congress next week, Khronos voted to approve the OpenGL ES 3.0 conformance submissions before the usual 30-day review period was up. It took quite a bit longer to get the conformance tests ready, so this move was made to not punish companies (who all worked hard to get their products ready) for delays in the test suite.
Anyway... the vote passed, and WE'RE IN!!! We did it!!!
As an aside, thanks to Ken's recent patches, we should be able to submit results for Sandy Bridge on Mesa 9.1 soon. Unfortunately, that submission will have to wait for the 30-day review period.
The slides from my FOSDEM talk are now available. I would have gotten them out sooner, but I caught a bit of a cold on the way home. I guess averaging 3 hours of sleep per night wore my resistance down. Who knew?
In any case, the talk went really well. The attendance was low due to not being in the printed schedule (I got my information in really late) and being scheduled across from Luc's graphics talk in the main track. It was really good to chat with the Wine developers... I continue to be amazed that thing works. I don't envy them at all... but I'm glad they're there.
This wiki is powered by ikiwiki.