After a couple of more introspective blogs, this week I’m going o be a little more geeky. To whit: 3D graphics programming. I’m a fan of graphics coding – it involves interesting maths and methods and can also be quite rewarding when your vision for a little application actually appears on the screen. Everyone should have a hobby, even an exceptionally nerdy one.
All this meant that when the news came down that we would need to make a demo for the upcoming CMIC open day, I jumped at the opportunity to build a cool-looking 3D realisation of my diffusion simulation. The idea is to render the tissue substrate we use to restrict the motion of the spins and the diffusing spins themselves and a couple of plots of spin displacements. It’s written in JoGL and interfaces with the Camino project, which contains the simulation code.
I should say at this point that it’s not yet finished, so this is something of a part one, but I’ve learned some interesting things this week so I thought I’d blog it.
First up, there was the basecode. This opens a window, sets up OpenGL and the rendering environment, and initialises an arcball so that you can rotate the view with the mouse. In the spirit of not inventing the wheel, I used some code that someone else had written. Specifically, IntelliJ’s port of the NeHe OpenGL tutorials, which I can highly recommend. Lesson 48 contains the arcball code.
The meshes from the diffusion simulation can be fairly big, with hundreds of thousands or millions of triangles so to render them efficiently meant doing something radical: using some OpenGL functionality that was developed after 1996. Shocking, I know, but needs must.
Back in 2001 when, as a fresh(er) faced young PhD student earning OpenGL seemed like a good idea, you rendered triangles one at a time with normals and colour specified in immediate mode. This, I have learned, is now rather akin to wearing a stove-pipe hat and working exclusively with steam-driven technoogy*. Instead there are these new-fangled things called Vertex Buffer Objects (VBOs).
A VBO is just a way of shunting as much of the rendering as possible off onto the GPU. Assemble your verte data into the right order, configure the graphics pipeline, shove the data into video RAM and them tell the card to get on with it.
It works VERY well.
I wanted to render my meshes with lighting, so I needed to specify normals as well as vertices. It turned out that find code example to construct and render VBOs with normals was a little hard to come by, so I ended up stumbling through this part on my own. I’ve got it working, though, and I’ll be posting some code snippets to show what I did. I’m not claiming this is the best code in the world, but it works and has pretty decent performance.
In the process of getting things working, I learned some important things:
- VBOs can easily handle normals (and colours, and textures, for that matter) but OpenGL is a little fussy about the order in which you do things. You need to generate and bind the normals object and specify the normals pointer before the vertices or you’ll get an error. I’m sure there’s a good reason for this, but my knowledge is too superficial to know what it is.
- Specifying projection geometry can be a tricky business. The view frustrum won’t work with a negative clipping plane, but more importantly a clipping plane at zero can cause your z-buffering to stop working (I presume this is due to singular values in some projection matrix). Moving the clipping plane away from zero will fix this.
- By default OpenGL only lights one side of the triangles. This is great for a closed surface, but my meshes are unholy messes derived from stacks of microscope images – you cn see inside and need to render both sides of the triangles. This has nothing to do with the VBO or even the shader model, you change it my specifying a two-sided lighting model with
- VBOs are almost supernaturally efficient. This morning I loaded a mesh with over 1000000 triangles. I can render it at over 30fps with no special sorting or optimisation at all on my laptop within my IDE.
So now I some code that renders a complex mesh with arcball rotation and lighting. I’ve added some extra functionality for a little graph in the bottom-left corner that I’ll be adding to over the next week or so. In the mean time, here’s a screenshot:
… as a bonus, I can render any mesh I’ve got a ply file for, so We can now simulate diffusion inside a cow
or a sailing ship…
I’ll post some updates once there’s some more functionality. Next up: the diffusing particles, and proper displacement profile plots.
* i.e. kind of steampunk. Not quite what I was going for.