When working on my thesis piece at The Ohio State University I started building a brain in Maya (I think it was version 1 back then). I had conceived a scene where what I refer to as EyeBrain would be playing chess with a fleshy blob vaguely resembling an elephant who would be moving the pieces with his trunk. I later cut that scene. As I blogged in January I revived the EyeBrain character recently, so I took the old Maya object, converted it to polygons using Softimage XSI and edited it in LightWave. Got sidetracked over the last months but here is a test render:
Geek alert: skip this next paragraph if you could care less about technical issues. A great feature of LightWave is that it shows image mapped displacement in the interface, in real time Open GL. And it is FAST, also in rendering. Downside is that it seems a shortcut has been taken to accomplish this. The displacement is not based on the surface normal, as it should be, simply displaces along one axis, giving you a choice of displacing along X, Y or Z. Not very useful for an organic object like a brain. So I had to create three displacement maps, one for each axis. Luckily LightWave does enable you to "bake" a normal shader to a texture, so using a texture map of the object normal to determine which map should be active where, I finally got the above result. Not very practical, no. But did I mention it renders really fast?
Update: So there is a way to use the displacement map based on the normals in LightWave. You can add a "normal displace" modifier. I also found out why it works so fast, something I should have guessed: LightWave only displaces the geometry points, not the resulting surface. Aargh!