Started August 2002, Finished October 2002
This story has somewhat of a twisted/interesting plot to it – in the Spring semester of my Junior year of high-school we were told to pick a topic for our “Senior Project” and find a mentor that would, a year later, sign off on our work and try and guide us on the right path for our career. Being heavily involved in game development and 3D computer graphics for around 5 years at that point and living only two hours away, I decided to be a little randy and contact Tim Sweeney – the lead programmer at Epic Games.
Much to my surprise he actually responded back and said that he might be interested but wanted me to put together a demo for him within two weeks. While reading his response the entire left side of my body went numb, literally, and I could no longer move that half of my body. I very slowly somehow managed to make my way downstairs to the kitchen where I found my brother, Nate, and attempted to ask him for a drink of water. After several minutes of attempting this only to see him look back with an odd look on his face I ended up getting the water myself with extraordinary difficulty. I have no memory of the next several hours except waking up around 3AM in my bed (having no clue how I got there) andvomitingprofusely well into the morning. My mom woke up at one point and decided to let me stay home the next day. The vomiting continued for around three days and I didn’t return to normalcy or school for, coincidently, two weeks. Coming from a relatively large, very low income family and having a 3D game engine demo to write from scratch, I chose to skip the hospital visit and instead spent every waking moment where I wasn’t vomiting ordeliriouson the demo. Having never actually been checked out, no one really knows for sure what happened to me but the general consensus was that it was either a stroke and/or heart attack.. at age 16.
The demo I ended up creating consisted of a fully dynamically lit and textured terrain engine coupled with a particle engine and a first person camera. The actual scene was a grassy terrain with rain, fire and fireworks (the latter all done with particles) written in C++ and Direct3D 8 – fully complete, bug-free and on-time for the 2 week deadline. Impressed with the quality of code, rate I developed everything and the demo itself, Tim agreed to be my mentor for the project that I’d then do the next year – I was both amazed and ecstatic.
Unfortunately, when it became time to actually do the project, he was unreachable. They were in the thick of Unreal Tournament 3’s development so, while a little disheartening for a youngster, I understood. Instead my mentor ended up being David “Virdog” Payne from, at the time, BrainBox games. BrainBox was a startup sister company of Digital Extremes, which was in turn a sister company of Epic’s. At the time they were at the early stages of development on a game called Pariah (Pariah was inevitably published under the Digital Extremes names) and he was the lead animator.
The whole ‘Senior Project’ bit was largely a joke, in my eyes at least. Literally since I was eight years old I was programming full time (if largely without pay) in C++ and had been doing 3D computer graphics games and simulations since I was 12. At the time I always had a new game engine in development with a multitude of tools, editors and demos so for my school to require me to do something that’s supposed to be pseudo career oriented as a preparation tool was fairly laughable for me. David was one of my IRC friends at the time that I talked to mostly daily anyway so he agreed to allow me to fax him the required school papers and sign off on them and I continued working on all the projects I always worked on anyway.. it just also counted for school credit this time. Anyway, enough about the history – on to the actual product.
GeoBump was originally designed as a tool that’d allow game developers to generate bump maps (both normal maps and displacement maps) from actual geometry rather than trying to pen it in in photoshop or some other image editing software. Artists would create two versions of all their 3D assets – the normal, blocky, low-resolution models consisting of a few thousand triangles everyone was used to seeing in games and then an extremely high-resolution model consisting of millions of triangles where every little freckle and skin gash was fully modeled out with actual geometry. Splitting up the models into an octree spatialdata structure, GeoBump would then proceed to, for each triangle on the low-resolution model, ‘walk’ across the triangle interpolating the texture coordinates and vertex normals and shoot anywhere from 1 to 1024 rays (GeoBump allowed completely insane levels of anti-aliasing when generating these maps – you could select anywhere from a single sample to thousands for each pixel on the resultant normal map and you had the option of using plain old box filtering orGaussian filtering)of both in the positive normal direction and the negative normal direction (the two models were placed overlapping each other – originally I only had the rays going in one direction where the high resolution model was assumed to always be on the exterior of the low-res model but these models were usually created by humans and would often intersect and overlap and the tool had to be designed to handle this). Once the point of intersection with the high resolution model was found the vertex normals of the high resolution model were interpolated to that point and the sample was taken. If multiple samples were chosen you’d repeat this process for the number of samples and each time jitter the starting position (and, subsequently, also slightly alter the ray direction by interpolating the normal).
Needless to say this earned me an ‘A’ on the project but what was interesting was that this apparently rivaled the other offerings on the market and what Epic was offering in their own Unreal Engine at the time as David requested if he could put it to use for Pariah, which was using the Unreal Engine and contained a similar tool but one that was still very early in development (at the time, most the tools they were using were apparently not all that functional – not all that surprising considering they didn’t ship for around three and a half years after this). An artist I was working with at the time who was in Oslo, Norway also put it to use on our own projects and some of his pet projects where one of his artist friends working at 2015 also requested to use it for their products and from there it spread. About six months to a year later, Crytech also put out a similar tool that they sold individually for around 10grand per user – I probably should have charged for it!
In later versions of GeoBump that never left my hands it was also extended to allow grabbing of just about any data from the high resolution mesh including color (for building the texture maps directly), textured color (if they went so far as to put uv-maps and texture the high-res mesh.. which was fairly rare) and also allowed you to run spherical harmonic simulations on the high-res model and then embed this into 2D textures that got mapped onto the low-res mesh allowing you to have fully globally illuminated and self-shadowing bumps at the per-pixel level that made these low-res models light in real-time at a quality level nearly indistinguishable from that of running full-on Pixar-quality global illumination simulations on the high res mesh – something I have yet to see put to use in any commercial products to this day.