I’ve taken a break over the summer from Alaska in order to come back at it from a fresh perspective, I’m going to be working hard over the next 2 months to get a build ready for the IGF submission deadline. As with previous years, if the game isn’t ready for submission I wont submit it but it’s good to have these external deadline as they are the most effective thing I’ve found to motivate progress towards a polished, presentable build. As a result I wont be working on assets or code, just the scripts that control the core gameplay experience, producing a healthy horizontal slice as opposed to the vertical slice I was working up to before the summer. A vertical slice is good for screenshots and trailers because it has final art and makes the game look a lot more polished than it is, a horizontal slice, especially with a simulation game is much more representative of how the end result will play, so getting in place the majority of the core gameplay is the priority for the next 2 months and if I can get enough content in the game, I’ll submit it to the IGF!
I thought I’d detail the gameplay of Alaska in a post because I haven’t really talked about gameplay and it’ll help me get it straight in my head aswell.
The basic gameplay of Alaska is based around opening up the enviroment with new tools in a mertoidvania style to enable you to scrutinise clues left there, the main gameplay verbs are:
- scumm style finding & using objects
- dialog, information flows between NPC’s dynamically and you can extract that infromation from some NPC’s more easily than others (famously employed by Bethesda in Oblivion and inexplicably removed from all their subsequent games)
- the three novel gameplay systems fueled by the metroidvania style upgrade system
The novel gameplay systems are:
Hacking tools use hydrogen to power the quantum processor in your phone, they are the super powers of the game (Similar to augmentations in Deus Ex).
The code breaker allows you to access anything that takes a password or pin.
Wireless Port Scanner:
The wireless portscanner allows you to find weak ports to use the code breaker on.
Wireless Port Opener:
The port opener allows you to force open a port on a device with no open ports, as long as it doesn’t have a wireless shield.
Lockpicks enable you to unlock manual locks in the world (functionally identical to lockpicks in Deus Ex).
Hydrogen Cells (Disposable)
Hydrogen Cells are consumed to fuel the Quantam Processor in your phone, which is used by all the Hacking tools (functionally identical to bioelectric cells in Deus Ex).
The screwdriver allows you to remove the wireless shielding from a device, which would otherwise be unhackable.
This mechanic will work similarly to the research mechanic in System Shock 2, although will be a little more involved.
The Spectral Analyser will tell you the elements contained in a sample, it will tell you the 3 most probable compounds which you then use the compound reactor to test against.
The Compound Reactor takes the sample and a pure chemical element and attempt to react and measure the reaction, combined with the Spectral Analysis you can then assert which compound a sample is.
In the rare occasion you find 2 samples that the compound reactor has shown to contain DNA, you can use this machine to compare Short Tandem Repeats and is almost certainly the method you will use to prove the guilt of your suspect in the game.
I’m putting together a new video of some of the features I’ve been working on since October, it should be done this week. In the mean time I wanted to write a post about the architecture of the Agency Engine (Agency is the name of the engine I wrote to build Alaska in) firstly to let people know how easy it is to roll your own engine by using the open source, free for commercial use libraries out there. The benefits to rolling your own engine are pretty comprehensive, but the most important things to me are, primarily you don’t have to worry about letting licenses lapse and therefore losing the right to maintain, improve or update your applications and the ability to see exactly where your code is falling down with a native debugger.
So what does the agency engine consist of?
Firstly the library links against the following libraries to provide their respective utilities;
Direct 3D 10
(rendering is abstracted and will soon be optionally replaced with open gl support for Linux)
(deprecated by Microsoft they laughably haven’t made anything to replace it, one of the many areas Microsoft have neglected in windows)
(the free and best physics library available)
(All non binary assets are in xml format and i’ve known & loved tiny xml for years, using xml in a release build is frowned upon because it’s costly, i might switch if i get the time)
(for object orientated scripting)
Recast & Detour
(for dynamic navigation meshes)
Then there’s my main engine classes;
(for loading my proprietary bsp format, I originally used valve BSP’s but it turns out there’s no explicit license for the source sdk which means I’d have to license it from valve)
Vector Math library
(stupidly i reimplemented something that’s been done a million times before, probably the biggest mistake of the project)
2D library for menus
(uses scripts and xml and so is completely scriptable without touching native code)
(probably my second biggest mistake in this project i wrote my own collada loader before finding out there is an open source library called assimp that would’ve done it for me the collada format is a complete mess, as is blenders implementation of it, and the loader took a long time)
(everything loaded from disk inherits this class template which implements reference counting to prevent redundant assets)
Finally the content creation pipleline;
Models are created in blender.
Maps are created in quark proprietary bsp tools although valve bsps will continue to be supported as it’s the levels that are under a non commercial license, not the code.
Almost all textures are created in the trial version of filter forge, although I have no qualms about buying a license for this as it’s what’s enabled me to build the game without an artist.
The renderer is quite complex and warrents more detail it’s loosly broken down into 9 steps & is a pre pass lighting renderer first defined here by Wolfgang Engel:
pBsp->update uploads the new bsp indices to the gpu, ensuring only the visible parts of the map are drawn
preprocessLights figures out closest lights that are within 1 bounce from the player renders their static shadow map if it is just added this frame & renders it’s dynamic shadow map, as long as thiers shadow maps available and it hasn’t rendered too many this frame already.
draw normals renders the world space normals of the geometry to a render target and the depth information to a depth buffer.
drawLab iterates through each active light accumulates all the lighting information on a pair of render targets, this step is very fill rate intensive, so it uses a stencil buffer to encapsulate the depth buffer in the lights radius.
drawHDR renders the lit geometry using the light accumulation buffer and the normal buffer from the previous steps to a high dynamic range buffer, i.e. colour values go from 0 to something like 8000 (depends on the gpu & other factors).
drawBloom down samples the HDR buffer ignoring pixels bellow the bloom threshold & then draws those buffers back to the hdr buffer with a blur filter.
drawToneMap renders the HDR buffer to a low dynamic range buffer by mapping the 0-8000 values non linearly down to between 0-1 (lending more detail to more mid range colour values)
drawHud & drawDebugHud are fairly standard 2D forward renderers.
I’ve not posted in a while for the same reason I’ve decided not to submit to the IGF this year, things have been pretty hectic recently and I haven’t had much time to work on Alaska, so it’s not up to the level i was planning it to be and so it’ll be a finished build that gets submited to next years IGF, the upside is it’ll almost certainly be out by then and people will be able to play it instead of being one of those frustrating games you can’t play for a year after it gets the award, never mind the nomination. As far as progress is concerned I did manage to implement the AI state system, so my next goal is to finish the map and start adding the narative extra content, which is really the final stretch but also probably the longest and so there’ll probably be videos to come but almost certainly no more playable builds, they’ll only spoil the plot.
Thought I’d do a quick post on what I’ve been doing this week before I go off to Ireland. So I’ve been polishing the map, fixing up geometry and applying textures & i basically blew though all existing my textures, so I decided to generate a lot more, I use Filter Forge to generate my textures which is an excellent tool and one of the key things that enable me to make such a nice game without an artist.
It does take a long time however and I also need to do some manual post processing afterwards so I decided to speed up the process by writing a batch file to automate it & like every time I try to write a batch file it sucked up an entire afternoon and barely functioned, I wasn’t happy with it so I looked at microsoft powershell, it was more like what I was after because you could do simple things like a string split without having to move heaven & earth but crucially there was all sorts of certificate guff I would have to go through just to enable powershell on my machine, I wasn’t about to do that cause it’s idiotic you need to install a certificate to run a script you wrote, when I could write a program to do the same job in C in half the time it’d take me to install the certificate.
I knew there had to be a better way, I had messed around with python in the past, to help my sister out with her CS stuff and when i was looking at a scripting language for my game & i knew it was c like and pretty simple, I also already had it installed because I use sickbeard which is python based, so I whipped up a quick script in 5 mins with the help of some command line tools, one for merging specular maps output by filter forge to dds format called Image Magick and another for converting the pngs to vtfs which my current bsp editing tool uses, called vtfcmd, which comes with vtflib from nems tools and finally microsofts texconv which comes with the directx sdk for creating dds block compressed textures.
And so I still have to generate the textures by hand in filter forge, which is a shame because there’s no reason it shouldn’t be configurable to spit out all the different components of a material in one go, currently you have to generate them individually, which is a fairly involved process, but once they’re generated all i need to do is run my script and bang, I can use the textures in the bsp as a vmf & load them into the game as a dds.
import os, os.path, sys
print "Ratus Apparatus Texture Processor:"
vtfMaterialPath="\"D:\\Steam\\steamapps\\email@example.com\\half-life 2 episode two\\ep2\\materials\""
#placeholder image names
percentageDone = 0
vtfCmdLine = ""
total = len(dirList)
for fname in dirList:
if len(splitname) > 1:
type = splitname
if os.path.exists(normal+srcType) and type=="png":
print "Processing " + fname +" "+ str(percentageDone) + "/" + str(total)
if os.path.exists(spec1+srcType) and os.path.exists(spec2+srcType):
os.system(specMergeCmd +" "+ spec1+srcType +" "+ spec2+srcType +" "+ blankImage+srcType + " -combine " + specOut+srcType)
os.system(specMergeCmd +" "+ spec1+srcType +" "+ fullImage+srcType +" "+ blankImage+srcType + " -combine " + specOut+srcType)
os.system(specMergeCmd +" "+ blankImage+srcType +" "+ spec2+srcType +" "+ blankImage+srcType + " -combine " + specOut+srcType)
vtfCmdLine += " -file " + main+srcType + " -shader " + main+".vmt"
os.system(dxtCmd +" -f " + dxtFormat +" "+ main+srcType)
os.system(dxtCmd +" -f " + dxtFormat +" "+ normal+srcType)
os.system(dxtCmd +" -f " + dxtFormat +" "+ specOut+srcType)
matFile = open(main+".mat","w")
vtfCmdLine += " -output " + vtfMaterialPath
#print vtfCmd + vtfCmdLine
os.system(vtfCmd + vtfCmdLine)
Lua & Luabind are amazingly powerful but also slightly arcane with their lack of tutorials and heavy reliance on the stl, but there’s some things I’ve learned that are essential to using luabind, the first thing you should do is set it up to report runtime errors, I didn’t know this was possible since it’s very low down on the ‘documentation’ page, but it’s very very important since you will be making all sorts of stupid mistakes during development. The second thing is binding static members isn’t supported out the box, turns out it’s actually pretty simple if you think about it for a bit, alternatively you can steal this guys code like I did.
Since finishing the luabinds I’ve been focusing on polishing up the map for the igf submission build, it wont take too long since almost all the geometry is in & it’s just a case of placing props and picking textures, from there it’s fixing up the AI and I’ve got some more scripting to put in which’ll be a piece of cake now I’m using a decent scripting language. Other things I’ve got to finish before October is the intro tutorial and an announce trailer, since once it’s submitted to igf I’m going to announce it, hopefully it’ll generate some interest.
I’m going to start posting here when I finish development milestones on the project, right now i’m transitioning all my proprietary scripting over to lua & luabind which is excellent except it has a tendency to fall over pretty bad, when I’m done with that I can finish the last piece of content I need to add before I can polish up the first half hour of gameplay and have a vertical slice ready for submitting to the igf and a trailer for promoting etc, for now you can download the build i put out in April here, there wont be another one until I submit to the igf because I’ll be busy working on it, but there will be more videos and screenshots.