Development Update

animatic-02

I know it’s been a long long time since my last development update, almost a year! The reason is very simply because the things I’ve been doing have been tediously boring, at least to my mind anyway. Here’s a brief summary of what I’ve been doing on Alaska for the last 10 months. I expect this will be my last development update before I start my greenlight campaign, I currently estimate i’ll be ready to go on greenlight in July or August.

Anniversary

Last month was the 5 year development anniversary of Alaska, my original estimate was a 3 year project but I’ve changed jobs twice, got married and had 2 amazing kids in that time and those things all take priority, as they should. This is what it looked like 5 years ago:

It’s come a long way since then. I’m pretty proud of what I’ve achieved, personally.

So What have I been doing over the last 10 months?

Moving Repo, Moving IDE, C++11 (Aug-Oct)

So the first thing I did after my last blog post was do a lot of long overdue clean-up. Moving to visual studio 2015, updating build flags and dependencies and generally tidying up the build. As part of that I also moved to git which has taken a load of as subversion is a pain to work with. I backed up all my source assets too. With the move to visual studio 2015, I can now support C++11 properly which meant I could clean up some crusty code.

Moving from Windows 7 and Visual Studio 2012 to Windows 10 and Visual Studio 2015 forced me to migrate from the old DirectX SDK to DirectX included in the Windows SDK. I’m still using D3D10 and effects. I’m not moving mid development and this meant reworking some stuff to work better with the new SDK & IDE. It’s better in the long run but PIX still doesn’t work properly with old effects and that’s a nuisance.

This all took a few months but It was important because it made a lot of things that had become a slog less so and it was time away from the mammoth job i’d unwittingly undertaken in my last blog post, the dreaded character import pipeline rework.

Bug Fixes (Nov & Febuary)

Mainly as a consequence of changing lots of stuff related to the build there were a few weird bug fixes that needed addressed, I did some of these in November and some in February.

  • BSP Loading was very slow (one small part was 75% of the load time)
  • luabind threw exceptions in the destructor and vs2015 rightly didn’t like that
  • a couple of materials and reusing some render buffers were causing visual issues
  • The Game Entities used their address in memory as their guid! Need real guids.

Fixing all these issues meant I had to do the job I’d been avoiding:

The Dreaded Character Import Pipeline Rework (Nov-Mar)

Final Characters

My character import pipeline is a disgrace, It is laughably fragile and I’ve paid dearly for it. If this was my full time job I would have allotted much more time to making it more stable but I figured I would only have to do it once or twice and so I could just suffer.

My import pipeline currently consists of:

  • Build and Rig the characters in Mixamo Fuse
  • Load the model in Blender and cut away all the hidden verts (teeth, joins)
  • Reduce the poly count as much as possible with decimate
  • Import the Rig with all the animations, repose & bind it
  • Export the rig and all animations, one by one to collada
  • Make DDS versions of all the textures

Alaska is quite rare for a small indie game in that it has 13 unique characters and this process is very manual, laborious and error prone. It took me a long time doing it manually before I decided to try and automate it.

By March I had generated all the Final Models and used a couple in the test map to prove them out. At this point I realised I could use python in blender to automate the export and It made one of the most laborious parts trivial, It really is easy to write scripts for blender and I thoroughly recommend it, this is the script I used to export all the animations:

import bpy
import os

for action in bpy.data.actions:
    outname = None
    for object in bpy.data.objects:
        if object.animation_data != None:
            object.animation_data.action = action
        bpy.ops.pose.transforms_clear()
        if object.name != "Armature":
            outname = object.name.lower()
    root = os.path.dirname(bpy.data.filepath)
    if action.name == "Idle":
        file = root + "\\" + outname + ".dae"
    else:
        actionname = action.name
        if actionname.endswith("StrafeWalking"):
            actionname = actionname[:-7]
        if actionname == "Walking":
            actionname = "Walk"
        file = root + "\\" + outname + actionname + ".dae"
    bpy.ops.wm.collada_export(filepath=file

A few loops around this script and the final models were in the game but something was wrong!

Human Readable Formats (March-April)

I always knew I was going to need to implement binary formats and now was the time, the final characters where massive collada files and they had a lot of redundant garbage in them. This caused loading times to explode to about 2 minutes if I remember correctly so I was forced to address the issue. I ended up making unified binary formats for the following files:

  • skeletons
  • objects
  • atlases
  • fonts
  • bsps
  • shaders

This reduced the load times to ~10 seconds and was pretty satisfying. There were quite a lot of minor bug fixes as a result of this and it took me into April.

Kinematic Character Controllers vs Nav Mesh Based Character Controllers (April – May)

I had been using a kinematic character control for the full length of development and while I was fairly happy with it, there were a couple of issues. Primarily I couldn’t achieve the kind of control over it I wanted, getting pushed out of the way, sliding down surfaces, climbing vertical faces and tunneling where all issues i’d been fighting all through development.

Kinematic Character Controllers are good because they give you a strong connection to the physical geometry (walking into a door, knocks it open for example). Nav Mesh Based Character Controllers are good because they give you a strong connection to AI reasoning (If an AI decides he wants to go somewhere, he’s not going to get stuck on the way).

Weighing these two options up I decided to take the plunge and try to remove and replace the kinematic controller I wasn’t really happy with, with a nav mesh based controller. It turned out to be a lot simpler than I anticipated and now I feel like I have a good character controller which I can tweak much more easily without having to fire off rays and apply forces to.

Animatic (May~ongoing)

animatic-01

I had been looking for someone to do an animatic for me to help with promotion and to cement some of the themes of the game more firmly at the start to help the demo pop better. By April I had decided, as usual to stop waiting for someone to come to me and try and throw something together with my extremely limited artistic capability, it’s ongoing and may not pan out but If I can manage to achieve what I’m after it’ll really add something to the game I feel it needs.

What Next?

The greenlight submission really is coming soon (although I wont submit it until I am completely happy with it). To get there I need to block off the areas of the map I don’t want the player going and finish the animatic, I also need to make an updated in game trailer and then make my greenlight page. Following me on Twitter is the best way to keep up with ongoing development, fingers crossed my next post will be about the greenlight submission!

Back from my break

I’ve taken a break over the summer from Alaska in order to come back at it from a fresh perspective, I’m going to be working hard over the next 2 months to get a build ready for the IGF submission deadline. As with previous years, if the game isn’t ready for submission I wont submit it but it’s good to have these external deadline as they are the most effective thing I’ve found to motivate progress towards a polished, presentable build. As a result I wont be working on assets or code, just the scripts that control the core gameplay experience, producing a healthy horizontal slice as opposed to the vertical slice I was working up to before the summer. A vertical slice is good for screenshots and trailers because it has final art and makes the game look a lot more polished than it is, a horizontal slice, especially with a simulation game is much more representative of how the end result will play, so getting in place the majority of the core gameplay is the priority for the next 2 months and if I can get enough content in the game, I’ll submit it to the IGF!

Alaska Gameplay

I thought I’d detail the gameplay of Alaska in a post because I haven’t really talked about gameplay and it’ll help me get it straight in my head aswell.

The basic gameplay of Alaska is based around opening up the enviroment with new tools in a mertoidvania style to enable you to scrutinise clues left there, the main gameplay verbs are:

  • scumm style finding & using objects
  • dialog, information flows between NPC’s dynamically and you can extract that infromation from some NPC’s more easily than others (famously employed by Bethesda in Oblivion and inexplicably removed from all their subsequent games)
  • the three novel gameplay systems fueled by the metroidvania style upgrade system

The novel gameplay systems are:

  • HACK
  • TOOLS
  • CHEMISTRY

Hack

Hacking tools use hydrogen to power the quantum processor in your phone, they are the super powers of the game (Similar to augmentations in Deus Ex).

Code Breaker:

The code breaker allows you to access anything that takes a password or pin.

Wireless Port Scanner:

The wireless portscanner allows you to find weak ports to use the code breaker on.

Wireless Port Opener:

The port opener allows you to force open a port on a device with no open ports, as long as it doesn’t have a wireless shield.

Tools

Lockpicks (Disposable)

Lockpicks enable you to unlock manual locks in the world (functionally identical to lockpicks in Deus Ex).

Hydrogen Cells (Disposable)

Hydrogen Cells are consumed to fuel the Quantam Processor in your phone, which is used by all the Hacking tools (functionally identical to bioelectric cells in Deus Ex).

Screwdriver

The screwdriver allows you to remove the wireless shielding from a device, which would otherwise be unhackable.

CHEMISTRY

This mechanic will work similarly to the research mechanic in System Shock 2, although will be a little more involved.

Spectral Analyser

The Spectral Analyser will tell you the elements contained in a sample, it will tell you the 3 most probable compounds which you then use the compound reactor to test against.

Compund Reactor

The Compound Reactor takes the sample and a pure chemical element and attempt to react and measure the reaction, combined with the Spectral Analysis you can then assert which compound a sample is.

DNA Sequencer

In the rare occasion you find 2 samples that the compound reactor has shown to contain DNA, you can use this machine to compare Short Tandem Repeats and is almost certainly the method you will use to prove the guilt of your suspect in the game.

The Anatomy of the Agency Engine

agency

I’m putting together a new video of some of the features I’ve been working on since October, it should be done this week. In the mean time I wanted to write a post about the architecture of the Agency Engine (Agency is the name of the engine I wrote to build Alaska in) firstly to let people know how easy it is to roll your own engine by using the open source, free for commercial use libraries out there. The benefits to rolling your own engine are pretty comprehensive, but the most important things to me are, primarily you don’t have to worry about letting licenses lapse and therefore losing the right to maintain, improve or update your applications and the ability to see exactly where your code is falling down with a native debugger.
So what does the agency engine consist of?
Firstly the library links against the following libraries to provide their respective utilities;
Direct 3D 10
(rendering is abstracted and will soon be optionally replaced with open gl support for Linux)
Direct Input
(deprecated by Microsoft they laughably haven’t made anything to replace it, one of the many areas Microsoft have neglected in windows)
Bullet
(the free and best physics library available)
Tiny XML
(All non binary assets are in xml format and i’ve known & loved tiny xml for years, using xml in a release build is frowned upon because it’s costly, i might switch if i get the time)
Lua
(for scripting)
LuaBind
(for object orientated scripting)
Recast & Detour
(for dynamic navigation meshes)
Then there’s my main engine classes;
BSP loader
(for loading my proprietary bsp format, I originally used valve BSP’s but it turns out there’s no explicit license for the source sdk which means I’d have to license it from valve)
Vector Math library
(stupidly i reimplemented something that’s been done a million times before, probably the biggest mistake of the project)
2D library for menus
(uses scripts and xml and so is completely scriptable without touching native code)
Model loader
(probably my second biggest mistake in this project i wrote my own collada loader before finding out there is an open source library called assimp that would’ve done it for me the collada format is a complete mess, as is blenders implementation of it, and the loader took a long time)
Resource template
(everything loaded from disk inherits this class template which implements reference counting to prevent redundant assets)

Finally the content creation pipleline;
Models are created in blender.
Maps are created in quark proprietary bsp tools although valve bsps will continue to be supported as it’s the levels that are under a non commercial license, not the code.
Almost all textures are created in the trial version of filter forge, although I have no qualms about buying a license for this as it’s what’s enabled me to build the game without an artist.

Renderer

The renderer is quite complex and warrents more detail it’s loosly broken down into 9 steps & is a pre pass lighting renderer first defined here by Wolfgang Engel:

preProcessLights();
pBsp->update(*((Vector3 *)&pCamera->getEye()));
drawNormals();
drawLab();
drawHDR();
drawBloom();
drawToneMap();
drawHud();
draw
DebugHud();

pBsp->update uploads the new bsp indices to the gpu, ensuring only the visible parts of the map are drawn
preprocessLights figures out closest lights that are within 1 bounce from the player renders their static shadow map if it is just added this frame & renders it’s dynamic shadow map, as long as thiers shadow maps available and it hasn’t rendered too many this frame already.
draw normals renders the world space normals of the geometry to a render target and the depth information to a depth buffer.
drawLab iterates through each active light accumulates all the lighting information on a pair of render targets, this step is very fill rate intensive, so it uses a stencil buffer to encapsulate the depth buffer in the lights radius.
drawHDR renders the lit geometry using the light accumulation buffer and the normal buffer from the previous steps to a high dynamic range buffer, i.e. colour values go from 0 to something like 8000 (depends on the gpu & other factors).
drawBloom down samples the HDR buffer ignoring pixels bellow the bloom threshold & then draws those buffers back to the hdr buffer with a blur filter.
drawToneMap renders the HDR buffer to a low dynamic range buffer by mapping the 0-8000 values non linearly down to between 0-1 (lending more detail to more mid range colour values)
drawHud & drawDebugHud are fairly standard 2D forward renderers.

IGF 2013

I’ve not posted in a while for the same reason I’ve decided not to submit to the IGF this year, things have been pretty hectic recently and I haven’t had much time to work on Alaska, so it’s not up to the level i was planning it to be and so it’ll be a finished build that gets submited to next years IGF, the upside is it’ll almost certainly be out by then and people will be able to play it instead of being one of those frustrating games you can’t play for a year after it gets the award, never mind the nomination. As far as progress is concerned I did manage to implement the AI state system, so my next goal is to finish the map and start adding the narative extra content, which is really the final stretch but also probably the longest and so there’ll probably be videos to come but almost certainly no more playable builds, they’ll only spoil the plot.

Content Production Pipeline

Thought I’d do a quick post on what I’ve been doing this week before I go off to Ireland. So I’ve been polishing the map, fixing up geometry and applying textures & i basically blew though all existing my textures, so I decided to generate a lot more, I use Filter Forge to generate my textures which is an excellent tool and one of the key things that enable me to make such a nice game without an artist.

It does take a long time however and I also need to do some manual post processing afterwards so I decided to speed up the process by writing a batch file to automate it & like every time I try to write a batch file it sucked up an entire afternoon and barely functioned, I wasn’t happy with it so I looked at microsoft powershell, it was more like what I was after because you could do simple things like a string split without having to move heaven & earth but crucially there was all sorts of certificate guff I would have to go through just to enable powershell on my machine, I wasn’t about to do that cause it’s idiotic you need to install a certificate to run a script you wrote, when I could write a program to do the same job in C in half the time it’d take me to install the certificate.

I knew there had to be a better way, I had messed around with python in the past, to help my sister out with her CS stuff and when i was looking at a scripting language for my game & i knew it was c like and pretty simple, I also already had it installed because I use sickbeard which is python based, so I whipped up a quick script in 5 mins with the help of some command line tools, one for merging specular maps output by filter forge to dds format called Image Magick and another for converting the pngs to vtfs which my current bsp editing tool uses, called vtfcmd, which comes with vtflib from nems tools and finally microsofts texconv which comes with the directx sdk for creating dds block compressed textures.

And so I still have to generate the textures by hand in filter forge, which is a shame because there’s no reason it shouldn’t be configurable to spit out all the different components of a material in one go, currently you have to generate them individually, which is a fairly involved process, but once they’re generated all i need to do is run my script and bang, I can use the textures in the bsp as a vmf & load them into the game as a dds.

import os, os.path, sys
print "Ratus Apparatus Texture Processor:"

path="."
vtfMaterialPath="\"D:\\Steam\\steamapps\\the_sombrero_kid@hotmail.com\\half-life 2 episode two\\ep2\\materials\""

#file sufixes
normalSufix="normal"
spec1Sufix="spec1"
spec2Sufix="spec2"
specSufix="spec"
srcType=".png"
dstType=".dds"

#tools
specMergeCmd="convert"
vtfCmd="vtfcmd"
dxtCmd="texconv"
dxtFormat="DXT1"

#placeholder image names
blankImage="blank"
fullImage="full"

percentageDone = 0
dirList=os.listdir(path)
vtfCmdLine = ""
total = len(dirList)
for fname in dirList:
splitname=fname.split('.')
main=splitname[0]
type=""
if len(splitname) > 1:
type = splitname[1]
normal=main+normalSufix
spec1=main+spec1Sufix
spec2=main+spec2Sufix
specOut=main+specSufix
if os.path.exists(normal+srcType) and type=="png":
print "Processing " + fname +" "+ str(percentageDone) + "/" + str(total)
percentageDone+=1
if os.path.exists(spec1+srcType) and os.path.exists(spec2+srcType):
os.system(specMergeCmd +" "+ spec1+srcType +" "+ spec2+srcType +" "+ blankImage+srcType + " -combine " + specOut+srcType)
elif os.path.exists(spec1+srcType):
os.system(specMergeCmd +" "+ spec1+srcType +" "+ fullImage+srcType +" "+ blankImage+srcType + " -combine " + specOut+srcType)
elif os.path.exists(spec2+srcType):
os.system(specMergeCmd +" "+ blankImage+srcType +" "+ spec2+srcType +" "+ blankImage+srcType + " -combine " + specOut+srcType)
vtfCmdLine += " -file " + main+srcType + " -shader " + main+".vmt"
os.system(dxtCmd +" -f " + dxtFormat +" "+ main+srcType)
os.system(dxtCmd +" -f " + dxtFormat +" "+ normal+srcType)
if os.path.exists(specOut+srcType):
os.system(dxtCmd +" -f " + dxtFormat +" "+ specOut+srcType)
matFile = open(main+".mat","w")
matFile.write("\n\n\t" +main+dstType+ "\n\t" +normal+dstType+ "\n\t" +specOut+dstType+ "\n\tlitspecmap.fx\n")
matFile.close()

vtfCmdLine += " -output " + vtfMaterialPath
#print vtfCmd + vtfCmdLine
os.system(vtfCmd + vtfCmdLine)

Lua binding

Lua & Luabind are amazingly powerful but also slightly arcane with their lack of tutorials and heavy reliance on the stl, but there’s some things I’ve learned that are essential to using luabind, the first thing you should do is set it up to report runtime errors, I didn’t know this was possible since it’s very low down on the ‘documentation’ page, but it’s very very important since you will be making all sorts of stupid mistakes during development. The second thing is binding static members isn’t supported out the box, turns out it’s actually pretty simple if you think about it for a bit, alternatively you can steal this guys code like I did.

Since finishing the luabinds I’ve been focusing on polishing up the map for the igf submission build, it wont take too long since almost all the geometry is in & it’s just a case of placing props and picking textures, from there it’s fixing up the AI and I’ve got some more scripting to put in which’ll be a piece of cake now I’m using a decent scripting language. Other things I’ve got to finish before October is the intro tutorial and an announce trailer, since once it’s submitted to igf I’m going to announce it, hopefully it’ll generate some interest.

Alaska Development Updates

I’m going to start posting here when I finish development milestones on the project, right now i’m transitioning all my proprietary scripting over to lua & luabind which is excellent except it has a tendency to fall over pretty bad, when I’m done with that I can finish the last piece of content I need to add before I can polish up the first half hour of gameplay and have a vertical slice ready for submitting to the igf and a trailer for promoting etc, for now you can download the build i put out in April here, there wont be another one until I submit to the igf because I’ll be busy working on it, but there will be more videos and screenshots.