Phil Hassey - game dev blog
Phil Hassey as Wolverine
"What kind of
arrogant jerk
has a website like this?"

Archive for the 'C++' Category

Turkey Tomahawk Turbo

Sunday, December 1st, 2013

I’m pleased to announce Turkey Tomahawk Turbo!

I made it for the Charity Game Jam this year!

So check it out Windows or Mac or Linux! And now on the Web!

Have fun!
-Phil

SDL2 Tips, Tricks, and Workarounds

Tuesday, November 5th, 2013

It only took me about 3 days to get my framework working with all my targets: Windows, Mac, Linux, iOS and Android. I took an extra day to write my own simple mixer for WAV and OGG audio. And there might have been another day in there for other cleanup / fixes. So maybe 1 week total.

You can get SDL2 here. If you’ve used SDL before, you’ll find that it’s pretty similar, only it seems to work better, and now it’s got slick iOS and Android support.

You will want to follow the SDL 1.2 to 2.0 Migration Guide. And then for Android and iOS, read the README-android.txt and README-ios.txt as included in the source zip. I’m just writing up some tips here to help you along the way. I’m writing this with OpenGLES 1.1 in mind.

SDL2 for Windows, Mac, Linux

– Initializing your Window must be done in the right order. Basically, I call SDL_CreateWindow first, then SDL_GL_CreateContext. If SDL_CreateWindow fails with my preferred settings, I chose more fail-safe settings. After setting the mode I use SDL_GetWindowSize and SDL_GetWindowFlags to see if I got what I wanted. If things aren’t quite right, I use SDL_SetWindowFullscreen and SDL_SetWindowSize to try and request them again. (I always go out of fullscreen, set size, (maybe go into fullscreen), then set size again.

– As of SDL 2.0.1, Mac Retina displays do not work consistently. There is a flag for this “SDL_WINDOW_ALLOW_HIGHDPI” but I’ve found that if you ever change mode, or do anything, things seem to fall apart. If you aren’t messing around much, it might just work well enough for you.

– glu doesn’t seem to work anymore. So I had to switch over to using glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP,GL_TRUE); to generate my mipmaps. I examine the glGetString(GL_VERSION)) to make sure it’s >= 1.4 before doing that. And if I’m not on a new enough GL, I just don’t use mipmaps at all.

– Use stb_image to load your images. (I didn’t do that just for SDL2, but it’s a great tip!)

SDL2 for Android

I can’t believe how smoothly this went. (Recall, 2 months for NDK, 1 week for Marmalade .. this port took 1 DAY!!!)

– Follow README.android for the details.

– in AndroidManfiest.xml in the “activity” tag add android:configChanges=”orientation” , keeps your app from crashing. You may use other orientation tags to keep it in a single orientation, or whatever.

– If you use sockets, be sure to add these permissions “android.permission.INTERNET”

– Use SDL_rwops to read your own data files (which you have placed in the “assets” folder of your Android project.)

– If you want to use SDL2_mixer, you may need to edit SDL2_mixer/Android.mk and disable a few things.

– If you are using single touch, your mouse SDL code might work already, otherwise add support for SDL_FINGER* events (and filter out the touch events from your mouse code if (e.motion.which == SDL_TOUCH_MOUSEID) { break; }

– Be sure to call SDL_SetTextInputRect before SDL_StartTextInput if you are using key input. SDL_SetTextInputRect let’s you specify where on the screen the text is appearing so that SDL2 can shift the screen to keep the virtual keyboard from overlapping it.

– On suspend / resume I had to pause my audio.

SDL2 for iOS

– Follow README.ios for the details.

– SDL_SetTextInputRect doesn’t work. So you’ll need to capture UIKeyboardWillShowNotification and shift your screen to keep the virtual keyboard from overlapping it on your own.

– Unlike in Android, the orientation won’t change between portrait and landscape UNLESS you add SDL_WINDOW_RESIZABLE to your SDL_CreateWindow flags.

– Even though I added in SDL_iPhoneSetAnimationCallback, I got crashes when suspending my app. I also had to use SDL_SetEventFilter to capture SDL_APP_WILLENTERBACKGROUND and SDL_APP_DIDENTERFOREGROUND to set a flag to tell my callback to start / stop doing its thing.

Custom Mixer?

You might be able to use SDL2_mixer on all platforms. I did try it out for Android, and got it working pretty easily. However, I decided to write up my own mixer using SDL2 to stream the output. (This ended up only taking a few hours, and it keeps me from having to have SDL2_mixer as an additional dependency on all platforms.)

I used SDL_LoadWAV to load wav files and stb_vorbis to load and stream ogg files.

Good luck!
-Phil

Getting ENet to work with Marmalade

Thursday, August 8th, 2013

Through a bit of Googling and whatnot, I got in touch with Joe Delgado who was able to provide a patch to fix ENet to work with Marmalade. These are modifications to unix.c. When I am making a Marmalade build of my games, I always have PLATFORM_MARMALADE defined. So you can replace that with whatever you do.

After the #ifdef __APPLE__ segment, add this to set up the correct defines, etc.

#ifdef PLATFORM_MARMALADE
#define SOMAXCONN 128
#undef HAS_POLL
#undef HAS_MSGHDR_FLAGS
#undef HAS_FCNTL
#include <sys/select.h>
#include <sys/uio.h>
#define HAS_SOCKLEN_T
#endif

In enet_socket_send, it turns out the Marmalade sendmsg is broken. So I replace:

    msgHdr.msg_iov = (struct iovec *) buffers;
    msgHdr.msg_iovlen = bufferCount;
    sentLength = sendmsg (socket, & msgHdr, MSG_NOSIGNAL);

With

#ifdef PLATFORM_MARMALADE
    // concatenates buffers together

    ENetBuffer * newBuffers;
    char* d;
    int i, totalSize;

    newBuffers = (ENetBuffer*)malloc(sizeof(ENetBuffer));
    totalSize = 0;
    for (i=0;i<bufferCount;i++) {
        totalSize += (buffers+i)->dataLength;
    }

    newBuffers->data = malloc(totalSize);
    newBuffers->dataLength = totalSize;
    d = (char*)newBuffers->data;
    for (i=0;i<bufferCount;i++) {
        memcpy(d, (buffers+i)->data, (buffers+i)->dataLength);
        d += (buffers+i)->dataLength;
    }

    msgHdr.msg_iov = (struct iovec *) newBuffers;
    msgHdr.msg_iovlen = totalSize;

    sentLength = sendmsg (socket, & msgHdr, MSG_NOSIGNAL);

    free(newBuffers->data);
    free(newBuffers);

#else

    msgHdr.msg_iov = (struct iovec *) buffers;
    msgHdr.msg_iovlen = bufferCount;
    sentLength = sendmsg (socket, & msgHdr, MSG_NOSIGNAL);

#endif

And that seems to get ENet working 100% with Marmalade. I applied this patch to ENet 1.3.6 … The original patch was written for ENet 1.3.3, and I suspect it would work for the latest version of ENet. I built this with Marmalade 6.3.2. I’ve passed on a request to the Marmalade team to fix sendmsg so that the larger chunk of the code would no longer be necessary.

-Phil

How I got Dynamite Jack from 62MB down to 46MB

Thursday, August 9th, 2012

Hey,

So – I went Universal with Dynamite Jack just today! Yay! This involved a lot of “blah blah” messing with resizing all the menus for iPhone users, which wasn’t very interesting, though it came out really well. The interesting bit was when I realized that “going Universal” meant that my retina iPad assets were going to be put on everyone’s phones. And that meant that the 50 MB OTA limit was going to hit me. Quite a few devs advised me against going over the limit, so I took their warning.

I was at 62 MB. I had to take the size down AT LEAST 12 MB to hit the 50MB limit. However, I also know that when distributing your game iTunesConnect pads things a bit, so adding 2MB to that is a good idea. So my target is 48MB.

What was taking up all that room??

– Video (IVF): 10 MB
– Sound effects (WAV): 7 MB
– Music (AAC): 24 MB
– Images (PNG): 20 MB
– Maps, etc: 1 MB

Okay, so I’m at a good place, I know exactly who is eating my space. But now what to do? Let’s check it out!

Video

So I decided not to do anything about the video. I include two IVF files, one for the retina iPad and one for everything else. They are as lossy as I want them to be, so further altering would start to eat up the quality. I could probably get away with re-encoding them a few % lighter and save 1 or 2 MB if I really had to.

Sound Effects

Stereo 16-bit WAV files sound great but they are huge. I took some advice and converted them to IMA4 files which are 25% as large as WAV files. This saved me about 5 MB. However, I found when testing the game out on my decent computer speakers that the IMA4 format really adds a lot of noise to the sound, this was not acceptable to me. I decided to re-encode them using the AAC encoder at 96k which resulted in perfect sounding sound effects and saved me 6MB!

iMac$ afconvert -d aac -f caff -b 98304 in.wav -o out.caf

I am using CocosDenshion for my audio engine, and it seems to handle this just fine.

Music

My music was already encoded as AAC files at 128k. I tried a variety lower bit rates to see if I could save a few bytes. I found at 64k it was really obvious that I was cutting corners. I found at 80k I couldn’t tell any difference, so I decided to go with 96k since that would give me a bit of a margin above that just so I could be sure the music sounded perfect. I used the same command line as converting the sound effects. Going from 128k to 96k saved me 5MB!

So far I had saved 11MB, but I was 14MB over and I knew I needed to trim a bit more fat to make this work.

Images

Previously I had tried a ton of variations on 16-bit “4444” dithered style images. These, unfortunately, looked horrible in Dynamite Jack. So I wasn’t able to use that trick.

I did find out about ImageOptim which takes forever to pack PNGs but it did manage to pull me back 2MB getting me down to 13MB total saved, which was really going to cut things close. I decided to investigate one other option.

I had heard that Amazing Breaker had used JPGs for the RGB component of images and a PNG file for the alpha component. I really require high-quality images in my game, so I found that at 98% quality and 1×1 sampling I was able to get really great looking images.

Ubuntu$ convert -quality 98 -sampling-factor 1x1 tmp-rgb.bmp tmp-rgb.jpg

I pre-blitted them (which gave me premultiplication) onto a black background to get the JPG. Then I made a grayscale PNG file of the alpha channel. I created my own mini format “.cuz” to combine these into single files and loaded them in my game. I found that the game looked perfect! This saved me 6MB!

Afterwards, I found that some of my pre-baked font images got larger using my format, so I left those as straight PNGs.

Finally …

So all said and done, I had saved 6 + 5 + 6 = 17MB! This got my IPA down to around 46MB, which is a nice distance below the 50MB limit 🙂 I’m quite pleased with the results. Some bonus tips:

– The AAC sound effect trick will only work iOS 3.0 and higher. Which shouldn’t be a problem now-a-days. I hear that this won’t work out-of-the-box with OpenAL, so maybe check out CocosDenshion.

– Definitely check your own music to find what bit rate starts to degrade the sounds. Playing on your iPhone or iPad speaker isn’t enough. Playing on earphones isn’t either (unless they are really nice). I recommend playing on your computer speakers so you can be sure the sound IS really good before deciding.

– The JPG+PNG image trick can get great results – but definitely keep the quality high. I found that I was able to go down to 98% and found no artifacts in my game. Be sure to experiment and find the sweet spot for your game images. Also be sure to test on all device resolutions you have to check for artifacts. I tested on all 4 iOS screen resolutions to be sure things were perfect.

Anyway, I hope you find this helpful in your quest for saving bytes! And don’t compromise on quality! Nobody wants to hear or see compression artifacts.

-Phil

P.S. If you need to cut your App in half after that, here’s what you could do:

– Change all SFX+music from stereo to mono
– Set JPG quality to 95% or something even less

That would probably cut mine back another 15MB or so, and very few people would notice. I would notice a tiny bit, and in my case, I don’t need to compromise any more, since I’m already under 50MB.


Bonus: a bit more detail on my image file format

First, I used a python script to figure out which was the best way to compress the image. I do a variety of conversions and see which one is the smallest:

– Using JPG-RGB + PNG-A
– Using PNG premultiplied Alpha
– Using original PNG

So, for example, fonts ended up working best as “original PNGs” and most everything else ended up being JPG-RGB + PNG-A. There should be a 4th option of just JPG-RGB with no alpha, but I didn’t bother, since I don’t have any fully opaque textures.

# -*- coding: utf-8 -*-
import glob
import os
import pygame
from pygame.locals import *
from PIL import Image
import numpy

SRC = "../data-ios"
DST = "../data-ios"

def do_cmd(cmd):
    print cmd
    os.system(cmd)

def png_fix(fname):
    img = pygame.image.load(fname)
    img = img.convert_alpha()
    pygame.image.save(img,fname)
    
def premult(finput, foutput):
    im = Image.open(finput)
    
    print "premultiplying matrix..."
    a = numpy.fromstring(im.tostring(), dtype=numpy.uint8)
    alphaLayer = a[3::4] / 255.0
    a[::4]  *= alphaLayer
    a[1::4] *= alphaLayer
    a[2::4] *= alphaLayer
    res = Image.fromstring("RGBA", im.size, a.tostring())

    res.save(foutput)
    png_fix(foutput)

def main():
    s = pygame.display.set_mode((256,256),0,32)
    for fname in glob.glob(SRC+"/*.png"):
        print fname
        img = pygame.image.load(fname).convert_alpha()
        
        img2 = img.convert_alpha()
        img2.fill((0,0,0,255))
        img2.blit(img,(0,0))
        pygame.image.save(img2,"tmp-rgb.bmp")
        
        cmd = "convert -quality 98 -sampling-factor 1x1 tmp-rgb.bmp tmp-rgb.jpg"
        do_cmd(cmd)
            
        img.fill((255,255,255),None,BLEND_RGB_MAX)
        img2 = img.convert_alpha()
        img2.fill((0,0,0,255))
        img2.blit(img,(0,0))
        pygame.image.save(img,"tmp-a.bmp")
        
        cmd = "convert tmp-a.bmp -define png:bit-depth=8 -define png:color-type=0 tmp-a.png"
        do_cmd(cmd)
        
        dst = fname 
        dst = dst.replace(".png",".cuz")
        f = open(dst,"wb")
        
        # 4 byte magic
        f.write("CZCO")
        # 4 byte version / whatever
        f.write("I\x00\x00\x01")
        
        s1 = open("tmp-rgb.jpg","rb").read()
        s2 = open("tmp-a.png","rb").read()
        t = 3 #JPG + PNG
        s3 = open(fname,"rb").read()
        
        # add a check for non-alpha images, store as JPGs.
        # wouldn't save much room since the full alpha PNG itself will only
        # be like 100 bytes.  not a high priority item!
        
        if (len(s3) < (len(s1)+len(s2))): 
            # we have failed, fall back to just wrapping a PNG
            # but first, premultiply it
            
            premult(fname,"tmp-pre.png")
            s4 = open("tmp-pre.png","rb").read()
            
            if len(s3) < len(s4):
                t = 1 # PNG - original
                s1 = s3
                s2 = ''
            else:
                t = 2 # PNG - premult
                s1 = s4
                s2 = ''
        
        s1 += "\x00"*(4-len(s1)%4)
        s2 += "\x00"*(4-len(s2)%4)
        
        # 24 byte info
        s = "%d %d %d"%(t,len(s1),len(s2))
        s += "\x00"*(24-len(s))
        f.write(s)
        
        # data
        f.write(s1)
        f.write(s2)
        
s = f = open("%s/data.json"%(SRC)).read()
s = s.replace(".png",".cuz")
f = open("%s/data.json"%(SRC),"wb")
f.write(s)
f.close()

main()

In my game, I use stb_image to load my images. But I used a little bit of C code to read my header and decide how to decode them.

        unsigned char cuz_head[256];
        FILE *f = fopen(fname,"rb");
        fread(cuz_head,1,256,f);
        int tp=0,s1=0,s2=0;
        sscanf((char*)cuz_head+8,"%d %d %d",&tp,&s1,&s2);
        fprintf(stderr,"is_cuz: %d %d %d\n",tp,s1,s2);
        fseek(f,32,SEEK_SET);
        
        // load our first image!
        data = stbi_load_from_file(f,&width,&height,&bpp,4);

        if (tp == 1) {
            // do nothing, it's like we loaded a normal image
        }
        
        if (tp == 2) {
            // this image is premultiplied
            is_premult = 1;
        }
        
        if (tp == 3) // separate ALPHA image
        if (data) {
            // this image is premultiplied
            is_premult = 1; 
        
            unsigned char *alpha;
            fseek(f,32+s1,SEEK_SET);
            int _width,_height,_bpp;
            alpha = stbi_load_from_file(f,&_width,&_height,&_bpp,1);
            
            if (!alpha) {
                fprintf(stderr,"(cuzi:alpha) stbi_load failed %s - %s\n",fname,stbi_failure_reason());
            }
            
            if (alpha) {
                fprintf(stderr,"(cuzi:alpha) OK\n");
            
                unsigned int *pix = (unsigned int *)data;
                unsigned char *pa = alpha;
                for (int i=0; i<width*height; i++) {
                    unsigned char *p = (unsigned char*)pix;
                    p[3] = *pa;
                    pa ++;
                    pix ++;
                }
                stbi_image_free(alpha);
            }
            
        }

Dynamite Jack: Final Prototype post-post-mortem

Monday, May 7th, 2012

So in October of 2011, Ludum Dare hosted a second October Challenge. I had so much fun the last year, despite canceling my game, I decided to give it another go. I was really attached to the idea I felt I was approaching with Stealth Target, so I wanted to give it another try. Since I realized the aesthetics and UI were the biggest problems, I decided to take the game back to “Glorious 2-D” and use the aesthetic from my earlier Ludum Dare game Anathema Mines for the starting point of this game.

Here are cut-down versions of the blogs posts I made during the October Challenge 2011. Additional commentary included below the quotes.

Oct 13th – October Challenge, take 2

I’m doing brute-force ray casting here and it works great. It’s really nice to be targeting the desktop using C, so I can do stuff like that. (The older LD version was in python so I had to code it smart, and if I were targeting mobile I’d have to be more optimized.) Anyway, my goal is to have this game selling on the Mac App Store before the end of the month for a few bucks.

TECH: I’ve done a fair bit of optimization here, but really, the main gist is that I raycast from the center of the light until I hit something. I have a few optimizations and whatnot that help make this faster, but nothing super clever. A win for the component object system was that I’m able to change the size of the shadows each object has, which helps for the fine tuning of the look. If you look carefully you can see the size of the player’s shadow get larger when he dies and falls down.

BIZ: I changed my mind about the Mac App Store before the end of the month. I soon realized that this game was coming out really good and that it was going to be worth taking the extra time to really polish it up before releasing it for sale.

Oct 14th – More lighting stuff

I re-did my lighting systems in the game so now I can have various colored lights and I can add ambient light to corners of the caves.

TECH: Each tile on the map is given an RGBA “lighting” component. Each frame I color where light is on the map, and then I blur the coloring of the map. Then I draw the flooring and tiles using the lighting values. I use a different averaged color for each corner so that the shading is nice and smooth. When the player walks you can see the lighting jump ahead by tiles, it’s a technical shortcoming, but it “feels okay” because it feels like the light is flickering a little.

Oct 15th – Technology .. explosions!!

Some new goodies today. Well, the explosions I’ve had for a while, but I just added in the technology that you have to destroy in order to defeat the evil over-lords or whatever. The technology is RED that’s how you know it’s EVIL technology.

DESIGN: If you remember back to Dynamite the core game mechanic was exploding the load bearing pilars in the game so that the building would collapse. I decided that collapsing the cave like that didn’t make much sense, and that glowing alien technology would just look way cooler. I had to come up with a way for blowing up the tech to have a purpose, so requiring the user to explode all the tech of a single color to unlock some doors seemed like a straight forward design choice.

DESIGN: You can see the black “pit” below the explosion. In the prototype of the game, the explosions actually created holes in the floor that were impassable. I decided I wanted my game to never back the player into a corner, so I now have the explosions only break down walls and give the player more area to move in, instead of less.

Oct 18th – Level editor thing

So, here’s my level editor thing. Right now I’m trying to figure out how to set up the level entrances / exits / pathways throughout the level. Sort of some kind of cryptic code system. I’m not sure how complicated I want it to be. Depends on if I will have the level editing open to the general public or not.

DESIGN: I was thinking about some really bad ideas at that point …

That said, I think I want it to be editable by normal people. So I think I’ll probably pass on using those weird codes. But at least now I have those cool hex icons for no reason.

DESIGN: I quickly came to the conclusion that if the editor was going to be too hard for a “normal person” to use, I would also eventually get sick of it. So I made sure to only include things in the editor that I felt everyone could use, not just myself. This really helped me when creating the levels for the game. Since I’m not hugely into creating levels, having a super easy to use editor was what made it possible for me to create the 28 levels for the game.

Oct 22nd – Anathema Mines – now with animated characters

UPDATE: Using my cool-sauce edge generation script, with just a few minutes of graphics work I can get a totally different look to my game. This is going to be super helpful to giving my low-budget game the appearance that it has art in it (maybe).

TECH: This is the one place that I really used some fun python code. I created these interesting mini drawings of the walls in the gimp, one of the ones I use in the final game looks like this:

TECH: I then use a python script to use sub-sections of that image and face them in all different directions to generate the 200+ possible wall tiles for that style of wall. It took a fair bit of messing around to get this to work perfectly, and in fact the “red technology” has two separate layers to give it the look it has. I also save alpha data about each of these 200 sub-tiles which I use for the light ray-casting collision detection. I also use the same data for just plain collision detection.

Oct 27th – More shadows, levels, and editor tweaks

Not entirely sure if I’ll make the Oct.31 deadline, but I’m making quite a bit of progress. I’ll keep plodding along and see where I’m at in a few days!

BIZ: I missed the deadline, but I came pretty close … My new objective was to send Valve a pitch video of the gameplay footage to see if they would want the game.

Nov 1st – Anathema Mines: gameplay video footage

Here’s my gameplay demo video. I’m attempting to “monetize” the game as of Oct 31st, so I’ll report back on how well that goes.

BIZ: I didn’t report back, but I will now. I sent the video to Valve along with some of what I was planning. They were interested! Had they said no, I would not have spent more time working on the game. This was my way of attempting to “fail early” on this project by seeing if the game looked good enough to have mass market appeal.

DESIGN: You can see how the guards reacted to seeing your flashlight in the distance in this video. I changed this later on in development as it made the game too hard. Also the other “scientist” characters had that ability, so I decided it would give the game more variety if they behaved differently. You can also see how the guards turn around counter-clockwise in this video. This was somewhat random at one point, but now they always turn clockwise when going between two points. This makes tracking their paths much easier when playing.

It’s been a great month working on this. The game is coming along super-well, I imagine it’ll actually be released publicly in about a month now.

BIZ: I obviously have some rather poor time estimation skills. It is now six months later and the game is finally coming out this week! The amount of work and polish that went into this game were way beyond what I imagined, but it’s been totally worth it! I’m super pleased with how this game came together.

The game is coming out on Thursday, May 10th! Be sure to check it out then 🙂

-Phil

P.S. The prototype was named “Anathema Mines”. I almost named the final game “Escape from Anathema Mines” but enough people couldn’t pronounce or remember the name that I decided to change it. A TON of ideas were thrown around, but eventually Dynamite Jack stuck 🙂

Help me runtime debug this code

Thursday, April 26th, 2012

Hey there… This happens to me sometimes. I want to be able to turn on a debug mode or a tool that will crash my code during runtime. I can use Xcode, MSVC 2008, or GCC under Linux. Anything. Though, XCode is preferred. This must work with C++, not just C code.

// I don't want to change my code.  I don't want to use Array templates.
// I just want a debugging tool that will crash my code when I do 
// array out-of-bounds, even if they are "safe."  Here's my example.

struct Goods {
    int x[10];
    int y[10];
    int z[10];
};

int main(int argc, char *argv[]) {
    Goods data;
    
    int k = 11;
    for (int i=0; i<k; i++) { data.y[i] = 1; } // SHOULD ERROR ON i == 10

    int j = -1;
    for (int i=j; i<10; i++) { data.y[i] = 1; } // SHOULD ERROR ON i == -1
 
    return 0;
}

Changing my code, using STL, using “new”, using “malloc” are not options.

Help me!
-Phil

Dynamite Jack: Component Object model

Friday, April 20th, 2012

I’m gonna dig into some of the tech behind Dynamite Jack, so hold onto your seats.

So, a lot of devs are into Component Object game design. That article is by a friend of mine who goes into a good overview of a lot about the model. Anyway, you could probably spend all day reading the resources he links to. The short version (as I see it): Ideal Component Object game design is like a Normalized Database. It basically means, it’s kind of a chore to get it working but it’s super flexible and powerful.

I thought that was interesting. So I went to a talk at 360iDev by Gareth Jenkins. He demonstrated how to do component architecture with real concrete examples, which really helped me get my head around the idea. But the most profound bit of the talk was the end, where I remember him saying something like:

“It doesn’t matter HOW you do this, as long as you DO IT.”

That was super freeing for me. The idea that it didn’t matter how hacky I did it, but it would still give me all the benefits was very helpful. So here’s how I do the component object model in Dynamite Jack:

struct Entity {
  bool active;
  bool has_player;
  bool has_guard;
  bool has_light;
  bool has_position;

  int player_state;
  int player_cartridges;

  int guard_state;

  float light_angle;
  float light_radius;
  float light_degrees;

  float position_x;
  float position_y;
};

#define MAX_ENTITIES 64
Entity entities[MAX_ENTITIES];

Now, you’ll see the cool bit is that I have access to all data without creating anything, so no free / delete needed. The simple bit is activating a feature is just a matter of setting “has_light” to on or off. That’s how I actually turn on and off the player’s flashlight in the game.

In the game code, I just loop through all the items and dispatch to various functions for each component type.

void sys_loop() {
  LOOP_ENTITIES() {
    if (e.has_player) { player_loop(e); }
    if (e.has_guard) { guard_loop(e); }
    if (e.has_light) { light_loop(e); }
    if (e.has_position) { position_loop(e); }
  }
}

And I have a similar function for handling events, and painting the screen. It made it really great for prototyping the game and changing features. The “cave trolls” in the game (see at the start of the trailer) had their AI replaced quite a few times, and being able to just hack new things on and change things around without breaking my other entities was really nice!

One of the other nice things about having it explicitly in code, as opposed to “registering” things magically, is that I can see exactly in my code what order the components are being run in, so if something depends on something else, I can be sure they are in the right order.

Dynamite Jack has 20 different “components” but they all lie in the same big structure. Another cool one is having an “action” component, which doesn’t even exist in the game, it’s just a timer to trigger some future event. With everything being an Entity, it just gets it’s own loop called, and is able to do whatever.

So, yeah, my method gives me only a single Class, as many components as I like, and the flexibility to mix and match stuff. The power to toggle components on and off. And a fixed amount of memory used. So easy “save to file” style serialization of the game state.

Anyway, that’s the tech for the day. I definitely recommend component systems, it makes it SO EASY to try new things with your game, and be able to tweak them until they work “just right”.

-Phil

P.S. If you’re looking for the code on how to do this, it’s all in this post. It really can be THAT SIMPLE 🙂

How to create and play IVF / VP8 / WebM / libvpx video in OpenGL

Thursday, February 2nd, 2012

Disclaimer: this tutorial covers how to render IVF / VP8 / libvpx video in an OpenGL libSDL / SDL window. IVF video only includes video, not audio. For game developers, it’s trivial to play audio via their own audio system. So you’ll have two files per movie “movie.ivf” and “movie.ogg” or whatever. As an exercise to the reader, you could easily jam both into a single file if you really wanted to.

The Problem We’re Trying To Solve

So you’re an indie game developer and you want to show a clip of video in your commercial cross-platform (PC/Mac/Linux/other?) game! Obviously you want a patent-free open source unrestricted license to do it.

Wait, can’t I go commercial?

Better than that, you could just use the built-in codecs on a platform! I’d suggest this if you are targeting a single platform, iPhone / iOS for example.

Otherwise, you’ll be using Bink, a commercial solution at $8500 / platform. I emailed about their “indie licenses” and never heard back.

The Open Source Options I didn’t like much

Here’s what we have for patent free open source codecs .. and their various problems.

Xiph Theora – Probably the best known codec. To get it working you have to have libogg, libvorbis, and libtheora all built for your target platforms. To me, that seemed like a lot to ask. Also, the libtheora API is a MONSTER. playtheora is a SDL example (similar to this one) that covers some of that ugliness, so I’d recommend checking that out if you want to use theora.

Dirac / Schroedinger – the BBC funded codec. I couldn’t get this one to build. It doesn’t seem to be all that popular.

Motion Jpeg – This isn’t so much of a codec as an idea. Make your own movie file with a ton of .jpg’s in it. I tried this. The files get really huge really fast. I wouldn’t recommend this.

Motion JPEG 2000 – This implementation was also pretty confusing. I couldn’t find where to start. And, yeah, this isn’t all that popular either.

libvpx .. why I chose it

WebM / libvpx – Backed by google this is a new contendor on the block. The thing that sold me was the sample encoder which was pretty simple. It also depends on nothing. Also, building it on OS X and Linux was trivial. Also they offer a pre-built Windows binary. Also, they just had their 1.0.0 release a few days ago.

So, yeah, having a supported, up and coming, easy to build codec was key to me.

How to encode for IVF / libvpx

Since it’s a new codec, not much supports it right now. I used a fresh build of ffmpeg under linux that I built with this configure command:

./configure --enable-encoders --enable-libvpx

Then I was able to use ffmpeg to encode ivf files pretty easily:

ffmpeg -i Untitled.mov -vcodec libvpx -b 1000k -s 1024x512 movie.ivf

Note: we’re not dealing with WebM files. WebM files are container files that also contain audio. Again, you’ll have to store your audio separately, or create your own container file, or figure out what WebM is on your own time.

So .. what’s the bottom line? Do we get any code?

Yes! I created a libSDL player that plays back the video at max speed possible and it converts the YUV data to RGB data and loads it as a texture. Here are the functions I provide:

void playvpx_init(Vpxdata *data, const char *_fname) ;

Just init your Vpxdata with a filename “movie.ivf” .. It’ll try and get libvpx up and running for you.

bool playvpx_loop(Vpxdata *data) ;

Call this once per frame to have it decode a frame of video. It will return false once it has run out of frames. If you want to mess with the libvpx YUV data yourself, it’s data->img. See the playvpx.cpp source or the libvpx example above to see what that structure provides. It’s pretty simple.

int playvpx_get_texture(Vpxdata *data) ;

Call this once per frame to have it convert the YUV data to RGB and upload the texture to OpenGL. It will return 0 on failure or a OpenGL texture ID on success. I convert on your CPU, so it’s not super fast, but it should work fine on modern computers. If anyone cares to provide a Shader version of this function, or provide a SIMD / MMX / SSE version .. well, that would be faster!

void playvpx_deinit(Vpxdata *data) ;

Call this function when you’re done to cleanup.

Conclusion and Source Code

Okay, here’s playvpx for you to check out. It’s a C-style API, but I’m sure I use some minor C++ in there. Probably wouldn’t be hard to make it C-only if you require C for your project.

Oh, and I include the libvpx binary for Windows, OS X, and Linux. So you may not have to build it for any platforms!

The code is licensed under the libvpx BSD-style license. My code here is a gutted version of their sample_decoder.c, so .. that seems to make most sense to me.

Is this a safe pattern for auto-registration?

Wednesday, October 26th, 2011

Hey,

I’m trying to have various components in my game pre-register themselves upon startup of the game. I’m using this pattern:

Here’s my registration code that is in one .cpp file. The register_init() function is available from anywhere:

std::vector &get_init() {
    static std::vector goods;
    return goods;
}
bool register_init(Init *v) {
    get_init().push_back(v);
    return true;
}

Here’s how I auto-register some init code from some other file by calling register init to initialize a variable:

int _is_levedit_init = register_init(new LeveditInit());

Is there some case where this would NOT work or is somehow a really bad idea? Or is there some pattern that is better to use?

-Phil

Invasion of the Blobs & dev toolkit in the works

Friday, May 20th, 2011

Hey,

A few weeks ago I entered the Ludum Dare game development contest and whipped together a fun game about defending off invading blobs using a spray can.

I spent another week getting it polished up so it works on a ton of platforms. The game is “The Invasion of the Blobs” (iBLOBS for short). You can get it here. It’s available on iPhone/iPad, Android, PC, Linux, Mac, and pretty soon the Mac App Store.

The reason for the porting frenzy with this game is I’m working towards releasing an open source C++ toolkit for supporting all these platforms (and maybe a few more). This is my first game release with this kit. It uses code from all my recent games, but it finally puts that code into a clean and organized re-usable structure. This is going to be super helpful for reducing bugs and improving game code across each platform.

Anyway, I hope you enjoy iBLOBS. It’s totally free, so you might as well give it a whirl. If you want to help out, please post a message here if there are any crashes or support issues on any platform, I want to get those ironed out best I can 🙂

Have fun!
-Phil

UPDATE: The Android port has been giving quite a few people trouble. If you are a dev with the Android dev kit, please do an “adb logcat” and post the results here, that would be a huge help!

UPDATE2: The Android build is so broken I took it down.