I totally have an idea for a post.

In fact, I totally have two ideas for posts. They're good ideas. They will make good posts.

Unfortunately, I just spent the first chunk of the month preparing for the Digital Game Museum booth at PAX, and now – right before the month ends – I'm heading out to Burning Man. So. We're not getting a post this month. It happens.

But we will be back, and we will have more to talk about. Just . . . maybe when I'm a little less covered in playa dust.

Nieuwe Aarde 0.1.1

2011, March 7th 1:22 PM

Windows (.zip version available)
Mac OSX
Linux (32-bit only)

"Wait, Nieuwe Aarde? What's going on? Didn't we already see this game?"

Well, yeah. You did. And you'll be seeing more of it, too!

I've decided to turn Nieuwe Aarde, along with one other game yet to be announced, into a longer-term project. I've got ideas on how to improve it considerably. This isn't really an improved version – this is just a re-release of the version you've played before – but it does have a few improvements.

First off, and most noticably, it has music! I've been collaborating with Robert Seaton with music for a few of my games (and I'll be posting them with music as well, though they won't be getting a long-term treatment.) We did a really neat thing with the music in this game. I'm not going to spoil it, but you should go play it to find out. Seriously it's pretty dang cool.

Second, a common complaint was that increasing your metal and magic in the lategame took far too long. I've added +1000/-1000 buttons as a small hack fix for that.

Third, the rendering engine is far more efficient – the original renderer was quite shockingly bad. Sometimes that's just what happens when you have 48 hours to write a game in. I regret nothing.

Overall, this is the same game . . . but keep an eye on this journal, because I'm going to be making some major changes to it.

Insanely, Ridiculously Busy

2010, December 21st 9:57 PM

I've come up with like four posts I want to make. Half of them are far too short, half of them are far too long.

You're getting "too short" here. Suffice to say that things are churning along, in many different directions, I'm unbelievably busy, and it's all going well.

We'll resume our regular programming eventually. For now, bear with me.

GDC 2010: Aftermath and New Beginnings

2010, March 14th 10:05 PM

I spent most of last week at the Game Developers Conference.

It was fantastic, because it always is – it's a solid week of jamming new game development knowledge in my head, and, y'know, there's nothing bad to be said about that. There were many good talks. Talks about game philosophy. Talks about game design. Talks about game implementation. Talks about marketing. Talks about business models. Talks about target users and monetization.

I realized, somewhere in the middle of these talks, why I was having trouble moving forward. It was because I was moving to the iPhone, not because I was excited about the iPhone, but because I was trying to sell a game. A game which – let's be honest – I wasn't really excited about either. I wasn't working on what I loved. I wasn't working on what I'd gone into this crazy industry for in the first place.

I was trying to change from an artist to a producer. And I'm not a producer. My business cards say "Director", but I'm not sure even that is accurate. I'm an artist, and games are my canvas.

When I talk about the people I respect most in the industry, I don't talk about the people making 99-cent iPhone games with three million downloads. I don't talk about thirty-million-player Facebook games, or the latest Madden game. I talk about Cactus. I talk about Johnathan Blow. I talk about Derek Yu. I talk about Jenova Chen.

I talk about the people who make the game they want to make. And, sure, they pay attention to marketing, to business, to target users. But in the end, I think these people all make games they're proud of, and they all make games that are meaningful beyond the next five minutes of our collective attention span. And that's what I want to do.

I'm still going to be doing my monthly experimental games, at least for the immediate future (and, hell, I've only got three months until I've been doing this for a year, it'd be a shame to stop now.) But I think it's time to buckle down and make something that I can be proud of, and I think it's time to start making waves and trying to wrench myself into the public eye instead of running dark.

If I'm gonna be a rock star, it's time to start acting like one.

2010 is a good date for that.

SNAFU

2009, July 30th 11:34 AM

Bit of a general fuckup here, folks – it turns out that the Mandible blog has been rejecting comments for probably a few days. If you posted a comment about Fluffytown, I'm sorry to say it got entirely lost and I can't retrieve it. If you had anything to say (and I'm hoping someone did) I'd appreciate it if you'd go back and re-comment, or at least summarize what you had to say.

Now I'm going to go figure out a way to automatically notify myself when things go this wrong.

0.1.0 Released

2008, April 28th 10:36 AM

Quickly on the heels of that alpha test is 0.1.0 official release. Grab it (nowhere, sorry, removed!).

What's different? Not a lot. There's an installer, and that's it. I haven't gotten any bug reports, so I'm calling this a release.

But there's a new site feature – I've added a site forum for discussions. Take a look, start a thread, talk about D-Net. It's still under very heavy construction, so beware that the layout and look might change drastically as you browse. But it's officially open. As always, I'd love to hear any commentary you have about D-Net or the site, and if you can't find an appropriate thread to post it in, head for the forum.

This is probably not the most exciting post you've ever read, but there'll be more coming. Not to worry.

Triage and Endings

2008, February 24th 2:15 AM

For quite some time I've been trying to figure out my plans for my current project.

A bit of backstory. D-Net began about three and a half years ago. I was visiting a friend's house, playing video games, and he broke out an old gem named Destruction Zone that I'd played before. Destruction Zone – and this may sound familiar – was a top-down tank game, multiplayer, with tanks blowing each other up in an arena. We played quite a long game, occasionally cursing the interface, and eventually I said "Hey! This game is pretty cool! Someone should make a modern version of this, with support for gamepads at the very least."

And, well, here we are. I spent two and a half years coding it part-time, while working at Google and getting distracted by other projects, and now I've spent about a year on it full-time. D-Net has taken several truly unexpected turns that distinguish it from Destruction Zone (enough that I'm no longer worried about copyright infringement – the most similar thing at this point is the name) and has grown into a distinct and truly enjoyable game on its own.

The tough part turned out to be the very interface I grumbled about the first time. Supporting multiple players on a single system is hard, and you've seen me rant about this at great length. The end result is a game which, to be honest, isn't immediately impressive at first glance, since much of the effort has gone towards things that the user simply isn't supposed to notice. (Like the fact that it's not a horrible pain to play.)

But more importantly, as much as I want this game to be complete and done, there's much of it that doesn't interest me. There are people who love multiplayer games, who buy Supreme Commander and Half-Life 2 and World of Warcraft and then never touch the single-player segments. In many ways, that's what D-Net is targeted towards – and, sad to say, I'm not one of them. I've enjoyed my work on it quite a bit, and it's shaping up very well, but I am fundamentally a single-player gamer. I love story, I love plot, I love drama and emotions and the fact is that D-Net doesn't have any of those.

So what I'm doing right now? In some ways, not tenable long-term.

There are solutions. I could sit down and rework D-Net into a singleplayer game. D-Net would make a very interesting squad-based combat game, and I've been quite, quite, intrigued by all the things I could do with this within the D-Net framework. But this is problematical. First, it would require a huge rebalancing, a huge re-tuning, and a lot of changes to the engine. D-Net is not designed for sprawling landscapes, it is not designed for running firefights, it is not designed for static guards or NPCs or indeed any of the things I would need. Second, it would mean I would no longer have a multiplayer game. D-Net is a fun game and it's played frequently at game parties with my friends. I don't want to lose that, because I do truly enjoy the thing I've built. And third, I wouldn't be building a game and a story together. I'd be desperately retrofitting, and either the story would suffer or I would have to write effectively a full engine.

Alternatively, I could splice a singleplayer game into the existing D-Net. This eliminates one of the above problems but keeps two, and adds an even nastier third problem: when you try to make a singleplayer game and a multiplayer game at the same time, at least one of them ends up suffering. There do exist games which, without modding, have been excellent singleplayer and multiplayer games – Call of Duty 4 is the best recent example that I know of. But they have to be designed for this from the ground up, and obviously, that has not happened.

I think it's important to explain why I kept writing D-Net.

Originally I was writing it because I thought it would be a neat and fun project. Later, though, I started thinking about what I would want to write for commercial release, and I came up with a plotline and a game that I felt had a huge amount of potential. I'm currently calling that game MV. I ended up designing it for the Nintendo DS, and I have quite a few files full of design and plot that I've built up while thinking about it.

In order to develop seriously for any game console, you have to buy a development kit. In order to do that, you have to convince the publisher that you deserve one, and that you won't just walk away and sell it to the highest bidder. Development kits are rather closely guarded, and as I understand it, you never actually own one – you're merely "renting" it. So, in order to get a Nintendo DS devkit, I would need to convince Nintendo or some other publisher that I was a competent game developer.

Thus, D-Net.

From what I've gathered since then, Nintendo DS devkits are much easier to get ahold of than I'd originally thought. On top of that, it sounds like having a mostly-working game is even more of a draw than I thought – so even if I can't get a DS devkit, I can do most of the work on the PC (keeping the DS's hardware specs in mind) and almost certainly get one with that. To put it simply, D-Net is no longer needed.

And thus, I think it may be time to retire D-Net . . . for certain definitions of retire.

D-Net isn't finished, but it's close, in many ways. I have plans for networked play, for singleplayer mode, for good AI. I have plans for many game modes, for a neat ending cinematic, and for many things that would be fantastic to finish. If I scrap all of those, or most of those, I really have very little left to do.

My todo list, at this point, contains three major tasks.

  • Make a public demo release
  • Finish all the weapons
  • Improve the graphics

And so that is what I am going to do. I'll do those things (and then make another demo release), and then I'll shop it around, and try to sell it, because I do truly want this thing to be sold and to show up on XBLA or PSN or Steam or Gametap. But if nobody is interested, then I suspect I'll end up GPLing it, because it's more important to me that D-Net gets played than that it makes money. And once I'm done with that, I'll get started on MV.

D-Net has been a large part of my life for quite a while, and I've enjoyed working on it, and I've learned a lot from it. But I think it's time to move on . . . after giving it those last few touches.

The truth behind game development

2007, December 20th 11:32 AM

I've had a few people ask me why this journal spends so much time talking about minutae of business (like server setup and mailing lists) and so little about, you know, actually writing games.

This journal is fundamentally about starting a game studio. Sometimes that's game design. Sometimes that's coding. Lately, it's been about setting up a server and getting some people reading this journal (yes, this one that you're reading right here) so that people actually play the game when it comes out.

So that's what I've been doing lately. The server's working and (finally) is requiring no tweaking. The mailing list is up and works as well. And now, I even have a company logo. It's a lot of small things, but it's small things that have to get done, and nobody is going to do them besides me. The next step, from a non-coding perspective, is PR – and I'm going to have a lot to say about that as well, because PR is a horrible pit of morality issues.

The reason I haven't been posting about what I've been coding is rather complicated and entirely uninteresting. I've been on a vacation for the last week, and now we're moving into the holidays, and so I haven't been getting as much work done as usual. On top of that, I've been coding things which are about as dull as it gets – interfaces. And not even interesting interfaces. For example, now you can hit Escape and get a popup menu allowing you to quit the game, return to the main menu, or change the resolution. Necessary? Absolutely. Interesting? Not in the least.

Unfortunately I've got a lot more uninteresting before I can get back to the interesting. What I'm trying to do is release a stripped-down demo version of the game for all to enjoy (and, for those of you who don't run Windows, I'm hoping to do a Windows/OSX/Linux cross-release. We'll see how well this works.) The game right now suffers from a few major flaws, however, such as essentially requiring someone who's played before to navigate the menus. This is pretty bad. It works great when your goal is "make sure the gameplay is fun and balance things", but it needs to be fixed, and it needs to be fixed now.

I'd love for game development to consist entirely of designing giant guns and aliens and making explosions look awesome. Unfortunately, that's just not the case, and I've got a long, long slog before I can get back to that.

This is what I woke up to four days ago.

By now, anyone reading this entry – and thousands of people who aren't reading this entry – have probably seen my Double-Boiling Your Hard Drive article. I wrote that one up because I thought it was fascinating, and submitted it to Slashdot, Digg, and Reddit just for fun. I got a small pile of hits off those submissions, but not many, and I went to bed assuming those posts would simply fade into oblivion.

While I was sleeping, a friend of mine made a Reddit entry that, to say the least, did quite a bit better. I woke up and my site was so hammered that even I couldn't access it.

Since I spent a lot of time trying to figure out what the issue was and (once again) ended up finding a solution myself, I'll explain the problem and eventual solution here along with a lot of technical garbage that most people probably don't care about. I am, however, going to try to make it understandable to non-geeks – so if you want a bit of a view into how webservers and performance works, read along.

The entire problem stems from the basic method that you use to do communications across the Internet. Here's the simplest way you can write an Internet server of any kind:

Repeat forever:
  Wait for a connection
  Send and receive data
  Close the connection

This seems reasonable at first glance, but there's one huge issue – you can only process one connection at a time. If Biff connects from a modem, and it takes me 20 seconds to send a webpage to him, that means there's 20 seconds during which nobody else can communicate with the server. Can't download, can't even connect. Worse than that, even the fastest browsers are going to take a few seconds to load a large complex webpage like this one, and that would kill performance completely.

There are many many solutions to this. The most common web server, Apache, seems to have two main modes you can run it in – prefork and worker – which implement two of those solutions.

First off, prefork. In prefork mode, Apache forks the webserver process into a larger number of processes, which means that there are actually, say, ten copies of Apache running simultaneously. This isn't nearly as bad as it sounds. Any modern operating system is going to realize that most of the data Apache loads, like the program itself and likely all of its configuration data, doesn't change – and if it doesn't change, there's no reason to make one copy of it for each Apache. It's shared among all the processes. If it's shared, you only need a single copy. Low memory usage and everyone is happy.

Unfortunately, Apache isn't the only thing that goes into a standard webserver now. The journal software that I'm using on this site – WordPress – is written in a scripting language called PHP. Scripting languages need an interpreter to work, and so Apache runs the PHP interpreter – one copy per process. The interpreter code itself gets shared in the same way Apache does, but all the temporary structures it builds and all of its working space isn't shared.

It actually gets worse from here. As part of their normal function, most programs allocate and deallocate memory. If they need to load in a big file, they allocate a lot of data, then deallocate it when closing the file. When they allocate, they first check to see if there's any "spare space" available that the program has already received but isn't currently using. If not, they request more space from the OS. However, most programs will not return space to the OS at any point. They'll just return that unused space to its local pool, the aforementioned "spare space". This means the OS can't know exactly what the program is or isn't using at any one time. The process eventually stabilizes at whatever its worst-case is – if you have one horrible page that takes eight megs of RAM just on its own, and you have your program load pages randomly, your program is going to reach eight megs and then sit on that eight megs forever – even after it's done dealing with that nasty page.

As a result of this, if you're running ten Apache processes, you will eventually be using ten times the maximum amount of RAM that Apache+PHP could use on any one page on your site. That's painful.

In my case, this site is running on a virtual server with 256 megs of RAM. My average Apache process was eating about 12 megs, and MySQL was consuming another 50 megs, and the OS was taking another chunk. I couldn't get more than ten processes running without absolutely killing my server. (When MySQL crashes due to running out of RAM I really don't care that I can serve error pages 50% faster.)

And this is why, despite the fact that the CPU load was negligible, the site was still completely inaccessible. I had plenty of CPU to generate and send more pages with. I was swimming in spare CPU. But no matter how much CPU I had, I couldn't possibly service more than ten users simultaneously.

Now, back to those Apache modes! What I've just described up there – with one process per connection – is "prefork" mode. There's another newer mode called "worker" mode.

In worker mode, Apache spawns one thread per connection. You can think of threads as sub-processes – they run inside the same process and all have access to the same memory and data.

Remember all that stuff I wrote about programs returning data to the OS? They don't return data to the OS – but they'll gladly return data to the process. Every copy of PHP that gets run can reclaim the exact same memory and re-use it, even while its siblings are sending and receiving data.

By default, worker mode spawns 25 threads per process, with multiple processes if it needs more connections than that. Under heavy load, each thread spends most of its time sending or receiving data (Biff's crummy modem again). In reality only one or two threads will actually be running PHP at a time – so the memory usage for this single process is, at most, twice that of the prefork processes. But we can now handle 25 times as many connections.

I finally got this mode up and running, and suddenly my site was not just usable, but 100% responsive. No slowdown whatsoever. However, you'll notice I haven't given any kind of detailed instructions on making this work, and there's a good reason for it. This is a terrible long-term solution and I was crossing my fingers the entire time, hoping it wouldn't melt down.

Here's the issue with threading. Imagine you have a blind cook in a kitchen. (I'm avoiding the classic car analogy.) He can cook easily, because he knows where everything is, and he knows where he may have moved things – he can take a pot down, put it on the stove, chop an onion, toss it in the pot, and the pot is still there. He's blind, but it doesn't matter, because nobody is mucking about with his kitchen besides him. No problems.

Now imagine that we have a huge industrial kitchen, with fifty blind cooks, all sharing the same stovetops and equipment. The cooks would get pots mixed up, interfere in each other's recipes, and there would probably be a lot of fingers lost. Threading, unless you're careful, can be equally catastrophic – all the threads work in the same memory space and they can easily stomp all over each other's data.

PHP, in theory, is threadsafe. Some of the libraries that PHP calls are threadsafe. Not all of them. It worked, for a day – but I wouldn't want to rely on it long-term.

There is a solution to this. It's just a horrible bitch.

There's a module called FastCGI that you can use with Apache in worker mode. FastCGI is threadsafe. FastCGI can be set up to call a specially-built version of PHP, and do so in multiple processes so PHP doesn't even have to be threadsafe. To make things even better, FastCGI keeps a small "pool" of PHPs – perhaps three or five – but nowhere near one per connection. This does mean that you can only have five PHP sessions running at once, but remember that PHP processing is fast on our server! Apache is smart enough to read all the input, do all the PHP processing quickly in memory, and then sit there waiting for Biff's modem to acknowledge all the data. Five PHP instances can easily service a few dozen connections.

Unfortunately, Debian Linux (and likely others) doesn't have particularly good native support for this. All the modules do exist in one form or another (apache2-mpm-worker, libapache2-mod-fastcgi, php5-cgi) but just installing them doesn't do the trick – you need to hook them together. Luckily, the FastCGI FAQ does mention everything you'd need for this (look under "config"). It's annoying to set up, but it's not really difficult – just irritating.

FastCGI on its own doesn't solve all the problems. WordPress is, actually, a CPU-hungry beast. Five PHP instances might be able to service a few dozen connections, but not hundreds – WordPress pages involve a lot of database queries and a lot of work. But solving this issue can be accomplished nicely by installing WP Supercache – it will cache pages as they're displayed and it hugely decreases CPU usage, meaning that those same five PHP instances can now serve well over a hundred.

Before these changes, my server couldn't handle more than ten simultaneous connections. I've used website stresstesting software since – I've managed up to 400, and the server doesn't even break a sweat. I can't do higher because my connection starts dying horribly.

There's no real excuse for any modern server to have trouble with this sort of load, unless it's doing extremely heavy noncacheable processing or getting hundreds of simultaneous connections. Computers are fast, and getting faster all the time – at this point I'd love to see this site get Digged or even Slashdotted, because I'm truly curious what it could stand up to.

I'm hoping that someone with this same problem will find this page and be able to fix it quickly. It's not that hard – it's just kind of annoying.

First I'm going to show you a picture, just to get your attention.

I have a rather old computer case that I've been lugging around for years. It's a Hush Technologies Mini-ITX. I don't think they even make these systems anymore – I got mine many, many years back, and it was one of the first they produced.

The Hush Mini-ITX was a near-silent computer, before silent computers were anywhere close to as easy to build. It used a Mini-ITX board, a small quiet low-power motherboard that frequently had a small fan for cooling, but instead of the standard fan it used a heat pipe connected to the side of the case. The case itself acted as a large heatsink and radiator for the CPU. The hard drive was enclosed in a heat-conductive but noise-silencing frame. Overall, a clever design.

It's been a hardy case. I can't say the same for components put inside of it. It runs hotter than I really want – so far it's on its third hard drive, its third motherboard, and its second power supply. Last time I swapped the hard drive when it started getting a bit noisy – not "there are things banging around in the hard drive case" noisy, but "its hum is getting louder". I figured the same thing would work this time, so in my recent upgrade I included a spare hard drive for it. Standard replacement deal – turn the system off, plug the extra hard drive in, toss a SystemRescueCd in the drive, and it refused to detect either hard drive.

Eventually I figured out that my old hard drive was deeply, deeply unhappy. It wouldn't show up in BIOS (and neither would any other drive on that IDE chain) and it wouldn't even initialize – it would just sit there and click. Click. Click. Click. Click-whir. Click. Click. Click. It was spinning up just fine . . . although after enough clicks, it would spin down again. It just wasn't showing up as a hard drive to the computer.

I did a lot of research and tried the standard recovery tricks. Apparently there's a rather infamous hard drive Click of Death, but it's more of a general symptom than a specific cause, and the causes can be anything ranging from "your hard drive is somewhat old" to "your drive head is now bent at a ninety degree angle". So that didn't really help me diagnose it, much less solve it.

The tricks are, to be said, odd, but I tried them anyway. Freezing the drive didn't help – if anything, it made the click noisier. Banging the drive gently didn't help. At this point I had kind of given up, so I tried banging it more emphatically and that didn't help either. That's most of the standard tricks.

So I sat there, with a slowly thawing hard drive sitting on the desk in front of me, and thought.

One of the possible reasons for the Click of Death was that the heads had gotten misaligned, either vertically or horizontally, or in some combination of the two. Another possible reason was that the heads had actually gotten stuck on something. If I could jar the heads loose, or get it started, it might function fine after that. And it had been working just peachy-keen in the computer beforehand – I hadn't even realized it was defective, just old.

So if the heads are just stuck . . . and freezing the drive makes it louder . . . well, brief diversion. If you have a jar that you can't open, there's a trick to getting it open. You run the jar under hot water. The lid expands, and the neck expands, and that also means the gap between the lid and the neck expands. And that makes it easier to open. Now, if I heated up my hard drive, perhaps the same thing would happen. On top of that, the drive had been quite a bit warmer when it was working – it had been encased in that soundproof frame I mentioned before. What if I brought it to near that same temperature before trying?

Obviously I didn't want to melt the drive, or burn it, or get it wet. This is exactly what a double boiler is for, and you can approximate a double-boiler easily using two pots. Thus the picture at the beginning of this entry.

I heated the drive up until it was bordering on "hot to the touch". I figured that was around how hot it was before. I plugged it in, and . . .

. . . well, apparently I've now invented a new way of repairing hard drives. I copied over the most vital stuff, moved it to a different computer quickly (I've never been afraid of a component cooling down before, but I suppose there's a first time to everything) and successfully took a disk image of it. Worked 100% perfectly. I can't find any references to this technique online, so perhaps I really am the first one to try it.

I can't say I recommend this as a standard repair method, and obviously this is no substitute for professional repair services. But if you've tried freezing your hard drive, smacking your hard drive with a hammer, and all the other "normal" tricks . . . maybe it's time to try double-boiling it.

 

On another subject, I will admit that this has little to do with Mandible Games. I've just been kind of busy lately, in entirely uninteresting ways. First off, Mandible Games almost has a logo – I'm just asking for a few minor changes before I finalize it and put it up. Second, I've been doing a lot of work on the interface to D-Net. I want people to be able to change the game's resolution and aspect ratio, and that takes a lot of effort to make the menus work sanely. Third, I got a new computer and almost lost a lot of data – obviously that's a bit of a slowdown as well.

My todo list, however, is getting shorter and shorter. Right now there's only six items left before I actually release a public demo version. The first version is going to be Windows, since that's what I develop on natively, but D-Net builds perfectly fine on both Linux and OSX – all I need to do is figure out how Linux and OSX packaging and installing works.

The first version also isn't going to include online play or single-player play, just to warn you, but it should give a sense of what the game is like, and if you have some friends who want to blow you up in tanks (and, ideally, some USB game controllers), it'll work just fine for that.

I think that's the current State of Mandible. Double-boiling hard drives and writing uninteresting UI code. Yep. That's about the size of it.