The Glitz and Glamour of Good Workflows

This will likely be the one of the only development logs written without a bunch of screen shots. I’m so sorry. It hurts me more than it hurts you.

In fact, given some of the tasks I’ve taken on this week: believe me, it hurt me a great deal.

Before pushing much further into development, I had to take a step back at the beginning of this week and get everything straightened-up a bit from the somewhat haphazard approach I had been taking with source control and backups thus far. It’s fairly easy, especially with Unreal Engine 4, to get into somewhat of a messy jam as far as source control goes. Given that the engine is, well, the digital equivalent of every elephant on the planet all on the same trampoline (I say that endearingly?), the potential for creating a completely unwieldy and very error-prone workflow for major integrations and changes is unspeakably easy. I know this. I was there around this time last year.

So, I took the time to rectify it! It was fun! So fun! So fun! So fun! So… (okay not even remotely fun). But, I want to go into some of the practices I put into place (and some of the tech and third-party APIs that got added along the way) so others can benefit from my pain. Much like the last time I talked about Perforce (someone actually emailed me to say that they stumbled on that post and it saved them so much work, which was beautiful).

Github

GitHub didn’t always have Git-LFS. It has a soft repository size limit of 1GB, and a much harder limit of 2GB. I have had to take many a repo down in the past because it easily and quickly eclipsed that size limit. Source repositories aren’t just about what you have active at any given time — they’re much more about what you have and have done for the entire life of the tracked file set. So, even if your at-the-moment overhead isn’t tremendous, that doesn’t mean that the 0.5GB asset you checked in accidentally early on isn’t still taking up space. And, as far as games go, Git has never really been meant for us. It’s much more for the efficient historical tracking of text files and the changes within them over time.

Git-LFS was introduced (I imagine) to address the greatly-increased needs of developers reliant on a large quantity of binary files. And it offers fairly large storage plans at a small additional price in order to handle them.

Problem for me: I just don’t like Git-LFS. Regardless of the additional storage size, Git simply does not handle… Sorry, you wouldn’t know this, but I just took a half-hour break to figure out why Perforce was unhappy with one of my changelists and refused its submission. I said “but no” and it said “lol we don’t care.” Video games!

As I was saying: Git simply does not handle massive repositories particularly well, especially when those repositories contain a lot of binary files. Git-LFS may make storage and revisions more manageable from a server perspective, but they absolutely do not ease the user’s experience.

Perforce

Now, Perforce handles large repositories regardless of their contents like a champ. It has a lot (and I do mean a lot) of quirks as it does so, but at the end of the day it’s really the only tool for the job on major game projects. It brings with it a command-line interface unlike any other, and I don’t say that as a positive, but the GUI (P4V) is generally well-equipped to handle most day-to-day operations.

That said: what Perforce provides in overall large project management and revision histories, it really, really drops the ball on ease-of-automation (scripting, primarily). I tried for about five painful hours earlier this week to setup a simple on-submit script that would generate release notes for me over time. This is the result of that (pardon the language; it was a lively debate between man and machine):

Point I’m slowly arriving at: all tools have their pros and cons. Also, Perforce, if you’re reading this:

  • The submission progress bar exists in an alternate space-time that I don’t even know how to properly classify. There is no bearing on human time, though.
  • Please, please, please, please validate the contents of a submission before the entire process is complete. There is a special hell for an application that lets you submit 20GB of content only to find out one file was missing so the whole commit was rejected.

The Resulting Structure

I now have three repositories that I manage without any actual amount of painful overlap whatever. Other than the one time I accidentally merged the branch of the NVIDIA GameWorks repo for HairWorks into my engine build. I have a lot of uses for the various GameWorks techs, but there is literally not a single conceivable use that I would have for well-rendered and physically-plausible hair in Steel Harvest. Not one. Unless we do some crossover with Brutal Legend. And, surprisingly, of all the NVIDIA tech I’ve worked with over the last year and a half, the HairWorks UE4 integration is, by far, the most widespread and complex. C’est la vie?

Anyway, we now have a Perforce server on AWS that will see me through the end of the project with joy and vigor. There is currently a large quantity of assets on their way through the tubes as we speak.

And in addition to setting up a GitHub organization for Joy Machine, I also setup two separate GitHub repositories (super secret ones): one for the Steel Harvest game code and one for the custom fork of Unreal Engine 4. And the interplay between the whole family is just beautiful.

List of Other Things So I  Can Get Back to Work

  • My custom build of Unreal Engine 4 now has a full integration with trueSKY (thus eliminating some of the awkward rough edges), and improving editor performance substantially when multiple viewports are visible at once.
  • My build also has full integration with VXGI – which I’m hoping can replace the heightfield GI component of the dynamic lighting setup I have in the scene right now – as well as my beloved HBAO, replacing the good-but-not-as-good screen-space AO of UE4.
  • I started an integration with the Twitch API.
  • Google Analytics support is built in now, which is really boring to everyone but me — and even then, it’s just necessary for data on the game throughout development.
  • I have fully-integrated the Houdini Engine again, which I will be putting to very fantastic use.
  • I added support for basic REST API calls (more on this in a second), as they’ll become necessary later on.
  • Added support for standard HTTP requests.
  • And, completely unrelated to the engine, I’ve been giving the Joy Machine site a slow-and-steady overhaul to take in the direction I’d really like to see it go.
    • The first real step in this direction is the addition of recent repo commit messages (GitHub-only for now, as you can see my progress with scripting p4 wasn’t outstanding), in addition to a very rudimentary “tasks completed” listing. I’m going to be improving these greatly over time. Or, ideally, someone else will because:
      • Until yesterday, I had no idea what jQuery, AJAX, and REST APIs were actually like to work with. Turns out: yeah, no, not my thing. Never my thing. By the time I got those very basic little widgets working, I felt like Sam Neill at the end of Event Horizon (spoilers: he wasn’t in the best mood). Anyway, one way or another, more dynamically-served content and updates will be coming. And they’ll be prettier.
      • I’ve also been adding bots and webhook responses to the Discord server for Joy Machine, which anyone is welcome to join! We’d like to have you. We talk sometimes. I talk sometimes. To myself.

More to Come in Game Prettiness Later

I hope.

2 Comments

  1. Ben Sizer on November 25, 2016 at 2:25 PM

    Looks like you need some more experience with Python 😉

    • Trent Polack on November 25, 2016 at 2:28 PM

      Oh, absolutely. Other than Lua, I’ve never really been much for scripting.

Leave a Reply