The Ebonheim Chronicle

Development Blog for Chronicles IV: Ebonheim

It's been a little bit since I've sat down and blogged and, while I have a few long-form write-ups in mind, I did want to drop in and give a short status of the project and how things are going.

Some Bookkeeping

I'm on BlueSky! I was on BlueSky before and continue to be. I harbor a deep distrust and series of smug sideways glances toward the platform and will continue to primarily be present on Mastodon but if you are a skeeter or skeet-adjacent, you can find my major updates and gifs on the blue butterfly.

I have a new email address through the extremely excellent Fastmail! This affects 0.1% of you but play-testers or anyone who has reached out in the past will now receive correspondence from bri@brianna.town. If you're the clever sort, you might even add it to your email contact list so that it doesn't get spam-folder'd.

The Milestone

If you're weird enough to have been closely following updates this year, my original plan to ship the next playtest milestone on Halloween like last year was severely hampered by not getting to do much of any development until late spring.

In the last 6 months I've accomplished a ton: contiguous world map, lighting, line of sight, inventory, equipment, the core game loop, and more! The only person holding onto the due date for the next milestone is me but I did definitely want to get it out to people before the end of the year. And it may still happen!

Right now I'm entering the big content grind where all the features need to be used to create abilities, items, enemies, and maps to house all the features I want to show off in the milestone. One of those not-yet-written write-ups is talking about the project-management of designing encounters for an RPG 😪

While the last playtest was strictly testing the combat mechanics, I want this one to more or less feel like the real final game in terms of progression and loop, so there's a ton left to do.

Still need to create some new music and sound effects too!

Want to Help Playtest?

If you haven't reached out about testing and wish to be contacted when the milestone is complete, please reach out through any of the various channels! PC, Mac, and Linux players are welcome! The playtest will have the ability to automatically upload replays of your playthrough (which you can opt out of) and will include an optional set of feedback prompts to ponder as you play.

Last year's playtest was such an overwhelming success and I can't wait to get back with folks again, hopefully this year!

| 🌐 | 🙋‍ | @britown@blog.brianna.town

Or: The Least Critical-Path Feature I Could Have Possibly Spent a Day On

The General Plan

Localization has been on my mind a bit recently. It isn't part of the next milestone but I have always maintained a loose plan of action and so I sometimes revisit the idea to make sure I'm building the engine in such a way that translations will be simple to implement.

Most of the work to support other languages just has to do with text-replacement. There's text in-code, in-asset, and even, although rarely for this game, in-image that would all need run-time substitutes for other languages.

The vision for supporting this revolves around an asset-driven approach much like the rest of everything else in the game. Essentially, user-facing text will be a special object that understands the context it exists in but still is just a text box in the game's various asset UI's.

From there, we can collect all of them from all assets into a big spreadsheet with a different language in each column to define their replacements. This has a ton of benefits:

  • Single source to work in and review remaining changes
  • Live-updating game instance to see the changes in-context
  • Sort search and edit across all assets and know their context at a glance
  • Set of translations is a separate asset that can be applied or overridden the same as other assets

For hard-coded text in-engine, a macro is used to automatically register those strings with their own contexts to a core-generated part of the translation asset.

This won't be difficult to implement and should come together pretty quickly. Whoever gets contracted to do the localization should have a painless experience working in the game directly to get everything done!

As long as they only ever need 8x8 ASCII characters which should absolutely never be a problem!

The Problem

I don't know Japanese. But I have been around video games long enough to know that the further back in time you go, a problem that gets harder is rendering Japanese text.

My pseudo-fantasy-EGA engine is fully software-rasterized with every pixel of my 712x292 frame buffer being drawn on the CPU. Text is drawn by taking a font bitmap, grabbing a cached recolor of it for when you want different colors, and rendering it similar to a sprite atlas with the single-byte character code being an index into the grid. On the original EGA, these characters were baked into the hardware of the card and in cases of text modes you could only draw those characters on a pre-set text grid. The limits of my renderer are much more lax than that and I have some pretty generic freedom for where and how to draw text.

So when I started to think about supporting Japanese (or really any non-roman script) I thought ok, maybe I can just have a new font bitmap that has the new characters in the same 8x8 character resolution. If you just pull from the appropriate index in the (much) larger image it'll work the same as ASCII for all of the UI and content-responsive layouts and translators can just write Unicode into the translation tool.

The Misaki Font

After some googling (an increasingly rare way to start a sentence) I came upon this great Language Exchange comment talking about an 8x8 Kanji font. The responder pretty clearly says to not do this and also nicely adds some helpful screenshots and even talks about how fucking Dragon Quest on the 3DS had people unable to tell the difference between three different types of swords from the tiny text.

Never one to trust an expert opinion, I did not read any of that and hastily clicked the link of what not to use, bringing me, just as I assume many before me, to the Misaki font.

Chrome auto-translated the page which was helpful because I can't read any of it. Sorry, why was I doing this again? Oh right, localization. Skimming the page I saw that the font was created in the 90's for use with the Sharp PC-E500 Pocket Computer and also that the font was actually just 7x7 so it could have a single-pixel gap to make the characters not bleed together. I got the impression talking to others and seeing comments that most of these Kanji are just about impossible to discern on their own, requiring the context of the full sentence to infer.

Nevertheless, they had a PNG atlas to download the whole thing in one image, which is what I did for ASCII so I decided to start there!

A Quick Note on TTF

The website also makes a .TTF available, and using the incredible single-include library stb_truetype I could theoretically render the characters to a virtual bitmap and transfer those to the EGA framebuffer at runtime for text rendering. Applied to my other regular font-rendering this would open the door to variable font sizes and a lot of flexibility.

I played with this idea and even got a basic version working, but I was a little frustrated dialing in the precision for specific pixel sizes and getting the characters to render exactly the way I need them. I also think that the current text limitations are one of those creativity-producing limitations rather than one of those annoying limitations so I gave up on the idea.

So you have 8,836 Characters...

For ASCII there was just a nice 32x8 character grid and the unsigned char value of a character directly corresponded to an index in that 2D array. The Misaki font PNG was 94x94 with a ton of blank areas and both rhyme and reason to an organization that I had zero knowledge of.

I looked for documentation, text files that say what characters correspond to what grid positions, or even wondering if it was related to Unicode at all. Nothing. And it's not like I was going to try and figure out what characters were which to build it myself.

Feeling stumped at lunch, I vented my despair to @SP, Developer of Super Puzzled Cat, Launching January 2025 with a FREE DEMO you can play TODAY just in time for Steam Next Fest, and he half-remembered something about something called “Shift-JIS.”

This was the search term I needed, and I quickly found myself at the Wikipedia article for JIS X 0208, a two-byte Japanese Industrial Standard first written in 1978. The page is extremely helpful because it breaks down every “row” of characters in the 94x94 grid and I was able to confirm that it lined up perfectly with the Misaki font PNG. The way the encoding works is that to reference a character, you need it's kuten (区点) which refers to the two numbers, row-column, that act as cell coordinates in the 94x94 grid! Easy!

Mapping to Unicode

In the modern era, our main way to input non-ascii characters is with Unicode, assigning all characters 1-4 bytes as “codepoints” with encoding schemes for representing them in memory. I'm going to try hard not to get too into Unicode in the post because it will reveal how little I know about it.

Of course, following JIS X by over a decade, Unicode has exactly 0.000 correlation or overlap with JIS X. I'll repeat that for ascii we can just take the single-byte character from any string in C and index it into our little font map and be good. For Japanese Unicode codepoints from an asset file to correctly resolve to the correct kuten we were absolutely stumped without having a 1:1 map.

Luckily, Unicode has a website located somewhere deep beneath New York's Trinity Church hidden behind an elaborate series of thematic clues and puzzles. It contains a page with an ftp link to the older original JIS 0201 Mapping. I couldn't get the link to work but maybe I was FTPing wrong. (Update: Mastodon User @gamedevjeff was able to find the moved FTP link at ftp://ftp.unicode.org/Public/MAPPINGS/OBSOLETE/EASTASIA/JIS/JIS0208.TXT)

Regardless, I really wanted 0208. On a lark I googled for the theoretical filename “JIS0208.TXT” and bingo! There, on Google's Github for their Japanese IME was the exact file I was looking for!

Written in 1990, the Unicode file contains 7000 lines of this:

0x8140	0x2121	0x3000	# IDEOGRAPHIC SPACE
0x8141	0x2122	0x3001	# IDEOGRAPHIC COMMA
0x8142	0x2123	0x3002	# IDEOGRAPHIC FULL STOP
0x8143	0x2124	0xFF0C	# FULLWIDTH COMMA
0x8144	0x2125	0xFF0E	# FULLWIDTH FULL STOP
0x8145	0x2126	0x30FB	# KATAKANA MIDDLE DOT
0x8146	0x2127	0xFF1A	# FULLWIDTH COLON
0x8147	0x2128	0xFF1B	# FULLWIDTH SEMICOLON
0x8148	0x2129	0xFF1F	# FULLWIDTH QUESTION MARK
0x8149	0x212A	0xFF01	# FULLWIDTH EXCLAMATION MARK
0x814A	0x212B	0x309B	# KATAKANA-HIRAGANA VOICED SOUND MARK
0x814B	0x212C	0x309C	# KATAKANA-HIRAGANA SEMI-VOICED SOUND MARK
0x814C	0x212D	0x00B4	# ACUTE ACCENT
0x814D	0x212E	0xFF40	# FULLWIDTH GRAVE ACCENT

The first column is the Shift-JIS code (A modern extension of JIS), the second column is the JIS 0208 kuten and the 3rd is ✨The Unicode Codepoint✨

Writing a Parser

I should do a whole blog post about how I parse text these days, but about 5 years ago I stopped dreading file parsing forever when I started using “Accept-style” recursive-descent parsing. Whipping up a few lines to load this entire file into a hash-map at runtime only took a few minutes!

Here's the complete code, I've annotated it with a bunch of extra comments to explain what everything does.

// read 4 hex characters and shift the nibbles into a single 16-bit number
// returns true on success, populating out
static bool _acceptHexShort(StringParser& p, uint32_t* out) {
   auto snap = p.pos; // error recovery snapshot

   uint32_t workingNum = 0;

   // 4 nibbles
   for (int i = 0; i < 4; ++i) {
      char d = 0;
      if (!p.acceptAnyOf("0123456789ABCDEF", &d)) {
         p.pos = snap;
         return false;
      }

      // convert the char to a number
      if (d >= 'A') d = d - 'A' + 10;
      else d -= '0';

      // shift it into place
      workingNum |= d << ((3 - i) * 4);
   }
   *out = workingNum;
   return true;
}

// contains the 3 numbers in a line of the map file
struct JISMapLine {
   uint32_t shiftjis = 0, jis0208 = 0, unicode = 0;
};


static void _constructUnicodeToJISMap(sp::hash_map<uint32_t, Int2>& mapOut) {
   auto file = bundledFileString("JIS0208.TXT"); // in-house file-bundler, the txt is encoded inside the exe and sitting in memory at this point
   StringParser p = { file.c_str(), file.c_str() + file.size() };

   while (!p.atEnd()) {
      if (p.accept("0x")) {
         // start of new line

         JISMapLine line;
         if (_acceptHexShort(p, &line.shiftjis) &&
            p.accept("\t0x") && _acceptHexShort(p, &line.jis0208) &&
            p.accept("\t0x") && _acceptHexShort(p, &line.unicode)) {

            // we grabbed the 3 numbers, split the jis0208 into two nibbles
            // the kuten start at 0x20 (32) and are 1-based

            auto row = (int)(((line.jis0208 >> 8) & 0xFF) - 0x20);
            auto col = (int)((line.jis0208 & 0xFF) - 0x20);

            // map this 2d integer point to the unicode codepoint
            mapOut.insert(line.unicode, Int2{ col - 1, row - 1 });
         }
      }

      while (!p.atEnd() && !p.accept('\n')) p.skip(); // skip to end of line
   }
}

Int2 JISCellFromUniChar(uint32_t unicode) {
   static sp::hash_map<uint32_t, Int2> _Map; // in-house hashmap
   if (_Map.empty()) {
      // populate once per program run
      _constructUnicodeToJISMap(_Map);
   }

   // this hashtable is way faster than std::unordered_map so this is fine
   if (auto srch = _Map.find(unicode)) {
      return *srch.value;
   }
   return { -1,-1 };
}

So We're Done! Almost...

With our shiny new JISCellFromUniChar function we can pass any codepoint up to 4-bytes and get a supported kuten for referencing a cell in our Misaki PNG.

But there is the tiny issue of getting those codepoints. Again, I'm not going to get into Unicode too much here but the main thing is that a utf8 string is still just a null-terminated const char* in your code but you can no longer just read it one byte at a time. Instead, every time you go to read a character, you can check specific bits to see if the character is continuing into the next byte. There are great small libraries for traversing a utf8 string but I had never written one before so here's mine...

const char* utf8ToCodepoint(const char* input, uint32_t* codepoint) {
   auto s = (unsigned char*)input;
   if (s[0] < 0x80) {
      *codepoint = s[0];      
      return input + 1;
   }
   else if ((s[0] & 0xE0) == 0xC0) {
      *codepoint = ((s[0] & 0x1F) << 6) | (s[1] & 0x3F);
      return input + 2;
   }
   else if ((s[0] & 0xF0) == 0xE0) {
      *codepoint = ((s[0] & 0x0F) << 12) | ((s[1] & 0x3F) << 6) | (s[2] & 0x3F);
      return input + 3;
   }
   else if ((s[0] & 0xF8) == 0xF0) {
      *codepoint = ((s[0] & 0x07) << 18) | ((s[1] & 0x3F) << 12) | ((s[2] & 0x3F) << 6) | (s[3] & 0x3F);
      return input + 4;
   }
   *codepoint = 0xFFFD; // invalid
   return input + 1;
}

Finally, it's time to actually render the characters. We traverse our utf8 string, pull out the codepoints, look up the kuten, and build a UV rect for the font texture:

void egaRenderTextSingleCharUnicode(EGATexture& target, EGATexture& font, Int2 pos, uint32_t codepoint, EGARegion* clipRect) {
   codepoint = convertAsciiCodepointToFullWidth(codepoint);
   auto cell = JISCellFromUniChar(codepoint);
   if (cell.x >= 0 && cell.y >= 0) {
      Recti uv = { cell.x * EGA_TEXT_CHAR_WIDTH, cell.y * EGA_TEXT_CHAR_HEIGHT, EGA_TEXT_CHAR_WIDTH, EGA_TEXT_CHAR_HEIGHT };
      egaRenderTexturePartial(target, pos, font, uv, clipRect);
   }
   else {
      // err
      egaRenderLineRect(target, Recti::fromVecs(pos, Int2{ EGA_TEXT_CHAR_WIDTH, EGA_TEXT_CHAR_HEIGHT }).expand(-2, 0), EGAUIColor_ltred, clipRect);
   }
}
void egaRenderTextUnicode(EGATexture& target, EGATexture& font, Int2 pos, const char* text_begin, const char* text_end, EGARegion* clipRect) {
   if (!text_end) text_end = text_begin + strlen(text_begin);

   auto cur = text_begin;
   while (cur != text_end) {
      uint32_t cp;
      cur = utf8ToCodepoint(cur, &cp);
      egaRenderTextSingleCharUnicode(target, font, pos, cp, clipRect);
      pos.x += EGA_TEXT_CHAR_WIDTH;
   }
}

And then we let 'er rip and prayed! At this I had absolutely no way of knowing if the values inside the map were correct or garbage or what.

      auto uniFont = egaFontFactoryGetFont(gi.state.fontFactory, EGAUIColor_black, EGAUIColor_white, EGAFontEncodingType_Unicode);
      egaRenderTextUnicode(gi.ega, *uniFont, Int2{ 8,8 }, u8"あなたの国は影に屈することになるだろう");

And it worked!

The folks at Nice Gear Games were nice enough to translate my overdramatic save-deletion dialog so that I could test it out:

One great thing with this is that my rich-text rendering still works! So text color and inline icons are already perfect:

One Last Hang-up

The font also has roman characters, so I went ahead and tried to write a regular English message but none of the characters rendered. Well, sure enough, the codes represented by JIS are the Full-Width roman characters which have different codepoints than what ASCII maps to.

So yet another function for catching those full-width conversions:

uint32_t convertAsciiCodepointToFullWidth(uint32_t c) {
   if (c == ' ') {
      c = 0x3000;
   }
   else if (c >= '!' && c <= '~') {
      c += 0xFEE0;
   }
   return c;
}

Maybe That Person Was Right About 8x8

After the excitement of getting this all working wore off I did start to notice/hear that the text is very hard to read. The Misaki font page does have an 8x12 font that is more readable so I went ahead and tried tossing that into the game. Now, this one's a bit more involved because the 8x8 font size is hard-coded and so changing it for this messes up a lot of things. I would be a bit more work to actually update the UI to support variable-height text. But I'm happy to show that the larger font size works just fine with the content-response UI in the game and looks really slick:

Thank You For Reading!

I really wrote a lot here but this was such a fun little project to get sniped by! As always, if you'd like to reach out or discuss the content here, you can reach out to me on Mastodon or else reply to the post about this post which I'll link here. Have a great day!!

Now all I need to do is finish making the game so that somebody can translate it someday!

#gamedev #longpost #chron4

| 🌐 | 🙋‍ | @britown@blog.brianna.town

When I decided to make a turn-based JRPG for Android in 2010, my initial thought was that it would be simple. After all, being turn-based, it wouldn't have complex physics or real-time issues and the simple art-style would make it a breeze.

Obviously, having never attempted to develop a complete game to release before, I had no idea what I was talking about, and indeed that game never went further than a moderately-successful demo.

Making games being fundamentally impossible aside, the key misunderstanding I want to highlight here today, is the disastrous assumption that turn-based games have simpler logic.


The Naïve Approach

AKA: How BladeQuest did it

So it's your party's turn to act in the game execution loop. You're looping, waiting for the player to make an input on their decision for the action they wish to perform. This involves UI: menus, clicks, confirms, cancels, etc. but the game state isn't fundamentally changing. Being turn-based (ignoring Active-Time-Battle shenanigans), the enemies aren't attacking, you're not taking damage, the player has as much time as they need to make their decision and confirm it.

In the code, at the basic level, once the confirmation of the decision is made, the game state is affected. Damage calculations are ran, defensive stats are considered, numbers are created and then they are applied to health bars through judicious addition and subtraction.

Of course, just updating these numbers and marking dead baddies as dead isn't very exciting so you need to do some animations. When an animation is playing, your game loop needs to understand that something is currently blocking further execution and do nothing, waiting for the animation to end.

So maybe you do something like this, for each character in the turn-order:

  1. First we want them to slide out of the party lineup to an acting position, so while this slide is happening, update their drawn position every frame based on a time step
  2. If they're at the acting position, change their sprite to an “acting pose” and yield every frame until an amount of delay time in that pose has passed
  3. If the posing delay is done, start the animation sequence for the selected action, creating particles, showing shapes, manipulating sprites
  4. During the animation, pegged to specific points or maybe just after it's done, apply some damage to a target
  5. Calculate the damage on the target enemy and add a “damage marker” to the draw state which will show that number bounding in front of the target
  6. Once the bouncing is done, actually apply the number to the enemy behind the scenes and see if they died
  7. If they died, start a death animation and wait each frame until that's complete
  8. If all the animations that have been started are complete on the current frame, slide the character back into the party line, waiting and updating position by timestep
  9. Now increase the turn index so the next character in the turn order goes

Why Doesn't This Work

It does work! It even worked in BladeQuest to make a successful demo!! But good god did we have trouble.

The biggest problem with this is state management between frames. An immense amount of state is needed for every part of this to know where particles are, where to draw the actors, what part of the turn they're on, etc. so that each frame your game knows whether to do something, draw something, or yield.

One of the hardest forms of this state tracking is timers. There could be timers for screen shaking, screen flashing, moving to poses, animation delays, marker bounces, all checking against their own internal clocks for when they're done. While nice and modular in theory, individual systems having their own internal timers requires them to sync with each other and communicate their status because they are often temporally blocking game state changes from taking place.

If actually taking the damage shouldn't happen until after an animation is finished, the line between render and update blurs, violating the golden rule of never allowing your game render to modify your game state.

If an attack critical-hits, you can pop off a quick screen-flash with a line of code, but what if that attack gets cancelled or blocked? Do you remove the flash you added? What if you want to play a special animation before the crit gets applied? You'll have to calculate if the crit will happen first, play animations with special state exceptions to wait for them to finish, and then calculate your damage and apply it. In the end, execution order winds up mattering a ton here.

In terms of scaling, special exceptions for new features start costing exponential dev efforts to glom onto this system. Want to add counter/parry/interrupt ability? Enjoy digging up every waiting-for-animations-to-finish call to see if it needs to handle a cancellation. We had an item called a Safety Ring which would prevent a fatal hit and the edge cases around a definitely-dead character not actually being dead were so numerous that we were still fixing Safety Ring bugs a day before the demo launched.

The Atomic Turn

Ok not actually technically atomic, but it sounds cooler.

I've been a hinting a little about a possible solution to the largely temporal issues with state management for a turn-based game. The biggest successes I've had with Chronicles development have been in identifying proper separations of concerns:

  • Separate your data from your logic
  • Separate your UI from your data
  • Separate your update from your render
  • Separate your device platform from your semantic inputs

The problem with the system describes above is that, display/animation/aesthetic/presentation is interleaved with execution logic.

What if we wrote a function that just executes the entire turn in a single function call, one frame, “atomically”. We can loop over the characters in the turn order, skip all presentation, determine the outcomes of all the actions and decisions, and apply them to the game state.

New features and exceptions can be written into this execution function much more easily, because they don't have to contend with timings and waiting. At every point in the execution of this function, the current game state is the exact correct state of all participating characters. character.health is correct at the time you check it because you're still in the same frame and same function call you started executing the turn in!

Let's make this even more useful, by applying some of the functional programming concepts I talked about ages ago, and say that our turn execution function should take a const Game State and return a new, post-turn GameState. Now we're not even modifying the rendered state, we're just running the turn like we would call any other function. This means, we could actually execute the complete turn as a perfect simulation, and inspect the resulting state to derive what happened (or, what is about to happen).

But What About All My Animations??

Of course, just updating these numbers and marking dead baddies as dead isn't very exciting so you need to do some animations.

Rather than letting the presentation timings drive our state changes, we're going to use the state changes to set up our presentation timings.

After making a decision for your character in Chronicles, the turn plays out in front of you. Here is the total code in the game update that is happening for every frame of that execution:

void turnExecuteStep(GameState& g) {
   auto& turn = g.turn;
   assert(turn.stage == TurnStage_Executing); // not executing!!

   if (g.step >= turn.startStep + turn.totalTurnLength) {
      turn.stage = TurnStage_Finished;
   }
}

The reason for this is that, while the atomic execution function is executing, in addition to the game state being updated, timed render logic is being added to a set of timelines to block out the turn.

Here is an example of the function that is called whenever applying any damage to another actor during the turn execution function. Play close attention to the second half where all of the functions take some form of a when parameter:

void ActorExec::applyDamage(GameState& g, ActorHandle sender, ActorHandle receiver, WorldGridCoords3D receivePos, ActorDamage const& dmg, ActionApplyMode mode, StepCount when, ExecutionTimeBlocks& blocks, bool blockAnims) {
   auto& cons = *getCurrentGameConstants();
   auto a = receiver;
   if (actorAlive(g, a) && dmg.dmg > 0) {
      auto inflicted = actorApplyDamage(g, a, dmg);
      int totalInflicted = inflicted.health + inflicted.armor + inflicted.stamina;

      auto dmgCpy = dmg;
      dmgCpy.dmg = totalInflicted;
      turnRecordDamagedEvent(g, sender, receiver, dmgCpy);

      bool killed = false;
      if (auto act = g.save.actors.find(a)) {
         if (act->health <= 0) {
            killed = true;
         }
      }

      // first we animate for hurtLen
      // then show dmg number
      // then we show death fade

      StepCount msgEndStep = 0;

      StepCount hurtLen = 0;

      if (mode == ActionApplyMode_Execute) {

         // dont show hurt animation if they took 0
         if (totalInflicted > 0) {
            hurtLen = cons.hurtPaletteLen;
            gamePlaySound(g, CoreAsset_SFX_Damage, when);
            actorAddDamagedAnimation(g, a, when, when + hurtLen);
            if (a == g.save.player_controlled) {
               gameShowDamagePaletteFlash(g, when, when + hurtLen);
            }
            actorSetDrawnStatus(g, a, actorCalcStatus(g, a), when + hurtLen);
         }
      }

The first thing we do is we actually apply damage numbers to the receiving actor, and record it in a log used for interrogating simulations like I described earlier.

mode is a way to determine “Preview” vs “Execution” where the former can nicely skip all the presentation-related side-effects of the function.

The important presentation parts here are gamePlaySound, actorAddDamagedAnimation, gameShowDamagePaletteFlash, and actorSetDrawnStatus

These functions all take begin and end frame counts because they don't execute immediately! All of these will happen at their requested step counts during that execution phase above because we're just waiting every frame until we hit turn.totalTurnLength!

So you see, we execute the entire turn logic in a single function call, and it sets up a perfectly-synced, interleaving keyframe-style timeline of what the render function should show every frame during the execution.

TimeBlocks

A very handy tool for organizing frame timings is this simple TimeBlocks struct:

struct StepBlock {
   StepCount begin, length;
};

struct ExecutionTimeBlocks {
   sp::list<StepBlock> blocks;

   operator bool() const { return !blocks.empty(); }

   StepCount end() const {
      StepCount out = 0;
      for (auto&& b : blocks) {
         out = std::max(out, b.begin + b.length);
      }
      return out;
   }
};

With syncing animations, you often need to have a complicated balance of blocking and non-blocking animations, and you want child calls and dependent timings to not care about the parent. Maybe I want to fire 100 random arrow particles, all starting and ending at random times. I don't care about any indivudal arrow but I don't want to continue to the next step until the last arrow is done, so you can use these TimeBlocks!

What's nice there is you can then pass around these time blocks to any number of modular functional functions that just push their little blocking time gaps into the set and at the end the calling parent can easily determine the final frame count of the final item.

Here's the full function taht gets called if the actor is performing a move-attack:

static StepCount _executeMoveAttack(GameState& g, ActorHandle user, ActionToken token, ActionTokenSet const& set, ActionTokenMemory const& mem, ActionTokenIterationCache const& cache, ActionApplyMode mode, StepCount when) {
   auto startStep = when;
   ExecutionTimeBlocks timeBlocks;

   for (auto&& result : actionMoveAttackCalculateResults(g, user, token, set, mem, cache)) {
      auto dirVec = dirVecFromCoords(result.origin, result.targetTile);
      auto& tState = g.turn.actors[user];

      switch (result.result) {
      case MoveAttackResult_ActivateDoor: {
         ActorExec::toggleDoor(g, user, result.targetTile, mode, when, timeBlocks);
         _pushFoVEventTorchEquipped(g, when);
      } break;
      case MoveAttackResult_Attack: {
         _executeAttack(g, user, result, mode, startStep, timeBlocks);
      }  break;
      case MoveAttackResult_Move: {
         sdh_each(g.turn.actors[user].turnResults) { if (it->type == ActorTurnResult::MoveAttacked) sdh_mark_erased(); }
         tState.turnResults.push_back({ ActorTurnResult::MoveAttacked, result.origin, dirVec, { result.targetTile }, false });

         gamePlaySound(g, CoreAsset_SFX_Move, startStep);

         if (!result.dodgeLocks.empty()) {
            ActorExec::activateDodgeLocks(g, user, user, mode, startStep, result.dodgeLocks, timeBlocks);
         }

         auto dist = int2ManhattanDist(result.origin.xy(), result.targetTile.xy());
         ActorExec::slide(g, user, user, dirVec, dist, true, mode, std::max(startStep, timeBlocks.end()), g.turn.turnLength, timeBlocks);

         turnRecordMoveEvent(g, user, user, result.targetTile);

      }  break;
      }
   }

   return std::max(startStep, timeBlocks.end());
}

The important takeaway here is this function returns a StepCount meant to signify the “End Step” of this action. The turn execution function is going to use that end step as the start step of the next action in the queue.

So we use a local timeBlocks here and pass it to any number of ActorExec:: functions similar to applyDamage above. These Exec functions often call other Exec functions recursively as sliding and taking damage often causes more sliding and more taking damage. From the MoveAttack's perspective, we don't really care, because whatever happens, it's just filling up our timeBlocks which we can just return the end() of as our last step!

Finally, here is an excerpt from the function executeTurnAction we've been talking about all this time:

      if (!disabled) {
         actorSpendStaminaForAbility(g, m, ab.ab);
         auto drawnStatus = actorCalcStatus(g, m);
         drawnStatus.stamina_recovery = 0;
         actorSetDrawnStatus(g, m, drawnStatus, currentStep);

         _beginAbilityCooldown(g, m, ab.ab);

         size_t idx = 0;
         ActionTokenIterationCache cache;

         while (!actionTokensAtEnd(set, idx)) {
            actionTokenLinkIterationCache(set, cache, idx);
            //turn.activeActors.push_back({ m, g.turn.turnEndStep, (int)idx });

            auto tok = set.tokens[idx];
            gameStateCalculateActionTokenDecisionCache(g, m, tok, set, mem, cache);

            if (mode == ActionApplyMode_Execute) {
               currentStep = gameStateApplyActionTokenForExecution(g, m, tok, set, mem, cache, currentStep);
            }
            else {
               gameStateApplyActionTokenForPreview(g, m, tok, set, mem, cache);
            }
            ++idx;
         }
      }

      auto actorEndStep = currentStep;

      auto actorLen = actorEndStep - actorStartStep;
      actorLen = std::max(actorLen, cons.turnMinimumLength);
      turn.executingActorTimes.push_back({ m, actorStartStep, actorStartStep + actorLen });

      auto nextActorStart = std::max(actorStartStep, actorStartStep + actorLen + turn.nextTurnDelay);
      currentStep = nextActorStart;

      actorBlocks.blocks.push_back({ actorStartStep, actorLen });
   }

   auto turnEnd = actorBlocks.end();

For a given actor, we execute their turn actions, starting with our currentStep which, by the end of the action list, contains the final step in that actor's execution. Then we can do some simple logic to apply minimum lengths and determine the start step for the next Actor.

We have another TimeBlocks to keep track of one block per actor and after we're done we just query end() to get our final turn step!

A Note on the Render Function

I require and recommend that render always take a const GameState and draw the entire game in an immediate-mode fashion.

For the given step, we can determine what damage indicators to draw, what palette to use, where to draw the characters, what animation primitives to draw, and even do simple lerping and easing from the various begin and ends. The timeline you are building during exeuction must be completely deterministic such that all you need is your GameState and a current StepCount to draw everything about that frame perfectly!

In Conclusion

All of this is to say that, in Chronicles, the entire turn is executed atomically in a single frame which then sets up a complex timeline of render data to display the animated turn execution.

A great deal of thought and care went into this implementation and it required a lot of discipline to see it succeed. It is certainly a pain in the ass whenever a new system needs to create some new frame-delayed timeline system instead of just being able to happen instantly, but the result is an incredibly stable and scalable framework that I can add extremely complex combat logic to ad infinium.

I hope you enjoyed this write-up! If you have any questions or comments, you can DM me on Mastodon or just reply to the post about this!

Have a great day!

#chron4 #gamedev #longpost

| 🌐 | 🙋‍ | @britown@blog.brianna.town

The fruits of all of my labor with regard to asset editors and combat actions results in it being fairly trivial to add huge numbers of different types of abilities. It feels really good to just pop open the engine, write no code, and throw together a few systems into a new ability.

Here I made a ninja-smoke escape ability in just a minute or two. I greyed out the firey explosion graphic I had made before using the new in-line palette swaps to make a nice smoke explosion and then leveraged existing teleport animations to make a quick and dirty smoke bomb.

Enemies calculate their own FoV and also can't see through smoke just like you can't, so leave some smoke tiles behind and you have a great getaway!

#gamedev #chron4

| 🌐 | 🙋‍ | @britown@blog.brianna.town

I've been having a time getting doors into the game over the last few weeks. After a failed attempt with the wrong methodology and several refactors I think I have it in a good place now!

Doors are special in ways unlike other actors in the game:

  • They can be attacked and killed (thank Barrels for paving the way for non-creatures actors for that)
  • They can be interacted with in a special user interface way.
  • Other actors can cohabitate the same grid-space as an open door, something previously explicitly disallowed in the engine.
  • And they also block line of sight when closed!

A ton of extra systems and refactors have gone into making these doors work, but a lot of that work is going to benefit the next goal which is other interactable actors like item chests!

#gamedev #chron4

| 🌐 | 🙋‍ | @britown@blog.brianna.town

I have a test map full of a lot of different kinds of enemies that I boot up to test new features. I also load into it a lot to look for polish and ensure that the enemy behavior and execution are all still working correctly. I have found as time has gone on that I find myself just playing through this area regularly, and just enjoy wasting time playing here.

I wanted to record a nice long gif to show one such run because I think it shows off a ton of the mechanics and starts to paint a realistic picture of the pace of the combat in the game.

#gamedev #chron4

| 🌐 | 🙋‍ | @britown@blog.brianna.town

I have fully embraced the modern nightmare of containerized web servers and the page you're reading right now is served up by a laptop in my home office. One of the nice things about this is I use File Browser which allows me to very easily upload files to my website from anywhere, and it even has a code editor for html and such!

Having direct access to all of this running on my own machine has empowered me to indulge in silly ideas. The most recent being thought of displaying a live-updating view of how much code I've written for my game.

I am in a weird spot where I think that LoC is a terrible metric for code quality or code complexity whilst simultaneously being absolutely addicted to a big number going up.

I have long had a script for running cloc, a neat full-featured utility to count LoC with a ton of options. The script runs the utility with options designed around ignoring dependencies, generated files, or any other “not lines I wrote myself” and I often enjoy checking in on it after a while to see how much larger the number has gotten.

Sometimes I even want to post that number to social media, but I worry about the optics of that and so I figured why not embrace the silliness and just make a place where anyone can easily see my big number and observe how big it is!

First we need some python that will update a copy of the repository, run cloc, and then send the output of the final line count to a file...

#!/usr/bin/env python3

import os
import subprocess
import json


# Define your repository path and the path to cloc.exe
REPO_PATH = '<path_to_checked_out_repo>'
CLOC_PATH = 'cloc'
OUTPUT_FILE = '<path_to_static_web_files>/cloc_out.txt'


def extract_sum_code(cloc_output):
    """Extract the SUM->code value from cloc JSON output."""
    cloc_data = json.loads(cloc_output)
    sum_code = cloc_data["SUM"]["code"]
    print(f"Found sum code: {sum_code}")
    return sum_code

def update_repo():
    """Pull the latest changes from the git repository."""
    os.chdir(REPO_PATH)
    subprocess.run(['git', 'pull', 'origin', 'master'], check=True)

def get_loc_count():
    """Run cloc.exe and get the lines of code count."""
    result = subprocess.run([CLOC_PATH, 'chron4/', '--exclude-dir=assets,x64', '--exclude-ext=inl,filters,vcxproj,recipe,ini,user,chrep,chf,temp,natvis,x', '--exclude-list-file=cloc-exclude.txt', '--json'], capture_output=True, text=Tru>
    cloc_output = result.stdout
    return cloc_output

def write_loc_to_file(loc_count):
    """Write the lines of code count to a file."""
    with open(OUTPUT_FILE, 'w') as file:
        file.write(str(loc_count))
        print("Wrote to file")

def main():
    try:
        update_repo()
        loc_count = get_loc_count()
        sum_code = extract_sum_code(loc_count)
        write_loc_to_file(sum_code)
    except Exception as e:
        print(f"An error occurred: {e}")


if __name__ == "__main__":
    main()

Next we add our script to cron...

0 0 * * * /usr/bin/python3 /<path-to>/runcloc.py >> /<path-to>/cron.log 2>&1

We sent cloc_out.txt to our static files on our nginx web server so referencing the content on our new website is as easy aaaas....

    <div style="text-align:center;">
       <img style="width:25%;" src="pikachu.gif" />
       <p><b><span id="loc"></span></b> lines of code have been written.</p>
       <p><a href="https://blog.brianna.town">Follow Development Updates</a></p>
       <p><a href="https://brianna.town">Return to Author's HomePage</a></p>
       
    </div>

    <script>
        async function fetchLoc() {
            const response = await fetch('cloc_out.txt');
            let loc = await response.text();
            loc = Number(loc).toLocaleString();
            document.getElementById('loc').textContent = loc;
        }
        fetchLoc();
    </script>

And there you go! Please enjoy the awkwardly-domain-named https://chrongame.com and look forward to a release date when the number is much much larger ♥

| 🌐 | 🙋‍ | @britown@blog.brianna.town

Today is June 28th which means that two years ago today I decided to try to #gamedev again and made my first commit to a new repo for a new engine attempting (for the fourth time) to make a game with the title “Chronicles IV: Ebonheim

Past Attempts

I thought it might be fun to talk about some pre-2022 project history and show some never-before-seen development gifs!

sEGA

Back in 2015, I started a new game engine with the constraints of being

  • 100% pure C
  • low-dependency
  • emulating EGA graphics cards

In the end I think the most powerful result of that endeavor was reshaping my brain around C. It forced me to learn so much about how code actually works and what is going on and completely revolutionized my coding style and ability.

The EGA emulation came as an idea of trying to create a cool arbitrary limitation on the graphical capabilities because old EGA games is some of the first games I ever played as a kid.

You can access that old engine here!

sEGA Games

The original idea for the engine was a point-and-click adventure title called Borrowed Time (BT) in the repo. I have nothing to show for this except for some scattered design documents but the general premise involved using a pocket watch to traverse over a clockwork Majora's Mask -style slice of time and solve a murder (still waiting for my check from the Obra Dinn devs).

By the time the engine was up and running and the graphics all worked I had “shifted” my idea to a turn-based tactical RPG called Shift which involved going on runs by diving into other planes of existence via D&D-style color pools. This project didn't get a lot further but it was influenced a lot by me being into DotA at the time and attempting to come up with new ways to accomplish complex deterministic combat resolution which was a constant pain point in BladeQuest.

BOMBILLAS.BAS

I used sEGA a little later to make a clone of the old QBASIC game GORILLAS.BAS for the Giant Bomb Game Jam!

You can download and play it here and see it being played by the Giant Bomb staff here!!

Chronicles IV (1)

After taking a break, moving apartments, losing a lot of weight, and getting exceptionally into a tabletop game called Burning Wheel, I had the idea of using sEGA to try and make some kind of Burning Wheel, Morrowind, Ultima, completely unrealistic game.

A big part of the pitch was that the whole game world would have (hundreds of) years worth of history mapped out in scripts. You create a starting character whose background would determine their age and starting location. You would then pick and poke and interact with the world to try and cause the course of history to change to accomplish your goals. It was this incredibly ambitious idea of having an RPG character who could literally grow too old and die of natural causes, where learning new skills took months or years of training.

Despite this idea never really coming together or having much hope of turning into anything, it's something I tinkered and played with for three years. It had no ImGui or in-engine edit UI but I still wanted to do all asset editing in-engine. This lead to creating a Lua console and building the map editor into the running instance. It had very tight lua integration for all of the actors. It's honestly wild to me just how much stuff this tech demo did in the end.

Here's some gifs from that project!

Thank you!

To everyone who has been following this project, it has been a joy to share my game's development with you!

Here's to a great third year!!

| 🌐 | 🙋‍ | @britown@blog.brianna.town

I've been actively developing this game for two years as of this month and have enjoyed keeping a tightly-curated blog of development progress and technical write-ups. But, as that body of work has grown, I've felt less and less easy about having that horse hitched to a platform I can't control or export from.

So now welcome to The Ebonheim Chronicle! All posts and their content from the last two years have been migrated manually here to a laptop in my office at home running Writefreely, a great minimal blog app that also has activitypub federation!

I've added a line of links to the signature of every post with how to access the RSS feed or follow the blog on mastodon or your federated feed of choice!

| 🌐 | 🙋‍ | @britown@blog.brianna.town

Spent this week building the art and UI around items and equipment. Items are the key to progression in the game! I can't wait to have them actually start affecting combat in meaningful ways 😁

#gamedev #chron4 #pixelart

| 🌐 | 🙋‍ | @britown@blog.brianna.town