-Work in Progress: Gameplay-
Technical Designer / Writer / Voice Actor || FPS Puzzle-Platformer || Unreal 4.8 || 13 developers || 5 months
Scroll down for the scripting details
The Curator, including all story, narration, and Robot Behavior scripting in the game, came to exist during the final two months of the project. The entire narration system was implemented using Unreal 4's Blueprint visual scripting system.
Both The Curator and SHI use the Robot Behavior System (RBS), a complex set of Triggers, Blueprints, and Struct Queues helps the Narrators time and execute their behaviour and narrations based on the player's location and actions. The RBS logic can be broken up into three main sections in three different Blueprints:
The Robot Behavior Queue, the Robot Update Triggers, and The Curator Robot Blueprint itself.
The Narrator System
The Robot Behavior Queue
Initially built by Henry Dai, this Blueprint houses a special Behavior Que that uses a Dynamic FIFO Array of Structs that pop off and execute over time. Each struct has a large web of logic checks that call functions on the Robot Blueprint when specific values are updated. These nodes control literally every aspect of the Robot's personality.
Every "Add Robot Behavior" node takes in a large amount of potential bevaior data as well as a duration time that dictates how long the current Behaviors have to complete before the next node is executed. The functions to interact with the Behavior Queue exist in a seperate Global Function Library, and as such can be accessed from any Level Blueprint.
This Blueprint also communicates with a special HUD widget that displays subtitles on screen. These subtitles are tied directly to the Robot Behavior Nodes, and as such are in rhythm with each narration sound file as they play. This system allowed the subtitles to play a role in the game's comedy, as funny lines can display the moment the player hears the narration to prevent spoiling the jokes early.
The Robot Update Triggers
The Behavior Queue allowed functions to be executed in sequence with controllable delays, but the Narrators needed to react to the player's actions and progression through the levels. To do this, I created a special Blueprint called the Robot Update Trigger (the red boxes shown above) that would add specific Behavior Structs to the Queue when touched by specific game entities, such as the Player or Robot. These triggers served as the primary point of contact between the RBS and the Player.
I wrote a special macro that would detect Overlap Events with the Robot Update Triggers, confirm that the Trigger both could be activated and was touched by a valid Entity, and then call large chains of logic containing Add Behavior nodes and special in-level events. These triggers allowed The Curator to follow the player though the level and react to their behavior in interesting ways.
The Curator Robot Blueprint
This is where the magic happens. The Robot Blueprint contains all logic for The Curator's Narration, movement, animation, and appearence, and is referenced by several Blueprints that appear en masse throughout the game. Let's break it down:
His Animation (Emojis)
His Look Direction
These Robo-Emojis added a lot to his character, and greatly enhanced his overall performance as the game's narrator.
The Robot traverses the halls of the GoRG using two main methods:
special Splines (Robot Paths) and Manual Transition functions.
These splines are fairly standard point-composed 3D curves. Every frame, if the Curator should be moving and has no special traversal instructions, he adds the world delta time multiplied by a movement speed variable to his distance along the current spline. This returns a vector that he uses to set his World Position just a little further up the spline path. This allowed for fine control of the Curators movement through every level.
Certain splines tell the Robot to 'Follow the Player', and use a basic algorithm to efficiently determine the closest point on the Robot Path to the player's current location. This allows the Curator to follow and observe the player in specified areas to add believability to his character.
There are a few moments in the game where the Curator needs to break formation and reach a point in the world ASAP. To accomplish this, I wrote a basic function into his existing movement logic to allow smooth transitioning between points in the world and on Robot splines. This prevents the Curator from ever teleporting unless the system knows the player is out of a specific area.
During the last three weeks of the project, to enhance The Curator's personality, I whipped-up a basic set of Timeline-based Animations (called Emojis) and integrated them into the RBS. The Timelines smoothly blend with his pre-existing actions, such as his constant hover-bob, and never played over eachother to prevent unintended or unpolished behavior.
Unfortunately for looking towards his movement direction, The Curator is not a physics object and has no inherent velocity in the Engine. To have him look towards his movement direction, I simply had him subtract his current world location this frame from his location last frame, and turned the normalized resulting vector into a rotation.
The Curator is always looking at something. Every frame, he interpolates between his current rotation and a rotation looking towards his current view target reference until it matches. He also has a few special functions set his view target to specific Actors, the Player, and his current movement direction.
The ultimate result is that the RBS can tell him to change his expression during narration or animations. Each expression has a specific bloom value and color associated with it, creating a rainbow of emotion when The Curator speaks.
The Curator's Narration is all played from an Audio Component on the Robot Blueprint. The functions that play the narration Sound Wavs ensure that there are never two narrations playing at once from the same character, and also slightly raise the pitch of the .wav files to make his voice sound consistent.
This is the one section of the RBS that SHI also uses. There is a special function that takes an Ambient Sound actor reference from the Level Blueprint and plays special Sound Wavs through it with modifiers for added reverb. This allows SHI to literally speak through 'speakers' around each level, and ensures that her narrations sound like they are being played through an intercom system.
The Curator's face is a Dynamic Material Instance, meaning that it can be changed at runtime by updating special scalar, vector, and texture sample parameters.
During development, it became clear that there were several points in Gravitas where players both had and took the opportunity to try and touch the Curator. This took many forms, the most notable of which was jumping on his face while he was following the player on a spline. His lack of a reaction, while humorous, was ultimately out of tune with his character, and as such I created a Personal Space algorithm that allowed him to dynamically avoid/dodge the player when they tried to touch him.
His Personal Space
Basically, while this was enabled (which was sparingly given its programmatic intensity), the Curator would send out a web of raycasts every 0.1 seconds that, when averaged together, created a dynamic local offset from his root Scene Component. This allowed him to quickly float away from the player when they entered his "personal space bubble" collider. I also scripted the Curator to perform a backflip whenever a player touched his face, kicking off any pupils that managed to land on him. These two new behaviors felt good, enhanced his character, and prevented a lot of whacky bugs from happening in later development.
This comprehensive system gave The Curator a realized personality and allowed him to be a useful point of guidance and entertainment for the player. The node-by-node duration variable offered complete control over the timing of all Narrator actions, which created a good comedic rhythm.
After working on the narration for Gravitas for about 1 month, I had developed a very fast and reliable workflow to implement narration in a level. After finishing a level in the game's script, there were four main steps to get The Curator and SHI up and running.
The Narration Integration Workflow
It always began with the splines. Physically drawing the paths of The Curator's movement through the 3D spaces of each level helped me map out his progression through the level alongside the player. The splines also indicated what he would be seeing at each moment and how he might react to the player's actions, which affected the script on multiple occasions.
Now that I could see The Curator's path through the level, the next step was to place the web of Robot Update Triggers that would pull him through the level with the player. I hooked up each of the triggers to the Level Blueprint with a single Robot Behavior node to display a test word to the screen to confirm they were triggering properly. I also rigged each trigger to actually move The Curator through the level as the player progressed and had him update his view targets to placeholder Actors so he wouldn't just stare forward blankly the entire time. When he could reach the end of the level with them, it was time for step 3.
Step 3 is the largest step by far. Once The Curator can move through the level, respond to player actions, and affect the world based on the player's progress, what remained was adding the actual narration. I would start by adding the approprate narration sound files to the placeholder nodes from step 2. After adding each Sound Wav, I would copy the narration text from the script document and paste it into the subtitle section of the first Behavior Node for each Update Trigger. After this, I would break the text down into on-screen chunks of no more than a sentence each and give each node an estimate duration for how long that text should remain on screen. I was usually no more than 0.8 seconds off after a few weeks. At this point, I would play through the game repeatedly for each chain of narration, updating the duration of each subtitle node to line up with the actual sound to within a fifth of a second. Then, I would add first pass mood changes to update The Curator's face with the appropriate emotion for what he was currently saying at each node.
After completing The Curator's behavior logic, I added the Ambient Sound Actors to the level that would act as SHI's speakers. Her lines were played using the same logic chains as The Curator, so the only thing that needed adding was the references to the speakers themselves so that each of her narrations would play from the correct world location. After this long process, the level now had first pass narration.