Gravitas

-Work in Progress: Gameplay-
Technical Designer Writer Voice Actor || FPS Puzzle-Platformer ||  Unreal 4.8 || 13 developers || 5 months

Scroll down for the scripting details

SCRIPTING

The Curator, including all story, narration, and Robot Behavior scripting in the game, came to exist during the final two months of the project. The entire narration system was implemented using Unreal 4's Blueprint visual scripting system.

Both The Curator and SHI use the Robot Behavior System (RBS), a complex set of Triggers, Blueprints, and Struct Queues helps the Narrators time and execute their behaviour and narrations based on the player's location and actions. The RBS logic can be broken up into three main sections in three different Blueprints:

The Robot Behavior Queue, the Robot Update Triggersand The Curator Robot Blueprint itself.

 

The Narrator System

The Robot Behavior Queue

 

Initially built by Henry Dai, this Blueprint houses a special Behavior Que that uses a Dynamic FIFO Array of Structs that pop off and execute over time. Each struct has a large web of logic checks that call functions on the Robot Blueprint when specific values are updated. These nodes control literally every aspect of the Robot's personality.

Every "Add Robot Behavior" node takes in a large amount of potential bevaior data as well as a duration time that dictates how long the current Behaviors have to complete before the next node is executed. The functions to interact with the Behavior Queue exist in a seperate Global Function Library, and as such can be accessed from any Level Blueprint. 

This Blueprint also communicates with a special HUD widget that displays subtitles on screen. These subtitles are tied directly to the Robot Behavior Nodes, and as such are in rhythm with each narration sound file as they play. This system allowed the subtitles to play a role in the game's comedy, as funny lines can display the moment the player hears the narration to prevent spoiling the jokes early. 

The Robot Update Triggers

 

The Behavior Queue allowed functions to be executed in sequence with controllable delays, but the Narrators needed to react to the player's actions and progression through the levels. To do this, I created a special Blueprint called the Robot Update Trigger (the red boxes shown above) that would add specific Behavior Structs to the Queue when touched by specific game entities, such as the Player or Robot. These triggers served as the primary point of contact between the RBS and the Player.

I wrote a special macro that would detect Overlap Events with the Robot Update Triggers, confirm that the Trigger both could be activated and was touched by a valid Entity, and then call large chains of logic containing Add Behavior nodes and special in-level events. These triggers allowed The Curator to follow the player though the level and react to their behavior in interesting ways.

The Curator Robot Blueprint

This is where the magic happens. The Robot Blueprint contains all logic for The Curator's Narration, movement, animation, and appearence, and is referenced by several Blueprints that appear en masse throughout the game. Let's break it down:

His Movement​​

His Animation (Emojis)

His Narration

Robot Paths

Manual Transitions

His Look Direction

These Robo-Emojis added a lot to his character, and greatly enhanced his overall performance as the game's narrator.

His Face

The Robot traverses the halls of the GoRG using two main methods: 

special Splines (Robot Paths) and Manual Transition functions. 

These splines are fairly standard point-composed 3D curves. Every frame, if the Curator should be moving and has no special traversal instructions, he adds the world delta time multiplied by a movement speed variable to his distance along the current spline. This returns a vector that he uses to set his World Position just a little further up the spline path. This allowed for fine control of the Curators movement through every level.

Certain splines tell the Robot to 'Follow the Player', and use a basic algorithm to efficiently determine the closest point on the Robot Path to the player's current location. This allows the Curator to follow and observe the player in specified areas to add believability to his character.

There are a few moments in the game where the Curator needs to break formation and reach a point in the world ASAP. To accomplish this, I wrote a basic function into his existing movement logic to allow smooth transitioning between points in the world and on Robot splines. This prevents the Curator from ever teleporting unless the system knows the player is out of a specific area.

During the last three weeks of the project, to enhance The Curator's personality, I whipped-up a basic set of Timeline-based Animations (called Emojis) and integrated them into the RBS. The Timelines smoothly blend with his pre-existing actions, such as his constant hover-bob, and never played over eachother to prevent unintended or unpolished behavior. 

Unfortunately for looking towards his movement direction, The Curator is not a physics object and has no inherent velocity in the Engine. To have him look towards his movement direction, I simply had him subtract his current world location this frame from his location last frame, and turned the normalized resulting vector into a rotation.

The Curator is always looking at something. Every frame, he interpolates between his current rotation and a rotation looking towards his current view target reference until it matches. He also has a few special functions set his view target to specific Actors, the Player, and his current movement direction. 

The ultimate result is that the RBS can tell him to change his expression during narration or animations. Each expression has a specific bloom value and color associated with it, creating a rainbow of emotion when The Curator speaks. 

The Curator's Narration is all played from an Audio Component on the Robot Blueprint. The functions that play the narration Sound Wavs ensure that there are never two narrations playing at once from the same character, and also slightly raise the pitch of the .wav files to make his voice sound consistent.

This is the one section of the RBS that SHI also uses. There is a special function that takes an Ambient Sound actor reference from the Level Blueprint and plays special Sound Wavs through it with modifiers for added reverb. This allows SHI to literally speak through 'speakers' around each level, and ensures that her narrations sound like they are being played through an intercom system.

 
 
 
 
 
 
 
 

The Curator's face is a Dynamic Material Instance, meaning that it can be changed at runtime by updating special scalar, vector, and texture sample parameters.

During development, it became clear that there were several points in Gravitas where players both had and took the opportunity to try and touch the Curator. This took many forms, the most notable of which was jumping on his face while he was following the player on a spline. His lack of a reaction, while humorous, was ultimately out of tune with his character, and as such I created a Personal Space algorithm that allowed him to dynamically avoid/dodge the player when they tried to touch him.

His Personal Space

Basically, while this was enabled (which was sparingly given its programmatic intensity), the Curator would send out a web of raycasts every 0.1 seconds that, when averaged together, created a dynamic local offset from his root Scene Component. This allowed him to quickly float away from the player when they entered his "personal space bubble" collider. I also scripted the Curator to perform a backflip whenever a player touched his face, kicking off any pupils that managed to land on him. These two new behaviors felt good, enhanced his character, and prevented a lot of whacky bugs from happening in later development.

This comprehensive system gave The Curator a realized personality and allowed him to be a useful point of guidance and entertainment for the player. The node-by-node duration variable offered complete control over the timing of all Narrator actions, which created a good comedic rhythm.

After working on the narration for Gravitas for about 1 month, I had developed a very fast and reliable workflow to implement narration in a level. After finishing a level in the game's script, there were four main steps to get The Curator and SHI up and running.

 

The Narration Integration Workflow

It always began with the splines. Physically drawing the paths of The Curator's movement through the 3D spaces of each level helped me map out his progression through the level alongside the player. The splines also indicated what he would be seeing at each moment and how he might react to the player's actions, which affected the script on multiple occasions. 

1.

2.

Now that I could see The Curator's path through the level, the next step was to place the web of Robot Update Triggers that would pull him through the level with the player. I hooked up each of the triggers to the Level Blueprint with a single Robot Behavior node to display a test word to the screen to confirm they were triggering properly. I also rigged each trigger to actually move The Curator through the level as the player progressed and had him update his view targets to placeholder Actors so he wouldn't just stare forward blankly the entire time. When he could reach the end of the level with them, it was time for step 3.

3.

Step 3 is the largest step by far. Once The Curator can move through the level, respond to player actions, and affect the world based on the player's progress, what remained was adding the actual narration. I would start by adding the approprate narration sound files to the placeholder nodes from step 2. After adding each Sound Wav, I would copy the narration text from the script document and paste it into the subtitle section of the first Behavior Node for each Update Trigger. After this, I would break the text down into on-screen chunks of no more than a sentence each and give each node an estimate duration for how long that text should remain on screen. I was usually no more than 0.8 seconds off after a few weeks. At this point, I would play through the game repeatedly for each chain of narration, updating the duration of each subtitle node to line up with the actual sound to within a fifth of a second. Then, I would add first pass mood changes to update The Curator's face with the appropriate emotion for what he was currently saying at each node.

After completing The Curator's behavior logic, I added the Ambient Sound Actors to the level that would act as SHI's speakers. Her lines were played using the same logic chains as The Curator, so the only thing that needed adding was the references to the speakers themselves so that each of her narrations would play from the correct world location. After this long process, the level now had first pass narration.

4.

The final step is entirely about polish. Now that the narrators exist in the level and have triggerable behavior, all that remains is tweaking small things like updating narration to finish before the player has a chance to interrupt it, adjusting splines to not clip through meshes, changing The Curator's movement speed between certain areas, updating emotions to flow better with the narration, etc. Once the Robot Emojis came into being, a large part of step 4 became adding them in to create more personality and flair in the way The Curator moved through the levels and reacted to the player's progression.

After these 4 steps (with various steps being repeated as levels evolved and new narration was added), each level had a responsive and believable set of narrators that helped push the narrative and player expience forward.

In order to both further sell The Curator's character and create interesting Wow moments and gameplay setpieces, I wrote several ideas for cinematic moments at underdeveloped points in the game and proposed them to the leads. For the ideas that were well-received, I would work with other departments (primarily Level Design And Art) to implement them. The most notable examples are the 'light activation sequences' and the cinematic opening events of Galleries 1, 2, and 3.

 

Creating Scripted Cinematic Sequences

During our inital prototyping phase after switching to our third new game idea, the Gravity field itself (which is essentially a tube that tells primitive components within it that have physics enabled to pull towards its normal) felt very static and flash-hacky. While this was because we had built it in a flash hack, the game was difficult to test and levels were difficult to design because the main mechanic felt so inconsistent and rough. To help out, I offered to give the Gravity Field Blueprints a pass to let the Programmers worry about other things. The main issues that I ended up polishing were that the fields felt sluggish and viscous, the player felt too floaty in and out of fields, and objects behaved unpredictably at intersections between gravity fields. 

 

Light Activation Sequences (Galleries 1, 2, and 3)

Every new Gallery began with a small viewing room with a large window, both allowing the player to survey the new area and creating a brief moment for The Curator to greet the player and introduce any new gameplay elements.

Special Gallery Introduction Events (SPOILERS)

To set the precedent that cinematic moments happen in viewing rooms throughout Gravitas, the first Gallery has one of the largest Wow moments in the game. As the player walks in, The Curator greets them and introduces the current piece. As he says the phrase: "ASCENSION", the lights in the main room begin coming on in sequence as a large set of rectangular pillars rise out of the floor. The last of the spotlights that activate shine light on the front of the pillar, illuminating the word 'Ascension' welded onto the central prism. This moment evolved from an original plan to have a large hand bring the Gravity Glove down from the heavens, raising the player up to a Garden of Eden once they acquired it. The actual centerpiece movement was scripted by one of the other Level Designers Shih-Cheng "Ross" Huang, and I scripted the light activation sequence and positioned the spotlights around the level to highlight the centerpiece's movement and the Gallery title.

Gallery 1: ASCENSION

Up until the second Gallery of the game, death is not possible. Since we used this level to introduce the lasers, which act as the main hazard in the game, I decided to have The Curator be very excited about surprising you with a new challenge and make a whole show of it. When the player enters the viewing room in this Gallery, The Curator turns to them, comments on having a surprise, giggles, and turns to look out into the first room. The moment that he tells the player the title is "Now You Can DIE", spotlights illuminate the title welded to a far wall, the room fills with lasers, and the rest of the lights slowly come on in sequence. This moment ended up being both humorous and informative since players now understood that death was possible. Our Game Designer (Team Lead) Tyler Morgan scripted the laser activation sequence, and I scripted the light activation and title illumination sequences.

Gallery 2: Now You Can DIE

Gallery 3: SURPRISE

When you enter the viewing room of the third Gallery, The Curator is absent and the world is quiet. This is strange, and immediately makes players explore. The moment they get close to the large window, The Curator quickly flies up from below the window on the opposite side, yells "I CALL THIS ONE SURPRISE!", dissapears below the window frame, a giant spotlight comes on illuminating the word SURPRISE welded to a far wall, and a giant cube swings through the window on a cable and breaks the glass. Most players have a small heart attack. The music and Gallery lights immediately begin coming on, and The Curator tells the player to appreciate his work while they're still high on adrenaline. This sequence took a lot of people to get right, but I ultimately added the final pieces of polish that made it work really well. Originally the Gallery title was to be on the giant cube itself, but it moved so fast that no one could read it. 

During the development of the overall narrative and script, I designed several cinematic introductions to Galleries and asked the team leads to assist in creating them. These are some of the coolest moments in the game and are designed as surprises, so if you haven't played it yet and want to, be cautious about reading on. Otherwise, download and play it!

 
 
 
 
 

Gravity Field Polish

To maximize the 'reveal' moment of the more visually impressive Galleries, I wrote a macro that would toggle the visibility of certain Actors in each level in random sequence with a random delay. This allowed spotlights and point lights to illuminate the first viewable area of several Galleries in a cinematic fashion, combining with other events to create a sense of theatricality. Pairing these sequences with The Curator's narration both added to his character as a dramatic artist and helped sell the identity of the Gallery itself.

Originally, the Gravity Field grabbed everything that it touched and applied three different forces to each physics object. The main gravity force, an angular dampening force, and a velocity dampeining force. After poking around in the Blueprints, I found that the dampening forces, which ended up being very necessary for consistent mechanical behavior, were simply far too strong and static. When the player jumped in to a field that, in theory, changed 'down' to the reverse of its surface normal, it felt like falling into mud and slowly drifting down a lazy river. The solution to this was simply decreasing the ratio of angular dampening force to an object's entrance velocity (so there was a little drift upon falling into a field, but never enough to fall out) and recuding the velocity dampening force dramatically. This made the Gravity Field feel like the perfect union between an extreme gravity shift and a slow tractor beam. 

1. Why are the Gravity Fields filled with invisible space molasses?

2. Does the player just... always have a parachute?

This issue was actually extremely trivial. I discovered that Unreal 4's default 'terminal velocity' for all physics objects in a level is far less than actual terminal velocity, resulting in a very slow and oddly controlled fall. I informed the team of this, and we added a physics volume that raised the maximum fall speed to every level in the game.  

This issue simply stemmed from there being no edge cases for physics objects existing in more than one Gravity Field at once. I spent a long time in a special testing level trying to implement specific physics forces that would allow Fields to transition objects between them between 80 and 100 degree angles, but launch objects when intersecting otherwise, but the logic became too complex for how little gameplay the launching would create. Ultimately, the best solution for time and performance was to only allow physics objects to be affected by one Gravity Field at a time. This made transitioning objects between fields very responsive, and eliminated the possibility of unpredictable object behavior due to accumulated forces.

3. Don't cross the Fields

 
 
 

While this polish occured very early in Gravitas's life cycle and the Programming Department continued to work on them afterwards, these changes set the foundation for the feel and performance of the final Gravity Fields that shipped with the game.

Bringing the Curator to life was an extremely intense and fun process that ended up adding a lot to Gravitas as an experience. Being mainly responsible for a major element of a 13-person team project was both challenging and rewarding, and allowed me to wear multiple hats during the last half of development. I very much hope for the chance to design and implement game characters in the future, and might integrate the Curator into future individual works to further flesh out his personality.