In this post we’re going to create an AI Character that hears our player footsteps and follows him. If you are interested in creating a “Seeing” sense for your AI, check out this post. Before we start, here is the end result:

To achieve the above functionality, we’re going to need a couple of things:

  1. A playable character with a PawnNoiseEmitter component, which will be used to “report” into the game that a sound has been played.
  2. A Controller for our AI character
  3. A custom AI Character which will contain a reference for our Behavior Tree and a PawnSensing component
  4. A Behavior Tree for our AI Logic
  5. A Blackboard to store values for our Behavior Tree
  6. A footstep sound (Optional)

While reading this list you will notice that the footsteps sound is optional, this is because in UE4 we explicitly tell the game that we’ve played a sound somewhere with a certain volume. The engine itself doesn’t care if we have actually played a sound or not. In case you want to include the same footstep sound I’ve used in the video above, here is the download link. In order to get started, create a Third Person C++ Template project.

Before we even begin to write any code, right when your template project loads up, add a nav mesh bounds volume to cover your whole level!

navmesh

Click on the image to enlarge in a new tab

Setting up our playable Character

To set up our character, open up it’s header file, and include the following library right before the .generated.h file:

#include "Perception/PawnSensingComponent.h"

Then, add the following declarations:

Then, switch to your character’s source file and inside your constructor, type in the following initialization:

Moreover, type in the following implementation of the ReportNoise function:

Note here we’re doing two things. We first play the given sound and then we’re using the built-in function MakeNoise to inform our game that we have played a sound.

In case you copy and pasted the code above don’t forget to edit my Character’s class name to match yours.

Now that we’re done with our character’s C++ logic and what we need to tell our character to play a sound when he touches the floor. To do that we’re going to use Animation Notifies. Think of Animation Notifies like functions, which get called in a specific frame of an animation that we decide. For more information about them, check out the official UE4 documentation.

Locate your character’s run animation and add a new Notify (I named my notify GenerateFootstepSound) when your character’s feet touch the ground. I’ve placed my notifies in frames 8 and 18 like the following image suggests:

anim_notify

Click on the image to enlarge in a new tab

Then, go to the Event Graph of your animation blueprint and add in the following logic for the Notifies we’ve created above:

footstep_generation

Click on the image to enlarge in a new tab

Notice that I imported the sound I’ve mentioned above and I have hardcoded its reference for the given function. It would be more flexible to create a separate USoundBase* property inside the Character’s header file and assign values through its Blueprint. However, in this case, this is adequate.

Compile and save your Blueprint. If you play with your Character right now, you will hear the generated sounds for each footstep! Let’s move on!

Creating our AI Character

Add a new C++ class which is based on the Character class and name it MyAICharacter. Inside it’s header file, right before the .generated.h add the following includes:

Then, type in the following declarations:

Then, switch to your source file, and type in the following code:

You will notice that inside the OnHearNoise function I’ve commented out the lines 28 and 34. This is because we have yet to add our AI Controller class. When we’re done with our Controller, we’ll get back and uncomment that code. For now, just compile and save your code.

Creating our AI Controller

Add a new C++ class which is based on the AI Controller class and name it MyAIController. Inside the header file, before the .generated.h header file, type in the following includes:

Then, type in the following declarations:

Switch to your source file and type in the following code:

Moreover,  inside your controller’s source file, don’t forget to add the following include:

#include “MyAICharacter.h”

When you’re done with all that, switch to your AI Character’s source file, uncomment lines 28 and 34 and add the following include:

#include “MyAIController.h”

Compile and save your code.

Setting up our Blueprints

Once you have completed all the steps above, create the following Blueprints:

  1. A Blueprint which inherits our AICharacter class
  2. A Blueprint which inherits our AIController class, and name it BP_MyAICon
  3. A Behavior Tree, named MyBehaviorTree
  4. A Blackboard named MyBlackboard

Once you’ve completed all that, go to our AI Character’s Blueprint and:

  1. Assign the default mannequin skeletal mesh and make sure to rotate the mesh to face the blue arrow inside the capsule collider
  2. Inside the Anim Blueprint Generated Class select the same Anim Blueprint as your main Character
  3. Select the BP_MyAICon as the AI Controller Class
  4. Select the MyBehaviorTree for the Behavior Tree

The above steps are summarised in the following image:

ai_character_bp

Click on the image to enlarge in a new tab

Since we don’t want a “Seeing” sense for our AI, locate it’s PawnSensingComponent and disable the See Pawns option. This is crucial.

ai_character_bp2

Click on the image to enlarge in a new tab

Setting up our Blackboard

Our blackboard setup is simple, add a new Object key named SensedPawn.

Settin up our Behavior Tree

The logic for our AI will be the following: Once we have heard a pawn, we move to its location. When we’ve reached our target we wait. And so on… It’s not the smartest AI in the world obviously, but it will do for this tutorial. The above logic is described in the following Behavior Tree:

ai_hearing_bt2

Click on the image to enlarge in a new tab

Moreover, click on the MoveActor node and set the acceptable radius to 100 and the Blackboard Key to SensedPawn.

Save your project and place an AI Character inside your map. The AI should now follow you like the video I’ve uploaded above.

Share postShare on Facebook11Tweet about this on TwitterShare on LinkedIn0Share on Reddit0