
James Brown - Software Engineer

Engine: Unity 2019.2.2f
Language Used: C#
Primary Roles: Editor Interface, Back-end Systems, Behaviour Execution System
Visual AI is a passion project I developed in my spare time which aims to make the design of game AI behaviour easier via a custom node based editor system and more efficient through Unity's versatile and data-driven ScriptableObject system. The editor interface was purposely designed to slightly mimic the Unity animator window in attempt to make it more appealing and less intimidating for designers and coders to learn. For instance, parameters and behaviours can be edited exactly the same way via selecting a behaviour connection curve to add or remove parameters, or by selecting a behaviour node to edit its field values. Additionally, I wanted to make debugging easier and quicker by displaying in the editor which of the highlighted agent's behaviour was currently being executed (exactly like the animator window). However, while the whole system isn't visual at the moment and does require some basic scripting knowledge, i am planning on integrating it with Unity's visual scripting system when released in 2020.

Editor Interface
As mentioned earlier, I wanted to create a less intimidating and more intuitive system for designers and programmers to use in planning out how an AI's behaviour tree executes. Through careful consideration and research into existing behaviour editors, such as Unreal Engine 4's behaviour editor, I decided upon utilizing a node based design due to the sheer quantity of existing node editors already out in the market, which will likely allow less of a learning curve when learning the system. I accomplished this task by utilizing Unity's editor window class, the custom menu interface and the Unity Editor namespace to create and render the editor interface.
Additionally, the editor window class isn't just entirely responsible for rendering the GUI, but for keeping track of references to the currently opened tree's behaviours, parameters and nodes in order to enable the user to edit the scriptable objects within the editor window instead of the asset directly. It also contains the necessary events for creating, assigning and removing nodes from the behaviour tree with the addition of loading saved tree assets from prefabs or in-game instances.
Code Sample - Editor Window
The node itself is a pivotal aspect of this system as it contains all the relevant data relating to what behaviour its currently carrying, its current position in the tree, the node type (which comes in Root, Condition, Action and Selector form) and what parameters is needed to succeed for its behaviour to be executed. The node derives from the Unity's ScriptableObject class in order to treat each node as an individual asset that could constantly persist between scenes unlike the standard mono behaviour approach I used for Project Alice. Additionally, the node system is capable and responsible for unlinking itself from attached nodes before it is removed by simply looping through all it connections and removing all references its connections has of itself.
​
Furthermore, I though it would be handy for the user to be able actively debug what the AI is currently doing during gameplay, which is simply accomplished by changing the nodes display colour based on what the behaviour's execute result returns. For instance, fail result will display red, green success and orange if the behaviour is currently being executed. While this may not be a complete substitute for debugging code, it without doubt helps figure out how and why the AI is behaving as it does.
Code Sample - Node System

Behaviour System
This is the core of the system as it's carries the necessary instructional data for how an AI agent behaves in scene. Like the node system, the behaviours additionally utilize Unity's ScriptableObject system to allow each behaviour to be treated as an individual asset as opposed to individual components which was a mistake I improved upon from Project Alice. There are three behaviour types that each derive from a base abstract class called _Behaviour in order to provide the base functions that each behaviour requires and in order to use as a base data data type for assigning to nodes.
As previously mentioned, the behaviours come in three types, Condition, Action and Selector which all serve a different purpose. For instance, the Condition behaviour should contain any explicit instructions that the next behaviour may require in order to execute, however, usage of this behaviour type may be rare and possibly obsolete due to the implementation of parameters. This observation was heavily noticed during the production of my second year major production as it was never utilized for any AI throughout development. Continuing, the Action type is designed to contain all the main behaviour (such as find closet enemy or attack) while the Selector type is a predefined behaviour which is designed to use calculated weight scores to simulate smart utility based decision making. The Selector type has two predefined modes, best outcome which always returns the lowest weight score (i.e. choose shortest distance), and roulette which utilizes the weight scores and probability to create and pick a random value on a roulette wheel with the odds being always being high for the biggest range but nevertheless always has a chance of being random. This type entirely depends upon how the programmer calculates weight scores in the connect action behaviours and is intended to make an AI appear more human in their decision making.
Code Sample

Example: Flocking Behaviour from Wondercade Arcade

AI Controller
The AI controller is the only MonoBehaviour in this system and only takes in two fields, the behaviour tree asset and the NavMeshAgent. It is designed to be attachable to any desired game object that the user desires to act as an AI while additionally containing the necessary functionality for cloning and executing the attached behaviour tree at run time. The attached tree asset is cloned on start up by iterating down each individual branch and instantiating itself while reassigning any necessary socket connections to point at the newly cloned assets.
​
At run time the deep copy asset is executed every update by translating down each branch from the root node, however, only while the current looked at node's parameters and behaviour succeeds, but If either these fail the branch is then exited to start of the next. As seen in the sample provided below, the branches get created upon startup following the cloning of tree asset and put into a data struct that contains a list of nodes. With this approach i can create a 2D list that represents the exact layout of the behaviour tree seen in the editor tool, which enabled me to easily create the loop for translating down individual branches. Currently I am still seeking to make improvements as I'm currently experimenting with the multi threaded Entity Component System and looking into converting this to utilize this system in order to increase performance when having a large collection of agents in scene.
Code Sample - Controller

Similarly to the Behaviour and Node classes, the AITree class utilizes Unity's ScriptableObject system for the same reasons the previous systems do, however, each node and behaviour asset are childed to the AI tree asset in order to create less clutter in the assets folder. Essentially, the AI tree serves a data base for holding all the necessary references to what nodes and global parameters the user has created while also containing the necessary functionality for getting, updating and setting parameters. Basically the global parameters are ones the user creates in the parameter panel above which contains the conditional information for what a assigned node parameter has to meet in order to return true while a locally assigned parameter is what the user has assigned to a node. Additionally, the parameter set functions change the values of the specified parameter assigned in each node while the global are never changing (unless manipulated through the inspector).
Code Sample - AITree


Parameters
Like Unity's animator system, the parameters are designed to act as the required conditions for the behaviour to be executed. They come in the three generic data types int, float and bool with specified conditions that need to be met in order to return true, such as greater, less or equal to for numerical types and true or false for the bool type. The parameters only get checked during gameplay if the node is being looked at by the AI controller's execution algorithm just before it executes the nodes attached behaviour. Originally, I utilized a template class where I could specify the desired datatype upon creation in the editor interface but soon switched back after learning that Unity's Editor system can't serialize template classes.
Code Sample
