Your browser (Internet Explorer 6) is out of date. It has known security flaws and may not display all features of this and other websites. Learn how to update your browser.

The external compiler

The message system we decided on for the engine where every message is a separate class that inherits from an abstract base class we found to be a rather simple and practical way of handling messages. However it did require a lot of monotonous busywork since you had to define a new class for every new message, to try and make this process a lot faster and more user friendly we designed a small companion program for the engine that we call the “ExternalCompiler”. This program utilizes an XML document for the definitions of the messages, this is an example of how that may look:


The information that is given here (the message name, the variables that the message will contain and a tag that is not displayed in this image that represents an include that the message is dependent on) is then used by the “ExternalCompiler” to write c++ code that correctly represents the desired messages in the engine.


This is how the external compiler currently looks. When using it the user specifies a Resource Directory where the previously mentioned XML files are situated, then they specify an output directory where the generated c++ files will be placed and lastly they specify the file names of the XML files in the large text area as a comma separated string. This as you probably realize is not the most well thought out design possible for an application like this, we are looking to re design it to be more user friendly in the future but for now it does the job we need it to and our focus is elsewhere.

The Factory system


Here is another walk-through/overview of another engine system. This one will be focusing on the Factory manager and how it interacts with the properties and enteties of the engine.

The Factory Manager:

The factory manager is the singleton that the developer will contact in the program when he wants to create an entity. It is also the class that is contacted from the state when it wants to load a level. This is meant to give developers a standardized way of creating entities in the engine, the definitions of these entities are done through the blueprint system that was covered in a previous post.

After these entities are defined through reading the blueprint and level files it is time to create actual in-game entities from them, this posed a problem for us, since we did not want the developer to have to go in to the “FactoryManager” and rewrite it so that it could handle every new property that they want to add to their game. We came up with a solution that was heavily inspired by a game engine that we had worked closely with before: “Nebula Trifid” , we designed two macros that the developer adds to the top of their properties .h(“SetUpProperty(PropertyName);”) and .cpp(ImplementProperty(PropertyName);) files. These define static functions that when called from the setup part of the developers application file would send the name of the property and a function pointer pointing to a creation function for the property to the “FactoryManager”. These are then stored in a map where the key is the name of the property and the value is the function pointer.

Later when the “FactoryManager” is told to create an entity with the property “x” the “FactoryManager” will get the pointer from the map where “x” is the key and use it to create a property that it can then handle as a “PropertyBase” pointer thanks to the polymorphic inheritance that the entity system uses. Thus the “FactoryManager” can create and deliver a fully functional game entity with all its properties correctly attached without any modification needed to the manager when new properties are added to the project.

Blueprint system

The blueprint system is designed to provide developers with a quick and easy way to make a “blueprint”(a standardized setup of properties and attribute values) for an entity. These blueprints can be used for base entities in the level editor that can then further edit their values to be more customized for the particular level. They can also be used by the “FactoryManager” in the engine to create entities strait from the blueprint using the “createEntityFromBlueprint” function.


The blueprint system works in three steps of files building on each other the format we use for this is XML. The first one is the “PropertyAndAttributeDefinitions” file:


This file, as the name suggests contains the definitions of properties and their attributes. A property is defined by a name that mus correspond to a name of a property class in the engine and a “derivedFrom” variable that represents the class that the property is derived from. Inside the property tags are attribute definitions, each “<var/>” tag represents an attribute definition. These definitions consists of a name which is the name you wish to give your property, a type that represents the type of variable it will be and a value that will be the default value for the property.

The next step is the blueprint file:


This file is where you define the actual blueprints, this is done by first adding the “BluePrint” tags with the parameter name this will represent the name of blueprint. Inside these tags the developer can add the properties that they desire the entity that will be created from the blueprint to have by adding the property tags with a name parameter(note that you do not need to specify the “derivedFrom” parameter here since it is already defined in the previous step). If the developer then desires to alter one of the properties attributes value they can add a “<var/>” tag inside the property tags with the name and value parameters(note that the type parameter is left out, since it is already defined in the previous step).

These properties can then be loaded in to the level editor where the developer can redefine all the previously mentioned values of the blueprints and save them to a level file:


Here is an example of what one of these files may look like. Here you can see two entities defined by “<Entity></Entity>” tags these tags contain the name of the Entity(defined by the developer in the level editor) and what blueprint it was originally defined by. Inside are all the redefined values of attributes in the same way as described in step two, you may also see a couple of new tags “<collider/>” These define a collision box added to the entity in the level editor.

How To Level Editor

The leveleditor can be found in:


It needs to stay in this location since it uses its location to get engine core properties.


Launching the .exe file, will show this window.

Screenshot 2015-03-12 11.11.02

The left side shows a empty box, that will be filled with entities as soon as you adds the application folder (path to your project).

To do this GoTo File->Application Directory


Screenshot 2015-03-12 11.17.18


Select the correct one and press Choose.

Now list contains entities ready to be added to the level


here is a little gif showing how to add.


Pressing on a entity in Entities List or in the widget, displays its information in the PropertiesView

Screenshot 2015-03-12 14.53.48


Next to the entity in the enity list is its name, which can be modified in propertyview

Press Update Enity to save.


Save/Load Level

File->Save Level/Load Level-><filename>.xml


There have been some issues with the development of the GUI since the intended library we were going to use decided that it wouldn’t work with our rendering solution. And therefor we decided to change our plans on using that library and we made a new GUI from scratch with only the functions we thought was necessary. Each GUI-element will be placed in a GUIcontainer to allow the user to setup multiple menus or gui for the same scene. And easily switch between having the different containers locked on the screen or if they want it to be movable. For now there is 2 different types of elements that can be added to the container: GUIText and GUIButton. The GUIText is rendering a text from a font. The text is set when the object is created but can be altered while the program is running by calling the GUIManager for that element and then using the updateText function. The GUIButton element is a button with 4 different states: idle,down,pressed and clicked. This is so the user can handle specific differently and make the gui handle differently depending on how the buttons are clicked. To make the user see that he actualy is pressing a button they are given 2 different textures to switch between, one when the button is idle and waiting and one when it is being held down/clicked. The GUIText and GUIButton is inheriting from a base class GUIElement that will help the GUIManager handle all elements position, size and if the element is active. An inactive container sets all the elements in it to inactive to prevent buttons from being invisible but active. The GUIContainer has the ability to be dragged around, when a container is dragged all the elements in it is dragged with it.

In this picture the second button is being pressed down.


Wrapping up States and other things

We have now finished the state system. This framework helps users create and manage their own states for use in the game engine.

The StateBase class is an abstract base class that users should inherit from when creating their own state classes. It contains functions for managing the active level and the name of the state. All clean-up and set-up for changing levels during runtime is handled within the class so the user doesn’t have to worry about it. Finally the StateBase class contains the virtual functions onEnter() and onExit() which are called when the user switches states. What they actually do is up to the user to implement.

The second part of this state-framework is the StateManager class. This is a singleton like all the other manager classes that the user calls upon when they wish to get the current state, get a state by name or to change states. Internally it only contains a std::vector full of StateBase pointers which the functions perform their functions on.

Many other parts of the engine are also reaching completion but most will probably get their own blog posts such as the GUI system, Level editor and camera property.

The Render engine

Hello everyone!

This post is intended as sort of a a walk-through/overview of the render engine used by the Solkraft Game Engine(it will not contain a section on rendering the GUI, that will be a separate post):

The render engine is based on OpenGL and several support libraries that we would like to first mention. For creating and managing the window we use “GLFW” (we also use this for input in other parts of the engine), for handling image loading and conversion to OpenGL textures we use the “SOIL” library and for easy access to OpenGL functions we use “GLEW”.

The core structure of the render engine i composed of two classes:

The RenderManager class: Is a singleton that the rest of the engine can interact with regarding graphics and rendering. For example: when an entity wants to sign up to be rendered it contacts this singleton, when the user wishes to re-size the render window or when something wants to receive camera focus.

The GlWindow class: Is the other class in the render engine representing the actual render window. This class is responsible for interacting with OpenGL and the Shader program when it is time to render. It also communicates with the “RenderManager” class to get the list of entities that are supposed to be rendered “the render queue”.

To try and better explain how the render engines “pipeline” functions we will go through all the steps from when an entity registers with the render manager to it being rendered in the window:

The requirements for an entity to be able to sign up for rendering are that it possess three attributes, the “Sprite” attribute which is a text string representing the name of an image in the “images” folder of the users project, the “DepthLayer” attribute which is an integer that represents the depth layer that the entity’s graphics will be rendered on with zero being “closest to the screen” and the “GLTextureHandle” attribute an integer representing an OpenGl texture handle, the value of this attribute will be assigned by the “RenderManager” later in the process. When these requirements are met the user can call the “RenderManager” and register the entity for rendering. The first thing that happens when an entity is registered for rendering is that it is added to the render queue and sorted based on the value of its “DepthLayer” attribute, after that the value of the “Sprite” attribute is compared to the names of images that have already been loaded in to memory, if a match is found the “GLTextureHandle” attribute is set to the corresponding texture handle of the image name. If it is not found, a function in “SOIL” is called and a new texture handle is returned and then assigned to the “GLTextureHandle” attribute, it is also added in to a “map” of already loaded images along with the value of the “Sprite” attribute.

The entity is now signed up to be rendered and in the next “onFrame” function call to the “RenderManager” the “GlWindow” class will be tasked with rendering it and all the other registered entities. When this happens the “GlWindow” starts by gathering necessary information for the entity that is about to be rendered, first it sends a message to the entity requesting its transform matrix which it will use as the model matrix in a “model * view * projection” calculation later in the shader program. After that, it looks up if something has camera focus, if something does then that entity’s camera matrix(another matrix stored in the entity) will represent the camera’s matrix and will be assigned to the view matrix in the aforementioned calculation, if nothing has the camera focus then the view matrix will be represented by an identity matrix. Now that it has completed gathering the necessary information from the entity and the “RenderManager” it is time to set the active texture and attribute arrays, this is done by calling several functions in “GLEW”. When setting the active texture the “GLTextureHandle” attribute of the entity is used as the texture handle argument. Then there are the two attribute arrays, one with vertices and one with UV coordinates. In the vertex array there are four vertices going from (-0.5, -0.5) in the bottom left to (0.5, 0.5) in the top right corner. We have designed it this way so that we can reuse these same four vertices for drawing all entities, using the scaling part of the entities model matrices to represent the actual size of the entity. When this is all done the information is sent of from the application and the entities are rendered in the window.

We hope that this has been helpful in better understanding how the Solkraft Game Engine functions under the hood.

Expect more of these posts over the next few days covering other parts of the engine as we approach the final presentation of our project on 13/3-2015.

There will also most likely be some posts regarding the demo game that we will be developing for the engine this week.

Level Editor just about done!


Screenshot 2015-03-06 15.25.57

This is the current (and most likely final) layout of the level editor.

The tab <Add Entity> contains Entities from the Blueprint XML file.  Double Clicking these or pressing “Add Entity” adds them to the current level.

The tab <Edit Entity> contains Entities from the level, as well a property window to modify values of the selected enitiy.

Screenshot 2015-03-06 15.25.51

This is the current (and most likely final) layout of the collision editor.

Edit values for each collision box/circle.

Level Editor Update


Entities which require collision boxes/circles (Entities with Physics Property) now have a editor specific for just that.

Adding, removing or reshaping collisions boxes/circles

Animations and the state of the Statemanager

Since the rendering part of the engine was more or less done (or at least done enough for this) at the beginning of the week we begun writing the AnimatedProperty class. It is a property that the user attaches to an entity that the user wants to be animated.

It functions by switching out the texture handle of the attached entity, making the render manager render the entity using a different image. It does this a regular intervals defined by the user in milliseconds. To load an animation into the engine the user has to specify the directory in which the animation’s frames are located, how many frames the animation contains and what filetype the images are saved in. When the images are loaded into the engine the texture handles are saved into a large std::vector containing all frames for all animations for that entity. The resulting start and stop indexes are saved into a custom “Animation” struct. Then the user can go ahead and send a “PlayAnimation” message to the entity and it will start animating whatever animation is set as “current animation”. The property will iterate through the frames std::vector between the indexes saved in the relevant Animation struct.

Implementing this was a smooth process with little issues. Since we wanted the animation system to be based on time instead of frame rate we had to make use of the EngineTimer class which we had written before. As we were testing we discovered some strange bugs in the EngineTimer class, but they were swiftly ironed out and AnimatedProperty is now as good as done, apart from it’s message system interface, which is placeholder until we have actually implemented the message system.

The animation test environment, running on our engine.
Two frames in our test environment, running on our engine. One press of the “D” key makes the entity move a step to the right and causes the animation to play. Input, rendering and animation are all done by the engine.

As for the StateManager and StateBase classes, we started implementing them a week or so ago, but had to stop about halfway through because the rest depended on other parts of the engine which were far from done at that time. This was a bit of a planning slip on our part and we took note to base our planning a bit more around dependencies in the future than we previously have. We swapped around a bit in the time-plan between the state-classes and a couple of other classes, but the rest of the plan remains unchanged.

All in all the engine is coming along nicely and things are going mostly according to plan.