From BlenderWiki

Jump to: navigation, search
Note: This is an archived version of the Blender Developer Wiki. The current and active wiki is available on wiki.blender.org.

The goal of this page is to list issues with the BGE's codebase, and not for general bugs. Right now this is just a copy of this blog post. We could also look into merging this into our todo list.

Rendering Code has Broken Encapsulation

In general, our rasterizer is quite messy. Why not simply clean it up then? The problem we face is that we have rendering (i.g, OpenGL) code scattered all around the code base. For example here is a list of the earlier mentioned modules that contain OpenGL code:

  • Rasterizer (good)
  • BlenderRoutines (bad)
  • GamePlayer (bad)
  • Ketsji (bad)
  • VideoTexture (bad)

The physics code also contained OpenGL code until recently. Why would a physics engine need to use OpenGL? You would probably guess for the "Physics Visualization" option, but that is being properly handled through the rasterizer interface. Instead, our physics code was using OpenGL in it's view-frustum and occlusion culling tests. That's right, our physics engine determines what is drawn, not our rasterizer or scenegraph. This should make a few people scratch their heads. The thing is, Bullet provides data structures for handling dynamic scenes and culling those dynamic scenes, and Bullet can do this a lot faster (logarithmic time complexity) than what we previously were using (linear time complexity). I can see why this optimization was done. However, the culling code is completely separate from the physics code. A GraphicsController was added to our physics interface to allow physics engines to do culling tests. Instead, we should have just created some code to allow the scenegraph to use Bullet to do culling (neatly encapsulated in some fashion).

Now, back to the rasterizer itself. Lets say I want to change how we use shaders (maybe to reduce the number of state changes). The most obvious spot to look for this code is in the rasterizer. However, the rasterizer does not know about shaders. For that mater it doesn't know about materials or textures either, and, until recently, did not even know about lights. A fun note on lights: the fixed-function (i.e., Multitexture Material) lighting code used to be handled by players (e.g., the embedded player or stand-alone blenderplayer), which meant we had different fixed-function code for the different players.

Overall, the fact that our rendering code is all over the engine makes it very difficult to maintain. We cannot add a new rasterizer for things like OpenGL 3+ or OpenGL ES (mobile) until we get everything properly encapsulated behind our current rasterizer, which will take time.

Duplicate Code in VideoTexture

The VideoTexture module sits almost completely apart of the rest of the engine. At first glance this is a good thing (loosely coupled, few dependencies, etc.). However, taking a deeper look at the module shows that it achieves this lack of dependencies by copying large chunks of the render setup code from KX_KetsjiEngine (the core engine class). This creates a large chunk of duplicate code. Now, if a developer wants to fix/change something like how camera projections are handled (e.g., fix something related to orthographic cameras) they might find the code in KX_KetsjiEngine, make the change, and call it a day. However, they have just introduced a bug to the render-to-texture functionality offered by VideoTexture! Overall, I would like to see the VideoTexture features better integrated into the BGE and the VideoTexture module itself phased out. For example, we should just add "native" support for Blender's movie texture type.

A further issue with the VideoTexture module is that it is only exposed through the Python API. The rest of the engine does not know it exists. This makes it difficult to use the VideoTexture module in other parts of the engine (e.g., for the earlier mentioned support for movie texture types). Ideally, the features in the VideoTexture module would be parts of the engine that are exposed to the Python API (like all of the other engine code).

Multiple Material Classes

In the BGE materials are handled via the following classes:

  • RAS_IPolygonMaterial - This is an interface in the rasterizer for handling materials
  • KX_BlenderMaterial - This is a concrete implementation of RAS_IPolygonMaterial that handles materials for the BGE (note that it's in Ketsji!). This includes code for Multitexture and GLSL materials mixed together in the same file (breaking the single responsibility principle).
  • BL_Material - Essentially a copy of Blender's material data made during conversion. This is made since the BGE tries to avoid using Blender data. However, one of the fields in this class is the Blender material, which various parts of the engine access. This invalidates the need for having BL_Material in the first place! Really, BL_Material is not needed and is causing extra clutter and confusion.
  • BL_Texture - Handles textures in Multitexture materials and custom shaders. GLSL mode uses Blender's texture handling code (i.e., not code in the engine itself). Again, this isn't in the rasterizer.
  • BL_Shader - This is for custom shaders created via the Python API. It should be noted, that despite the BL prefix, this class does not (directly) touch Blender data or code. It does make use of BL_Material and BL_Texture, which ultimately do interact with Blender code, but BL_Shader doesn't need to know where the data came from. In other words, BL_Shader does not care what Blender does, but it still has the BL prefix, which is confusing to new developers.
  • BL_BlenderShader - Another shader class, but instead is used for Blender's generated GLSL shaders. This means that custom-made user shaders have a completely different code path than the builtin shaders. Nice.

We used to also have a KX_PolygonMaterial for Singletexture mode. The general idea behind many of these classes is not bad, but, as mentioned, they have a lot of oddities.

Logic System Dependent on Logic Bricks

The game engine can not currently run without logic bricks. The entire logic system is dependent on logic brick code. For example, we cannot even run a Python script without getting logic bricks involved. Logic bricks are useful to have, but they should be properly encapsulated and organized. There are a few things that need to be done here:

1. Logic bricks should be put into their own folder (GameLogic) and not be part of the core engine module. 2. Logic bricks should be implemented as a logic system that is separate from the engine itself. This allows other forms of logic (e.g., Python and HIVE) to be integrated in a smoother manner. This could be achieved by having a list of logic systems that the game engine could iterate over and tell to update. 3. Logic bricks should expose functionality, not contain it. A lot of this has been cleaned up, but we've had cases where features such as animations were coded directly into a logic brick. These types of features should be in the engine code, and then a logic brick can interface with that code. This allows other systems (e.g., Python, HIVE, other parts of the engine itself) to make use of these features. We've still got some features that only logic bricks have access to like 2D filters. We also have cases such as the Near and Radar bricks that create collision shapes in the physics engine to work. Where is this in the Python API?

Inconsistent Logic Bricks

While on the topic of logic bricks: logic bricks are rather inconsistent, which makes using or developing them awkward. For example, handling pulses is left up to the actuator (not some higher level system). This means that actuators can choose how to behave to positive or negative pulses. This sounds like a good idea until you realize two different patterns have emerged:

  • Actuators that only perform tasks while receiving a positive pulse.
  • Actuators that start performing tasks on a positive pulse and only stop when they receive a negative pulse.

This is just confusing to both new users (why do the actuators behave differently?) and developers (how should a new actuator behave?). What is worse is that this inconsistency cannot be fixed without breaking existing games.

Using BKE_rigidbody in the Game Engine

Note: This should probably be moved to it's own page

I have begun researching if/how we can use BKE_rigidbody (i.e., Blender's Bullet code) in the BGE. This would reduce the amount of duplicate code we have as well as making sure Blender and BGE are on the same page in terms of physics (e.g., physics visualizations matching). The first problem that becomes apparent is that BKE_rigidbody is designed to work tightly with Blender code (as it should). However, this makes some of the simulation functions not very useful to the BGE since it has its own scene and object types. If the BGE started using more Blender data types directly, then this could become a lot more useful. I also saw some code for baking/caching that might not make sense in real-time, but I'm sure these can be bypassed with appropriate flags.

There is also the RBI_api. We could probably use this in the BGE, but since it's mostly just a light C wrapper API around Bullet, it doesn't give us much benefit. The things we really want (e.g., determining physics shape, etc.) are in BKE_rigidbody.

I also looked to see if we could take just the shape conversion code, but this is currently all sitting behind static functions (meaning they are not currently accessible).