Category Archives: Blender

Unity3D component setup from a Blender scene

It is sometimes convenient to specify the component setup of a model in Blender instead of Unity3D, to hide implementation details and to make the life of artists and designers easier. This is for example useful to specify invisible hits, or areas that can’t be navigated by AI. The easiest way to do that is to use custom properties in Blender.

Custom Properties

Blender allows to add custom string, float or integer properties to an object. The custom properties that are attached to meshes, materials or textures are stored separately, and are currently not picked up by Unity3D.

The new binary FBX exporter (from Blender version 2.71+) can write these values to a FBX file, which Unity3D can pick up during import. The automatic Blender importer in Unity3D still uses the old Ascii exporter, so you need to either export files manually, or modify the Unity3D import script to select the new exporter. I think it is a good idea to get the latest FBX exporter scripts from the nightly build, to get many fixes over the version released in 2.71.

Bildschirmfoto 2014-10-05 um 20.10.01

The new binary exporter needs to be selected, and “custom properties” needs to be enabled. I modified my Blender and Unity3D scripts to always use the new exporter and have “custom properties” enabled when models are imported.

On the Unity3D side they are received with AssetPostprocessor.OnPostprocessGameObjectWithUserProperties. It is a good idea to start with the example script to see if the custom properties are coming across.

Example setup

I am using several custom properties currently:

It is of course possible to add more game specific components to a GameObject. It seems to be safe to delete components, though deleting GameObjects causes an error. I just mark those objects as EditorOnly.

My AssetPostprocessor script is probably a bit overkill, because it supports setting file-global defaults in an Empty called “defaults” in Blender. If you hardwire the defaults you may be able to just process the properties once in OnPostprocessGameObjectWithUserProperties. That function is called once for each Blender node that has custom properties. To process all GameObject nodes you need to handle OnPostprocessModel as well.

using UnityEngine;
using UnityEditor;
using System.Collections.Generic;

// automatically configure game objects imported from a model
public class SetupGameObject : AssetPostprocessor {
	Dictionary<string, object> defaults = new Dictionary<string, object> ();
	Dictionary<GameObject, Dictionary<string, object>> settings = new Dictionary<GameObject, Dictionary<string, object>>();
	private bool GetProperty(GameObject go, string key, bool defaultValue) {
		Dictionary<string, object> objectSettings = null;
		if (settings.TryGetValue (go, out objectSettings)) {
			object o = null;
			if (objectSettings.TryGetValue(key, out o)) {
				return (int)o != 0;
		object os = null;
		if (defaults.TryGetValue (key, out os)) {
			return (int)os != 0;
		return defaultValue;
	private void ApplySettings(GameObject go) {
		bool castShadows = GetProperty (go, "castshadows", true);
		bool receiveShadows = GetProperty(go, "receiveshadows", true);
		bool generateCollider = GetProperty(go, "collision", true);
		bool isStatic = GetProperty(go, "static", false);
		bool renderMesh = GetProperty (go, "render", true);
		bool navigation = GetProperty (go, "navigation", true);

		if (isStatic) {
			GameObjectUtility.SetStaticEditorFlags (go, StaticEditorFlags.LightmapStatic | StaticEditorFlags.OccluderStatic | StaticEditorFlags.OccludeeStatic | StaticEditorFlags.BatchingStatic | StaticEditorFlags.NavigationStatic | StaticEditorFlags.OffMeshLinkGeneration);
		} else {
			GameObjectUtility.SetStaticEditorFlags (go, 0);

		if (navigation) {
			GameObjectUtility.SetStaticEditorFlags (go, GameObjectUtility.GetStaticEditorFlags (go) | StaticEditorFlags.NavigationStatic);
		} else {
			GameObjectUtility.SetStaticEditorFlags (go, GameObjectUtility.GetStaticEditorFlags (go) & ~StaticEditorFlags.NavigationStatic);

		MeshRenderer meshRenderer = go.GetComponent<MeshRenderer>();
		MeshFilter meshFilter = go.GetComponent<MeshFilter> ();
		SkinnedMeshRenderer skinnedMeshRenderer = go.GetComponent<SkinnedMeshRenderer> ();
		if (renderMesh) {
			if (meshRenderer) {
				meshRenderer.castShadows = castShadows;
				meshRenderer.receiveShadows = receiveShadows;
			if (skinnedMeshRenderer) {
				skinnedMeshRenderer.castShadows = castShadows;
				skinnedMeshRenderer.receiveShadows = receiveShadows;
		} else {
			if (meshRenderer) {
			if (meshFilter) {
			if (skinnedMeshRenderer) {

		MeshCollider meshCollider = go.GetComponent<MeshCollider>();
		if (!generateCollider) {
			if(meshCollider) {
	private void ApplyToChildren (GameObject go) {
		// apply settings to all gameobjects
		foreach (Transform t in go.transform) {
			ApplyToChildren (t.gameObject);
	// this is called once by Unit3D after OnPostprocessGameObjectWithUserProperties
	void OnPostprocessModel (GameObject go) {
		ApplyToChildren (go);
	// this is only called by Unity3D for game objects with properties
	void OnPostprocessGameObjectWithUserProperties(GameObject go, string[] properties, object[] values) {
		bool isDefault = () == "defaults";
		if (isDefault) {
			// I can't seem to delete this without errors, so let's just mark it "Editor only"
			go.tag = "EditorOnly";
		Dictionary<string, object> newDict = new Dictionary<string, object> ();
		for (int i=0; i<properties.Length; i++) {
			string prop = properties[i].ToLower();
			object val = values[i];
			if (isDefault) {
				defaults[prop] = val;
			newDict[prop] = val;
		settings [go] = newDict;

Blender 2.71 is out, and there are good news and bad news…

Blender 2.71 is out and available for download. The good news is that my patch to fix the problems with the space transforms was accepted and is included in the new binary FBX exporter. The bad new is that a later change broke it, so that only the root node has the space transform baked in. For simple meshes that are attached to the root node that may be fine, but on models with a more complicated hierarchy this means that the problem has now moved to the first child node of the model root.

I submitted a patch to fix this but it probably missed the deadline for the 2.71 release. I’ll try to get it accepted into Blender 2.72, but I’ll probably have to provide another option to support people who rely on the 2.71 behaviour.

At the moment Blender crashes when called from Unity3D but I haven’t worked out if the problem lies in a change in Blender or Unity. Running the Unity export script from the command line works fine, so it may be some problem with the way Unity launches Blender. The automatic pipeline still uses the Ascii exporter, so it doesn’t benefit from the fix anyway.

Update: It looks like the crash in Unity3D is related to a Python/MSVC problem. I attached WinDbg and was just about to open a bug, but it is already being looked at:
Updating the DLL fixes the crash for me.

Importing Animation Events from Blender into Unity3D

A question that comes up often in Unity3D support forums is how to import animation events from an application. There is no built in way to do this, but it is possible create a custom system by adding an exporter and importer for animation events yourself, in this example for Blender.


This is an example implementation for Blender. To start it should be imported into an empty project because it is likely that each team wants to customise the scripts to suit its workflow. ~265kB, MIT license
This is a new version with support for Mecanim.

Older versions: ~180kB, MIT license

Installation into Blender

Open “File/User Preferences…” and the “Addons” panel.

Addon Preferences screen

Addon Preferences screen

  1. Select “Install from file…” and open “Assets\Blender\” from the project you installed the package into
  2. If you can’t easily see the new addon enable the “Import Export” category
  3. Enable “Export Animation Events”
  4. Save your settings

The File/Export menu should now have an “Anim Events (.xml)” entry. The default import script looks for events with the name, but you can adjust the import script to use a different naming convention if that suits your workflow better.

Set up in Blender

Blender supports both Timeline Markers, which are part of the scene, and Pose Markers, which are part of an action. The import script only looks for pose markers, but the timeline markers are exported as well. I assume you set up your Blender scene to have one action for each animation, so that they are imported as separate animations into Unity3D.

Blender Markers

Blender Markers

By default markers in Blender are created on the timeline, but when “Show Pose Markers” is enabled in the action editor markers are created as part of the action. Pose markers show up as tiny diamonds in the dope sheet, while timeline markers are displayed as tiny triangles. The marker name needs to be appropriate for the animation event you want to trigger. In my example implementation I just write them out in a similar format as they are displayed in Unity’s animation editor, but it is possible to just reserve a few keywords that the importer then processes into animation events.

The example exporter writes to an XML file, but it is equally possible to use csv or plain text files.

<?xml version="1.0" ?>
<scene fps="24" version="1">
			<marker frame="77" name="TimeLineMarker"/>
		<action name="Alpha">
				<marker frame="5" name="Banana(1)"/>
				<marker frame="20" name="Raspberry(&quot;Pi&quot;)"/>
				<marker frame="10" name="Pear(1.0)"/>
		<action name="Beta">
				<marker frame="0" name="Apple()"/>
		<action name="CubeAction">
		<action name="Gamma">

Import into Unity3D

I implemented three different import methods in Blender that are available from a preference pane.

Event Import Preferences

Event Import Preferences

  • Import XML looks for and applies the events in that file to the animations of a model during import.
  • Import Asset looks for . This can be used if you don’t want to specify animation events in Blender. To get an empty asset you need to select the model in the project view and select “Window/Add Event Data”. For production use it would be worth wrapping the class with a custom editor.
  • Import Automatic runs Blender in the background, exports animation events into the temporary directory and imports them right away into the model. This only works with .blend files. To make this work you need to specify the path to the Blender executable.

This is implemented by adding an AssetPostprocessor with OnPostprocessModel handler that is called whenever a model is imported in Unity. At the point when the handler is called the model and animation data is still writable, so it loads the event descriptions from a second file ( or and adds them to the appropriate animations.

I would use the automatic import if models are kept as .blend file in the project, the XML import if models are manually exported into .FBX files, and the asset importer if the animation tool doesn’t support events.

Example event handler

Animation events need to be received by a behaviour. This example behaviour is added automatically by the asset postprocessor, so each team will most likely want to customise it for their projects.

public class EventReceiver : MonoBehaviour {
	public void Apple() {
		Debug.Log ("Apple()");
	public void Banana(int i) {
		Debug.Log (String.Format ("Banana({0})", i));
	public void Pear(float f) {
		Debug.Log (String.Format ("Pear({0})", f));
	public void Raspberry(string s) {
		Debug.Log (String.Format ("Raspberry({0})", s));

Other options for automatic export

Running Blender from the asset postprocessor is the least invasive method, but requires Blender to be run twice, once for exporting the FBX and once for the animation events. An alternative is to run the animation event exporter whenever an FBX is exported, either from the Blender side or the Unity side.

The Blender script to export FBX lives at “Program Files\Blender Foundation\Blender\version\scripts\addons\io_scene_fbx\” and the Unity3D script to run Blender is at “Program Files\Unity\Editor\Data\Tools\”. Though customising these scripts may mean more work across a big team or when updating Unity or Blender.

Another option is to keep Blender running in the background and sending commands to it, similar to how the Max and Maya pipelines work in Unity3D.


Mecanim has an interface to add events during import so I modified my scripts to add new events to events specified this way. Unfortunately there doesn’t seem to be an official API to get the imported animation clips when using mecanim. I’m using Object.FindObjectsOfType (typeof(AnimationClip)) which picks up some other animation clip objects as well, and there doesn’t seem to be a way to identify which belong to the import. At the moment I rely on clip names not having any conflicts. Please let me know if you run into problems.

Package Contents

Assets\Blender\ Blender exporter addon installation file
Assets\Blender\io_anim_events\ Script to write animation markers into a xml file
Assets\Blender\io_anim_events\ Script to register the exporter into the Blender UI
Assets\Models\cuberot.blend Simple model with animation and markers
Assets\Models\ Manually edited marker using a ScriptableObject
Assets\Models\ XML file exported from Blender containing events
Assets\Models\Materials\Material.mat Diffuse material for test object
Assets\Scenes\TestModel.unity Example Scene. Running this should display log messages to the console whenever an animation event is triggered
Assets\Scripts\EventReceiver.cs Example implementation of a MonoBehaviour that receives animation events. It will be added automatically to an imported model.
Assets\Scripts\Editor\EventData.cs Example ScriptableObject to specify animation events outside of Blender
Assets\Scripts\Editor\EventImporter.cs AssetPostprocessor implementation that applies animation events to a model after it was imported
Assets\Scripts\Editor\EventImporterPreferences.cs Preference panel to select import method and specify Blender path
Assets\Scripts\Editor\XMLEvents.cs Class to load XML events. This uses the C# XmlSerializer class

Importing Layered Textures from Blender into Unity3D


Dwarf in Unity

This is a first experiment to import layered textures from Blender into Unity3D. This still uses the default Unity3D lighting function, but combines the texture layers that are on the Blender shader. The left dwarf is imported with the default exporter, the middle dwarf uses the improved exporter I am working on, and the right dwarf uses an improved material importer.


Dwarf in Blender

To compare, this is the dwarf rendered in Blender.

Unity3D and Blender

Many people are using Blender to create 3D assets to use in Unity3D, but the conversion doesn’t always go smoothly. I decided to spend some time researching and documenting the process and to provide solutions for key problems.

Model orientation

A simple test model in Blender

The first problem people often report is the orientation of models. Blender and Unity3D are using different coordinate systems and every model moving between them has to go through a conversion. To visualise this I made a simple test model in Blender, with arrow models aligned with the coordinate axes. The first thing to note is that Z is up and Blender is using a right handed coordinate system.

The axes model in Unity

This model is easily imported into Unity and shown here with the camera rotated so that the model has the same orientation as in Blender. The model is imported once as a .blend file, once as a FBX file and once as a Collada DAE file.  The axes don’t match at all, so we need to have a look for the reasons.

The easiest change to explain is the change to Y and Z: Unity3D works with the Y axis up, so Blender rotates the model by 90 degree around the X axis. The other change is a bit more complicated: Unity3D uses a left handed coordinate system, so the model has to be mirrored during import to make sure it looks the same as in Blender.


The model in a different FBX viewer.

A quick look in a different FBX viewer shows that this must be done during import into Unity3D and not export from Blender. Annoyingly unintuitive, but this seems to match the Blender exporter code. We can use this information to predict how a model will be oriented when imported into Unity3D: up will stay up, positive Y in Blender will point towards negative Z in Unity3D and positive X in Blender will point towards negative X in Unity3D.


Example model with baked space transform.

Unfortunately this is not the whole story: Blender bakes the rotation around X only into the transforms, so if a model is used for example as a terrain tree model it will be drawn sideways. I made a change to the exporter to bake this change into the actual mesh data, which solves this kind of problem. It will need some testing before it can be applied to the official Blender distribution, but now you can use Blender models in a terrain without having too much of a headache about which axis is up.

Test model on terrain.

Test model on terrain.

This is a model exported with baked space transform used as a “tree” in a terrain.

Model materials

Material problems with current fbx exporter.

Material problems with current FBX exporter.

The other big problem people are having is with material export. This shows a dwarf model exported with the original exporter, which fails to assign materials even remotely close to the Blender scene. This is no surprise because the shading systems in Unity3D and Blender are very different: Blender’s shading system is meant for offline rendering, and Unity3D’s shading system is meant for efficient hardware rendering on many devices. It is still disappointing that the materials are not even close within the limitations of the two systems. This can be explained by the system that the Blender FBX exporter uses to select textures: You can assign textures directly to the UV data in a mesh, and the exporter combines those textures with the meshes’ material settings.

Fixed materials.

Fixed materials.

I changed the system so that it goes through a material’s texture slots and writes out the textures. Unity3D is able to pick the correct diffuse and normal maps from that.

There doesn’t seem to be a way for FBX files to specify exactly which UV map has to be used for a texture, so the UV settings in a texture are ignored for now. Unity3D uses only the first two UV maps anyway (one for textures, one for lightmaps), so this shouldn’t be a big problem. I think it would be possible to handle some cases by rebuilding UV data during export, but that may never be able to handle all cases and cause more problems and unpredictable behaviour down the line.

Fallback for meshes with no materials

Fallback for meshes without materials.

The old system is still useful as a fallback, so it is used for meshes that have no material assigned. The textures in this example come from the OpenGameArt models benny and skeleton.

General Problems

Skinning problem.

Skinning problem with bone envelopes disabled.

There are often problems in models that are not easily visible in a modelling application, but show up in a realtime engine. In this case there are a few vertices near the boots that have no bone attached to them. These don’t show up in Blender because bone envelopes are enabled, but show up in Unity3D because the realtime skinning only uses vertex groups. There is a useful third party script to show vertex weights in Blender that makes it is easy to find misbehaving vertices and to add or remove them from vertex groups.

Blender can embed textures into a .blend file. That is very handy for sharing scenes, but Unity3D has no way to access those textures and the FBX exporter doesn’t write them out. The menu option “File/External Data/Unpack into Files” can be used to extract most textures, though textures that were created inside Blender, for example with the texture paint tool, may need to be saved manually. This can be done in the UV/Image Editor with menu option “Image/Save Image”.

Beta test the new exporter

You can beta-test the new version of the Blender FBX exporter so that more problems can be caught before it goes into Blender officially. I developed my changes with Blender 2.68a, but there is a good chance that the script will work with earlier versions.

  • The space transform from Z-up to Y-up is baked into the whole model including the geometry.
    • This MAY break models that have been imported and posed into Unity3D already, so it shouldn’t be changed in the middle of a project.
  • Textures are now taken from the material’s texture slots instead of the uv layers. This means that normal maps transfer over correctly. The code falls back to uvlayer textures only if no material is set.
    • This MAY break models that rely on the old behaviour. If your textures are set on the material you should be fine.

Please remember that this is a beta version and you should be careful with it in production work. Mixing models exported with the new and old exporter is most likely going to cause problems at some point. If you are in a team then please discuss it with team mates or leads before using it, and try it out on a separate copy of the project first.

It is very important that you make a backup of the existing exporter script before installing this, in case you want to return to the old version.


This code replaces Blender’s original export scripts. On Windows they are in %PROGRAMFILES%\Blender Foundation\Blender\<version>\scripts\addons\io_scene_fbx
On OSX they are in the application bundle, in

You can disable baking the space transform if you export an FBX file manually from Blender. The option is the last at the bottom called “Bake Space Transform”. To change the default you have to modify the code:

bake_space_transform = BoolProperty(
            name="Bake Space Transform",
            description="Bake the transform from Blender's space into target space into all transforms and the mesh data",

Change the default to False. The settings that Unity3D uses for importing .blend files are hardcoded in the other file:

def defaults_unity3d():
	return dict(global_matrix=Matrix.Rotation(-math.pi / 2.0, 4, 'X'),
		    object_types={'ARMATURE', 'EMPTY', 'MESH'},

Change bake_space_transform to False. Unfortunately the changes to the material system were too extensive to make an option to switch to the old system,

Here is the script for download:

Please let me know about your results!

Update: From what I’ve heard there will be a significant rewrite of the FBX exporter in one of the next versions of Blender. If you’re visiting this from the future, this was tested on Blender 2.68a, and I’ll try to re-test this on the new exporter as soon as it arrives.

Update 2: My patch was accepted into Blender, so Blender 2.71 will come with the “Bake Space Transform” option available.

I had to disable comments again thanks to spam. You can reach me on Twitter and Google Plus.