Skip to content

Exporting 3D content for Qt 3D with Blender

At the heart of every 3D application is geometry. Qt 3D-based 3D applications are no different and require the user to either generate geometry or provide asset files for Qt 3D to load. This blog post demonstrates how Blender and its Python API could be used to write an exporter that generates geometry for Qt 3D.

For those of you not yet too familiar with Qt 3D, let me remind you that Qt 3D is based on an Entity Component System. A given Entity gets its behavior defined by the various components it aggregates.

Assuming you want to render some 3D content, a renderable entity would be composed like so:

Entity {
    components: [
        GeometryRenderer {
            geometry: ...
        Material {}


Loading Geometries with Qt 3D

There are currently 3 ways to incorporate geometry in a Qt 3D application:

Using the default meshes provided by Qt 3D Extras

QConeMesh, QCuboidMesh, QCylinderMesh, QPlaneMesh, QSphereMesh, QTorusMesh

These are all QGeometryRenderer subclasses which take care of generating geometry based on configurable properties. It’s up to you to provide a Material that will shade your geometry.

Entity {
    components: [
        ConeMesh {
            rings: 16
            slices: 16
            topRadius: 0.1
            bottomRadius: 1.0
       Material {...}


Using QMesh component to load a geometry file

Several backend plugins are available for .obj, .fbx and gltf version 1. The list of plugins will likely grow in the future (work on a gltf version 2 importer is currently ongoing). QMesh is also a QGeometryRenderer subclass, you also need to provide a Material.

Entity {
    components: [
        Mesh {
            source: "path/to/my/file.obj"
        Material {...}


Using QSceneLoader component to load a scene file

QSceneLoader is also plugin based. At the time of writing, one such plugin is based on the Open Asset Importer Library (Assimp) which supports the following formats.

There’s also another plugin which supports gltf version 1.

Entity {
    components: [
        SceneLoader {
            source: "path/to/my/scene.obj"

The subtelty between QMesh and QSceneLoader is that QMesh loads only a single mesh whereas QSceneLoader will load an entire scene. Essentially QSceneLoader will generate a subtree of QEntity with QGeometryRenderer and QMaterial components. The nice thing is that QSceneLoader will also instantiate matching Materials to go with each mesh.

In most cases you’ll either use QSceneLoader to load a rather complex scene or, if you know how the scene content is structured, decide yourself on which parts you need and use several QMesh components.


Now this is all fine but you’ll often end up with one of these issues:

  • Geometry is generated at run-time which can be costly
  • Many 3D formats are text-based (takes up space on disk, slow parsing at run time)
  • Loading a scene subtree requires traversing said subtree to retrieve entities composed of components of interest
  • QSceneLoader may duplicate materials, effects, attributes depending on the scene file
  • Declaring several QEntity with QMesh components can be tedious
  • Import plugins for QMesh/QSceneLoader are not available on all platforms and may not cover all formats.


From a general perspective, the current mechanisms make it tedious for the developer to control a complex scene. They either completely hide away the structure of a loaded subtree or, on the contrary, force you to know exactly what composes your scene and let you do the heavy lifting of deciding which meshes you care about.

If you are after performance, you need to know how your scene is structured, which materials are in use and how large your geometries are. With this information you can decide if you need to:

  • rework you geometry to reduce the number of vertices to be drawn
  • group several parts together so that they can all be drawn at once
  • reduce the amount of materials required.

So what if instead of loading something blindly, we generated Qt 3D content in advance, as part of our tooling or asset conditioning work?



Blender is a free and opensource 3D creation suite. I wouldn’t go as far as saying that it’s intuitive to use for a newcomer, but it’s a really powerful tool. In addition, it provides a powerful Python API which can be used to write importers, exporters or simply automate processes (importing and exporting geometries offline). The nice thing is that the API is documented, the bad thing is that the documentation is mostly a list of methods and members…

How could Blender solve any of the issues we have?

Instead of generating or loading geometry at runtime, I therefore decided I would try to experiment and write an exporter plugin for Blender. The goal for it would be to generate the QML content for an application (though we could easily extend it to cover C++ as well) and export the geometry buffers to binary files that just need to be read at runtime without requiring any parsing.

This could solve the issues of slow startup caused by parsing text-based files and possibly duplicating effects and materials. This also solves the case of import plugins deployment (as we are now performing this offline) and only shipping binary files that can be read with the readBinaryFile functions on the QML Buffer element. Finally this also gives us the complete structure of our scene.

Buffer {
    type: Buffer.VertexBuffer
    data: readBinaryFile("qrc:/assets/binaries/bufferdata.bin")


Creating a Blender Exporter Addon

A blender addon can be easily created by subclassing bpy.types.Operator and optionally bpy_extras.io_utils.ExporterHelper which provides convenience helpers as well as a default UI layout.

Overview of an exporter class

  1. Define members from bpy.types.Operator and ExporterHelper
    • bl_idname
      • The addon will be accessible in the Blender API though bpy.ops.bl_idname
    • bl_label
      • The name used on the export button UI
    • filename_ext
      • The name of our format extension if we have one
  2. Set UI properties and export options
    • Blender provides default property types
    • In the Qt3DExporter cases, properties were added to control:
      • whether to export only the selected objects or the whole scene
      • whether to export only the visible objects in the scene
      • whether we want meshes to be grouped in a collection
      • whether we want materials to be grouped in a collection
      • whether we want to export a full Qt3D application or just the
  3. Implement the draw method to lay out our properties
    • Retrieve the operator’s layout and add rows and columns
      • We can add labels and reference properties you have previously created
  4. Define the execute method which will be the entry point of our exporter
class Qt3DExporter(bpy.types.Operator, ExportHelper, OrientationHelper):
    """Qt3D Exporter"""
    bl_idname = "export_scene.qt3d_exporter";
    bl_label = "Qt3DExporter";
    filename_ext = ""

    # We set up exporter UI here
    use_mesh_modifiers = BoolProperty(
        name="Apply Modifiers",
        description="Apply modifiers (preview resolution)",

    use_selection_only = BoolProperty(
        name="Selection Only",
        description="Only export selected objects",

    def draw(self, context):
        layout = self.layout
        col =
        col.label("Nodes", icon="OBJECT_DATA")
        col.prop(self, "use_selection_only")
        col.prop(self, "use_visible_only")

    def execute(self, context):
        # Actual exporting work to be done here

def createBlenderMenu(self, context):
    self.layout.operator(Qt3DExporter.bl_idname, text="Qt3D (.qml)")

# Register against Blender
def register():

def unregister():

Most of the work will be done in the execute method. When reaching that point you’ll want to:

  1. check which options have been selected by the user
  2. retrieve the export path selected
  3. gather data from the blender scene
  4. perform any post processing or conversion
  5. write the exporter data in whichever format you’re interested


Parsing Blender Data


We can do a lot of things with the Blender API, the hard part is really finding out what you need

In our particular case we only care (for a first version) about:

  • Objects (
    • Collections of objects that reference a datablock
      • name
      • data (reference to a datablock)
      • type (type of datablock being referenced)
      • matrix_local
      • matrix_world
      • select (whether we are selected or not)
      • parent (reference to a parent object)
  • Meshes (
    • Collection of datablocks containing information about a mesh
      • name
      • material slots (references to materials used by the mesh)
      • uv_layers
      • vertex_colors
      • vertices (list of position)
      • edges (an edge references 2 vertices)
      • loops (collection of loops, a loop references a vertex and an edge)
      • polygons (a list of loops forming a polygon)
  • Materials (
    • Collection of datablocks containing information about a materials
      • name
      • ambient
      • diffuse_color
      • specular_color
  • Modifiers (object.modifiers)
    • Collection of modifiers an object can reference
    • A modifier is a visual transformation applied to a mesh
      • Mirror
      • Array
      • Solidify
      • Subsurface…
    • An object referencing a Mesh datablock that has modifiers can be transformed into a Mesh with the modifiers applied by calling object.to_mesh()
  • Lamps (
    • Collection of datablocks containing information about lamps
      • type (POINT, SPOT, SUN)
      • color
      • intensity
      • SPOT
        • spot_size (cut off angle)
        • constant_coefficient (constant attenuation)
        • linear attenuation
        • quadation attenuation
      • POINT
        • constant_coefficient (constant attenuation)
        • linear attenuation
        • quadation attenuation
  • Scene (bpy.context.scene)
    • References objects and render settings for the scene

Now that we know what we care about, the next part is traversing these collections and converting them to Qt 3D content.


First we need to go over all the meshes in the scene and gather information required to convert these to QGeometryRenderers, QGeometry, QAttributes and QBuffers.

The idea is to go over each Blender mesh and process then as follows:

  1. Triangulate
  2. Apply the modifiers it references
  3. Retrieve vertex data (position, normals, texture coordinates, colors)
    1. Write data into a binary file
    2. Record description of attributes
  4. Compute the indices
    1. For each material being referenced by the blender mesh
      1. Create a submesh (basically a QGeometryRenderer in Qt3D)
      2. For each polygon referenced by the submesh
        1. compute list of indices based on  the loops of the polygon
      3. generate and record the IndexAttribute for the submesh
    2. Generate the IndexBuffer based on the sub meshes
    3. Write data into a binary file.

We keep the data we have produced here for later.


Next, we need to gather information about each instance of Material of the scene to later on create and instantiate QMaterial subclasses.

For now the exporter is only recording the name, ambient, diffuse and specular color. Later on I’d like to extend that to either export a shader directly or switch to PBR materials.


Once we’ve created an intermediary representation for our mesh data and material data, we can proceed with the actual exporting of the scene.

The idea is to retrieve all the objects references by the BlenderScene. Then, from these objects, we can create a hierarchy.

Finally it’s just a matter of traversing the tree.

For each object:

What type of object we are dealing with?


All of the above work has been implemented in a dedicated Exporter class. It is instantiated and called in the execute function of our addon which looks like below:

    def execute(self, context):
        exportSettings = self.as_keywords()
        exportSettings["global_matrix"] = axis_conversion(to_forward=self.axis_forward, to_up=self.axis_up).to_4x4()

        self.binaryDirectoryName = "assets/binaries/"
        self.shadersDirectoryName = "assets/shaders/"
        self.qmlDirectoryName = "qml"

        self.userpath =
        if not os.path.isdir(self.userpath):
            self.userpath = os.path.dirname(self.userpath)
            msg = "Selecting directory: " + self.userpath
  {"INFO"}, msg)

        # switch to work dir and create directories
        if not os.path.exists(self.binaryDirectoryName):
        if not os.path.exists(self.shadersDirectoryName):
        if not os.path.exists(self.qmlDirectoryName):

        # Save scene into scene
        scene = bpy.context.scene

        exporter = Exporter(scene, exportSettings)

        # Create QML Files

        # Create .qrc file

        # Create main.cpp

        # Create .pro

Does it work?

Well actually it does.

As a test sample I’ve used a mesh from blendswap

Where to get it?

Here. Usage instructions are provided on the repository. It should work with pretty much any blender scene. Just make sure you have assigned a material to the elements you want to export.

Next steps

  • Export Animations
  • Export Armatures
  • Better Material support
    • export textures
    • generate shader when possible
    • investigate if the QShaderProgramBuilder could be used to export node trees from blender
  • Use instancing for Array modifiers
  • C++ support

Categories: KDAB on Qt / Qt3D

4 thoughts on “Exporting 3D content for Qt 3D with Blender”

  1. Good job! But when I run the export project ,it crashed with output: “The process was ended forcefully.”
    I use Qt5.9.4 and Blender2.79 .

    1. Paul Lemire

      I have only tried with 5.10/5.11 so far. Would you be able to try with any of these versions as well? I’ll try to check if I have a 5.9 build around to test with.

  2. It seem the “Additional Dependencies option of “Qt Resource Compiler section doesn”t work as expected (or, at leats, how to used to work before plugin change). In my case I created a .qrc file for contain the .qml file sources to embed into resource format inside the executable. To automatically “force recompilation of the .qrc file every time some .qml files inside is changed I set the each .qml file name into the Visual Studio file property “Additional Dependencies field and this work as expected using standard VS custom compilation mode. Now, after the pluguin project update, this feature seem still present in the “new interface but making same operation by insert the .qml sources file name doesn”t sort any effect an no compilation was executed. I”m misunderstading the use of this field?

Leave a Reply

Your email address will not be published. Required fields are marked *

By continuing to use the site, you agree to the use of cookies. More information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.