Table of Contents

Qt 6: How To port Shader Effects from Qt 5



Recently, we wrote about porting a Qt 5 application to Qt 6. The bigger part of the work is due to changes in Qt’s graphics logic – especially when you have lots of shaders strewn across your QML code.

In this blog post, we introduce a step-by-step guide on how to move the shader code you might have in your Qt 5 application to Qt 6.

Quick Overview

To sum it up, it mostly comes down to the following changes:

  • Move embedded shader strings from QML to separate files
  • Add a version to the first line in your shader files
  • The attribute and varying qualifiers got removed, meaning you have to rewrite your variable declarations. All variable declarations should get a layout qualifier. This process is described in detail below
  • Remove the precision specifiers (highp and lowp)
  • Fragment shader’s gl_position output becomes an ordinary out variable
  • Port the deprecated glsl functions to their new counterparts, i.e. change occurrences of texture2D to texture

Porting the shaders

We have prepared a minimal example to demonstrate the porting process. It has a vertex shader deforming the incoming vertices to generate a wiggling effect and a fragment shader to blend between two abstract images back and forth. Below is the whole main.qml:

import QtQuick 2.15
import QtQuick.Window 2.12
Window {
    id: window
    width: melting.implicitWidth
    height: melting.implicitHeight
    visible: true
    color: "black"
    ShaderEffect {
        id: wiggleEffect
        property real strength: 5.0
        property real time: 50.0
        property real blendingProgress: -1
        property variant source1: firstImageSource
        property variant source2: secondImageSource
        anchors.centerIn: parent
        width: triangle.width
        height: triangle.height
        mesh: GridMesh {
            resolution: Qt.size(20, 20)
        UniformAnimator on time {
            from: 0
            to:  100
            duration: 2000
            loops: -1
            running: true
        UniformAnimator on blendingProgress {
            from: -1
            to:  1
            duration: 10000
            loops: -1
            running: true
        vertexShader: "
                float noise(vec2 uv){
                    return fract(sin(dot(uv, vec2(12.9898,78.233))) * 43758.5453123);
                uniform highp mat4 qt_Matrix;
                uniform lowp float strength;
                uniform lowp float time;
                attribute highp vec4 qt_Vertex;
                attribute highp vec2 qt_MultiTexCoord0;
                varying highp vec2 qt_TexCoord0;
                void main() {
                    qt_TexCoord0 = qt_MultiTexCoord0;
                    highp vec4 pos = qt_Vertex;
                    lowp float angle = 2. * 3.141 * (noise(pos.xy) + time / 100.0);
                    lowp float strengthWithVariation = strength * noise(pos.yx);
                    pos.x += cos(angle) * strengthWithVariation;
                    pos.y += sin(angle) * strengthWithVariation;
                    gl_Position = qt_Matrix * pos;
        fragmentShader: "
                varying highp vec2 qt_TexCoord0;
                uniform lowp float qt_Opacity;
                uniform lowp float blendingProgress;
                uniform lowp sampler2D source1;
                uniform lowp sampler2D source2;
                void main() {
                    lowp float alpha = abs(blendingProgress);
                    gl_FragColor = ((1. - alpha) * texture2D(source1, qt_TexCoord0)
                                   + alpha * texture2D(source2, qt_TexCoord0))
                                   * qt_Opacity;
    ShaderEffectSource {
        id: firstImageSource
        live: false
        hideSource: true
        sourceItem: Image {
              id: melting
              width: implicitWidth
              height: implicitHeight
              source: "qrc:/melting.jpg"
              fillMode: Image.PreserveAspectFit
    ShaderEffectSource {
        id: triangle
        live: false
        hideSource: true
        sourceItem: Image {
              id: triangle
              width: implicitWidth
              height: implicitHeight
              source: "qrc:/triangle.jpg"
              fillMode: Image.PreserveAspectFit
As you can see, there is one ShaderEffect component coming with both, a vertex shader and a fragment shader. The shader code is embedded as strings, as it’s often the case in Qt 5 applications.

To port the ShaderEffect, we first have to outsource the shader strings. Create a directory with the name shader and add two files to the directory:
  • wiggleblender.vert
  • wiggleblender.frag

(or whatever names you give to your shaders; use appropriate file extensions to distinguish between fragment and vertex shaders).

Copy the shader strings into the corresponding files. Now change the ShaderEffect properties fragmentShader and vertexShader in your main.qml accordingly:
ShaderEffect {
    id: wiggleEffect
    // ...
    vertexShader: "qrc:/shader/wiggleblender.vert.qsb"
    fragmentShader: "qrc:/shader/wiggleblender.frag.qsb"
Note that the file extension we are using here is qsb – which stands for Qt Shader Baker. Qsb files can contain different shader variants – one for each of the supported graphics backends like OpenGL, Vulkan, DirectX or Metal. We are going to create qsb files later on from the ported shader files.

You can reference the qsb files directly, but in production code it’s often better to access assets via Qt´s resource system.

Porting the shader code is straightforward.
#version 440                                                                       // 1
layout(location = 0) in vec4 position;                                             // 2
layout(location = 1) in vec2 texcoord;
layout(location = 0) out vec2 coord;                                               // 3
layout(std140, binding = 0) uniform buf {                                          // 4
    mat4 qt_Matrix;                                                                // 5
    float qt_Opacity;
    float strength;
    float time;
    float blendingProgress;
} ubuf;
out gl_PerVertex { vec4 gl_Position; };                                            //6
float noise(vec2 uv){
    return fract(sin(dot(uv, vec2(12.9898,78.233))) * 43758.5453123);
void main() {
    coord = texcoord;
    vec4 pos = position;
    float angle = 2. * 3.141 * (noise(pos.xy) + ubuf.time / 100.0);                // 7
    float strengthWithVariation = ubuf.strength * noise(pos.yx);
    pos.x += cos(angle) * strengthWithVariation;
    pos.y += sin(angle) * strengthWithVariation;
    gl_Position = ubuf.qt_Matrix * pos;
Below is a step-by-step recipe you can use for your own code:
  1. Add a version string
  2. Add these two lines to fetch the vertex data. Qt fills a vertex buffer with vertex positions at location 0 and texture coordinates at location 1. Instead of using the attribute qualifier as before, now we declare inputs to the vertex shader by writing modern glsls code using layout and in qualifiers
  3. Declare the outputs you would like to pass to the fragment shader – these are the varying declarations in your non-ported code. For each output variable you have to add a line in the form layout (location = n) out type name and increment n starting at 0.
  4. Declare the uniforms. They are passed directly from QML properties to the shader code by Qt. All uniforms except samplers are provided via a uniform buffer. Qt always binds the buffer at 0 and stores the variables qt_Matrix and qt_Opacity at the very beginning. Note that you have to add std1440, which determines how the uniform buffer is laid out in memory
  5. Remove all highp and lowp specifiers
  6. Optional; the vertex shader writes its output to a built-in variable. If you leave this line out, the graphics engine still knows what to do
  7. Don’t forget to access the uniform variables via the uniform buffer (easy to miss)

Porting the fragment shader works in a similar fashion:
#version 440                                                 # 1
layout(location = 0) in vec2 texCoord;                       # 2
layout(location = 0) out vec4 fragColor;                     # 3
layout(std140, binding = 0) uniform buf {                    # 4
    mat4 qt_Matrix;
    float qt_Opacity;
    float strength;
    float time;
    float blendingProgress;
} ubuf;
layout(binding = 1) uniform sampler2D source1;              # 5
layout(binding = 2) uniform sampler2D source2;
void main() {
    float alpha = abs(ubuf.blendingProgress);
    fragColor = ((1. - alpha) * texture(source1, texCoord)  # 6, 7
                   + alpha * texture(source2, texCoord))
                   * ubuf.qt_Opacity;                       # 8
  1. Add a version string
  2. Declare the variables coming from the vertex shader, i.e. the varying declarations in the non-ported code. You can just copy all variable declarations with out qualifiers from the vertex shader and replace all out keywords with in. If you don’t have a vertex shader, just copy the line as given in the code snippet above – Qt passes the texture coordinate from the default vertex shader. Note that the layout locations need to match (this does not apply to the variable names, which may differ).
  3. Declare the output – a vec4 color. Fragment shader output must be declared explicitly. Output of the color to gl_FragColor does not work anymore.
  4. Add the uniform buffer (the same that we added to the vertex shader)
  5. Declare all samplers. Don’t forget to assign different bindings. Note that the binding point 0 is reserved for the uniform buffer by Qt.
  6. Port gl_FragColor to whatever name you have used for the out variable
  7. Change all occurrences of texture2D to texture
  8. Again, access uniforms via the uniform buffer

Baking the shaders

If everything has been ported correctly, you are finally able to bake the shader files. There are two ways to achieve this:
  • Manually
  • Let the build system handle the details
Of course, delegating the shader baking to your build system is recommended, but in the early Qt6 days things can still be a bit flaky. We explain both cases – manually baking the shaders into qsb files and using CMake as the build system


Use the command line tool qsb as described below:
qsb --glsl 100es,120,150 --hlsl 50 --msl 12 -b -o  wiggleblender.vert.qsb wiggleblender.vert
qsb --glsl 100es,120,150 --hlsl 50 --msl 12    -o  wiggleblender.frag.qsb wiggleblender.frag 
This creates the vertex and fragment shader qsb files suitable for rendering with OpenGL, Vulkan, DirectX and Metal backends. The -b parameter adds some magic so that the scenegraph renderer is able to properly batch the ShaderEffect. For more information on qsb and its parameters, please consult the official documentation. Optionally add the files to the Qt resource system:
    <qresource prefix="/">

Using CMake

Add the following lines to CMakeLists.txt:

qt6_add_shaders(${PROJECT_NAME} "porting-example-shaders"

Unfortunately, as of today there seems to be a bug when the Makefile generator is used, meaning the qsb command line tool is not called. In the future qsb should just be an arbitrary build step like calling moc on header files, working silently in the background.


Qt made the transition from Qt5 to Qt6 a breeze. What we have learned is, that if you port from Qt5 to Qt6 and have refined your QML code with a lot of shaders, you need to pay attention on this detail. While following our guide gives you the right tools at hand to master this challenge, feel free to drop a comment down below if any question arise while doing so.

Leave a Reply

Your email address will not be published. Required fields are marked *

Berthold Krevert

Berthold Krevert

Since over 10 years, Berthold Krevert works as a senior software developer here at basysKom GmbH. He has a strong background in developing user interfaces using technologies like Qt, QtQuick and HTML5 for embedded systems and industrial applications. He holds a diploma of computer science from the University of Paderborn. During his studies he focused - amongst other topics - on medical imaging systems and machine learning. His interests include data visualisation, image processing and modern graphic APIs and how to utilize them to render complex UIs in 2d and 3d.
Share on facebook
Share on twitter
Share on linkedin
Share on reddit
Share on xing
Share on email
Share on stumbleupon
Share on whatsapp
Share on pocket

Read more

Jeremias Bosch
Awesome :-) The basysKom Toolbox

We would like to introduce our basysKom Toolbox to you. It is a state-of-the-art collection of best practices in agile management and software development and a valuable tool for every kickoff meeting.

Come and visit us on the Embedded World 2020 and take your own copy of the printed card deck with you!

Read More »
IoT Cloud
Heike Ziegler
IoT: Getting started with cloud and modern IoT and IIoT from scratch

IoT and IIoT applications are special compared to other kinds of cloud applications as they have to deal with devices existing “outside” of data centers.

The following series of articles provide an end-to-end overview of what Microsoft Azure offers to handle some of the challenges involved in connecting an IoT Device with the Cloud.

By working through this series you will learn about the major concepts involved in getting your IoT/IIoT device connected to Microsoft Azure. In our examples we will feature Qt, Node.Js, Protobuf from Google and much more to get you started.

Read More »