Sign up for the KDAB Newsletter
Stay on top of the latest news, publications, events and more.
Go to Sign-up
Find what you need - explore our website and developer resources
20 August 2025
The integration of artificial intelligence into software development environments has rapidly evolved, and Qt Creator is no exception. With the introduction of the Qt AI Assistant by Qt Company, developers working with Qt Creator now have access to AI models through the IDE. This post provides an introduction to the Qt Creator plugin.
This is part 1 of an ongoing series about AI coding with Qt.
Qt AI Assistant is a commercial plugin for Qt Creator to bring current AI models to the IDE. Features provided by the plugin include
This is a step up from the existing GitHub Copilot support in Qt Creator that was focused on code completion only.
Completing Qt AI Assistant is a publicly available set of models by Qt Group. The models are based on CodeLlama and are fine-tuned for usage with Qt 6 QML. They are not included with the plugin but need to be set up manually using Ollama.
The setup process for Qt AI Assistant is more involved than some other AI coding tools. The plugin is currently available as part of the commercial Qt distribution. Installation requires enabling the appropriate extension repository within Qt Creator and activating the plugin. Once installed, configuration is necessary to connect the plugin to a large language model (LLM) provider.
Supported LLMs include OpenAI (ChatGPT), Anthropic (Claude), Google (Gemini), and self-hosted models via Ollama. For OpenAI integration, developers must use the OpenAI developer platform to generate an API key, which is different from having an account for ChatGPT. This API key is then entered into the plugin’s settings within Qt Creator. Other models require similar setup using URLs and credentials, depending on the provider or the self-hosting method.
More information is in this video linked at the bottom of this blog post.
The plugin distinguishes between code completion suggestions as you type and prompt-based interactions, such as asking for code explanations or generating new code. For QML, a specialized Code Llama 13B QML model can be used. For other languages general purpose models are employed.
The chat interface allows developers to highlight code and request explanations or modifications. For example, selecting a block of QML or C++ and asking the assistant to "explain the selected code" yields a detailed, context-aware explanation.
A notable feature is the ability to generate test cases from selected QML code. While the generated tests may require manual refinement, this automation can accelerate the initial setup of unit tests and reduce repetitive work. The plugin’s approach is to copy relevant code into the test, which may not always result in optimal reuse, but provides a useful starting point.
Developers can choose between different LLMs to use for the chat and review vs the code completion scenario. For QML model choice is separate, and offers including the fine-tuned models provided by Qt Company. This flexibility extends to hosting options, supporting both cloud and local deployments, depending on organizational needs and privacy considerations.
For a detailed walkthrough and live demonstrations, watch the following episodes of "The Curious Developer" series:
Additionally, the official Qt AI Assistant product page provides up-to-date information on features and availability: https://www.qt.io/product/ai-assistant.
Future posts in this series will consider alternative coding tools useful for Qt and will bring the newest developments of the tools we mention here.