Skip to content

iOS video rendering

The multimedia layer in Qt contains various possibilities for including streaming video in your QtQuick applications – most commonly there’s the Video element, which you include in your QML interface, specifying the source URL, playback options and so on.

Unfortunately, on iOS you’ll discover a limitation: the backend for QtMultimedia only supports window-level integration. In practice this means any QtQuick items supposed to be positioned on top of the video, in fact appear behind it. At KDAB we have several clients who want to show customised playback interfaces on top of the video stream, with application-specific data, and of course taking advantage of all the visual power and flexibility of QtQuick.

Since Qt is based around the idea of people contributing and collaborating, we decided to investigate the work involved in fixing this limitation, and we’re pleased to say that we found a high-performance solution which will be included in Qt 5.5. The most complex piece of the problem, getting hardware-decoded video frames into an OpenGL texture on the GPU, is handled for us by a family of CoreVideo APIs, and an object called a CVOpenGLESTextureCache. This interfaces to the iOS DRM and hardware-accelerated video decoding layers, to give us video frames in a format we can use. Even better, the layer takes care of delivering frames with colour-space conversion from YUV (the standard for video) into the RGB space we need for QtQuick rendering.

Here’s the result: Qt Quick, Controls and graphical effects on top of video, on iOS:

Of course, interfacing the CoreVideo classes into the Qt Multimedia code required some experimentation, especially to correctly manage the lifetime of the video frames. Since each frame is consuming GPU texture memory, it’s important we are able to know when they can be safely discarded. Fortunately the existing window-based implementation of video on iOS already provides the OpenGL context needed to initialise the texture cache.

In the end we’re delighted with the result. In particular there are no intermediate copies of the video frame data between what CoreVideo produces and what is passed to the scene-graph for display, so this solution should be suitable for HD video without excessive power consumption.

FacebookTwitterLinkedInEmail

Categories: iOS / KDAB Blogs / KDAB on Qt

3 thoughts on “iOS video rendering”

  1. That’s awesome stuff!

    Did you also find a way to render the contents of a UIView into the QtQuick Scene, such as the native WebView, which is currently also just an overlay?

    1. James Turner

      Rendering a UIView into a QtQuick scene would be a separate problem I think. Ultimately it means capturing the CALayer(s) backing the view into a generic OpenGL texture. That’s what the CARenderer code does, but it only exists on OS-X, not iOS.

      Also, rendering the layers is probably solvable but interacting with them is going to be very complex in terms of keyboard focus, accessibility, scrolling and so on. So simply presenting UIViews separately to QtQuick is likely to be much more robust (they handle events natively). Is there a specific scenario that needs rendering UIView content inside the scene? Ignoring the HTML / web content case, which I think Qt WebEngine will solve.

      1. Hello,

        I have a particular use case to illustrate why it would be useful.
        I develop an iOS/Android App (Squareboard app) and we use gestures to navigate throughout the pages. A single-finger swipe from border left makes the app slide Braxton the previous hierarchical page.
        Using the WebView in the Article page makes it impossible to capture the gesture as the native UIWebView overlays on top of QML.

Leave a Reply

Your email address will not be published. Required fields are marked *