qemu-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Qemu-devel] PATCH: fix bgr color mapping on qemu on Solaris/SPARC


From: Dan Sandberg
Subject: Re: [Qemu-devel] PATCH: fix bgr color mapping on qemu on Solaris/SPARC
Date: Fri, 12 May 2006 18:40:43 +0200
User-agent: Mozilla Thunderbird 1.0.7 (Windows/20050923)

Jamie Lokier wrote:

Dan Sandberg wrote:
When the screen is "painted" the DAC's read from the host video buffer (1600x1200) and interpret it as RGB. Somewhere they "hit" the left boundary of the separate viewport that you have set up and bang, on the fly they switch to reading 800x600-organized data from the other video buffer and interpreting it as BGR. Later on the same video line they "hit" the right boundary of the separate viewport and bang they switch back to reading from the main buffer and interpreting it as RGB.

As a result the 1600x1200 RGB buffer and the 800x600 BGR buffer are equally active and equally often updated on the same physical screen - without need for any moving data around, and without any time consuming activity at all taking place as all switches are done on the fly in the background by special hardware (if the board supports this).

It is like having two separate physical video boards, each with its own display buffer.

Thanks; I didn't know OpenGL had that function as well as 3d rendering.

That's what the Xv extension does ("X video") - it's to provide an
overlay to be used by video players.  Xv scales the source image and
mixes it with the primary framebuffer in the way you describe.
However, Xv is intended for non-RGB colourspace source formats, so may
not be suitable for Qemu.  I don't know if Xv sometimes can support RGB.

Since Xv is supported by many video cards, even old ones without 3d,
or without working 3d drivers, I'm surprised that particular OpenGL
function isn't commonly implemented with equal performance.

-- Jamie


_______________________________________________
Qemu-devel mailing list
address@hidden
http://lists.nongnu.org/mailman/listinfo/qemu-devel

Oooops,
I just took a look at a list of OpenGL pixelformats. I really thought I had seen BGR there, but I was wrong. Again I am new to OpenGL so do not take anything I say for the truth. Here is the list:

GL_ALPHA, GL_ALPHA4, GL_ALPHA8, GL_ALPHA12, GL_ALPHA16, GL_LUMINANCE, GL_LUMINANCE4, GL_LUMINANCE8, GL_LUMINANCE12, GL_LUMINANCE16, GL_LUMINANCE_ALPHA, GL_LUMINANCE4_ALPHA4, GL_LUMINANCE6_ALPHA2, GL_LUMINANCE8_ALPHA8, GL_LUMINANCE12_ALPHA4, GL_LUMINANCE12_ALPHA12,
GL_LUMINANCE16_ALPHA16, GL_INTENSITY, GL_INTENSITY4, GL_INTENSITY8,
GL_INTENSITY12, GL_INTENSITY16, GL_R3_G3_B2, GL_RGB, GL_RGB4, GL_RGB5,
GL_RGB8, GL_RGB10, GL_RGB12, GL_RGB16, GL_RGBA, GL_RGBA2,
GL_RGBA4, GL_RGB5_A1, GL_RGBA8, GL_RGB10_A2, GL_RGBA12, or GL_RGBA16

Anyway, many people think of OpenGL as just 3D, but it is extremely competent for 2D (given a good driver). If you want an example of OpenGL superior 2D performance compared to Windows GDI routines, then go to
http://www.skinhat.com/lazarus/
and download Lazarus from there with GLscene preinstalled.
(You probably also need to download and install FreePascal from: http://www.lazarus.freepascal.org/) (If you do not have OpenGL drivers installed you have to get them from your graphic card manufacturer's homepage.)

There are a large number of OpenGL examples included complete with source code ready to compile and test. Open the example project glscene/demos/bench/canvas/canvas.lpr and compile and run by hitting the green arrow.

Windows GDI performance look really bad in comparison. On my computer the 20.000 ellipses test with a line width of 2 took 2268 ms for standard Windows GDI and 145 ms for the OpenGL 2D-canvas (and its just a standard business computer with no fancy graphic card at all).

There are many other nice examples included as well, so it is well worth the download.

Regards
Dan





reply via email to

[Prev in Thread] Current Thread [Next in Thread]