[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: OpenGL sample code on Linux
From: |
Fred Kiefer |
Subject: |
Re: OpenGL sample code on Linux |
Date: |
Tue, 20 Jan 2009 00:22:35 +0100 |
User-agent: |
Thunderbird 2.0.0.19 (X11/20081227) |
I don't think that this code in context.c is the problem here. Just
before those line we try to get a 32 bit visual, when XRENDER is
defined, which is the case on my machine. It is rather the other way
round, the gl code uses the first visual and that has a lower depth.
Thomas Gamper wrote:
> for (i=numvis-1, best = -1; i>=0; i--)
> {
> if (vinfo[i].depth == 24) best = i;
> else if (vinfo[i].depth>24 && best<0) best = i;
> }
>
> Try to replace 24 with 32. Since an OpenGL Pixelformat without alpha
> channel basically makes no sense.
>
> TOM
>
> Fred Kiefer schrieb:
>> I hacked bestContext() in back/Source/x11/context.c to return False and
>> with that both examples work again. This looks like the problem has to
>> do with the visual, drawable or depth selection. On my machine
>> glXChooseFBConfig() finds 18 usable configurations, but the code sticks
>> to the first one.
>>
>> Fred
>>
>