More GLSL...
Does ATI even test their drivers before they release them? :P I've found two bugs in their GLSL implementation already. One is minor: the #version preprocessor directive isn't supported. On the other hand, supporting #version would be simple. A high school intern could add it. The other one is more serious. I must have started exceeding the instruction limits or something because LinkProgramObjectARB was simply crashing when my program got too long. :o
It's also way too easy to go into software fallback mode. The optimizer kind of sucks -- the following code was throwing me into software:
float getOffset(int i) { float f = vector[i]; ... } void main() { float offset = getOffset(0) + getOffset(1) + getOffset(2) + getOffset(3); ... }
Whereas a simple change took me back into hardware:
float getOffset(float f) { ... } void main() { float offset = getOffset(vector[0]) + getOffset(vector[1]) + getOffset(vector[2]) + getOffset(vector[3]); ... }
Even *I* could write an optimizer better than that. (And I plan to, by the end of next year. More later.) It's not like the GPU is complicated: there's a tiny set of instructions, no (well, limited now) branching, and no stack or function call logic. Linear execution with tons of registers... a compiler's dream!
Meh! I'm standardizing on GLSL nonetheless. Hopefully things start to look up. :)
Yes, I hate ATI's drivers. HATEHATEHATE
chad, you never cease to amaze me. I sometimes regret not completely and utterly embracing the programmer lifestyle. grin