9

Specifying the GLSL version gives a syntax error when using LWJGL. I haven't tried to reproduce this issue outside LWJGL. This is happening on multiple Macs running Lion.

I've gotten both vertex and fragment shaders to work without using #version. But I'm about to use the texture function, which seems to require a #version directive.

Here's the simplest failing example:

#version 120

void main() {
  gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}

Compiling this fragment shader and calling glGetShaderInfoLog gives this error:

ERROR: 0:1: '' : syntax error #version

Replacing 120 with anything else, such as 110, also gives an error. Curiously, though, if I use 130 or higher, it gives the same error plus a complaint about the version not beig supported. (i know my system doesn't have GLSL 1.3, but it's still weird that this error displays when the compiler is acting like it doesn't understand the version tag.)

I'm on a Mac with an ATI Radeon HD 4670. GL_VERSION is 2.1 ATI-7.12.9 and GL_SHADING_LANGUAGE_VERSION is 1.20.

Given that, I don't see any reason why GLSL 1.20 should be unavailable. And it's really weird to me that it's saying #version is a syntax error, as opposed to saying something about an unsupported GLSL version.

2
  • Where is your shader loading code? Commented Dec 17, 2011 at 0:14
  • I can paste that in once i have internet. (Using my phone right now.) Would thab source code be helpful? Commented Dec 17, 2011 at 0:16

2 Answers 2

18

Solved! It had nothing to do with OpenGL. My file reader code was dropping all line breaks. This was fine in the body of the shader, which had semicolons. But the preprocessor directive had no semicolon to protect it from this error.

So for anyone with this problem, make sure the code you're actually passing to glShaderSource still has its linebreaks.

Sign up to request clarification or add additional context in comments.

3 Comments

Thanks a lot, you've saved my night! I'm on Mac with SDL and C++, but with this obscure error message OpenGL was trying to tell me the same thing.
A way to approach this may be found here. The BufferedReader pulls the shader source in line by line, discarding CR/LF characters. You then explicitly reappend a newline character to each line using the StringBuilder (see about 2/3 of the way down that page).
Thanks a lot as well. This was the exact issue I was having. Pesky Java IO removing line breaks!
1

Both the vertex and the fragment shader need to have the same version. So if you add a #version 120 to the fragment shader, you should also add it to the vertex shader, too. But it's a bit strange that this is reported as a syntax error. Maybe there's another error, but both definitely have to have the same version tag.

EDIT: Also keep in mind that the version tag needs to be the first line in the shader source code (newlines and comments should be Ok by specification, but who knows what the drivers think).

3 Comments

I tried using the same version in both, but no luck. Since this happens in compilation and not linking, I don't think the shaders know about each other anyway.
@Jarrett Considering the ominous vendor of your graphics card, it may also be a driver bug. Though this is really a very simple feature and should be no problem to support, but then again, it's still ATI.
I jusy verified the same issue with a GeForce 320M. So I'm starting to doubt that both vendors have the exact same bug in a basic feature.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.