OpenGL 4.1 in OSX Xcode 6 Preferences
I'm trying to learn OpenGL and immediately found out that OpenGL versions> 3.2 were more important to learn.
So I set up mine Mac OS X 10.10.3
with Xcode and command line tools to get some examples.
But this is due to pain in the butt.
The error I am getting.
Undefined symbols for architecture x86_64:
"LoadShaders(char const*, char const*)", referenced from:
_main in main.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
So I have the latest version GLFW GLEW cmake
to keep everything in order.
I add my header and library paths to the project.
Library search paths
/opt/X11/lib /usr/lib/ /Users/mac/Dev/workspace/C++/Test/glfw/build/src/Debug /Users/mac/Dev/workspace/C++/Test/glew/lib
- I add the last two directories to double them in the directory
Header search path is like include instead of lib
my linkers are huge (from random stuff)
-framework OpenGl -lGLUT -lglew -lglfw -lGL -lGLU -lXmu -lXi -lXext -lX11 -lXt
and I linked Binaries to Libraries. Where am I going wrong? my GPU is Intel HD 4000 and seems to support up to OpenGL4.1.
Do I need to add compiler flags? Is it just not believable?
Here is the tutorial code I'm trying to run.
#include <stdio.h>
#include <stdlib.h>
//GLEW
#define GLEW_STATIC
#include <GL/glew.h>
#include <GLUT/glut.h>
//GLFW
#include <GLFW/glfw3.h>
#include <AGL/glm.h>
GLFWwindow* window;
//no real explanation of what this is.....
#include <common/shader.hpp>
//MAIN FUNCTION
int main(int argc, char *argv[])
{
if( !glfwInit() )
{
fprintf( stderr, "Failed to initialize GLFW\n" );
return -1;
}
// specify GL information
glfwWindowHint(GLFW_SAMPLES, 4); // 4x antialiasing
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 4); // We want 4.1>= OpenGL_version >=3.3
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 1);
glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE); // To make MacOS happy; should not be needed
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE); //We don't want the old OpenGL
// Open a window and create its OpenGL context
window = glfwCreateWindow( 600, 375, "Tutorial 02", NULL, NULL);
if( window == NULL ){
fprintf( stderr, "Failed to open GLFW window. If you have an Intel GPU, they are not 3.3 compatible. Try the 2.1 version of the tutorials.\n" );
glfwTerminate();
return -1;
}
glfwMakeContextCurrent(window);
// Initialize GLEW
glewExperimental=true; // Needed in core profile
if (glewInit() != GLEW_OK) {
fprintf(stderr, "Failed to initialize GLEW\n");
return -1;
}
// Ensure we can capture the escape key being pressed below
glfwSetInputMode(window, GLFW_STICKY_KEYS, GL_TRUE);
// Dark blue background
glClearColor(0.0f, 0.0f, 0.4f, 0.0f);
// Create Vertex Array Object.
GLuint VertexArrayID;
glGenVertexArrays(1, &VertexArrayID);
glBindVertexArray(VertexArrayID);
// Create and compile our GLSL program from the shaders
GLuint programID = LoadShaders( "SimpleVertexShader.vertexshader", "SimpleFragmentShader.fragmentshader" );
// An array of 3 vectors which represents 3 vertices
static const GLfloat g_vertex_buffer_data[] = {
-1.0f, -1.0f, 0.0f,
1.0f, -1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
};
GLuint vertexbuffer;
glGenBuffers(1, &vertexbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);
do{
// Clear the screen
glClear( GL_COLOR_BUFFER_BIT );
// Use our Shader
glUseProgram(programID);
// 1st attribute buffer : vertices
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
// glVertexAttribPointer(
// Atrribute,
// size,
// type,
// normalized,
// stride,
// array buffer offset
// );
glVertexAttribPointer( 0, 3, GL_FLOAT, GL_FALSE, 0, (void*)0);
// Draw the triangle
glDrawArrays(GL_TRIANGLES, 0, 3);
glDisableVertexAttribArray(0);
// Swap buffers
glfwSwapBuffers(window);
glfwPollEvents();
} // Check if the ESC key was pressed or the window was closed
while( glfwGetKey(window, GLFW_KEY_ESCAPE ) != GLFW_PRESS &&
glfwWindowShouldClose(window) == 0 );
// Cleanup VBO
glDeleteBuffers(1, &vertexbuffer);
glDeleteVertexArrays(1, &VertexArrayID);
glDeleteProgram(programID);
// Close OpenGL window and terminate GLFW
glfwTerminate();
return 0;
}
I mainly followed this tutorial here Oscar Chavez, setup instructions
source to share
The function is LoadShaders()
not part of OpenGL and is not part of any of the libraries you are using (I see GLEW and GLFW). In short, LoadShaders()
missing.
I'm guessing this LoadShaders()
is a feature that the author of the tutorial wrote, but the formatting for the tutorial is a little frustrating (to say the least!) So I'm not sure.
source to share
//no real explanation of what this is.....
#include <common/shader.hpp>
Perhaps the author of this tutorial wrote something himself and just threw it into the sources of the project, without saying anything more. The most annoying part of these lines is the use of wedge brackets ( <β¦>
) instead of quotes ( "β¦"
), because it is misleading that the header is part of the system libraries and not something local to the project. In a sense, it probably is common/shader.cpp
, and it probably will have a function LoadShaders
. This particular feature is something completely ordinary , not part of OpenGL or any of the helper libraries that your linker tells you is not.
source to share