Loading images into OpenGL in iPhone

So you can’t make a game without images, right? Well, actually you can but that’s another story.

But how can you load a jpg or a png and use then on OpenGLES?

First let’s make a simple class Texture2D

class Texture2D 
	Texture2D(int id, int width, int height)
           _textureId = id; _width = width; _height = height;
	virtual ~Texture2D(void)
             // Delete Texture from HGL Memory
             glDeleteTextures(1, ((GLuint*)&_textureId));
	int  getTextureId() {return _textureId; }
	int _textureId;   // The reference ID of the texture in OpenGL memory
        int _width;
        int _height;

Now for the code to actually load the image

static Texture2D* IPhoneOperativeSystem::LoadImage(std::string imagefile)
         // Id for texture
	GLuint texture;	
        // Generate textures
	glGenTextures(1, &texture); 
	// Bind it
	glBindTexture(GL_TEXTURE_2D, texture);
	// Set a few parameters to handle drawing the image at lower and higher sizes than original
	NSString *path = [[NSString alloc] initWithUTF8String:imagefile.c_str()];
         NSData *texData = [[NSData alloc] initWithContentsOfFile:path];
        UIImage *image = [[UIImage alloc] initWithData:texData];
        if (image == nil)
	  return NULL;	
        // Get Image size
       GLuint width = CGImageGetWidth(image.CGImage);
       GLuint height = CGImageGetHeight(image.CGImage);
       CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
       // Allocate memory for image
       void *imageData = malloc( height * width * 4 );
       CGContextRef imgcontext = CGBitmapContextCreate( imageData, width, height, 8, 4 * width, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big );
       CGColorSpaceRelease( colorSpace );
       CGContextClearRect( imgcontext, CGRectMake( 0, 0, width, height ) );
       CGContextTranslateCTM( imgcontext, 0, height - height );
       CGContextDrawImage( imgcontext, CGRectMake( 0, 0, width, height ), image.CGImage );
        // Generate texture in opengl
       glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData);
	// Release context
	// Free Stuff
       [image release];
       [texData release];
       // Create and return texture
       Texture2D* tex = new Texture2D(texture,width, height);
       return tex;

So right now you can do something like

   Texture2D* tree_image = IPhoneOperativeSystem::LoadImage("tree_image.jpg");

For drawing, before sending your primitives just do something like:

glBindTexture(GL_TEXTURE_2D, tree_image->getTextureId());

This code should work on all iOS devices

More Reading

Post navigation

  • May I suggest a small correction :

    you need to set the buffer to 0 after the malloc
    memset(imageData,0,(height * width * 4 ));

  • I would not malloc+memset. If you really need to zero out memory (in this case, as you say, you don’t!) use calloc. This is from the Memory Usage Performance Guide from Apple:

    When you call memset right after malloc, the virtual memory system must map the corresponding pages into memory in order to zero-initialize them. This operation can be very expensive and wasteful, especially if you do not use the pages right away

    The calloc routine reserves the required virtual address space for the memory but waits until the memory is actually used before initializing it. This approach alleviates the need to map the pages into memory right away. It also lets the system initialize pages as they’re used, as opposed to all at once.

  • There is a standard objective-c class published from Apple called Texture2D doing the exact same task (an more). Maybe also have a look at that one. Should be easy to change it from obj-c to c++.


  • Right timming indeed… But I’m having an issue with this…

    I’ve added a .png file (test.png) to the resources, and made sure it gets copied to the output (it’s under Copy Bundle Resources on xCode)…

    Then I debug and when I reach the

    NSData *texData = [[NSData alloc] initWithContentsOfFile:path];

    path has the correct value (“test.png”), but the texData is empty after it runs this line… I went into the “build\Debug-iphoneos\unidir_vod_test.app” directory, and the “test.png” file is there (so I’m guessing it reaches the iPad)…

    I’m wondering if I’m missing any permissions, or if the filepath is not correct (maybe something gets introduced by the system when it builds the local filesystem)…

    Tried a websearch, but it’s like needle in a haystack… :\

    Any ideas? πŸ™‚

  • It helped… took me a while to understand “resourcePath” on the first line was the method of the class and not a variable… πŸ™‚

    It’s working now (I think, haven’t done anything with the texture yet), so thanks a million…

    Some additional questions:

    1) Do I have to release the NSString *path I got from NSBundle (or is it just a reference)?
    2) Why use “stringByAppendingPathComponent”, instead of just adding “/Game_Resources” to the resulting std::string?
    3) How do I say to my resources to go a specific directory, instead of being in the root directory of my application?

    Again, thanks for the help, this is confusing for an “old-school” PC/C++ programmer… πŸ™‚ and don’t get me started with getting the certificates up and running! πŸ˜€

  • Did you try and compile this?

    Texture2D(int texture id, int width, int height)

    is not valid…

  • Hi David,

    Hope you are doing well.

    I am an iOS developer. i am facing problem to convert UIImage into texture and then texture should be rendered into a UIView.

    I tired this with GLImageProcessing example which is provided by apple. But there is bug in this example when UIImage converts into texture then it becomes some blur in texture. so original image does not convert properly into texture.

    Please help me solved out this problem.
    I would be very grateful to you.


  • I want to perform brightness, contrast, sharpness on UIImage but it is not possible without converting into texture, because texture makes with pixels from UIImage and this functionality works very fast in pixels as shown in GLImageProcessing example.

    Also Apple provides CIFilter class for it. but it is slower. it does not work continuously. user have to wait for it.

Leave a Reply

Your email address will not be published. Required fields are marked *