So you can’t make a game without images, right? Well, actually you can but that’s another story.
But how can you load a jpg or a png and use then on OpenGLES?
First let’s make a simple class Texture2D
class Texture2D { public: Texture2D(int id, int width, int height) { _textureId = id; _width = width; _height = height; } virtual ~Texture2D(void) { // Delete Texture from HGL Memory glDeleteTextures(1, ((GLuint*)&_textureId)); } int getTextureId() {return _textureId; } protected: int _textureId; // The reference ID of the texture in OpenGL memory int _width; int _height; }; |
Now for the code to actually load the image
static Texture2D* IPhoneOperativeSystem::LoadImage(std::string imagefile) { // Id for texture GLuint texture; // Generate textures glGenTextures(1, &texture); // Bind it glBindTexture(GL_TEXTURE_2D, texture); // Set a few parameters to handle drawing the image at lower and higher sizes than original glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE); NSString *path = [[NSString alloc] initWithUTF8String:imagefile.c_str()]; NSData *texData = [[NSData alloc] initWithContentsOfFile:path]; UIImage *image = [[UIImage alloc] initWithData:texData]; if (image == nil) return NULL; // Get Image size GLuint width = CGImageGetWidth(image.CGImage); GLuint height = CGImageGetHeight(image.CGImage); CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // Allocate memory for image void *imageData = malloc( height * width * 4 ); CGContextRef imgcontext = CGBitmapContextCreate( imageData, width, height, 8, 4 * width, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big ); CGColorSpaceRelease( colorSpace ); CGContextClearRect( imgcontext, CGRectMake( 0, 0, width, height ) ); CGContextTranslateCTM( imgcontext, 0, height - height ); CGContextDrawImage( imgcontext, CGRectMake( 0, 0, width, height ), image.CGImage ); // Generate texture in opengl glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData); // Release context CGContextRelease(imgcontext); // Free Stuff free(imageData); [image release]; [texData release]; // Create and return texture Texture2D* tex = new Texture2D(texture,width, height); return tex; } |
So right now you can do something like
Texture2D* tree_image = IPhoneOperativeSystem::LoadImage("tree_image.jpg"); |
For drawing, before sending your primitives just do something like:
glBindTexture(GL_TEXTURE_2D, tree_image->getTextureId()); |
This code should work on all iOS devices
May I suggest a small correction :
you need to set the buffer to 0 after the malloc
memset(imageData,0,(height * width * 4 ));
Thanks, although it’s not necessary to do it since the imageData is going to be fully written it’s always a good practice to memset it =)
I would not malloc+memset. If you really need to zero out memory (in this case, as you say, you don’t!) use calloc. This is from the Memory Usage Performance Guide from Apple:
When you call memset right after malloc, the virtual memory system must map the corresponding pages into memory in order to zero-initialize them. This operation can be very expensive and wasteful, especially if you do not use the pages right away
The calloc routine reserves the required virtual address space for the memory but waits until the memory is actually used before initializing it. This approach alleviates the need to map the pages into memory right away. It also lets the system initialize pages as theyβre used, as opposed to all at once.
There is a standard objective-c class published from Apple called Texture2D doing the exact same task (an more). Maybe also have a look at that one. Should be easy to change it from obj-c to c++.
cheers
Thanks for the tip, I kept using my own because of portability for all my games
This came at the right time. π
Thank you π
Right timming indeed… But I’m having an issue with this…
I’ve added a .png file (test.png) to the resources, and made sure it gets copied to the output (it’s under Copy Bundle Resources on xCode)…
Then I debug and when I reach the
NSData *texData = [[NSData alloc] initWithContentsOfFile:path];
path has the correct value (“test.png”), but the texData is empty after it runs this line… I went into the “build\Debug-iphoneos\unidir_vod_test.app” directory, and the “test.png” file is there (so I’m guessing it reaches the iPad)…
I’m wondering if I’m missing any permissions, or if the filepath is not correct (maybe something gets introduced by the system when it builds the local filesystem)…
Tried a websearch, but it’s like needle in a haystack… :\
Any ideas? π
Each iOS app has a specific container where it runs and stores his files.
To access the resourcePath you must query the mainBundle like this:
Now lets assume you place everything inside a folder called “Game_Resources”.
To get a std::string just do
Hope it helps π
It helped… took me a while to understand “resourcePath” on the first line was the method of the class and not a variable… π
It’s working now (I think, haven’t done anything with the texture yet), so thanks a million…
Some additional questions:
1) Do I have to release the NSString *path I got from NSBundle (or is it just a reference)?
2) Why use “stringByAppendingPathComponent”, instead of just adding “/Game_Resources” to the resulting std::string?
3) How do I say to my resources to go a specific directory, instead of being in the root directory of my application?
Again, thanks for the help, this is confusing for an “old-school” PC/C++ programmer… π and don’t get me started with getting the certificates up and running! π
1) You should release it yes [path release];
2) It’s just for convenience. Older iOS sdk would return without the end / others with. This way you don’t have to bother with that. If you have some kind of folder structure inside you can append it the regular way.
3) The best way is to place everything inside a folder on your project root, let’s call it game_assets. Add that folder to your XCode project inside Resources. But when adding it use “folder references” as an option. The folder should be blue and replicate your structure even if you change it outside Xcode.
Thanks a million, David, you just saved me lots of time! π
This poster: http://afsharious.wordpress.com/2011/07/27/loading-textures-in-opengl-on-ios/ seems to have stolen your code!
He credited the source so it’s all good. Thanks for the tip
Did you try and compile this?
Texture2D(int texture id, int width, int height)
is not valid…
There was a small typo, it was supposed to be either int texture_id or int id. Just use int id. I’ve updated the article. Thanks for letting me know.
It seems this method doesn’t handle alpha correctly. At lest it does alpha premultiplicaiton.
Hi David,
Hope you are doing well.
I am an iOS developer. i am facing problem to convert UIImage into texture and then texture should be rendered into a UIView.
I tired this with GLImageProcessing example which is provided by apple. But there is bug in this example when UIImage converts into texture then it becomes some blur in texture. so original image does not convert properly into texture.
Please help me solved out this problem.
I would be very grateful to you.
Regards,
Kapil
I never worked much with UIImage, but can it be initialized with some sort of data buffer? because if yes you can load the image like the example and then pass. Also, is this to use with GL? if yes, why are you using a UIImage to convert to a texture and not load it directly from disk to GL like the article says?
I want to perform brightness, contrast, sharpness on UIImage but it is not possible without converting into texture, because texture makes with pixels from UIImage and this functionality works very fast in pixels as shown in GLImageProcessing example.
Also Apple provides CIFilter class for it. but it is slower. it does not work continuously. user have to wait for it.
ah, I see. Well sorry, can’t help you there, I have more knowledge of GL than iOS sDK itself.