Lately, I've been trying to port one of my C++ engines to the DS. Everything is working great, however, I'm having some issues with textures - mostly from glTexImage2D().
You see, all my textures are in a char* with rgb data. However, the last perimeter of glTexImage2D asks for a uint8*. I've tried converting my char* into a uint8*. It compiles, but all the pixels on the textures are the wrong color and it looks like they have been sorted at random.
I'm "guessing" that the last perimeter of glTexImage2D does not want rgb data, or at least, not the way I have it formatted. Could someone tell me where I went wrong? Here is some source code if it helps:
"image.pixel" is a character pointer (char*). As you can see, I tried to make a function called "charToUint8" that tries to convert the data. Here is the "charToUint8" function:Code:int loadTexture(AU_Texture image) {int texId; glGenTextures(1, &texId); glBindTexture(GL_TEXTURE_2D, texId); glTexImage2D(}GL_TEXTURE_2D, 0, GL_RGB, TEXTURE_SIZE_128, TEXTURE_SIZE_128, 0, TEXGEN_TEXCOORD, charToUint8(image.pixels,image.width,image.height));return texId;
uint8* charToUint8(char* charPList,int w,int h){
int i = 0, ii = 0, r = 0, g = 0, b = 0;}
uint8* outUL = new uint8[int(w*h*2)];
while(i < int(w*h*3)){
r = int(charPList[i]);}
g = int(charPList[i+1]);
b = int(charPList[i+2]);
//here is the problem
outUL[ii] = r;
outUL[ii+1] = b;
i += 3;
ii += 1;
return outUL;
Edit: After some experimentation, I noticed that the native DS format asks for a 2 byte color value. This means that it would need to convert the 3 byte rgb value to a 2 byte DS color value.
I would be grateful if someone could point me in the right direction.![]()
Last edited by DeltaSpeeds; March 29th, 2009 at 23:53.
There are currently 1 users browsing this thread. (0 members and 1 guests)
Bookmarks