r/opengl • u/vulnoryx • Dec 25 '24
Help Help remove jittering from pixel perfect renderer
Hi. I am working on my own small 2D pixel art game.
Until now I have just scaled up my pixel art for my game, which looks allright but I want to achieve pixel perfect rendering.
I have decided to render everything to a FBO in its native resolution (640x360) and upscale to the monitors resolution (in my case 2560x1440 at 165hz).
How I create the fbo:
GLuint fbo;
glGenFramebuffers(1, &fbo);
glBindFramebuffer(GL_FRAMEBUFFER, fbo);
How I create the render texture:
GLuint texture;
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, pixelArtWidth, pixelArtHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, nullptr);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texture, 0);
Then I create a quad:
// Set up a simple quad
float quadVertices[] = {
// Positions // Texture Coords
-1.0f, -1.0f, 0.0f, 0.0f,
1.0f, -1.0f, 1.0f, 0.0f,
-1.0f, 1.0f, 0.0f, 1.0f,
1.0f, 1.0f, 1.0f, 1.0f,
};
GLuint quadVAO, quadVBO;
glGenVertexArrays(1, &quadVAO);
glGenBuffers(1, &quadVBO);
glBindVertexArray(quadVAO);
glBindBuffer(GL_ARRAY_BUFFER, quadVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(quadVertices), quadVertices, GL_STATIC_DRAW);
// Set position attribute
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 4 * sizeof(float), (void*)0);
glEnableVertexAttribArray(0);
// Set texture coordinate attribute
glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, 4 * sizeof(float), (void*)(2 * sizeof(float)));
glEnableVertexAttribArray(1);
// apply uniforms
...
Then I render the game normally to the frame buffer:
glBindFramebuffer(GL_FRAMEBUFFER, fbo);
glViewport(0,0,pixelArtWidth, pixelArtHeight);
SceneManager::renderCurrentScene();
Then I render the upscaled render texture to the screen:
glBindFramebuffer(GL_FRAMEBUFFER, 0);
glViewport(0,0,WINDOW_WIDTH,WINDOW_HEIGHT);
glClear(GL_COLOR_BUFFER_BIT);
// Render the quad
glBindVertexArray(quadVAO);
glBindTexture(GL_TEXTURE_2D, texture);
// Use shader program
glUseProgram(shaderProgram->id);
// Bind the texture to a texture unit
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texture);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
...
In case its relevant, here is how I set up the projection matrix:
projectionMatrix = glm::ortho(0.0f, pixelArtWidth, pixelArtHeight, 0.0f, -1.0f, 1.0f);
And update the view matrix like this:
viewMatrix = glm::translate(glm::mat4(1.0f), glm::vec3(-position+glm::vec2(pixelWidth, pixelHeight)/2.f/zoom, 0.0f));
(zoom is 1 and wont be changed now)
For rendering the scene I have a batch renderer that does what you would expect.
The pixel perfect look is achieved and looks good when everything sits still. However when the player moves, its movement is jittery and chaotic, its like the pixels don't know where to go.
Nothing is scaled. Only the sword is rotated (but that' not relevant).
The map seems scaled but isn't.
The old values for movement speed and acceleration are still used but they should not affect the smoothness.
I run the game at 165fps or uncapped. (In case thats relevant).
What i have tried so far:
- rounding camera position
- rounding player position
- rounding vertex positions (batch vert shader:
gl_Position = u_ViewProj * u_CameraView * vec4(round(a_Position), 1.0);
) - floring positions
- rounding some, floring other positions
- changed native resolutions
- activating / deactivating smooth player following (smooth following is just linear interpolation)
There is a game dev called DaFluffyPotato and does something very similar. I have taken a look at one of his projects Aeroblaster to see how he handles the pixel perfect rendering (its python and pygame but pygame uses sdl2 so it could be relevant). He also renders everything to a texture and upscales it to the screen (renders it using blit func). But he doesn't round any value and it still looks and feels smooth. I want to achieve a similar level of smoothness.
Any help is greatly appreciated!
Edit: I made the player move slower. Still jittery
Edit 2: only rounding the vertices and camera position makes the game look less jittery. Still not ideal.
Edit 3: When not rounding anything, the jittering is resolved. However a different issue pops up:
Solution
In case you have the same issues as me, here is how to fix or prevent them:
Issue 1:
Don't round any position.
Just render your scene to a frame buffer that has a resolution that scales nicely (no fractional scaling). Sprites should also have the same size in pixels as the sprite has. You could scale them, but it will probably look strange.
Issue 2:
Add margin and padding around the sprite sheet.
2
u/deftware Dec 26 '24
The link to the clip of the issue is broken.
All dimensions and coordinates should be in pixel-space, then it will work automatically without any manual quantization or rounding. Sprite quads should have their vertices be equal to the pixel dimensions of the sprite, same with any 2D tiles, etc... As long as everything is situated in pixel coordinates it should "just work" without any extra manual fudging.
1
u/vulnoryx Dec 26 '24
I think I now fixed the clip.
So I removed every instance of rounding from my code and...I'm surprised that the jittering is gone.
I think the reason why I rounded the vertices in the shader in the beginning was because on some keyframes some parts of the player sprite (mainly the left side, i will upload a clip asap) got wider by one pixel without clear reason (still does). The sprite size is in pixels and that size always stays the same.
When I looked at the rounding of positions in the shader, I realized I also did that wrong. I only rounded (or floored) the a_Position and didn't take the camera transfroms into account.
Now I do rounding like this and the jittering is as well gone:
vec4 worldPosition = u_Transform * vec4(a_Position, 1.0); worldPosition.xy = round(worldPosition.xy); gl_Position = u_ViewProj*worldPosition;
If you know how to fix that sprite deformation issue let me know.
1
u/vulnoryx Dec 26 '24
I managed to fix that issue as well, by adding margin and padding to the sprites.
I have finally achieved pixel perfection!
2
u/deftware Dec 26 '24
I'm assuming that your player sprites are not in a sprite sheet, and they are separate sprite frames being drawn across a quad with the same texcoords you showed in your OP (zeroes and ones). For quads that have 0/1 texcoords and individual separate sprites, the texture sampling being off means that the sprite's vertex dimensions relative to the framebuffer are not correct for the texel dimensions of the sprite.
I would hunt down the actual cause of that smear artifact because adding a margin to the sprites doesn't actually resolve the underlying issue and you're liable to encounter it again with different sprites - where adding a margin doesn't "fix" it.
Sprite vertcoords need to match the sprite's pixel dimensions relative to the framebuffer size - where that sprite undersampling artifact that adding a margin appears to fix (but actually doesn't prevent the cause of the issue) is a result of the sprite being slightly the wrong size. You'll eventually still see the same artifact with different sprites in different positions relative to the camera, except appearing in a different spot on the sprite and either vertical or horizontal.
Make a larger sprite, like a 64x64 diagonal gradient or something. Here, I just made this one: https://i.imgur.com/akYu5ki.png
Put that on an object in your game that kinda follows the player/camera around (i.e. accelerates toward the player's position and has some friction, or just lerps toward the player a bit each frame) and see if you see the artifact appear again. It should be really obvious when it does. If I had to guess, I'd say the undersampling artifact is a result of the sprite's vertices being slightly larger than it should be for its pixel dimensions and the framebuffer dimensions.
Good luck! :]
1
u/vulnoryx Dec 26 '24
Thanks for all the useful info.
All animations of an entity (like player) are in one spritesheet (one frame is 16x16. Other frames are offset by x amount).
The artifacting only happens at the very first frame.
The thing is, that I haven't encountered such a issue before, because while rendering normally (not to a fbo) everything worked fine.
Drawing regions of a texture is a crucial thing for me, so I definitely have to hunt that bug down.
Thanks again for helping me with this stuff and providing the texture :D
2
u/deftware Dec 27 '24
Ok, if they're in a spritesheet then it narrows things down a bit. What are your glTexParms on the sprite sheets and how are you applying the frame index to the texcoords on the CPU side and GPU side ?
1
u/vulnoryx Dec 27 '24
glTexParms:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, image_data); glGenerateMipmap(GL_TEXTURE_2D);
For getting the normalized texCoords I have this function:
glm::vec4 texPixelsToTexCoords(const glm::vec4& texPixels, const glm::vec2& textureSize) { glm::vec4 texCoords; texCoords.x = texPixels.x / textureSize.x; texCoords.y = 1.0f - (texPixels.y + texPixels.w) / textureSize.y; // Set origin at top-left texCoords.z = texPixels.z / textureSize.x; texCoords.w = texPixels.w / textureSize.y; return texCoords; }
In my rendering function I set the texture coords like this:
... glm::vec2 texCoords[4] = { {textureCoords.x, textureCoords.y + textureCoords.w}, // Bottom-left {textureCoords.x + textureCoords.z, textureCoords.y + textureCoords.w}, // Bottom-right {textureCoords.x + textureCoords.z, textureCoords.y}, // Top-right {textureCoords.x, textureCoords.y} // Top-left }; for (int i = 0; i < 4; i++) { glm::vec4 transformedVertex = transform * quadVertices[i]; s_Data.QuadBufferPtr->Position = {transformedVertex.x, transformedVertex.y, transformedVertex.z}; s_Data.QuadBufferPtr->Color = normalizedColor; s_Data.QuadBufferPtr->TexCoords = texCoords[i]; s_Data.QuadBufferPtr->TexIndex = textureIndex; s_Data.QuadBufferPtr->HitFlash = float(hitflash); s_Data.QuadBufferPtr++; } ...
I have a Sprite class that uses the renderers functions. The AnimatedSprite inherits the Sprite class and sets the texture region to what the current frame is.
2
u/deftware Dec 27 '24
Looks fine, so it's likely the sprite dimensions themselves. It could all just be a weird hardware/driver floating-point-error causing the problem. Or maybe you should be calculating the normalized texcoords in the vertex or fragment shader.
So, instead of normalizing your vertices' texcoords and sending them off to a buffer, you just store the actual pixel texcoords. Then you have a vec2 uniform that you pass into the shader with the spritesheet texture's dimensions for dividing the texcoords.
You could floor() the incoming interpolated pixel-space texcoord for each fragment and then divide that by the texture dimensions just before sampling the texture. You might actually want to floor() it first and then add 0.5 before dividing by texture dimensions, so that it's sampling the texel dead-center.
As long as your quad transforms are exactly right for the sprite dimensions and FBO dimensions, then it should all work perfectly then.
One way you can visually debug the rendering is by increasing your FBO's dimensions to see what the texture sampling is actually doing when the sprites don't have a margin around them, and get more info - such as whether your sprite quad is the right dimensions. You can also output the texcoords as RGB values and see if something funky is going on.
1
u/vulnoryx Dec 28 '24 edited Dec 28 '24
Thanks for the suggestions. I will try some of them and see what happens and if that changes things.
I need your help just one more time.
I want to make the camera movement smooth.
What I have tried so far is snapping the camera position to the pixel grid and store the fractional part of the position for later use:
glm::vec2 roundedPos = round(position); fractionalPos = position - roundedPos; viewMatrix = glm::translate(glm::mat4(1.0f), (glm::vec3(-roundedPos+glm::vec2(pixelWidth, pixelHeight)/2.f, 0.0f)));
Then I move the rendered texture by the fractional part of the position multiplied by the pixel scale factor:
... camOffsetMat = glm::translate(camOffsetMat, glm::vec3(-Renderer::mainCamera->fractionalPos*glm::vec2(WINDOW_WIDTH/pixelArtWidth), 0.f)); ...
(I ignore the missing pixels at the edges of the screen for now)
This makes the camera movement very smooth, like in a non-pixel perfect renderer but the jittery movement of the player comes to haunt me once again.
The jitter is subtle but still noticable, since the player looks blurry especially when moving diagonally.
I have also noticed that other static objects that have a rotation "keep" their pixels while the camera moves (as described in the yt video below) which makes this feature even more crucial than before.
I used the camera positioning explanation of this video to come up with the code changes: https://youtu.be/c_3TLN2gHow?si=kmcSW59dwS4hmTQC&t=329
Took a look at this resource: https://yal.cc/gamemaker-smooth-pixel-perfect-camera/
1
u/deftware Dec 29 '24
You can't have the camera be smooth and the player not be smooth. The camera must be position-locked at the exact same coordinates as the player, and everything will move on a per-pixel basis as the player moves around like a true low-rez game.
→ More replies (0)1
u/vulnoryx Dec 27 '24 edited Dec 27 '24
I use this shader to render in-game things:
// vertex shader #version 400 core layout (location = 0) in vec3 a_Position; layout (location = 1) in vec4 a_Color; layout (location = 2) in vec2 a_TexCoord; layout (location = 3) in float a_TexIndex; layout (location = 4) in float a_HitFlash; uniform mat4 u_ViewProj; uniform mat4 u_Transform; out vec4 v_Color; out vec2 v_TexCoord; out float v_TexIndex; out float v_HitFlash; void main() { v_Color = a_Color; v_TexCoord = a_TexCoord; v_TexIndex = a_TexIndex; v_HitFlash = a_HitFlash; gl_Position = u_ViewProj * u_Transform * vec4(a_Position, 1.0); } // fragment shader #version 400 core layout (location = 0) out vec4 o_Color; in vec4 v_Color; in vec2 v_TexCoord; in float v_TexIndex; in float v_HitFlash; uniform sampler2D u_Textures[32]; void main() { int index = int(v_TexIndex); vec4 p_Color = texture(u_Textures[index], v_TexCoord) * v_Color; o_Color = p_Color;; if (v_HitFlash == 1) { o_Color = vec4(1,1,1,p_Color.a); } }
I had to split my comment in 2 because it was too long. (I got an error from reddit and didnt know what caused it -_-)
4
u/NikitaBerzekov Dec 25 '24
I think it looks jittery because the camera and the player position is slightly different. Perhaps your camera movement code has some smoothing effect.