1
\$\begingroup\$

I have run into a weird problem. The Alpha values of sprites are not respected when the sprites are in motion.

I was trying to do particles in 3d, and so for the start I just rendered them onto a quad(I am using Raylib), the image is the usual fading white to black circle used for particle systems. Somehow the artifacts of black borders show up. Even after playing around with the various blend functions, I can't seem to figure out how to remove this artifact.

The code looks like this

void DrawBlurSphere(Texture2D text, Vector3 position, f32 Radius, Color color)
{
    f32 x = position.x, y = position.y, z = position.z;
    f32 width  = 3.0f;
    f32 height = 3.0f;

    rlSetTexture(text.id);
    rlBegin(RL_QUADS);
    rlColor4ub(color.r, color.g, color.b, color.a);
    // Front Face
    rlNormal3f(0.0f, 0.0f, 1.0f); // Normal Pointing Towards Viewer
    rlTexCoord2f(0.0f, 0.0f);
    rlVertex3f(x - width / 2,
               y - height / 2,
               z); // Bottom Left Of The Texture and Quad
    rlTexCoord2f(1.0f, 0.0f);
    rlVertex3f(x + width / 2,
               y - height / 2,
               z); // Bottom Right Of The Texture and Quad
    rlTexCoord2f(1.0f, 1.0f);
    rlVertex3f(x + width / 2,
               y + height / 2,
               z); // Top Right Of The Texture and Quad
    rlTexCoord2f(0.0f, 1.0f);
    rlVertex3f(x - width / 2,
               y + height / 2,
               z); // Top Left Of The Texture and Quad
    rlEnd();

    rlSetTexture(0);
}

while (!WindowShouldClose())
{
    f32 dt = GetFrameTime();
    update_particles(&origin, dt);
    BeginDrawing();
    ClearBackground(BLANK);

    DrawFPS(SCREEN_WIDTH - 100, 10);

    BeginMode3D(camera);
    rlSetBlendFactors(RL_SRC_ALPHA, RL_ONE, RL_FUNC_ADD);
    BeginBlendMode(BLEND_CUSTOM);
    for (size_t i = 0; i < origin.used; ++i)
    {
        particle_3d *p1 = origin.pool + i;
        if (p1->lifetime > 0)
        {
            DrawBlurSphere(text, p1->position, 15, p1->color);
        }
    }
    EndBlendMode();
    EndMode3D();

    EndDrawing();
}

I have also tried the normal GL_SRC_ALPHA and GL_ONE_MINUS_SRC_ALPHA and that doesn't work as well. The other thing is, the objects don't have that boxy look when they are stationary.

Can you please help me with this one?

enter image description here

\$\endgroup\$
3
  • 2
    \$\begingroup\$ Do you have depth writing enabled? This should usually be turned off when rendering translucent particles to ensure they don't clip one another. \$\endgroup\$ Commented Aug 16, 2024 at 8:02
  • \$\begingroup\$ @DMGregory Thank you so much man. I played around with it and got it to work. Looks like I need to read up more about the functions in opengl. This feels like a rookie mistake. \$\endgroup\$ Commented Aug 16, 2024 at 22:22
  • 1
    \$\begingroup\$ Glad it helped! I wrote up an answer below explaining this in a bit more depth (pun intended 🥁) — let me know if you need anything clarified/corrected; else, you can click the ✅ icon to mark the answer as "Accepted" if it covers what you need. \$\endgroup\$ Commented Aug 17, 2024 at 0:25

1 Answer 1

1
\$\begingroup\$

The way rasterizers get content to layer correctly with depth is to use a "depth buffer".

This associates each pixel/fragment on the screen/destination buffer with a "depth" number related to the distance to the nearest surface in that direction. (I say "related" because this depth is usually stored with a non-linear mapping that keeps more precision available close to the camera... but we can ignore that wrinkle for our current purposes) We clear this buffer to the farthest possible depth at the start of rendering.

When we're drawing opaque geometry, we first check the depth stored in the buffer for that spot (this is the "depth test"). If the stored value is closer than the fragment we're trying to draw, we abort shading/blending this fragment, because it should be occluded behind that closer surface. Otherwise, we shade and draw this fragment over whatever was already in the colour buffer, and replace the stored depth with this fragment's depth (this is the "depth write"). That prevents farther fragments that might be drawn later in the frame from drawing over this surface.

This gets more complicated though when we're trying to render translucent blended effects, where we don't want such a simple "closest fragment wins" rule, we want the output colour to be some blend of potentially several surfaces that all overlap at that point.

If our translucent geometry is not sorted from back to front from the camera's perspective, we can sometimes have a quad close to the camera render first, and a quad behind that one rendering later in the frame. If we have depth writes enabled, the earlier quad ends up reserving its depth in the depth buffer, and preventing later stuff behind it from rendering — effectively punching out a square hole.

To fix this, we usually disable depth writing for translucent geometry and, as much as is practical, ensure it's sorted back to front, or use blend modes that are order independent. This is not always perfectly solvable, so translucent rendering is full of odd compromises.

In OpenGL, you can toggle depth writing on or off with glDepthMask(true / false).

\$\endgroup\$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.