Now that I am finally not sick, it’s time to try and get things done! The first thing I’m getting to this week is some simple representation of the particles. I want to draw scaled textures from the points. I have two choices for this. I could either make a shader for binding and drawing simple point sprites, these would be very cheap to do, but as far as I understand, they’ve got two issues.
- As a GL_POINT is culled out of the screen, the texture will never be drawn (even if it may cover more screen space). This could lead to some strange artifacts. I have yet to test the severity of this, so I can’t speak for how it would look. Chances are it could look allright, but I can find no direct example of it online. I’ll just trust what I read on this.
- GL_POINTS are not rendered with depth, so if I’ve understood this right, these point sprites will all be of a certain unit size, regardless of the depth value of the point in the pointcloud. Again, this is something I haven’t tried, but if it is like this, it would be an effect I wouldn’t desire. I could make my own distance attentuation in the shader for this, but there might be a simpler solution altogether.
I first thought that point sprites would be a nice little solution, and true, it may be one. But the points presented above made me consider just doing a simple fragment shader (at first) which would take a point and make a quad facing the camera out of it. This would mean I wouldn’t have to worry about neither culling, nor distance attentuation, as this would be handled by OpenGL itself. This shader could later be changed to create actual, rotated particle geometry, something that might be interesting to peek into later. However nothing that I will focus on during the run of this project.