Modern OpenGL makes it easy to render simple 3D graphics: just put each model in a vertex buffer object (VBO) then throw them all at the graphics card, in any order. It's fast, it's fun, and it works. Unless the models include transparency, in which case it breaks horribly. Here is a video showing what we tried:
Our tree models have clusters of polygons textured with rough brushstrokes, to try and approximate a hand-painted look. The textures have areas around the edges that are fully or partially transparent, eg. they have an alpha channel with values less than 1.0. Without this the edges of the polygons would be too obvious, and the trees would look too geometrical. While we do like the low-poly style, we are trying to add more hand-made detail for Patchwork Empire.
Blending and Z-Buffering
If we just enable blending and render the tree, it doesn't work. The problem is polygons which are closer to the camera can be rendered before polygons which are further away, and blending interacts badly with z-buffering. The transparent areas of the polygon still update the z-buffer, so subsequent polygons will not be displayed in that area, even though they should be visible through the first polygon. This causes very strange gaps to open and close in the tree as it is viewed from different angles.
You can fix this problem by sorting the polygons from furthest to nearest before rendering them. However, you have to do this each frame, for every transparent polygon in the scene. It can make it impossible to batch models by texture ID, you can't just throw VBOs at the graphics card any more, and the sorting technique can't handle intersecting polygons. Far too much trouble!
OpenGL supports alpha testing, where fragments with alpha below a certain level can be discarded entirely, avoiding z-buffer writes and fixing the problem. Yay! But what level of alpha should we use? If we test that the alpha is greater than zero, we get strange color fringing gaps around the edge of the texture, where the alpha falls from one to zero. It's not as bad as straight blending, but it's definitely still noticeable.
If we test that the alpha is greater than 0.5, it looks better. We still get some blending, from 1.0 to 0.5, but we drop the fragments before they get too transparent, and start revealing too much of the background and creating gaps in the object. While this approach is definitely acceptable, there is an even better option!
Sample Alpha to Coverage
If you are using multisampling, you can enable sample alpha to coverage, and transparent objects will just work! (Up to a point). Multisampling collects multiple samples (surprise) per pixel, then combines them to smooth out hard edges on polygons. However, it can also perform a kind of blending by treating the alpha value of a fragment as the coverage for that pixel. It's not perfect, and if you have too many layers of overlapping transparent polygons it won't look right. But if you only have a couple of layers, which we do, then it's fantastic!