I was quite happy with the dynamism in the test scene at this point as well as the pixelated experiment, so I decided to work on another purely aesthetic feature. I have never implemented bloom in previous rendering experiments and I thought it would be a fun one to try - adding an interesting dimension to the lighting of the scene, and perhaps being extended to make a local star appear much brighter.


The process for this was to update the fragment shaders to output two colours to two different textures, specified using different colour attachments. The first texture receives the regular colour, and the second only receives the colour if it’s above some threshold value, resulting in a masked version of the image with either the regular colour or black.

To this second texture I applied a compute kernel that blurred the result, using the MPSImageGaussianBlur kernel provided by Apple. The idea with bloom is to take this texture which contains a blurred version of only the bright parts of the original image, and adding it to the original image. This sum has the effect of giving the bright parts of the image a glowing border. I tried using MPSImageAdd, but encountered issues with this, so instead opted to write a custom kernel that reads both images and replaces the contents of the first one with the sum of both.

I allowed the bloom threshold to be specified per shader allowing less bright objects to bloom without making the image too bright, or not bloom at all. The end result was a glowing aura around emissive or high specularity surfaces such as background stars, engine trails, and the planetary rings. I really like the effect and it was surprisingly easy to implement.

Bloom effect applied to ringed planet, stars, and spaceship engine trails


Continuing the theme of making existing things look more visually interesting, I decided next to experiment with noise. I didn’t have a concrete plan for how I would apply it, but had a couple of ideas - varying planet height, and varying planet texture.

I previously experimented with generating noise while following along with The Art of Code’s fantastic shader tutorial on Value Noise using Shaderific on my iPad, but for my Metal renderer I chose to implement Perlin noise. Ken Perlin’s 2002 reference implementation of his eponymous noise algorithm is defined in 3 dimensions. In addition to the noise generation I wrote a function that accepts a 3 dimensional coordinate, a number of octaves and an initial scale. This function allows noise values at different “frequencies” for the same coordinate to be summed to add finer detail to the noise. I’m interested in implementing fractional brownian motion for this purpose later on.

Even though the noise is defined in 3 dimensions, you can easily get a 2d slice of the 3d noise by passing a constant value for one of the dimensions. To verify my Swift implementation I generated a texture in 2d, writing sampled noise at each pixel and mapped the texture on a quad in clip space. With the multisampling and pixelated filter, it looked like this:

Perlin noise after low resolution filter

As previously stated, two applications for noise are to generate a texture for the planet, and to vary the height of the surface in a continuous way. Although the surface of the planet is two dimensional, I could not think of a way to map the surface of the sphere to continuous two dimensional coordinates required to sample the noise. Instead, for each position on the surface I wanted a noise value for, I sampled the point’s 3 dimensional coordinate which is continuous.

I wanted to do the noise sampling for each fragment of the planet’s surface, and since the calculation for noise could be intensive depending on the number of fragments, as well as the number of octaves, I thought about ways to to generate noise on the CPU and make it available on the GPU. The first approach I took was to try to generate a 3d texture. Having never used 3d textures before I also thought it would be an interesting experiment. The first thing I noticed when I did this was that generating the texture on the CPU at runtime took a noticeable amount of time. Up until this point, the time it has taken to load or generate anything at runtime has been negligible, and this was a first indication of why games prepare assets on advance instead of procedurally generating everything, or have loading screens, or clever ways of loading in the background. Why did it take a long time? Well because it’s a lot of data to generate. A 32bit 1024x1024 texture will take up 4MB, but a 32bit 1024x1024x1024 texture will take up 4GB!

This was a bit of a blocker so after a while I considered a different approach. The reason I wanted to generate the texture on the CPU was because I assumed the noise would be too expensive to calculate on the GPU. Since I didn’t have any data to back this up, I decided to see if the assumption was valid. I implemented the same noise generation code as Metal shader functions and as a test, I updated the planet’s vertex shader to add a small offset to each vertex in the direction of the vertex from the planet’s centre. I also varied the planet’s surface by changing the colour based on the noise in the planet’s fragment shader. This resulted in a planet with a bumpy, textured surface, and contrary to my performance assumptions, ran absolutely fine with noise sampling in both vertex and fragment shaders. Obviously my scene is incredibly simple so it won’t necessarily scale, but it was a good reminder to validate assumptions before pursuing an “optimisation”.

Applying Noise

The bumpy planet looked a bit odd since it’s supposed to look like a gas giant which wouldn’t have terrain. I removed the changes to the planet and instead decided to apply noise to an icosahedron with faces wound inwards, that would move with the camera in the same way as the stars. This would become part of the “skybox”, with the stars drawn in front of it. My initial plan was to make the noise quite sparse to represent galaxies in the far distance, but after some initial tweaks I came upon a denser setting than planned. This had the effect of making the surrounding space a lot more visually interesting, as though the camera is in the middle of a nebula or dust cloud. The bloom effect amplified this even further as it made the brighter parts of the cloud glow. I plan to make this more easily configurable so that I can test different noise settings and colours, but for now I’m pretty satisfied with the effect:

This entry is part of a series on writing a Metal Renderer in Swift Playgrounds.