2D Particles Animation with WebGL

05.12.2020

Let's draw some particles and animate them on our screen using shaders and raw WebGL. While using frameworks such as threejs or pixi brings a lot of ease into expressing yourself creatively, they also bring a lot of size and dependencies to your JS codebase. Writing lower-level webgl code means your program will be much more optimised and even let you do things libraries like threejs dont allow you to do.

We are going to create some 2d particles and animate them in our shaders, which are small programs to be compiled and ran on the device GPU. Here is the final result:

See the Pen

WebGL Particle Animation by Georgi Nikoloff (@gbnikolov)

Let's start by setting up some boilerplate code. Here is what we will do:

1. create a canvas and resize it according to the device screen

2. obtain it's webgl context and create an update loop that will update and redraw our graphics

/* Create a canvas and append it to the DOM */
const canvas = document.createElement('canvas')
document.body.appendChild(canvas)

/* Obtain WebGLRenderingContext for drawing */
/* 'experimental-webgl' is needed for IE11 */
const gl = canvas.getContext('webgl') || canvas.getContext('experimental-webgl')

/* Initialize our program */
document.addEventListener('DOMContentLoaded', init)

function init () {
/* Handle canvas resize */
window.addEventListener('resize', resizeCanvas)
resizeCanvas()

/* Schedule first paint */
requestAnimationFrame(renderFrame)
}

function renderFrame () {
/* Paint over the canvas with a rgba color */
gl.clearColor(0.8, 0.8, 0.8, 1.0)
gl.clear(gl.COLOR_BUFFER_BIT)

/* Schedule next particles redraw */
requestAnimationFrame(renderFrame)
}

function resizeCanvas() {
/* Multiply width and height to account for the device pixel ratio and prevent aliasing */
canvas.width = devicePixelRatio * innerWidth
canvas.height = devicePixelRatio * innerHeight

/* Scale down to the actual layout width and height */
canvas.style.width = `${innerWidth}px`
canvas.style.height = `${innerHeight}px`

/* Set the WebGL context viewPort */
/* gl.drawingBufferWidth and gl.drawingBufferHeight represent the actual width of the current drawing buffer */
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight)
}

We have our basic setup ready and now need something to draw. WebGL has a rendering pipeline which represents how your data will result in pixels on the screen. An entire explanation of the rendering pipeline is out of the scope for this demo, but the gist of it is that you get to write two small programs called shaders that will be ran on the GPU. They are written in GLSL, a C-like language.

WebGL lets you you prepare and issue render calls to the GPU, where you have access to programming the Vertex shader and Fragment shader. These shaders are represented as WebGLShader and let you control the final position and appearance of your object on the user screen respectively.

The entire rendering pipeline is out of scope for this demo. Here is the minimal amount of steps we need to take to draw our particles:

  1. Create gl.VERTEX_SHADER and gl.FRAGMENT_SHADER and supply our GLSL shader programs as strings. Compile the two programs on the GPU and check for errors in the gl.COMPILE_STATUS. These two programs will be executed on the device GPU and expect certain variables as inputs that we will supply with the webgl API.
  2. Create a WebGLProgram and supply the two successfully compiled shaders to it. Link the program and check for it's gl.LINK_STATUS
  3. We can query our newly created programs and obtain the allocated slots for the input variable it expects. Once we obtain their address on the GPU, we can supply data to them using the webgl API.
  4. Once the program is linked successfully and we have provided our data, we can issue the draw call. WebGL supports different primitives such as gl.LINES, gl.TRIANGLES, gl.TRIANGLE_FAN, etc. For our demo we will use gl.POINTS and gl.LINES

1. Creating our vertex and fragment shader objects:

Let's create a helper method for creating a shader with a certain shaderType:

function makeShader (gl, { shaderType, shaderSource}) {
/* Create a shader with a gl.VERTEX_SHADER or gl.FRAGMENT_SHADER */
const shader = gl.createShader(shaderType)

/* Supply the shader source as a JS string to the newly created shader */
gl.shaderSource(shader, shaderSource)

/* Compile the shader on the GPU */
gl.compileShader(shader)

/* Check if the shader compiled successfully by querying the gl.COMPILE_STATUS for this shader */
const success = gl.getShaderParameter(shader, gl.COMPILE_STATUS)
if (success) {
return shader
}

/* If the compilation was not successful, log the shader info */
console.error(gl.getShaderInfoLog(shader))

/* Delete the invalid shader object */
gl.deleteShader(shader)
}

2. Creating our WebGL Program:

Let's write another helper method to create a WebGLProgram that expects our shaders as strings

function makeProgram (gl, { vertexShaderSource, fragmentShaderSource }) => {
/* Create WebGLShader object with type gl.VERTEX_SHADER using our helper method */
const vertexShader = makeShader(gl, {
shaderType: gl.VERTEX_SHADER,
shaderSource: vertexShaderSource,
})

/* Create WebGLShader object with type gl.FRAGMENT_SHADER using our helper method */
const fragmentShader = makeShader(gl, {
shaderType: gl.FRAGMENT_SHADER,
shaderSource: fragmentShaderSource,
})

/* Create new WebGLProgram object */
const program = gl.createProgram()

/* Attach both shaders to the program. Ordering matters: vertex shader is always first */
gl.attachShader(program, vertexShader)
gl.attachShader(program, fragmentShader)

/* Link the program on the device GPU */
gl.linkProgram(program)

/* Check if the program was linked successfully by checking its gl.LINK_STATUS */
const success = gl.getProgramParameter(program, gl.LINK_STATUS)

if (success) {
return program
}

/* Log the program info for debugging */
console.error(gl.getProgramInfoLog(program))

/* Delete invalid program */
gl.deleteProgram(program)</pre>
}

3. Writing our shaders

We can now construct a new WebGLProgram, but first, let's write our shader sources as simple strings on the top of our program:

Vertex Shader:

/* precision of our floats in GLSL */
/* See https://webglfundamentals.org/webgl/lessons/webgl-precision-issues.html for more info */
precision highp float;

/* The vertex indice as a float. */
/* We can calculate the final vertex position using the indice */
attribute float a_indice;

/* Updated uniform time will be passed every frame so we can run our animation */
uniform float time;

/* Radius scale of movement for our particles. */
/* We will update it every frame with new value animated in javascript using simple easing */
uniform float radiusScale;

/* Pass the particle count in shader construction step so we don't have to pass an extra uniform */
const float PARTICLES_COUNT = ${PARTICLES_COUNT}.0;

/* GLSL does not have PI primitive, so we need to supply it ourselves */
const float PI = ${Math.PI};

void main () {
/* Construct vec2 (x, y) position sing our indice that we supplied as a Float32Array */
/* Calculate the position using our indice and time as inputs to simple sin / cos formulas. */
/* Change the variable values to see the effect */
float step = PI * radiusScale / PARTICLES_COUNT;
float timeScale = 0.001;
float minRadius = 0.1;
float x = sin(a_indice * radiusScale * 0.01 * step - time * timeScale) * (a_indice / PARTICLES_COUNT + minRadius);
float y = cos(a_indice * step - time * timeScale) * (a_indice / PARTICLES_COUNT + minRadius);

vec2 position = vec2(x, y);

/* WebGL expects us to supply the final position in 4 dimension - xyzw */
gl_Position = vec4(position, 0.0, 1.0);

/* Set our particle size depending on how far is it from the center */
float distFromCenter = distance(position, vec2(0.0));

gl_PointSize = distFromCenter * 7.0 + 1.0;
}

Fragment Shader:

precision highp float;

/* Our fragment shader is super simple, just color each vertex uniformly with the same color */
void main () {
gl_FragColor = vec4(vec3(0.6), 1.0);
}

We can then use our helper method and create our program using our shaders:

drawProgram = makeProgram(gl, {
vertexShaderSource,
fragmentShaderSource,
})

4. Preparing our indices array for the particles draw call

Once the program has been successfully linked, let's supply it with some data so we can render it on screen. In the next snipper we will:

  1. Prepare our indices as Float32Array, so we can use them on the GPU for our animation.
  2. Look up the a_indice variable location on our GPU and enable it
  3. Supply our particles indices as a WebGLBuffer to the a_position variable on our GPU
const PARTICLES_COUNT = 500  

/* Create a WebGLBuffer to hold our indices array */
const indicesBuffer = gl.createBuffer()

/* Bind our indicesBuffer to the active gl.ARRAY_BUFFER */
gl.bindBuffer(gl.ARRAY_BUFFER, indicesBuffer)

/* Supply data to our indicesBuffer using the current gl.ARRAY_BUFFER hook */
gl.bufferData(gl.ARRAY_BUFFER, indices, gl.STATIC_DRAW)

/* Query the a_indice input variable in our vertex shader from the linked program running on our GPU */
const positionLocation = gl.getAttribLocation(drawProgram, 'a_indice')

/* Enable the variable on our GPU */
gl.enableVertexAttribArray(positionLocation)

/* Point the position location to the active buffer and specify its layout */
/* Second argument is "1" because we have one indice per vertex */
/* Please check https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/vertexAttribPointer */
gl.vertexAttribPointer(positionLocation, 1, gl.FLOAT, false, 0, 0)

We need to explicitly allow our program before using it:

gl.useProgram(drawProgram)

Now that we have supplied individual index attributes to all our vertices, we can pass global uniforms, that uniform-ly can affect our vertices:

/* We need to explicitly allow our program before using it */
/* Look up our uniform locations on the GPU */
timeLocation = gl.getUniformLocation(drawProgram, 'time')
radiusScaleLocation = gl.getUniformLocation(drawProgram, 'radiusScale')

5. Issuing draw calls and the update loop

At this point we have set all of the WebGL state that we need. We have compiled our WebGL program, set the viewport correctly and supplied our input variables to our shader correctly formated. With all this we can issue a render call:

/* Paint over the canvas with a rgba color */
gl.clearColor(0.8, 0.8, 0.8, 1.0)
gl.clear(gl.COLOR_BUFFER_BIT)

/* Supply new elapsed time uniform to our shader */
gl.uniform1f(timeLocation, ts)

/* Supply the updated radiusScale to our shader */
gl.uniform1f(radiusScaleLocation, radiusScale)

/* Issue a render call using gl.LINES */
gl.drawArrays(gl.LINES, 0, PARTICLES_COUNT)

/* Issue a render call with gl.POINTS */
gl.drawArrays(gl.POINTS, 0, PARTICLES_COUNT)

/* Issue next render */
requestAnimationFrame(renderFrame)

6. Conclusion

This demo is not production-ready code, rather a small prototype to showcase core WebGL concepts, while still trying to give a high-level overview of the process. Without the comments it is ~130 lines. With a little more code we can handle user interactions, change colors depending on the positions, convert it to 3D etc.

Happy coding!

Back