How to WebGL

Ever wanted to get into building something with WebGL but stopped cause of how daunting it was. Thats exactly where i was. Though this article however I hope you wont have to go through the same steep learning curve i faced and will maybe, hopefully come to like WebGL for its quirks the way I have.

What we will be building in this article is shown below. Move your mouse from left to right to see the effect.

See the Pen How to WebGL by Ashwin P Chandran (@Werty7098) on CodePen.0

Before reading this article I would highly recommend that you are familiar with vertex and fragment shaders and what they do in the GPU. This post assumes that you already know what they do and would just like to know how to get a simple WebGL app up and running.

Here is a simple introduction to them both https://www.youtube.com/watch?v=C1ZUeHLb0YU

The Code

The code to build this may seem daunting at first and can be seen below. Not to worry, when broken down its pretty simple.

I am showing the final code first as it will give you a better idea of what the end product will look like. If at any point you don’t understand whats happening, just scroll down to the appropriate section and see what that section does

First let’s look at the HTML markup.

<!DOCTYPE html>
<html>
<head>
    <meta charset="utf-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
    <title>WebGL Basics</title>
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <link rel="stylesheet" type="text/css" media="screen" href="main.css">
    
</head>
<body>
    <canvas id="web-gl" width="300" height="300"></canvas>
    <script src="main.js"></script>
</body>
</html>

The only thing to really worry about in the markup above is the <canvas> element. All the WebGL code renders inside it. For the CSS to have it nice and centred we have in main.css:

body, html {
  display: flex;
  justify-content: center;
  height: 100%;
  align-items: center;
}

canvas {
  border-radius: 30px;
  box-shadow: 0px 20px 30px -10px #00000047;
}

Now with our Markup in place lets get to the meat of this article. How to setup the WebGL environment. Below is the whole main.js file, but I will break it down so that we can understand how each section in it contributes to what we want to render.

let canvas;
let gl;
let program;
let vertexBuffer;
let vertexCount;
let scale = 0.5;
let theta = 0;
const vertexNumComponents = 2;

// Load Shaders
Promise.all([
  fetch("./shaders/vertex-shader.glsl").then(v => v.text()),
  fetch("./shaders/fragment-shader.glsl").then(v => v.text())
]).then(source => {
  setup(source[0], source[1]);
  requestAnimationFrame(draw);
}).catch(err => console.log(err));

// Setup WebGl environment
function setup(vertexSource, fragmentSource) {

  // Initialize canvas and webgl context
  canvas = document.getElementById("web-gl");
  gl = canvas.getContext("webgl");

  // Compile both shaders
  const vertexShader = gl.createShader(gl.VERTEX_SHADER);
  gl.shaderSource(vertexShader, vertexSource);
  gl.compileShader(vertexShader);
  if (!gl.getShaderParameter(vertexShader, gl.COMPILE_STATUS))
    console.log(gl.getShaderInfoLog(vertexShader)); 

  const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
  gl.shaderSource(fragmentShader, fragmentSource);
  gl.compileShader(fragmentShader);
  if (!gl.getShaderParameter(fragmentShader, gl.COMPILE_STATUS))
    console.log(gl.getShaderInfoLog(fragmentShader)); 

  // Create WebGL Program
  program = gl.createProgram();
  gl.attachShader(program, vertexShader);
  gl.attachShader(program, fragmentShader);
  gl.linkProgram(program);
  if (!gl.getProgramParameter(program, gl.LINK_STATUS))
    console.log(gl.getProgramInfoLog(program));

  // Create shape
  const vertices = [
    -1.0, -1.0,
    1.0, -1.0,
    1.0, 1.0,
    -1.0, -1.0,
    1.0, 1.0,
    -1.0, 1.0,
  ]
  vertexCount = vertices.length / vertexNumComponents;

  vertexBuffer = gl.createBuffer();
  gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
  gl.bufferData(
    gl.ARRAY_BUFFER, 
    new Float32Array(vertices), 
    gl.STATIC_DRAW
  ); 

  // Set rotation based on mouse position
  const setRot = e => theta = 2 * 3.145 * e.pageX / innerWidth;
  const setRotTouch = e => theta = 2 * 3.145 * e.touches[0].pageX / innerWidth;
  document.addEventListener("mousemove", setRot);
  document.addEventListener("touchmove", setRotTouch);
}

function draw() {
  gl.viewport(0, 0, canvas.width, canvas.height);
  gl.clearColor(0.8, 0.9, 1.0, 1.0);
  gl.clear(gl.COLOR_BUFFER_BIT);

  gl.useProgram(program);

  // Tranformation Matrices
  const ScaleMatrix = [scale, 0, 0, scale]
  const RotationMatrix = [
    Math.cos(theta), -Math.sin(theta), 
    Math.sin(theta), Math.cos(theta)
  ]
  gl.uniformMatrix2fv(
    gl.getUniformLocation(program, "scaleMatrix"), 
    false, 
    ScaleMatrix
  );
  gl.uniformMatrix2fv(
    gl.getUniformLocation(program, "rotationMatrix"), 
    false, 
    RotationMatrix
  );

  // Load square to Vertex Shader (aka Draw)
  const posPointer = gl.getAttribLocation(program, "aVertexPosition");
  gl.enableVertexAttribArray(posPointer);
  gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
  gl.vertexAttribPointer(
    posPointer, 
    vertexNumComponents, 
    gl.FLOAT, 
    false, 
    0, 
    0
  );
  gl.drawArrays(gl.TRIANGLES, 0, vertexCount);

  requestAnimationFrame(draw);
}

Now We have just two more files to deal with, the vertex-shader.glsl and fragment-shaders.glsl.

// Vertex Shader
attribute vec2 aVertexPosition;

uniform mat2 scaleMatrix;
uniform mat2 rotationMatrix;

varying vec4 pixelPosition;

void main() {
  gl_Position = vec4(scaleMatrix * rotationMatrix * aVertexPosition, 0.0, 1.0);
  // interpolated pixel position
  pixelPosition = vec4(aVertexPosition, 0.0, 1.0);
}
// Fragment Shader
// Set the default precision
precision highp float;
uniform mat2 rotationMatrix;
varying vec4 pixelPosition;
void main() {
    vec3 color = vec3((pixelPosition.x + 1.0) / 2.0, 0.0, 0.5);
    gl_FragColor = vec4(color, 1.0);
}

Breakdown

The WebGL program here can be broken down into 4 major parts

  • Setup
  • Animation Loop
  • Vertex Shader
  • Fragment Shader

Let’s look at what each of these sections do in detail.

Declare the global variables and load shaders

The first few lines of main.js are used to declare some global variables and load the shader files. This step can be done in a multitude of ways and isn’t as important to our goal here. These globals are just used to share context between the setup stage and run stage as our application here is very simple. What they do will be explained in Setup section

let canvas;
let gl;
let program;
let vertexBuffer;
let vertexCount;
let scale = 0.5;
let theta;
const vertexNumComponents = 2;

// Load Shaders
Promise.all([
  fetch("./shaders/vertex-shader.glsl").then(v => v.text()),
  fetch("./shaders/fragment-shader.glsl").then(v => v.text())
]).then(source => {
  setup(source[0], source[1]);
  requestAnimationFrame(draw);
}).catch(err => console.log(err));

Setup

Having declared the globals and loaded the shader files we call the setup function. The setup function here is simply the way i have implemented it. It is more important to understand what happens in this function.

It has the role of setting up the WebGL environment and can be thought of the function that puts everything in place for the draw loop to use while drawing. To setup the environment we need to do the following steps

  • Get the WebGL context to be able to use the WebGl API
  • Compile the Shaders
  • Link the shaders to a WebGL program instance

The first thing that we need to do to setup WebGL is getting the WebGL context from the canvas element as gl

// Initialize canvas and webgl context
canvas = document.getElementById("web-gl");
gl = canvas.getContext("webgl");

the gl variable here will point to the WebGL context here which allows us to use WebGL API. The next step for us is to compile both the shaders (vertex and fragment shaders) using the API

// Compile both shaders
const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, vertexSource);
gl.compileShader(vertexShader);
if (!gl.getShaderParameter(vertexShader, gl.COMPILE_STATUS))
  console.log(gl.getShaderInfoLog(vertexShader)); 

const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, fragmentSource);
gl.compileShader(fragmentShader);
if (!gl.getShaderParameter(fragmentShader, gl.COMPILE_STATUS))
  console.log(gl.getShaderInfoLog(fragmentShader)); 

To compile a shader we do the following steps.

  • Create a shader instance for the type of shader we want to compile.
  • Load the source code for that shader into its instance
  • Compile and check if the compile was successful

Now we make a WebGL program that will use these shaders to render graphics when data is passed into it. This section is self explanatory.

// Create WebGL Program
program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS))
  console.log(gl.getProgramInfoLog(program));

This essentially sets up the WebGL environment for us. However to do something useful with this we need to load some data into the program and describe how the program should interpret the data. So let’s load some data.

// Create shape
const vertices = [
  -1.0, -1.0,
  1.0, -1.0,
  1.0, 1.0,
  -1.0, -1.0,
  1.0, 1.0,
  -1.0, 1.0,
]
vertexCount = vertices.length / vertexNumComponents;

vertexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
gl.bufferData(
  gl.ARRAY_BUFFER, 
  new Float32Array(vertices), 
  gl.STATIC_DRAW
); 

The code above makes a square with every pair of values representing the x,y coordinates of the square. If you notice, there are 6 instead of 4 points for the square.

This is because the program will draw the square as two triangles. It is common in WebGL to represent complex shapes as a collection of triangles as seen on the right

To load the data we first define a Buffer instance that will hold this vertex information and then pass in this information into the buffer instance. To learn about what the two major function in here do and what parameters should be passed in:

Finally let’s introduce some interaction in our test app with the mouse and touch.

// Set rotation based on mouse position
const setRot = e => theta = 2 * 3.145 * e.pageX / innerWidth;
const setRotTouch = e => theta = 2 * 3.145 * e.touches[0].pageX / innerWidth;
document.addEventListener("mousemove", setRot);
document.addEventListener("touchmove", setRotTouch);

With our setup ready we are all set to define how the application must draw on our canvas this data.

Animation Loop

So this is where we tell our WebGL program what to draw for every frame in our animation. This involves a few standard steps:

  • Reset the canvas so that the previous frame data is cleared
  • Set the WebGL program to be used
  • Load the Data (there are 3 kinds of data here that is generally loaded)
    • Uniforms (Affine matrices, constants, values to be applied to each vertex)
    • Textures
    • Vertices (The points that will be individually operated on by the vertex shader
  • Draw and Repeat

So lets start by reseting the canvas

gl.viewport(0, 0, canvas.width, canvas.height);
gl.clearColor(0.8, 0.9, 1.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);

Then link the program. Multiple programs can used to enable the use of different vertex and fragment shaders. Here we will use only one

gl.useProgram(program);

Now we can load the matrices/ constants that can be used in the program independent of the vertices.

// Tranformation Matrices
const ScaleMatrix = [scale, 0, 0, scale]
const RotationMatrix = [
  Math.cos(theta), -Math.sin(theta), 
  Math.sin(theta), Math.cos(theta)
]
gl.uniformMatrix2fv(
  gl.getUniformLocation(program, "scaleMatrix"), 
  false, 
  ScaleMatrix
);
gl.uniformMatrix2fv(
  gl.getUniformLocation(program, "rotationMatrix"), 
  false, 
  RotationMatrix
);

And finally the vertices themselves and draw them. In this example we will not deal with textures but they follow a similar procedure as well.

// Load square to Vertex Shader (aka Draw)
const posPointer = gl.getAttribLocation(program, "aVertexPosition");
gl.enableVertexAttribArray(posPointer);
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
gl.vertexAttribPointer(
  posPointer, 
  vertexNumComponents, 
  gl.FLOAT, 
  false, 
  0, 
  0
);
gl.drawArrays(gl.TRIANGLES, 0, vertexCount);

This is one of the trickiest parts of this setup so let me go over it in detail.

  • We first need to get access to the vertex pointed in the vertex shader using getAttribLocation.
  • Next we need to ensure that the attribute is activated. It is by default deactivated. we do this using enableVertexAttribArray. Technically this can be done just once if we are never disabling it but that is just bad practice and hence is done for every iteration of the draw loop
  • Next we need to bind the buffer that we had loaded earlier back to the target. Binding the buffer to the target breaks any previous bindings to the same target, hence is done just draw operation for the buffer to be drawn
  • Then we define how the attribute must read and interpret the buffer that we are about to draw. Details of this function’s parameters can be found here. https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/vertexAttribPointer
  • Finally we draw this buffer as triangles. Details of its parameters can be found here. https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/drawArrays

With this the program knows what exactly needs to be drawn for every frame in the animation. We then call this function again in the next animation frame to update the canvas.

requestAnimationFrame(draw);

Vertex Shader and Fragment Shader

Now that we have defined what it is that our app must draw, let’s define how it should go about drawing it. For this we have two shaders that operate at different levels in the graphics pipeline.

Graphics Pipeline

WebGL follows the OpenGL ES format for this. Writing GLSL files using OpenGL ES commands we can define our two shaders. First lets define our vertex shader that operates on the individual vertices of the square that we passed in earlier.

attribute vec2 aVertexPosition;

uniform mat2 scaleMatrix;
uniform mat2 rotationMatrix;

varying vec4 pixelPosition;

void main() {
  gl_Position = vec4(scaleMatrix * rotationMatrix * aVertexPosition, 0.0, 1.0);
  // interpolated pixel position
  pixelPosition = vec4(aVertexPosition, 0.0, 1.0);
}

This shader does not do much except for simply applying the rotation and scaling transformations on every vertex. It also passes along the information of the untransformed position to the fragment shader through the varying pixelPosition variable.

Finally for our fragment shader we color our square based on the postion of the square pixel on the x axis before transforming it as a simple way to observe the transformation.

// Set the default precision
precision highp float;
uniform mat2 rotationMatrix;
varying vec4 pixelPosition;
void main() {
    vec3 color = vec3((pixelPosition.x + 1.0) / 2.0, 0.0, 0.5);
    gl_FragColor = vec4(color, 1.0);
}

You can experiment by assigning the gl_Position variable to pixel position instead to see what effect this has on the result.

Conclusion

With this we have built a simple WebGL application. We can see how with 4 simple components we can setup and run WebGL on our browser to render a myriad of applications. Many of the steps seen here give a level of granularity that may be too much for many use cases. In such a scenario we can use one of many 3rd party WebGL libraries to abstract away most of these features. That being said, it is always useful to know what goes on under the hood. Hope this was a helpful tutorial. Leave a comment if you feel something can be added or changed in the presentation of this article. Cheers.

Leave a Reply

Your email address will not be published. Required fields are marked *