Georgi Nikolovhttps://archive.georgi-nikolov.comWebsite for blog articles and works by Georgi Nikolov, a frontend developer living in Berlin, Germany.en-usSnake in WebAssemblynikoloffgeorgi@gmail.comMon, 04 Mar 2024 15:25:20 +0000https://archive.georgi-nikolov.com/project/webassembly-snakehttps://archive.georgi-nikolov.com/project/webassembly-snakeurl - https://gnikoloff.github.io/wasm-snake/

code - https://github.com/gnikoloff/wasm-snake

Snake preview

Snake written in WebAssembly Text (WAT) format and compiled to WASM bytecode.

All graphics, game state and logic are written in WAT. Module is loaded in JS (host environment) which is responsible for game tick, user input and random number generator seeding.

The WASM memory consists of 3 virtual pages (64kb each) = 192kb total. Within those exist the pixel buffer, the snake and food positions and the character data. The memory is shared with JS, which then blits it to the screen using WebGL2.

You can refer to src/snake.wat for detailed memory break down and lots of comments.

References and readings

]]>
WebGPU Raytracernikoloffgeorgi@gmail.comSun, 14 Jan 2024 12:23:55 +0000https://archive.georgi-nikolov.com/project/webgpu-raytracerhttps://archive.georgi-nikolov.com/project/webgpu-raytracerurl - https://gnikoloff.github.io/webgpu-raytracer/

code - https://github.com/gnikoloff/webgpu-raytracer

<br/>

Technologies Used:

Typescript, WebGPU, WGSL

Role:

Development

Front view of pathtraced scene

Side view of pathtraced scene

After doing rasterization for years, I was very intrigued by raytracing. After all, it is the holy grail of computer graphics, producing incredible photorealistic imagery with soft shadows, ambient occlusion and blurred reflections. These effects are difficult to achieve using a real-time 3D rasterizer but here you essentially get them for free with little to no rendering tricks involved.

At the end of the day I ended up with what's called a path tracer. It requires a large quantity of rays to be fired through each pixel in a stochastic manner for convergence thus removing noise from the rendered image.

The app is compromised of two parths: code that runs on the CPU and code on the GPU.

CPU side is managed by Typescript. It loads Wavefront objects and material files, creates bounding volume hierarchy, prepares all the scene triangles in buffers and uploads them to the GPU. It also handles user interaction and camera movement.

GPU side is managed by WGSL shader code. This is the heart of the raytracer and runs in parallel on the GPU via compute shader. It bounces rays around the scene and gathers the accumulated color that it finally writes to the pixel in a image buffer. The image buffer is then blitted to the device screen.

You can check out more info at the github repo.

References and Readings:
  1. Raytracing in a Weekend - this is where everybody starts with raytracing it seems. I followed Book 1 and 2 and implemented them in C++ before switching to a compute shader approach.
  2. Intel Path-Tracing Workshop - Raytracing in a Weekend runs on the CPU and does not really explain how to port it to the GPU (where recursion is not allowed). This 2 videos show very well how to do the same task via loops in GLSL. The theory and math presented are also really good.
  3. Weekend Raytracing with wgpu - Porting "Raytracing in a Weekend" Book 1 to WebGPU. I got the idea to use storage buffers for the frame pixel contents here.
  4. WebGL Ray Tracer - Path tracer written in WebGL. I studied the code and implemented my model parsing and BVH generation based on it.
  5. WebGPU Spec
  6. WGSL Spec

alternative render

]]>
Happy new 2024!nikoloffgeorgi@gmail.comTue, 19 Dec 2023 16:32:10 +0000https://archive.georgi-nikolov.com/project/2d-phy-dojohttps://archive.georgi-nikolov.com/project/2d-phy-dojourl - https://gnikoloff.github.io/2d-phy-dojo/

code - https://github.com/gnikoloff/2d-phy-dojo/tree/main

<br/>&nbsp;

Technologies Used:

WebGL2, JS, C++

Role:

Development

<br/>

Happy new 2024!

2D physics from scratch with WebGL2, JS and C++ to celebrate the new year.

project render

Dependencies

  • 2d-phy - lightweight 2D physics engine written in C++. Constructs and methods are exposed to JS via embind
  • hwoa-rang-gl2 - drawing utilities for WebGL2

Rendering

This demo instances all shapes aggressively by shape type:

Boxes

boxes only

Circles

circles only

Triangles

triangles only

Outlines

outline only

References and readings

]]>
Apple Metal Sketch Dojonikoloffgeorgi@gmail.comThu, 02 Mar 2023 10:27:43 +0000https://archive.georgi-nikolov.com/project/apple-metal-sketch-dojohttps://archive.georgi-nikolov.com/project/apple-metal-sketch-dojoAvailable for iOS and iPadOS

try the public beta - https://testflight.apple.com/join/neaw91ke

source code - https://github.com/gnikoloff/apple-metal-sketch-dojo

<br/>

Technologies Used:

Swift, SwiftUI, Apple Metal API

Role:

Development

<br />

iPhone preview of the app

This is my first iOS app and a playground for me to explore Swift and the Apple Metal rendering API.

This project has no external libraries: all animations, physics and graphics are written from scratch. I worked on it in my spare time for almost 2 months, learning a ton about Swift, Metal and different rendering techniques in the process. For studying I used these resources (in the following order):

Coming from web development with Javascript and WebGL / WebGPU, this project was really fun to do as my first iOS app.

Check out the technical behind the scenes breakdown at https://github.com/gnikoloff/apple-metal-sketch-dojo

App home screen view

]]>
Software Rasterizernikoloffgeorgi@gmail.comMon, 25 Sep 2023 14:12:02 +0000https://archive.georgi-nikolov.com/project/software-rasterizer-in-chttps://archive.georgi-nikolov.com/project/software-rasterizer-in-curl - https://gnikoloff.github.io/software-renderer/

code - https://github.com/gnikoloff/software-renderer

<br/>

Technologies Used:

C, SDL2, WebAssembly

Role:

Development

3D renderer written as an exercise to learn C and how graphics APIs such as OpenGL are implemented under the hood. Supports native and web via Emscripten.

Features

  1. Software rasterization
  2. Pure C17 implementation
  3. Right hand Y-up coordinate system
  4. Uses SDL for access to graphics hardware, mouse, keyboard, etc.
  5. Supports native and web via emscripten
  6. Perspective correct interpolation
  7. Support for vertex and fragment shaders
  8. Depth buffer
  9. Face culling
  10. Viewport clipping
  11. Loading and parsing Wavefront .OBJ files and PNG images
  12. Cube map sampling

References and readings

  1. 3D Computer Graphics Programming
  2. Modern C
  3. niepp/spbr
  4. Lode's Computer Graphics Tutorial

<br />

Geometries demo

Geometry demo

Depth buffer demo

Visualised depth buffer

Environment mapping

Environment mapping

Plasma demo

Plasma demo

Tunnel demo

Tunnel demo

2D Physics demo

2D Physics demo

]]>
2022 Portfolionikoloffgeorgi@gmail.comFri, 28 Jan 2022 16:24:37 +0000https://archive.georgi-nikolov.com/project/2022-portfoliohttps://archive.georgi-nikolov.com/project/2022-portfoliourl - https://www.2022.georgi-nikolov.com/

code - https://github.com/gnikoloff/2022-portfolio

<br/>&nbsp;

Technologies Used:

WebGL2, GLSL, Typescript

Role:

Design & Development

portfolio render with webgl2

I released my new portfolio for 2022 as a pure webgl2 scene. I wrote all of the graphics, animation, interaction and layouting system from scratch. I did three challenges for my portfolio:

  1. wrote my own webgl2 framework, hwoa-rang-gl2 for the rendering
  2. wrote my minimal tweening library
  3. implemented raycasting against axis aligned bounding boxes, planes and spheres with hwoa-rang-math

debug mode

Webgl2

  • vertex array objects (VAO) are supported in core webgl2 and are no longer hidden behind an optional extension. I interleaved all of my buffers data as much as possible and put them in VAOs to reduce CPU - GPU communication and per-frame calls into GL
  • used uniform buffer objects (UBO) for my shader uniforms that are shared across all meshes, such as time, view camera matrix, projection camera matrix, etc. This way, instead of calling <code>gl.uniformf()</code>, <code>gl.uniformMatrix4fv()</code> and so on for each mesh individually, they all received the correct data via a few simple UBO commands, dramatically reducing the per-frame calls into GL
  • Used texture atlases, combining all of my textures. When adding new images, I updated subregions of the mega texture with <code>gl.texSubImage2D()</code>, calculated correct UV subregion offsets and passed them as extra shader attributes for the meshes to sample with.
  • WebGL2 supports mipmapping NPOT texture, so used that too
  • To draw all the texts, I used hidden canvases with <code>CanvasRenderingContext2D</code>, since its trivial to render text with the <code>fillText()</code> API. I then fed the hidden canvases to my texture atlas and used them in my shaders.

mega texture render

Here is one slice of my texture atlas. As you can see, it allocates new images via a packing algorithm with the cool texture-atlas library and renders them to the correct texture subregion with <code>gl.texSubImage2D()</code>. Notice the extra small black white images at the top right - they are used for the label masking effect animations.

As usual, I used Redux for state management and Typescript for neat code. Also tried out ViteJS as a new bundler and it worked out swimmingly.

Debug mode

You can enter special debug mode by appending ?debugMode=1 to the end of the url

References and readings

]]>
Mipmaps demonikoloffgeorgi@gmail.comWed, 11 Oct 2023 10:16:03 +0000https://archive.georgi-nikolov.com/project/mipmaps-demohttps://archive.georgi-nikolov.com/project/mipmaps-demoProject URL - https://gnikoloff.github.io/webgl-mipmaps-explainer/

Source code - https://github.com/gnikoloff/webgl-mipmaps-explainer

Preview of the mipmap demo

Few weeks back somebody on Reddit showed me this great article by Ben Golus on texture mipmapping, anisotropy filtering and overall best practices.

It helped me a lot to solidify my knowledge on the matter. Up until that point, I had only a high level idea of bilinear, trilinear, anisotropy filtering and all these fancy terms.

I finally got the time to port this new knowledge to WebGL2. The techniques and visuals are lifted straight from the article. I will use this project as a future reference for all these techniques.

]]>
WebGPU Metaballsnikoloffgeorgi@gmail.comSun, 26 Jun 2022 18:23:12 +0000https://archive.georgi-nikolov.com/project/webgpu-metaballshttps://archive.georgi-nikolov.com/project/webgpu-metaballsurl - https://gnikoloff.github.io/webgpu-compute-metaballs/

code - https://github.com/gnikoloff/webgpu-compute-metaballs

<br/>

Technologies Used:

Typescript, WebGPU, WGLSL

Role:

Development

<br/>

I wrote this demo to get better at WebGPU as well as learning different graphics techniques. Here is a bunch of things I implemented:

  • Metaballs isosurface via marching cubes calculated entirely on the GPU with compute shaders via a <a href="https://www.w3.org/TR/webgpu/#compute-pipeline" target="_blank"><code>GPUComputePipeline</code></a>
  • Deferred rendering using 2 <code>rgba16float</code> textures. I encoded the normals to only two channels as seen here. My GBuffer layout looks like this:
  1. Texture 1: RG: Normal, B: Metallic, A: Mesh ID (ID is used in the deferred pass to apply conditional shading to different meshes)
  2. Texture 2: RGB: Albedo, A: Roughness
  • Physically based shading with a Cook-Torrance BRDF taken from LearnOpenGL, albeit copy-pasted most of the GLSL and rewrote it to WGLSL
  • Shadow mapping
  • HDR and tone mapping
  • Bloom postprocessing done with compute shader, again via <a href="https://www.w3.org/TR/webgpu/#compute-pipeline" target="_blank"><code>GPUComputePipeline</code></a>. I composited the bloom on top of the scene with additive blending.

WebGPU Compute Metaballs demo screenshot

]]>
WebGPU Sketch Dojonikoloffgeorgi@gmail.comMon, 11 Oct 2021 19:11:16 +0000https://archive.georgi-nikolov.com/project/webgpu-sketch-dojohttps://archive.georgi-nikolov.com/project/webgpu-sketch-dojourl - https://gnikoloff.github.io/webgpu-dojo/

code - https://github.com/gnikoloff/webgpu-dojo

<br/>&nbsp;

Technologies Used:

WebGPU, WGLSL, Typescript

Role:

Development

<br/>

A collection of experiments with the emerging WebGPU API.

Most of the samples are built with my micro WebGPU rendering & compute library hwoa-rang-gpu

Please use latest Google Chrome or Google Chrome Canary or Firefox Nightly (Instructions). Linux users might need to take extra steps for Chrome as explainedhere.

WebGPU Blinn-Phong material example

Blinn-Phong Material

WebGPU Shadow Mapping example

Shadow Mapping

WebGPU Complex glTF model example

Complex glTF Model

]]>
Portfolio 2021nikoloffgeorgi@gmail.comTue, 01 Jun 2021 21:56:22 +0000https://archive.georgi-nikolov.com/project/portfolio-2021https://archive.georgi-nikolov.com/project/portfolio-2021url - https://2021.georgi-nikolov.com/

code - https://github.com/gnikoloff/2021-portfolio

<br/>

Technologies Used:

hwoa-rang-gl, typescript, redux

Role:

Development

Category:

WebGL, GLSL, animation

Render of homepage

I created my new 2021 portfolio as a pure WebGL scene written from scratch. I used my own lightweight rendering engine to power the 3D graphics and wrote my own layouting system for the text and boxes.

I used redux for state management, popmotion for animation and Typescript for neat code.

Development mode

To access FPS meter, shadow map debug view and texture atlas debug view, simply append ?debugMode=1 as a query param at the end of the url

Debug mode of the website

]]>
Houdini Paint CSS Dojonikoloffgeorgi@gmail.comSun, 24 Jan 2021 21:11:29 +0000https://archive.georgi-nikolov.com/project/houdini-css-dojohttps://archive.georgi-nikolov.com/project/houdini-css-dojourl - https://css-houdini-dojo.georgi-nikolov.com/

code - https://github.com/gnikoloff/houdini-dojo

<br/>

Technologies Used:

CSS Paint API

Role:

Development

Category:

CSS, JS

Renders of my CSS Houdini experiments

I made a demo page to explore the emerging CSS Houdini Paint API. I ported some of my old canvas works to the new API. CSS Houdini is great and easy to use, since it is essentially a stripped down version of the canvas2D API. Two major things are missing: text methods and direct pixel manipulation and access.

The API is unfortunately still not supported in Firefox and Safari, so a polyfill is needed.

I published the Paintlets as standalone npm packages for easy embedding and using.

This project was featured on the Tympanus Codrops Collective 646. Some of the paintlets are also featured on https://houdini.how/.

]]>
GPGPU boxesnikoloffgeorgi@gmail.comFri, 30 Jul 2021 10:17:06 +0000https://archive.georgi-nikolov.com/project/gpgpu-boxeshttps://archive.georgi-nikolov.com/project/gpgpu-boxesurl - https://gpgpu-boxes.georgi-nikolov.com/

code - https://github.com/gnikoloff/gpgpu-hwoa-rang-gl

<br/>

Frontend Technologies Used:

hwoa-rang-gl

Role:

Development

Category:

Animation, Physics

<br/>

demo render

WebGL demo written with my personal library hwoa-rang-gl

Uses deferred shading to render up to 100 000 shaded boxes that are influenced from up to 200 dynamic point lights. Animation of the boxes positions and velocities is all offloaded to the GPU using framebuffer ping-ponging.

Key features:

  • Renders all boxes in one draw call using geometry instancing
  • For deferred shading, uses WEBGL_draw_buffers extension if available to write positions, colors and normals data to different textures simultaneously in one framebuffer. If extension not available (mainly mobile hardware), fallbacks to rendering each texture to different framebuffer
  • Support for GPGPU animation by rendering to floating point textures via OES_texture_float. If not present, fallbacks to rendering to half floating point textures via OES_texture_half_float
  • Uses Vertex Array Objects OES_vertex_array_object to group bindings for easier manipulation
]]>
hwoa-rang-glnikoloffgeorgi@gmail.comSun, 25 Apr 2021 11:39:32 +0000https://archive.georgi-nikolov.com/project/hwoa-rang-glhttps://archive.georgi-nikolov.com/project/hwoa-rang-glurl - https://gnikoloff.github.io/hwoa-rang-gl/

code - https://github.com/gnikoloff/hwoa-rang-gl

<br/>

Technologies Used:

WebGL, Typescript

Role:

Development

<br/>

I developed a lightweight WebGL library for personal use and my work. I used it to learn better the API, get more experience with Typescript and as a playground to learn more linear algebra.

I plan on developing it further, adding features like skeleton animations, different material types and physically based rendering.

<br/>

01. Shadow maps demo

shadow map rendering example

<br/>

02. Deferred shading demo

Deferred shading demo

<br/>

03. Fluid dynamics demo

Fluid dynamics demo

<br/>

04. Linear & Exponential fog example

Linear & Exponential fog example

<br/>

05. GPGPU Particles

gpgpu particles

]]>
New Year Animationnikoloffgeorgi@gmail.comWed, 30 Dec 2020 16:28:31 +0000https://archive.georgi-nikolov.com/project/happy-new-2021-animationhttps://archive.georgi-nikolov.com/project/happy-new-2021-animationurl - http://happy-new-2021.georgi-nikolov.com/

code - https://github.com/gnikoloff/happy-2021-animation

<br/>

Frontend Technologies Used:

WebGL

Role:

Development

Category:

Animation, Physics

<br/>

Metaballs render

Ball physics animation written in direct WebGL to celebrate the end of 2020.

  • Uses hardware instancing with <code>ANGLE_instanced_arrays</code> for minimum draw calls
  • Supports VAOs with <code>OES_vertex_array_object</code> to organise buffer and attributes state and reduce WebGL calls
  • Post processing step using fullscreen quad for the metaball gooey effect
  • Custom animation and physics written from scratch

References

I studied 2D balls physics and collision detection and created this Codepen collection to document my progress.

Codepen Physics collection shot

My codepen balls physics collection

]]>
ACRNM Websitenikoloffgeorgi@gmail.comSat, 28 Nov 2020 18:06:14 +0000https://archive.georgi-nikolov.com/project/acronym-prototypehttps://archive.georgi-nikolov.com/project/acronym-prototypehttps://acrnm-product-explorer.now.sh/

<br/>

Technologies Used:

THREE.js, GLSL, SCSS, vanilla JS for DOM manipulation

Role:

Design, Development, Animation

Category:

Website

<br/>

Acronym Product Explorer Home Page

I made this website as an experimental demo for a favourite brand of mine - ACRNM from Germany.

I used THREE.js and GLSL for the layout, sliders, cursor and loader of the website. The rest is built using vanilla JS for DOM manipulation and the popmotion npm package for complex animations.

The project was challenging because since most of the website is drawn in WebGL, custom implementations of mouse hovering, scrolling and layouting were necessary.

I developed separate mobile version to account for touch and spent significant amount of time making sure this demo runs smoothly on various devices and tested on low-budget Androids, different iPhones and iPads to make sure each one gets the best performance. A dynamic Performance Manager frees up resources and degrades the quality if the framerate drops significantly.

NOTE: This website was made for presentational purposes and is not affiliated with Acronym GmbH. All of the materials on it belong to them.

I built a custom slider supporting clicks and dragging using shaders for nice image effect.

Once a product is highlighted, the user can press & hold to enter single view. The slider smoothly transitions from the main view to the single view and back.

All of the website layout is made using THREE.js with orthographic projection. A "explore" and regular 2D grid layouts are both supported. I made sure scroll is also supported and in sync with the WebGL scene.

I developed an augmented version for handheld devices to account for touch interactions, while preserving the WebGL scene for playful interaction.

]]>
Web AR Furniture Demonikoloffgeorgi@gmail.comSun, 29 Nov 2020 00:43:58 +0000https://archive.georgi-nikolov.com/project/web-ar-furniture-demohttps://archive.georgi-nikolov.com/project/web-ar-furniture-demohttps://model-viewer-frontend.nikoloffgeorgi.vercel.app/

<br/>

Technologies Used:

model-viewer

Role:

Design, Development

Category:

WebAR

<br/>

Slideshow displaying a 3D model of a sofa

I developed this AR demo shop for Apple Quick Look and ARCore during the first COVID quarantine here in Berlin. It lets you easily preview furniture models in Augmented Reality mode from the comfort of your home and gracefully fallbacks to regular 3D using WebGL if your device does not have AR capabilities.

I used the "<model-viewer>" package for this project and wrote custom logic on top of it to allow "Buy Now" links within AR mode.

A 3D model of a chair positioned in Augmented Reality

]]>
Cross Iframe RPCnikoloffgeorgi@gmail.comSat, 28 Nov 2020 18:11:45 +0000https://archive.georgi-nikolov.com/project/cross-iframe-communicationhttps://archive.georgi-nikolov.com/project/cross-iframe-communicationhttps://cross-iframe-communication.vercel.app/

This is a small project I did for fun to illustrate the principles of cross iframe communication and the remote-procedure-call pattern, heavily used at my job to coordinate and synchronise different iFrame elements sitting on the same page.

]]>
Chat in WebVRnikoloffgeorgi@gmail.comSun, 29 Nov 2020 03:04:04 +0000https://archive.georgi-nikolov.com/project/chat-in-webvrhttps://archive.georgi-nikolov.com/project/chat-in-webvrRead the article
<br/>

Technologies Used:

Delight Engine

Role:

Development

Category:

WebXR

<br/>

I was tasked to create a tipping bar and a chat window as UI components, that let the user tip streamers and read messages while in VR mode using our in-house Delight 3D WebGL engine.

Our client would supply the messages using their chat backend to our engine to render using simple BBCode, which we in turn would convert to HTML, insert in SVG <code>&lt;foreignObject&gt;</code> and render the resulting inlined SVG image as a WebGL Texture on a quad.

Since foreignObject uses the normal browser rendering pipeline, rich text styles, layout, images and emojis could easily be converted to bitmap and drawn in WebGL.

A separate engine API functionality for receiving and displaying tip values was also developed.

Here is sample code for

  1. displaying a newly received message from the server to our user while in VR mode
  2. let the user tip the streamer

Markup:

html*<!-- Initialize live video powered by Delight Engine -->
<dl8-live-video
  format="STEREO_180_LR"
  autostart
  no-seeking
  display-mode="force-inline"
  not-pausable
>

  <!-- Initialize live video chat as a UINode in our engine -->
  <dl8-live-video-chat />

  <!-- Initialize live video tipping bar as a UINode in our engine -->
  <dl8-live-video-tipping-bar
    default-budget="8"
    on-submit-tip="window.onSubmitTip"
  />

  <source src="path/to/your/hls.m3u8" type="application/x-mpegurl" />
</dl8-live-video>

JS API:

js*/* ------ Adding a chat message to the stream ------ */

const $liveChat = document.getElementsByTagName('dl8-live-video-chat')[0]

/* The message poster's username formatted using simple BBCode */
const messagePosterBBCode = '[color=white][big]Georgi Nikolov[/big][/color]'

/* The message contents formatted */
const messageContentsBBCode = '[small]wow, cant believe I am here for this [big][color="#f00"]amazing stream!!!
[/color][/big][small]'

/* Display the new message positioned in 3D space and oriented towards the user camera */
$liveChat.addMessage(messagePosterBBCode, messageContentsBBCode)

/* ------ Tipping the streamer ------ */

const $tippingElement = document.getElementsByTagName('dl8-live-video-tipping-bar')[0]
const tippingBufget = 100

/* Update the on-screen remaining tip budget using custom engine call */
tippingBarEl.updateBudget(tippingBudget)
]]>
Blobnikoloffgeorgi@gmail.comSat, 12 Dec 2020 00:03:22 +0000https://archive.georgi-nikolov.com/project/blob-animationhttps://archive.georgi-nikolov.com/project/blob-animation<br/>

Technologies Used:

WebGL2, GLSL

<br/>

Small animation done as artistic expression

<p class="codepen" data-height="265" data-theme-id="dark" data-default-tab="js,result" data-user="gbnikolov" data-slug-hash="NEyrOM" style="height: 265px; box-sizing: border-box; display: flex; align-items: center; justify-content: center; border: 2px solid; margin: 1em 0; padding: 1em;" data-pen-title="blob">

<span>See the Pen <a href="https://codepen.io/gbnikolov/pen/NEyrOM">

blob</a> by Georgi Nikoloff (<a href="https://codepen.io/gbnikolov">@gbnikolov</a>)

on <a href="https://codepen.io">CodePen</a>.</span>

</p>

<script async src="https://cpwebassets.codepen.io/assets/embed/ei.js"></script>

<p class="codepen" data-height="265" data-theme-id="dark" data-default-tab="js,result" data-user="gbnikolov" data-slug-hash="MzQgjp" style="height: 265px; box-sizing: border-box; display: flex; align-items: center; justify-content: center; border: 2px solid; margin: 1em 0; padding: 1em;" data-pen-title="webgl2 particles">

<span>See the Pen <a href="https://codepen.io/gbnikolov/pen/MzQgjp">

webgl2 particles</a> by Georgi Nikoloff (<a href="https://codepen.io/gbnikolov">@gbnikolov</a>)

on <a href="https://codepen.io">CodePen</a>.</span>

</p>

<script async src="https://cpwebassets.codepen.io/assets/embed/ei.js"></script>

]]>
2018 Portfolionikoloffgeorgi@gmail.comSat, 28 Nov 2020 18:07:53 +0000https://archive.georgi-nikolov.com/project/portfolio-2018https://archive.georgi-nikolov.com/project/portfolio-2018https://2018.georgi-nikolov.com/

Project code is here

<br/>

Technologies Used:

WebGL, GLSL

Role:

Design, Development, Animation

Category:

Personal Portfolio

<br/>

I used vanilla WebGL as a personal challenge for my 2018 portfolio.

I used canvas2d to render the animation labels and animated the balls with physics in a separate framebuffer. The resulting framebuffer is used as a texture to my plane. The plane is moved by the mouse position using perlin noise in the vertex shader.

The physics and animation easings for this project are custom implementation by me.

Simple motion blur was implemented as a post processing step by swapping two framebuffers.

]]>
Akira GLnikoloffgeorgi@gmail.comSat, 28 Nov 2020 21:36:24 +0000https://archive.georgi-nikolov.com/project/akira-glhttps://archive.georgi-nikolov.com/project/akira-glhttps://github.com/gnikoloff/akira-gl

I wrote my own micro webgl framework for personal needs. Its codebase is culmination of multiple custom WebGL projects I have done before that.

It supports features like

  • Perspective & Ortho Camera
  • Primitive geometry such as planes, cubes, spheres
  • Custom materials using vertex shader & fragment shader snippet system similar to THREE.js
  • Instancing using <code>ANGLE_instanced_arrays</code> WebGL extension where available
  • VAOs using <code>OES_vertex_array_object</code> WebGL extension where available
  • Mouse picking

Examples are coming soon.

]]>
Tania Gleavenikoloffgeorgi@gmail.comThu, 08 Mar 2018 17:25:17 +0000https://archive.georgi-nikolov.com/project/tania-gleavehttps://archive.georgi-nikolov.com/project/tania-gleaveFor: Wonderland Industry

Link currently unavailab;e

<br/>

Technologies Used:

WebGL, Vue, Wordpress, WooCommerce

Role:

Development, Animation

Category:

Website

<br/>

I made the front-end and webgl effects for the website of Tania Gleave, a Canadian jewellery maker.

I built all the styling, interactions and transitions on the website, using Wordpress REST api for content.

I wrote a bunch of different vanilla WebGL modules for the special effects

<p class="codepen" data-height="265" data-theme-id="dark" data-default-tab="js,result" data-user="gbnikolov" data-slug-hash="LOrMvq" style="height: 265px; box-sizing: border-box; display: flex; align-items: center; justify-content: center; border: 2px solid; margin: 1em 0; padding: 1em;" data-pen-title="webgl exercise">

<span>See the Pen <a href="https://codepen.io/gbnikolov/pen/LOrMvq">

webgl exercise</a> by Georgi Nikoloff (<a href="https://codepen.io/gbnikolov">@gbnikolov</a>)

on <a href="https://codepen.io">CodePen</a>.</span>

</p>

<script async src="https://cpwebassets.codepen.io/assets/embed/ei.js"></script>

Video masking I rendered circular shapes to a separate framebuffer and used the result's channels as a masking value in my main shader

Discarded WebGL slider

Shopping basket render

Discarded navigation animation

]]>
Fat Catnikoloffgeorgi@gmail.comThu, 17 May 2018 11:39:44 +0000https://archive.georgi-nikolov.com/project/fat-cathttps://archive.georgi-nikolov.com/project/fat-catProject Link

Project Link II (unstable physics)

Project Link III (unstable physics)

<br/>

Technologies Used:

canvas2D

Role:

Development, Animation

Category:

Animation

<br/>

I did this experiment as an exercise in blob physics.
I researched Verlet integration, blob physics and collision detection and wrote all the math myself. The graphics are drawn using vanilla canvas2D.

Blob physics debugger view

Blob's mass spring system illustrated with debugger view. Each blob node is connected to its adjacent nodes

I used this article as a reference.

]]>
DXC Technologynikoloffgeorgi@gmail.comMon, 09 Apr 2018 17:35:28 +0000https://archive.georgi-nikolov.com/project/dxc-technologyhttps://archive.georgi-nikolov.com/project/dxc-technologyFor: Wonderland Industry

Offline Project

<br/>

Technologies Used:

Three.js, GLSL

Role:

Development

Category:

Installation

<br/>

I was tasked with creating a big-screen interactive presentation for the opening of a new DXC business center in Germany, according to the artists' and clients art direction.

I used three.js to build a total of 12 slides (ie 3D scenes) in which to convey the message and spirit of DXC. A mixture of predefined models made in Cinema4D and generative geometry and animation were used for each scene.

In order to optimize and run WebGL on a big screen in 60FPS, I tried to use interleaved buffers and instancing as much as I could to save CPU-GPU bandwidth. I also lazily allocated new slides on clicking next, loading their models and animating them in as you progressed through the presentation and disposed old ones, instead of allocating everything at once.

As another optimisation step, I combined all postprocessing effects into a single step and implemented a custom QualityManager that downgrades the experience if the framerate starts dropping.

Scene #1 -> Scene #2

Scene #4 -> Scene #5

]]>
Aesthetic Holdingnikoloffgeorgi@gmail.comThu, 08 Mar 2018 17:28:39 +0000https://archive.georgi-nikolov.com/project/aesthetic-holdinghttps://archive.georgi-nikolov.com/project/aesthetic-holding<br/>

Technologies Used:

Three.js, GLSL, Wordpress Backend

Role:

Development, Animation

Category:

Website

<br/>

I did the interactions and animations for this landing page website. I used threejs to load models from a 3D designer and apply custom logic on them. Godrays and FXAA antialiasing shader are implemented as a postprocessing step.

Godrays interaction

Hover interaction

Scroll interactions

]]>
Dimeboxnikoloffgeorgi@gmail.comTue, 17 Apr 2018 16:22:56 +0000https://archive.georgi-nikolov.com/project/dimeboxhttps://archive.georgi-nikolov.com/project/dimeboxurl: https://dimebox.com/

<br/>

Technologies Used:

Prismic.io CMS, Node.js, custom frontend

Role:

Development

Category:

Website

<br/>

I was tasked with creating the marketing website of Amsterdam based company Dimebox. I built it using Prismic.io as a backend for content and pug, scss, js and express.

]]>
Akoya Creativenikoloffgeorgi@gmail.comSat, 28 Nov 2020 18:04:36 +0000https://archive.georgi-nikolov.com/project/client-workhttps://archive.georgi-nikolov.com/project/client-workhttp://akoyacreative.com

<br/>

Technologies Used:

Wordpress, Vue, THREE.js

Role:

Development, Animation

Category:

Website

<br/>

Akoya Website Homepage Render

I developed the website for AKOYA, a small design studio in Sofia, Bulgaria. I used Wordpress as a headless CMS and queried its API with a Vue frontend. I added different 3D WebGL animated backgrounds working along with the company motion designer.

]]>
CodeSketchnikoloffgeorgi@gmail.comSat, 28 Nov 2020 18:01:04 +0000https://archive.georgi-nikolov.com/project/codesketchhttps://archive.georgi-nikolov.com/project/codesketchhttps://codesketch.georgi-nikolov.com/=

<br/>

Technologies Used:

canvas2D, Threejs, GSAP

Role:

Design, Development, Animation

Category:

Website

<br/>

I made this interactive showcase of my CodePen collection using drag interactions and CSS3D for the layout and WebGL / canvas2d for the animations. I used this as a tool to explore animation principles and math

]]>
Oliver Wicksnikoloffgeorgi@gmail.comSun, 29 Nov 2020 01:24:52 +0000https://archive.georgi-nikolov.com/project/oliver-wickshttps://archive.georgi-nikolov.com/project/oliver-wickshttps://www.oliverwicks.com/

I helped with the frontend and design for this project. I developed the home page and a bunch of internal campaign pages, working closely with the backend team.

]]>
Nomad Creativenikoloffgeorgi@gmail.comSat, 28 Nov 2020 22:34:04 +0000https://archive.georgi-nikolov.com/project/nomad-creativehttps://archive.georgi-nikolov.com/project/nomad-creativehttp://nomadcreative.bg/

<br/>

Technologies Used:

SVG, CSS, canvas2d, Wordpress

Role:

Design, Development, Animation

Category:

Website

<br/>

Nomad Creative Website Hero Unit Screenshot

Nomad Creative is a small boutique design agency based in Sofia, Bulgaria.

I created the frontend for this website, as well as set up the Wordpress for it and installed needed plugins.

]]>
CameraGLnikoloffgeorgi@gmail.comMon, 07 Dec 2020 15:02:12 +0000https://archive.georgi-nikolov.com/project/camera-glhttps://archive.georgi-nikolov.com/project/camera-glhttps://camera-gl.georgi-nikolov.com/

<br/>

Technologies Used:

Threejs, GLSL

Role:

Development, Animation

Category:

Camera FX

<br/>

A small exercise project using Three.js and getUserMedia() to practice my shader skills.

A screenshot of the Camera GL photo filter app

]]>
Portfolio 2014nikoloffgeorgi@gmail.comMon, 07 Dec 2020 16:52:58 +0000https://archive.georgi-nikolov.com/project/portfolio-2014https://archive.georgi-nikolov.com/project/portfolio-2014https://2014.georgi-nikolov.com/

<br/>

Technologies Used:

GSAP, Backbone.js

Role:

Design, Development, Animation

Category:

Personal Portfolio

<br/>

A simple portfolio I did back when I just started programming.

]]>
Drawing with Metalnikoloffgeorgi@gmail.comSun, 31 Dec 2023 08:40:20 +0000https://archive.georgi-nikolov.com/project/rendering-with-metal-presentationhttps://archive.georgi-nikolov.com/project/rendering-with-metal-presentationIn May 2023 I gave a talk for CocoaHeads Berlin about my experience with learning the Metal rendering API and building a bunch of graphics demos with it. A live recording will be uploaded soon.

<div class="cp_embed_wrapper">

<iframe src="https://www.slideshare.net/slideshow/embed_code/key/3idrYdoKk5Ytrf?hostedIn=slideshare&page=upload" width="476" height="400" frameborder="0" marginwidth="0" marginheight="0" scrolling="no"></iframe>

</div>

]]>
Dynamic animation with JSnikoloffgeorgi@gmail.comFri, 13 Aug 2021 11:34:21 +0000https://archive.georgi-nikolov.com/project/dynamic-animation-with-jshttps://archive.georgi-nikolov.com/project/dynamic-animation-with-js

Course link - https://www.awwwards.com/academy/course/dynamically-scripted-animations-with-javascript

Demos, examples and source code - https://js-animation-awwwards-course.georgi-nikolov.com/

<br/>

I created a course on AWWWARDS for creating dynamic animation and physics with Javascript using the HTML5 canvas.

<br/>

]]>
Practical WebGLnikoloffgeorgi@gmail.comThu, 21 Oct 2021 21:24:44 +0000https://archive.georgi-nikolov.com/project/practical-webgl-from-scratchhttps://archive.georgi-nikolov.com/project/practical-webgl-from-scratchCourse Link - https://www.awwwards.com/academy/course/practical-webgl-from-scratch-for-frontend-developers

Demos, examples and source code - https://practical-webgl-from-scratch-awwwards-course.georgi-nikolov.com/

Practical WebGL from scratch poster

I created a course on AWWWARDS for working with the vanilla WebGL for web developers

It covers everything one needs to know to build highly interactive and engaging visuals:

WebGL rendering is covered in detail, including all kinds of drawing primitives

Different drawing modes in WebGL

Matrix transformations

Procedural and image textures

Bulding sliders, camera effects, particle animations and scroll effects

]]>
SVG From Scratchnikoloffgeorgi@gmail.comFri, 13 Aug 2021 11:21:55 +0000https://archive.georgi-nikolov.com/project/svg-fundamentals-coursehttps://archive.georgi-nikolov.com/project/svg-fundamentals-course

Course link - https://www.awwwards.com/academy/course/building-vector-graphics-uis-and-animations-with-svg-from-scratch

Course examples - https://svg-fundamentals-awwwards-course.georgi-nikolov.com/

Github repo - https://github.com/gnikoloff/svg-fundamentals-awwwards-course

I created a course on AWWWARDS for building vector graphics, UIs and animations with SVG from scratch

It covers everything one needs to know to build highly optimised and interactive vector graphics:

  1. SVG rendering is covered in detail, including all kinds of drawing primitives
  2. Different kinds of charts and graphs
  3. Interactive drawing apps
  4. CSS / SMIL Animations
  5. Advanced SVG filters and special effects
  6. Importing and optimising graphics built by GUIs like Adobe Illustrator
  7. Generating procedural random graphics with JS and exporting them to Adobe Illustrator
]]>
Creating a Typography Motion Trail Effect with Three.jsnikoloffgeorgi@gmail.comWed, 21 Jul 2021 10:47:38 +0000https://archive.georgi-nikolov.com/blog/creating-a-typography-motion-trail-effect-with-three.jshttps://archive.georgi-nikolov.com/blog/creating-a-typography-motion-trail-effect-with-three.jsLearn how to use WebGL framebuffers via Three.js to create an interactive motion trail effect for text.

]]>
Adding a Persistence Effect to Three.js Scenesnikoloffgeorgi@gmail.comTue, 28 Dec 2021 11:33:32 +0000https://archive.georgi-nikolov.com/blog/adding-a-persistence-effect-to-three.js-sceneshttps://archive.georgi-nikolov.com/blog/adding-a-persistence-effect-to-three.js-scenesHow to Get a Pixel-Perfect, Linearly Scaled UInikoloffgeorgi@gmail.comThu, 15 Jul 2021 15:00:25 +0000https://archive.georgi-nikolov.com/blog/dynamically-scaling-css-values-based-on-the-viewporthttps://archive.georgi-nikolov.com/blog/dynamically-scaling-css-values-based-on-the-viewportDynamically scaling CSS values based on the viewport width is hardly a new topic. You can find plenty of in-depth coverage right here on CSS-Tricks in articles like this one or this one.

Most of those examples, though, use relative CSS units and unitless values to achieve fluid scaling. That loses pixel perfection and usually introduces text wrapping and layout shifts once the screen goes below or above a certain threshold.

]]>
How to Cancel Pending API Requests to Show Correct Datanikoloffgeorgi@gmail.comSat, 26 Jun 2021 07:16:07 +0000https://archive.georgi-nikolov.com/blog/how-to-cancel-pending-api-requests-to-show-correct-datahttps://archive.georgi-nikolov.com/blog/how-to-cancel-pending-api-requests-to-show-correct-dataToo Many SVGs Clogging Up Your Markup? Try `use`.nikoloffgeorgi@gmail.comWed, 10 Mar 2021 20:13:03 +0000https://archive.georgi-nikolov.com/blog/too-many-svgs-clogging-up-your-markup-try-use.https://archive.georgi-nikolov.com/blog/too-many-svgs-clogging-up-your-markup-try-use.Rendering a 3D textured cube with hwoa-rang-glnikoloffgeorgi@gmail.comSun, 18 Apr 2021 08:23:13 +0000https://archive.georgi-nikolov.com/blog/rendering-a-3d-textured-cube-with-hwoa-rang-glhttps://archive.georgi-nikolov.com/blog/rendering-a-3d-textured-cube-with-hwoa-rang-glDrawing 2D Metaballs with WebGL2nikoloffgeorgi@gmail.comTue, 19 Jan 2021 21:13:04 +0000https://archive.georgi-nikolov.com/blog/metaballs-in-webgl2https://archive.georgi-nikolov.com/blog/metaballs-in-webgl2Drawing Graphics with the CSS Paint APInikoloffgeorgi@gmail.comFri, 18 Jun 2021 09:35:18 +0000https://archive.georgi-nikolov.com/blog/drawing-graphics-with-the-css-paint-apihttps://archive.georgi-nikolov.com/blog/drawing-graphics-with-the-css-paint-api2D Particles Animation with WebGLnikoloffgeorgi@gmail.comSat, 05 Dec 2020 14:58:58 +0000https://archive.georgi-nikolov.com/blog/webgl-particleshttps://archive.georgi-nikolov.com/blog/webgl-particlesLet's draw some particles and animate them on our screen using shaders and raw WebGL. While using frameworks such as threejs or pixi brings a lot of ease into expressing yourself creatively, they also bring a lot of size and dependencies to your JS codebase. Writing lower-level webgl code means your program will be much more optimised and even let you do things libraries like threejs dont allow you to do.

We are going to create some 2d particles and animate them in our shaders, which are small programs to be compiled and ran on the device GPU. Here is the final result:

<p class="codepen" data-height="265" data-theme-id="dark" data-default-tab="js,result" data-user="gbnikolov" data-slug-hash="dypMOmP" style="height: 265px; box-sizing: border-box; display: flex; align-items: center; justify-content: center; border: 2px solid; margin: 1em 0; padding: 1em;" data-pen-title="WebGL Particle Animation">

<span>See the Pen <a href="https://codepen.io/gbnikolov/pen/dypMOmP">

WebGL Particle Animation</a> by Georgi Nikoloff (<a href="https://codepen.io/gbnikolov">@gbnikolov</a>)

</p>

<script async src="https://cpwebassets.codepen.io/assets/embed/ei.js"></script>

Let's start by setting up some boilerplate code. Here is what we will do:

1. create a canvas and resize it according to the device screen

2. obtain it's webgl context and create an update loop that will update and redraw our graphics

js*/* Create a canvas and append it to the DOM */
const canvas = document.createElement('canvas')
document.body.appendChild(canvas)

/* Obtain WebGLRenderingContext for drawing */
/* 'experimental-webgl' is needed for IE11 */
const gl = canvas.getContext('webgl') || canvas.getContext('experimental-webgl')

/* Initialize our program */
document.addEventListener('DOMContentLoaded', init)

function init () {
   /* Handle canvas resize */
   window.addEventListener('resize', resizeCanvas)
   resizeCanvas()

   /* Schedule first paint */
   requestAnimationFrame(renderFrame)
}

function renderFrame () {
   /* Paint over the canvas with a rgba color */
   gl.clearColor(0.8, 0.8, 0.8, 1.0)
   gl.clear(gl.COLOR_BUFFER_BIT)

   /* Schedule next particles redraw */
   requestAnimationFrame(renderFrame)
}

function resizeCanvas() {
   /* Multiply width and height to account for the device pixel ratio and prevent aliasing */
   canvas.width = devicePixelRatio * innerWidth
   canvas.height = devicePixelRatio * innerHeight

   /* Scale down to the actual layout width and height */
   canvas.style.width = `${innerWidth}px`
   canvas.style.height = `${innerHeight}px`
   
   /* Set the WebGL context viewPort */
   /* gl.drawingBufferWidth and gl.drawingBufferHeight represent the actual width of the current drawing buffer */
   gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight)  
}

We have our basic setup ready and now need something to draw. WebGL has a rendering pipeline which represents how your data will result in pixels on the screen. An entire explanation of the rendering pipeline is out of the scope for this demo, but the gist of it is that you get to write two small programs called shaders that will be ran on the GPU. They are written in GLSL, a C-like language.

WebGL lets you you prepare and issue render calls to the GPU, where you have access to programming the Vertex shader and Fragment shader. These shaders are represented as <code>WebGLShader</code> and let you control the final position and appearance of your object on the user screen respectively.

The entire rendering pipeline is out of scope for this demo. Here is the minimal amount of steps we need to take to draw our particles:

  1. Create <code>gl.VERTEX_SHADER</code> and <code>gl.FRAGMENT_SHADER</code> and supply our GLSL shader programs as strings. Compile the two programs on the GPU and check for errors in the <code>gl.COMPILE_STATUS</code>. These two programs will be executed on the device GPU and expect certain variables as inputs that we will supply with the webgl API.
  2. Create a <code>WebGLProgram</code> and supply the two successfully compiled shaders to it. Link the program and check for it's <code>gl.LINK_STATUS</code>
  3. We can query our newly created programs and obtain the allocated slots for the input variable it expects. Once we obtain their address on the GPU, we can supply data to them using the webgl API.
  4. Once the program is linked successfully and we have provided our data, we can issue the draw call. WebGL supports different primitives such as <code>gl.LINES</code>, <code>gl.TRIANGLES</code>, <code>gl.TRIANGLE_FAN</code>, etc. For our demo we will use <code>gl.POINTS</code> and <code>gl.LINES</code>

1. Creating our vertex and fragment shader objects:

Let's create a helper method for creating a shader with a certain shaderType:

js*function makeShader (gl, { shaderType, shaderSource}) {
  /* Create a shader with a gl.VERTEX_SHADER or gl.FRAGMENT_SHADER */
  const shader = gl.createShader(shaderType)

  /* Supply the shader source as a JS string to the newly created shader */
  gl.shaderSource(shader, shaderSource)

  /* Compile the shader on the GPU */
  gl.compileShader(shader)

  /* Check if the shader compiled successfully by querying the gl.COMPILE_STATUS for this shader */
  const success = gl.getShaderParameter(shader, gl.COMPILE_STATUS)
  if (success) {
    return shader
  }

  /* If the compilation was not successful, log the shader info */
  console.error(gl.getShaderInfoLog(shader))

  /* Delete the invalid shader object */
  gl.deleteShader(shader)
}

2. Creating our WebGL Program:

Let's write another helper method to create a <code>WebGLProgram</code> that expects our shaders as strings

js*function makeProgram (gl, { vertexShaderSource, fragmentShaderSource }) => {
  /* Create WebGLShader object with type gl.VERTEX_SHADER using our helper method */
  const vertexShader = makeShader(gl, {
    shaderType: gl.VERTEX_SHADER,
    shaderSource: vertexShaderSource,
  })

    /* Create WebGLShader object with type gl.FRAGMENT_SHADER using our helper method */
  const fragmentShader = makeShader(gl, {
    shaderType: gl.FRAGMENT_SHADER,
    shaderSource: fragmentShaderSource,
  })

  /* Create new WebGLProgram object */
  const program = gl.createProgram()

  /* Attach both shaders to the program. Ordering matters: vertex shader is always first */
  gl.attachShader(program, vertexShader)
  gl.attachShader(program, fragmentShader)

  /* Link the program on the device GPU */
  gl.linkProgram(program)

  /* Check if the program was linked successfully by checking its gl.LINK_STATUS   */
  const success = gl.getProgramParameter(program, gl.LINK_STATUS)

  if (success) {
    return program
  }

  /* Log the program info for debugging */
  console.error(gl.getProgramInfoLog(program))

  /* Delete invalid program */
  gl.deleteProgram(program)</pre>
}

3. Writing our shaders

We can now construct a new <code>WebGLProgram</code>, but first, let's write our shader sources as simple strings on the top of our program:

Vertex Shader:

glsl*/* precision of our floats in GLSL */
/* See https://webglfundamentals.org/webgl/lessons/webgl-precision-issues.html for more info */
precision highp float;

/* The vertex indice as a float. */
/* We can calculate the final vertex position using the indice */
attribute float a_indice;

/* Updated uniform time will be passed every frame so we can run our animation */
uniform float time;

/* Radius scale of movement for our particles. */
/* We will update it every frame with new value animated in javascript using simple easing */
uniform float radiusScale;

/* Pass the particle count in shader construction step so we don't have to pass an extra uniform */
const float PARTICLES_COUNT = ${PARTICLES_COUNT}.0;

/* GLSL does not have PI primitive, so we need to supply it ourselves */
const float PI = ${Math.PI};

void main () {
  /* Construct vec2 (x, y) position sing our indice that we supplied as a Float32Array */
  /* Calculate the position using our indice and time as inputs to simple sin / cos formulas. */
  /* Change the variable values to see the effect */
  float step = PI * radiusScale / PARTICLES_COUNT;
  float timeScale = 0.001;
  float minRadius = 0.1;
  float x = sin(a_indice * radiusScale * 0.01 * step - time * timeScale) * (a_indice / PARTICLES_COUNT + minRadius);
  float y = cos(a_indice * step - time * timeScale) * (a_indice / PARTICLES_COUNT + minRadius);

  vec2 position = vec2(x, y);

  /* WebGL expects us to supply the final position in 4 dimension - xyzw */
  gl_Position = vec4(position, 0.0, 1.0);

  /* Set our particle size depending on how far is it from the center */
  float distFromCenter = distance(position, vec2(0.0));

  gl_PointSize = distFromCenter * 7.0 + 1.0;
}

Fragment Shader:

glsl*precision highp float;

/* Our fragment shader is super simple, just color each vertex uniformly with the same color */
void main () {
  gl_FragColor = vec4(vec3(0.6), 1.0);
}

We can then use our helper method and create our program using our shaders:

js*drawProgram = makeProgram(gl, {
   vertexShaderSource,
   fragmentShaderSource,
})

4. Preparing our indices array for the particles draw call

Once the program has been successfully linked, let's supply it with some data so we can render it on screen. In the next snipper we will:

  1. Prepare our indices as <code>Float32Array</code>, so we can use them on the GPU for our animation.
  2. Look up the <code>a_indice</code> variable location on our GPU and enable it
  3. Supply our particles indices as a WebGLBuffer to the a_position variable on our GPU
js*const PARTICLES_COUNT = 500  

/* Create a WebGLBuffer to hold our indices array */
const indicesBuffer = gl.createBuffer()

/* Bind our indicesBuffer to the active gl.ARRAY_BUFFER */
gl.bindBuffer(gl.ARRAY_BUFFER, indicesBuffer)

/* Supply data to our indicesBuffer using the current gl.ARRAY_BUFFER hook */
gl.bufferData(gl.ARRAY_BUFFER, indices, gl.STATIC_DRAW)

/* Query the a_indice input variable in our vertex shader from the linked program running on our GPU */
const positionLocation = gl.getAttribLocation(drawProgram, 'a_indice')

/* Enable the variable on our GPU */
gl.enableVertexAttribArray(positionLocation)

/* Point the position location to the active buffer and specify its layout */
/* Second argument is "1" because we have one indice per vertex */
/* Please check https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/vertexAttribPointer */
gl.vertexAttribPointer(positionLocation, 1, gl.FLOAT, false, 0, 0)

We need to explicitly allow our program before using it:

js*gl.useProgram(drawProgram)

Now that we have supplied individual index attributes to all our vertices, we can pass global uniforms, that uniform-ly can affect our vertices:

js*/* We need to explicitly allow our program before using it */
/* Look up our uniform locations on the GPU */
timeLocation = gl.getUniformLocation(drawProgram, 'time')
radiusScaleLocation = gl.getUniformLocation(drawProgram, 'radiusScale')

5. Issuing draw calls and the update loop

At this point we have set all of the WebGL state that we need. We have compiled our WebGL program, set the viewport correctly and supplied our input variables to our shader correctly formated. With all this we can issue a render call:

js*/* Paint over the canvas with a rgba color */
gl.clearColor(0.8, 0.8, 0.8, 1.0)
gl.clear(gl.COLOR_BUFFER_BIT)

/* Supply new elapsed time uniform to our shader */
gl.uniform1f(timeLocation, ts)

/* Supply the updated radiusScale to our shader */
gl.uniform1f(radiusScaleLocation, radiusScale)

/* Issue a render call using gl.LINES */
gl.drawArrays(gl.LINES, 0, PARTICLES_COUNT)

/* Issue a render call with gl.POINTS */
gl.drawArrays(gl.POINTS, 0, PARTICLES_COUNT)

/* Issue next render */
requestAnimationFrame(renderFrame)

6. Conclusion

This demo is not production-ready code, rather a small prototype to showcase core WebGL concepts, while still trying to give a high-level overview of the process. Without the comments it is ~130 lines. With a little more code we can handle user interactions, change colors depending on the positions, convert it to 3D etc.

Happy coding!

]]>
Rendering HTML as a WebGL Texturenikoloffgeorgi@gmail.comWed, 02 Dec 2020 22:24:09 +0000https://archive.georgi-nikolov.com/blog/rendering-html-to-webglhttps://archive.georgi-nikolov.com/blog/rendering-html-to-webglArticle link

Small demo showing how to turn HTML markup into bitmap that can be used in canvas2d / WebGL for fancy effects.

I used this technique to create a chat, where the user can read incoming messages from a streamer chat while in VR. It also has substantial performance benefits if applied correctly.

]]>