Renderer



Controls

Drag to move camera, scroll to zoom. Use menu to view different model and materials.

Overview

This project is a flexible real-time physically based forward renderer based on the GLTF 2.0 specification.

GLTF 2.0

GLTF 2.0 is a popular asset transfer format which allows PBR models to be shared in an easy and compact way. The project currently supports: attribute and index extraction (sparse and dense), traversing nodes and primitives, PBR material extraction (with properties and textures), animations, morph targets and skinning. The engine also implements the transmission, IOR, sheen and texture transform extensions.

The renderer supports environment based lighting, animation, transmission, emission and other GLTF 2.0 features.

Importing data

We traverse and process the GLTF scene graph of nodes, generate parameters for the internal engine representation and allocate GPU buffers for mesh attributes. An internal engine node graph is created which represents the hierarchy of the scene. Some nodes are drawable mesh objects which correspond to geometry, morph and skinning data coupled with a shader program. To avoid data and work duplication, the engine keeps track of buffer descriptions and shader programs in repository maps. As different primitives are not aware of shared data, whenever they seeks to create an already allocated buffer or a previously compiled shader program, we can reuse the existing ones. This helps cut down on GPU memory use and shader compilation times.

Materials

In the engine, materials correspond to GLSL programs. Each drawable primitive is created with a PBR material which is generated to exactly match only the features and aspects that the primitive has. The material system is modular and easily extendible and also includes textured, attribute visualising and Lambertian materials. There are also materials for the environment map, the generation of IBL data and a depth pre-pass. Shader authoring and integration is simple and new materials can be added to the engine easily.

Physically Based Rendering

The engine uses physically based rendering (PBR) and image based lighting (IBL). We use a Lambertian diffuse lobe and the Cook-Torrance BRDF for the specular lobe. We use the Trowbridge-Reitz (GGX) microfacet distribution function, the Smith visibility/shadowing function and the Schlick Fresnel approximation.

The renderer makes use of data and colour textures to render spatially varying materials illuminated by the environment.

HDR environment maps act as the source of illumination. Whenever a new HDR environment texture is selected, a pre-processing program convolves the environment data and generates spherical harmonics matrices for diffuse ambient light. Another shader generates a cubemap used for ambient specular light. The lower mip-levels store pre-integrated specular reflections following the approaches found in Frostbite and Filament. We also generate a BRDF LUT as described in those renderers. The red and green channels hold the scale and bias for the GGX BRDF and the blue channels holds the DG term for the Charlie BRDF used for rendering sheen.

The mip-levels of the environment cubemap store convolved specular reflections for varying roughness. Diffuse illumination is stored as spherical harmonics matrices.

The renderer uses the popular split-sum approach for image based lighting. The BRDF LUT holds data which depends only on the view direction and material roughness. Note this is the multiple scattering variant based on Filament.

Render order and performance

Although the primitives exist in a common node graph, we distinguish between opaque, transmissive and transparent meshes and use render-passes to achieve correct blending. The PBR materials can be quite heavy for a forward renderer in a browser. We use AABB culling to exclude primitives which fall outside of the camera view frustum. This takes into consideration node animations and updates the bounding volumes whenever anything in the scene changes. We also use a depth pre-pass of all opaque geometry to avoid shading fragments which are hidden by subsequent primitive rasterisation.

Morph targets and skinning use data textures to avoid attribute limits. Morphed geometry has a texture array where each level is a square texture storing the interleaved attribute offset data for one target. Skins store and update a texture which holds the transform matrices of their joints.

The renderer supports sheen, unlimited morph targets and skinning. Vertex skin joint influence is visualised in the last image.