Particles

Particles Guide

How to use the fluid field with procedural particles and GPGPU particles, why camera data is passed, and how to tune the motion without ambiguity.
Procedural and GPGPU

The library does not ship a particle engine. It gives you a flow field. A particle system becomes fluid-reactive when it samples that field and turns the sample into displacement or acceleration.

There are two useful patterns in this repository:

  • Procedural particles: no persistent particle simulation. Positions are computed from a shape formula and displaced at render time.
  • GPGPU particles: position and velocity live in GPU textures. A compute or fragment update pass advances them every frame.

The contract from the fluid side

For particles, the core contract is small:

fluid.step(dt)

particles.step({
  velocityField: fluid.velocityTexture,
  // particle-specific data...
})

velocityTexture.xy stores the flow after the fluid step. That is the texture you want when particles should move with the simulated field.

Procedural particles

The trefoil example is the procedural path. It does not maintain particle position and velocity textures. Instead, each instance computes its base position from a knot formula, samples the fluid, and offsets the render-time position.

const particles = createTrefoilParticles(fluid.velocityNode, {
  count: 4000,
  tubeRadius: 0.22,
  scale: 0.65,
  pointSize: 7,
})

scene.add(particles.mesh)

The update path only changes uniforms:

particles.update({
  elapsed,
  modelRotation,
  displacement,
  dispThreshold,
  dispRange,
  dragStrength,
  maxFlowSpeed,
})

What is happening

The particle does not have memory. There is no integrated velocity from the previous frame. Every frame starts from a clean procedural position:

vec3 base = trefoilPosition(instanceId);
FluidSample sample = fluidSample(fluid, base);
vec3 displaced = base + sample.flow * uDisplacement;

This is best when the visual has a strong underlying shape: a knot, ribbon, sphere shell, grid, portrait mask, typography, or any procedural field. The fluid acts like a brush that bends the shape.

Procedural particle controls

Parameter What it controls How to tune it
displacement How far the flow can push the procedural position. Raise until the shape reacts clearly. Lower if the shape loses its identity.
dispThreshold Minimum density or flow energy before displacement starts. Raise it to keep weak background noise from moving the object.
dispRange Soft transition width above the threshold. Wider ranges feel smoother. Narrow ranges feel sharper and more digital.
dragStrength Extra screen-plane pull from flow direction. Use lightly. Too much turns a shaped field into a smear.
maxFlowSpeed Clamp for hot pointer input. Lower it when fast gestures explode the shape.

GPGPU particles

The 2D and 3D flow-particle examples are GPGPU systems. They store persistent state on the GPU:

Each frame, the particle update reads old position, old velocity, destination, and the fluid field. It writes new position and velocity into the next ping-pong targets.

particles.step({
  dt,
  velocityField: fluid.velocityTexture,
  viewMatrix: camera.matrixWorldInverse,
  projectionMatrix: camera.projectionMatrix,
  cameraRight,
  cameraUp,
  modelRotation,
  pointSize: 6,
  spring: 4,
  zeta: 1.15,
  dragLin: 0.28,
  dragQuad: 0.05,
  aMax: 24,
  vMaxScale: 1,
  flowStrength: 1.2,
  flowThreshold: 0.02,
  maxFlowSpeed: 12,
  responseGamma: 2,
  depthLift: 0,
  perpendicularAngle: 0,
  sideVariation: 0,
  depthAttenuationScale: 1,
})

Why camera data is passed

This is the part that should be explicit in the tutorial. The fluid field is a 2D screen-space texture. The particle positions are in local or world space. The particle update has to answer one question:

Which pixel of the fluid field is behind this particle on screen?

That requires the same projection path used by rendering:

vec3 worldPos = uModelRotation * pos;
vec4 clip = uProjectionMatrix * uViewMatrix * vec4(worldPos, 1.0);
vec2 ndc = clip.xy / clip.w;
vec2 uv = ndc * 0.5 + 0.5;
vec2 flow = texture2D(uFlow, clamp(uv, 0.0, 1.0)).xy;

Then the sampled flow is still a screen-space vector. flow.x means “move right on the screen”, and flow.y means “move up on the screen”. Particles need a world acceleration, so the update converts screen axes back into world directions:

vec3 flowWorld = flow.x * uCameraRight + flow.y * uCameraUp;

That is why the step call needs:

Parameter What it controls How to tune it
viewMatrix Moves world positions into camera space. Pass camera.matrixWorldInverse every frame.
projectionMatrix Projects camera-space positions into clip space. Pass camera.projectionMatrix after updating the camera aspect.
cameraRight World direction that maps to screen X. Read column 0 from camera.matrixWorld.
cameraUp World direction that maps to screen Y. Read column 1 from camera.matrixWorld.
modelRotation Transforms particle local positions and directions when the particle object rotates. Update from the particle mesh matrix before the step.

Typical frame setup:

particles.points.updateMatrixWorld(true)
modelRotation.setFromMatrix4(particles.points.matrixWorld)

cameraRight.setFromMatrixColumn(camera.matrixWorld, 0)
cameraUp.setFromMatrixColumn(camera.matrixWorld, 1)

particles.step({
  viewMatrix: camera.matrixWorldInverse,
  projectionMatrix: camera.projectionMatrix,
  cameraRight,
  cameraUp,
  modelRotation,
  velocityField: fluid.velocityTexture,
  // ...
})

Why particles do not fly away forever

The example particle system combines flow acceleration with a spring-damper that pulls each particle back toward its destination.

vec3 error = destination - position;
vec3 aSpring = spring * spring * error;
vec3 aDamp = -2.0 * zeta * spring * velocity;

Without this, a single pointer gesture eventually scatters the cloud. With it, the cloud can react, drift, and settle back into its designed layout.

Parameter What it controls How to tune it
spring How strongly a particle returns to its destination. Higher values snap back faster. Lower values drift longer.
zeta Damping ratio for the spring motion. Around 1 is controlled. Lower values wobble. Higher values feel heavy.
dragLin Linear velocity drag. Use it to calm normal motion.
dragQuad Quadratic drag for fast particles. Use it to tame spikes from strong pointer input.
aMax Acceleration clamp. Prevents single samples from launching particles.
vMaxScale Velocity clamp multiplier. Lower it when particles streak too far after interaction.

Flow response controls

Fluid velocity values can be noisy and uneven. The particle example shapes that input before applying it.

Parameter What it controls How to tune it
flowStrength Main multiplier for sampled fluid flow. Raise for stronger interaction. Lower for subtle background motion.
flowThreshold Dead zone before flow affects particles. Raise to ignore weak residual fluid.
maxFlowSpeed Clamp for sampled flow magnitude. Lower it for stability on quick pointer gestures.
responseGamma Curve applied to normalized flow response. Higher values make weak flow quieter and strong flow more selective.
depthAttenuationScale How depth reduces influence in 3D layouts. Raise when rear particles react too strongly.

2D plane vs 3D cloud

The 2D plane uses the same screen-space flow projection but keeps depth shaping off:

particles.step({
  mode: 'plane2d',
  depthLift: 0,
  perpendicularAngle: 0,
  sideVariation: 0,
})

The 3D cloud starts particles on a sphere-like layout and uses extra controls to make a 2D flow texture feel volumetric.

particles.step({
  mode: 'cloud3d',
  depthLift: 0.65,
  perpendicularAngle: 0.75,
  sideVariation: 0.35,
})

depthLift adds motion along depth. perpendicularAngle introduces side motion around the screen-flow direction. sideVariation varies that side motion per particle so the cloud does not move as one flat sheet.

Which particle path should you use?

Use procedural particles when:

  • The final shape is more important than physical continuity.
  • You want a knot, mask, surface, text shape, or field to react to the brush.
  • You want fewer moving parts and no persistent particle state.

Use GPGPU particles when:

  • The particles should carry momentum between frames.
  • You need a cloud, dust, sparks, confetti, or a sheet that drifts and settles.
  • You want spring, damping, drag, and flow-response controls.
Procedural particles sample the fluid and displace a render-time shape. Open example
The 2D GPGPU example stores position and velocity in textures. Open example
The 3D example uses the same flow contract with depth shaping. Open example