Overview
SermoBot is an interactive web experience built around a luminous “orb” persona: a Three.js 3D scene with a custom GLSL shader, post-processing (bloom and vignette), and optional ambient audio that reacts to the visuals.
Visitors can chat with the orb in the browser. Messages are sent to Pollinations’ OpenAI-compatible chat API; replies are shown on the page and read aloud with the Web Speech API, while shader uniforms speed up briefly during the request for feedback.
The app is bundled with Vite, deployed as static assets (for example on Netlify), and uses environment variables for the API key so secrets stay out of the repository.
The UI keeps chrome to a minimum: one focal presence (the orb) and one line of input, so the experience reads as a dialogue with a character rather than a conventional app shell.
Design philosophy
As both designer and developer, the goal was to align look, motion, and behavior: the orb is not decoration—it embodies the system prompt (“wise, slightly mysterious luminous orb”). High-contrast neon on deep navy supports legibility and mood; bloom and slow drift add atmosphere without competing with the message area.
Mystical minimalism: no sidebars, chat bubbles, or dense controls. Removing visual noise directs attention to the entity and the single input, which makes the product feel intimate and focused.
Design–engineering synergy: the wireframe torus, procedural halo, and audio-reactive shader are authored in GLSL and Three.js; typography, layout, and micro-interactions live in HTML/CSS and GSAP. The same person shaped the narrative tone, the color story, and the implementation tradeoffs (e.g. post-processing vs performance).
Architecture
Everything runs in the browser. There is no custom backend: the static site talks directly to Pollinations over HTTPS. The diagram below summarizes how UI, graphics, audio, and chat connect.
AI Halo shader
The orb is drawn with a custom WebGL2 GLSL program titled AI Halo (author credit in source). The look is entirely procedural: no texture maps—only UVs, time, simplex-style noise, distance fields, and a handful of uniforms (uTime, uSpeed, tAudioData) supplied from JavaScript.
Vertex stage
The vertex shader is a straight pass-through transform: gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0), and vUv = uv. Every pixel of the quad receives stable 0–1 UVs so the fragment shader can treat the surface as a 2D canvas in model space.
Fragment stage: four halo layers (draw2)
UVs are centered and scaled (vUv * 3.0 - 1.5). The same routine draw2 runs four times with different inner-radius arguments (0.6, 0.8, 0.9, 0.3); outputs are added so multiple soft rings interleave.
In polar form (atan for angle, length for radius), a 3D noise sample snoise3(vec3(uv * 1.2, time * 0.5)) modulates where the bright ring sits between the inner radius and the outer edge—so the rim shimmers instead of staying a perfect circle. An inverse-linear falloff (light1) from that moving ring creates the glow; additional smoothstep terms darken the center into a “hole” and taper the exterior. Hue shifts around the orb with cos(angle + time * 2.0), mixing the magenta, cyan, and deep blue constants. extractAlpha turns faint RGB into transparent pixels by normalizing bright regions into alpha, which keeps stacked layers from muddying each other.
Audio-reactive rings
Four annuli are built with paintCircle: each uses a smooth band (smoothstep pair) at a base radius, but the radius is nudged per channel by tAudioData (e.g. r from tAudioData.x, and similarly for y/z/w on sibling rings) so FFT or analyser energy visibly swells the hoops. Small sine perturbations (variation) wobble the distance field so the rings feel hand-drawn rather than perfectly round. The pass is then multiplied by a time-rotated UV mask: red and green are zeroed, blue carries a quadrant-shaped factor 1.0 - v.y * v.x, and alpha is scaled—so this layer reads as a directional blue highlight over the halo stack.
Wandering specular spots
A separate path warps UVs with 3.0 - cos(uTime * uSpeed) * 0.3 for slow breathing. Inside the unit disk, lighting is built from products of inverse distances to several points that orbit on independent sin/cos paths (different periods). Multiplying those terms yields sharp moving hotspots—metaphorically a small “constellation” sliding behind the glass—tinted with low-amplitude purple RGB and wrapped in extractAlpha for soft edges.
Features
WebGL2 rendering with RawShaderMaterial and GLSL ES 3.00 vertex and fragment shaders, plus GSAP-driven UI and camera motion tied to pointer input.
FFT-based audio analysis smoothed in JavaScript to drive shader parameters without harsh flicker; background music with consent-style flow and an optional easter-egg track swap.
Resilient chat integration: normalized API message content (string or structured parts), capped conversation window, retries with shorter context, and a higher completion token budget for reasoning-style models behind the openai route.
Links
Live SermoBot site — deployed experience.
Pollinations powers the chat via the OpenAI-compatible endpoint at gen.pollinations.ai; API keys are created at enter.pollinations.ai.
Core libraries: three.js for WebGL, Vite for the dev server and production build, and GSAP for animation.
Setup (for developers)
Clone the repository, run npm install, copy .env.example to .env, and set VITE_POLLINATIONS_API_KEY to your key. Use npm run dev for local development and npm run build for production output in dist/.