There’s this photo that’s been sitting on my inspiration board for a while, of the space shuttle Endeavour, suspended in space in low Earth orbit at sunset. It shows Earth’s upper atmosphere as a backdrop, featuring beautiful, colorful layers ranging from dark orange to blue before fading away into the deep black of space. Not only is that gradient of color aesthetically pleasing, but the phenomenon behind those colors, atmospheric scattering, is even more of an interesting topic once you start looking into how it works and how to reproduce it.
Shuttle Silhouette https://www.nasa.gov/image-article/shuttle-silhouette-2/
I wanted to build my own version of this effect with shaders, rendering the sky’s distinctive blue color and realistic sunsets and sunrises directly in the browser. The goal was to get as close as I could to that photo, while also moving toward the kind of atmospheric rendering often seen in games and other shader-based media.
Here’s a compilation of what came out of this month-long journey, all running in real time:
I didn’t originally plan on writing about this subject, but the enthusiasm around the recent Artemis II mission, combined with my own interest in all things space, made it feel worth exploring in depth. It also felt like the perfect opportunity to build an interactive experience that could make the topic more accessible. In this write-up, we’ll see how to implement an atmospheric scattering shader post-processing effect step-by-step, starting with the implementation of the different building blocks (raymarching, Rayleigh and Mie scattering, as well as ozone absorption) to render a realistic sky dome, and then adapt the result to render it as an atmospheric shell around a planet. Finally, we'll look into Sebastian Hillaire’s LUT-based approach for a more performant result, or at least my attempt at implementing it, as this was very much the stepping outside of my comfort zone phase for this project.
Enjoying my writing and feeling like supporting my work? You can show your appreciation by buying me a coffee (I really really really do like coffee) which will give me the much-needed energy (and fuel my caffeine addiction) to take on more ambitious/high-quality articles and (probably over-engineered but fun) projects. As a token of gratitude, your name will be featured on this little screen below! MADE IN NYC - @MAXIMEHECKEL - 2025 - MADE IN NYC - @MAXIMEHECKEL - 2025 - MADE IN NYC - @MAXIMEHECKEL - 2025 - MADE IN NYC - @MAXIMEHECKEL - 2025 - Thank you for reading!
How to Render a Sky You may have, at some point or another, tried to slap a blue gradient background behind some of your work in an attempt to give it a more "atmospheric" look and call it a day, but quickly noticed doing so never feels quite right 1. For a more true to life implementation, we must treat the sky and its color as the result of light interacting with air and its constituents, while taking into account several variables, such the altitude of the observer, the amount of dust, the time of day, etc, all of that in a volume. With that established, our goal for this first part is to use this as guiding principle to lay the foundation for our atmosphere shader, and get to a result that feels almost indistinguishable from a real sky, at any time of the day. Sampling Atmospheric Density Much like how we’d approach volumetric clouds or volumetric light, one easy way to sample the atmosphere is through raymarching. We can cast rays from the camera’s position into the scene and step through the transparent medium to answer the two following questions: How much light survives traveling through the atmosphere? This is the transmittance term. How much light is redirected toward the camera at each sample? Also known as scattering. Catch-up If you need a quick refresher on raymarching with some simple examples, I invite you to check out Painting with Math: A Gentle Study of Raymarching To answer the first one, we need to accumulate the atmospheric density encountered along the ray to obtain what is known as the optical depth. We will model this using the Rayleigh density function, which tells us how much "air" there is at a given altitude h . This is important to take into account that the atmosphere gets thinner as altitude increases. Sampling Rayleigh density and accumulating optical depth 1 const float RAYLEIGH_SCALE_HEIGHT = 8.0 ; 2 const float ATMOSPHERE_HEIGHT = 100.0 ; 3 const float VIEW_DISTANCE = 200.0 ; 4 const int PRIMARY_STEPS = 24 ; 5 const vec3 SUN_DIRECTION = normalize ( vec3 ( 0.0 , 1.0 , 1.0 ) ) ; 6 7 float rayleighDensity ( float h ) { 8 return exp ( - max ( h , 0.0 ) / RAYLEIGH_SCALE_HEIGHT ) ; 9 } 10 11 void main ( ) { 12 vec2 p = vUv * 2.0 - 1.0 ; 13 14 vec3 color = vec3 ( 0.0 ) ; 15 vec3 viewDir = normalize ( vec3 ( p . x , p . y , 1.0 ) ) ; 16 vec3 skyDir = normalize ( vec3 ( viewDir . x , max ( viewDir . y , 0.0 ) , viewDir . z ) ) ; 17 18 float stepSize = VIEW_DISTANCE / float ( PRIMARY_STEPS ) ; 19 float viewOpticalDepth = 0.0 ; 20 21 for ( int i = 0 ; i < PRIMARY_STEPS ; i ++ ) { 22 float t = ( float ( i ) + 0.5 ) * stepSize ; 23 float h = t * skyDir . y ; 24 25 if ( h < 0.0 ) break ; 26 if ( h > ATMOSPHERE_HEIGHT ) break ; 27 28 float dR = rayleighDensity ( h ) ; 29 viewOpticalDepth += dR * stepSize ; 30 31 32 } 33 34 35 36 color = ACESFilm ( color ) ; 37 38 fragColor = vec4 ( color , 1.0 ) ; 39 } Then, from the optical depth, we can compute the transmittance T at a given point along the ray: the fraction of light that survives while traveling through the atmosphere. T=1.0 means that there is no loss of light.
T=0.0 means that the light is totally extinguished. If you’ve read my article on volumetric clouds 2, we’re using a formula that may look familiar for this: Beer's Law: Computing transmittance 1 2 3 float dR = rayleighDensity ( h ) ; 4 viewOpticalDepth += dR * stepSize ; 5 6 vec3 transmittance = exp ( - rayleighBeta * viewOpticalDepth ) ; 7 scattering += dR * transmittance * stepSize ; 8 9 Rayleigh Beta The rayleighBeta variable, or Rayleigh scattering coefficient, tells us how much red, green, and blue light gets scattered by air molecules over a given distance. In shader code, we store it as vec3(0.0058, 0.0135, 0.0331) . With this in place, we can now describe how light is attenuated as it travels through the atmosphere. However, density and transmittance only tell us how much light is available to scatter, not how that light is distributed toward the viewer. For that, we need to account for the angle between the incoming sunlight and the view ray, which is what the Rayleigh phase function models. Rayleigh phase function 1 2 3 4 const vec3 SUN_DIRECTION = normalize ( vec3 ( 0.0 , 1.0 , 1.0 ) ) ; 5 6 float rayleighPhase ( float mu ) { 7 return 3.0 / ( 16.0 * PI ) * ( 1.0 + mu * mu ) ; 8 } 9 10 11 void main ( ) { 12 13 float phase = rayleighPhase ( dot ( skyDir , SUN_DIRECTION ) ) ; 14 15 16 17 scattering *= SUN_INTENSITY * phase * rayleighBeta ; 18 19 float horizon = smoothstep ( - 0.12 , 0.05 , skyDir . y ) ; 20 vec3 color = mix ( SPACE_COLOR , scattering , horizon ) ; 21 color = ACESFilm ( color ) ; 22 23 fragColor = vec4 ( color , 1.0 ) ; 24 } Putting all this together, we can have a somewhat accurate representation of how much scattered light accumulates along a given ray at any given altitude. The widget below represents the process we just described, showing you: The sample steps along a single ray
The resulting pixel color obtained from this process (an approximation) Raymarch Steps 20 Altitude 3.5 km As you can see, we’re accumulating shades of blue at lower altitude! This is mostly due to the Rayleigh scattering coefficient’s value: Red scatters very little
Green a bit more
... continue reading