Latest Tech News

Stay updated with the latest in technology, AI, cybersecurity, and more

Filtered by: shader Clear Filter

Ten Years of D3D12

For those of us that have been using it from the start, it can be hard to believe that Direct3D 12 has been around for ten years now. Windows 10 was released on July 29th 2015, and D3D12 has been with us ever since. While it’s true that this is the longest we’ve gone between major D3D version updates, it’s also not fair to say that the API has remained static. On the contrary, D3D12 has received a steady stream of new interfaces, functions, and shader models. These updates have included some ver

Topics: like link shader spec use

The Evolution of Shaders

Nvidia launched the first GPU in 1999 with the GeForce 256, pioneering hardware T&L. In 2001, the GeForce 3 introduced programmable shaders, marking the shader era. Over 24 years, GPUs advanced massively, from 57 million transistors in NV20 to 92 billion in Blackwell (B100). Shader counts exploded—from 16 in 2007 to over 21,000 in 2025. Unified shaders appeared in 2007, and AI-focused Tensor Cores began in 2017. Despite huge performance gains, GPU prices rose modestly: a high-end card cost $800

When to make LODs: Understanding model costs

When to make LODs Jason Booth 3 min read · Dec 26, 2021 -- Listen Share Understanding model costs I’m always amazed at how often I hear people talk about “poly counts” in modern rendering, as if that’s even a thing. But lore has a way of sticking around forever, far past its point of being useful. And quite frankly, I’m constantly seeing artwork created in substandard ways because of this lore. First, the cost of rendering something doesn’t really have much relation to how many polygons it ha