Tech News
← Back to articles

The Great Unracking: Saying goodbye to the servers at our physical datacenter

read original related products more articles

Since October 2010, all Stack Exchange sites have run on physical hardware in a datacenter in New York City (well, New Jersey). These have had a warm spot in our history and our hearts. When I first joined the company and worked out of the NYC office, I saw the original server mounted on a wall with a laudatory plaque like a beloved pet. Over the years, we’ve shared glamor shots of our server racks and info about updating them.

For almost our entire 16-year existence, the SRE team has managed all datacenter operations, including the physical servers, cabling, racking, replacing failed disks and everything else in between. This work required someone to physically show up at the datacenter and poke the machines.

We’ve since moved all our sites to the cloud. Our servers are now cattle, not pets. Nobody is going to have to drive to our New Jersey data center and replace or reboot hardware. Not after last week.

That’s because on July 2nd, in anticipation of the datacenter’s closure, we unracked all the servers, unplugged all the cables, and gave these once mighty machines their final curtain call. For the last few years, we have been planning to embrace the cloud and wholly move our infrastructure there. We moved Stack Overflow for Teams to Azure in 2023 and proved we could do it. Now we just had to tackle the public sites (Stack Overflow and the Stack Exchange network), which is hosted on Google Cloud. Early last year, our datacenter vendor in NJ decided to shut down that location, and we needed to be out by July 2025.

Our other datacenter—in Colorado—was decommissioned in June. It was primarily for disaster recovery, which we didn’t need any more. Stack Overflow no longer has any physical datacenters or offices; we are fully in the cloud and remote!

Major kudos to the SRE team, along with many other folks who helped make this a reality. We’ll have a few blogs soon to talk about migrating the Stack Exchange sites to the cloud, but for now, enjoy the pictures.

We had about 50 servers all together in this location. Here’s what the servers looked like at the beginning of the day:

Eight (or more) cables per machine multiplied by over 50 machines is a lot of cables! In the above picture you can see the large mass of cables. Even though they are neatly packaged in a little cage (called an “arm”), one per server, it was a lot of work to de-cable so many hosts.

Why so many cables per machine? Here’s a staged photo that shows the individual cables individually:

Blue: 1x 1G ethernet cable for the management network (remote access).

... continue reading