I’ve been using GitLab for my private projects for years now. I just started using it at some point and never had a reason to switch.
How I ended up here
Back when GitHub still charged for private repositories, GitLab offered them for free. That was the initial hook. I had a bunch of small projects and experiments that I didn’t want to publish but also didn’t want to pay for. GitLab was the obvious choice.
GitHub eventually made private repos free too, but by then my workflow was already built around GitLab. All my CI pipelines, Docker images, deployment scripts - everything pointed there.
The Docker registry thing
Every GitLab project comes with a Container Registry. This is probably the feature I use most.
The workflow is simple. Build an image locally or in CI, push it to the registry, pull it wherever you need it.
No separate Docker Hub account. No thinking about pull rate limits (remember when Docker Hub introduced those and broke half the internet’s CI pipelines?). No managing access tokens for yet another service.
For my private projects this is perfect. I don’t need the discoverability of Docker Hub. I just need a place to store images that integrates with my existing authentication.
The 10GB limit per project sounds small on paper but I’ve never come close to hitting it. Old tags get cleaned up, base layers are shared, and most of my images aren’t that big anyway.
... continue reading