I have a similar experience, though only now am I faced with the existential crisis that "2013" is going to be "a decade ago" in a few months. My Arch Linux install started life as a VMWare Workstation image. It made it through two major init systems (sysvinit -> systemd), different audio subsystems (alsa -> pulseaudio -> pipewire), different WMs (gnome2 -> kde4 -> i3 -> sway), three filesystems (ext3 -> ext4, ext4 -> brtfs -> ext4, then ext4 -> zfs), several different versions of VMWare Workstation (7 through 14 I believe), different storage substrates, etc. It's also lived on three different uArchs (AMD Bulldozer c. 2012, Intel Skylake c. 2016, and Ryzen c. 2020) but VMWare abstracted most of that away, of course.<p>Eventually I got fed up with Windows and decided to `zfs send` the install to a real disk and booted it on bare metal. It has been my daily driver since then for the last 2 years or so. (I did drop into the Arch installer a last year to unfuck my bootloader while trying to get rEFInd & ZFS Boot Menu to work, but that was just building a new initramfs; I haven't run "pacstrap" since I built the image c. 2013.)<p>The flexibility this operating system has provided me with is nothing short of amazing. I do have to say though: since switching to Wayland + the in-kernel AMDGPU driver, I can't remember the last time my system was rendered unbootable. (Excepting the one time I tried to change my bootloader, but that's user error.) In hindsight I feel like the vast majority of Arch's reputation for breaking systems is overblown, and the blame rests mostly on DKMS + NVidia's proprietary drivers.