Seems to me that any company that existed before "IT departments", especially those that had stellar computing abilities, only adopted modern IT bureaucracy when it was already too late in more ways than one.<p>Sometimes too late to make a positive contribution, but usually an overworked IT layer from the get-go who can't make any progress at all themselves without everything conforming to an imaginary "best practice" 100%. Where progress toward the achievement of an idyllic IT operation must be tangible before anybody else gets anything they want ever. Those that lament the company that could accomplish way more before computers than afterward, they move on, or age out and retire, and all that's left is those that accept the BS for some reason or another.<p>Then you get "modern" companies formed after show-stopping IT was already very common, and they perceive the "best practice" as imitating the bigger, more well-established failures, for lack of any truly shining examples.<p>I was in a small company imitating a big one and the right move for me was to prioritize something simple that anybody could physically do, like running an ethernet wire to an additional location so the same laptop could be utilized from either desk. Requiring no server action or behind-the-scenes effort from any IT employee whatsoever. After IT proved incapable of timely performance the site manager then justified the relatively negligible cost of more cables, which we ran ourselves to a dozen PC's that had no benefit from being on the internet, since they were not "office machines" in any way. Got this little network air-gapped less than a year before IT got hacked and we came out smelling like a rose. After that we could do anything we wanted on the isolated network, IT only procured the hardware and software we wanted and it was not at their company-wide expense, so they came out ahead and our small profit center could absorb the full cost easily.