In a typical BigCo, much of the "soft service" processes are run by their own group, far removed from the reach of the automation implementers whether they're internal or external.<p>A top-down mandate to bring increased automation will require the implementers and the business unit to meet, agree on requirements, and exchange detailed information about the existing process which may or may not be formally codified. This process is frequently done without the involvement of frontline service workers, so the information being exchanged is done by people at least one step removed from the details of the work, and it just gets worse from there, like a meeting-filled, high-stakes variant of the telephone whispers game.<p>The opposite case is where a business unit develops their own highly computerized solution, but the process, when discovered, is unacceptable to the BigCo. The subject matter expert can seldom contribute technical artifacts into the sanctioned rewrite, so the rewrite becomes an imperfect facsimile, and the subsequent migration will cause issues.<p>In both cases, much of the conflict comes from the users of the software not being contributors to the software, in a broad sense of that word. And, service unit employees are often already tiered between process administrators and task executors with no obvious path of advancement between them, such that the frontline workers can be automated out of a job entirely. On top of the organizational barriers, this creates a structural disincentive against contributing to a solution.
Additional 4th point: understand the ethical tail risk, if it exists!<p>Is there an assumption (explicit or otherwise) that the human process your are automating includes a capability that the general public would consider "ethical"?<p>In the worst case scenario, where:
1. that capability is removed by the automation
2. there are negative consequences
3. the whole world discovers the consequences,<p>will your business survive? Can you "blame the vendor?"<p>e.g. <a href="https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G" rel="nofollow">https://www.reuters.com/article/us-amazon-com-jobs-automatio...</a><p>Amazon made a smart move in pulling this project -- the public perception risk of selling, or even deploying a product that could be seen as "bias, automated" was too great.
In my experience, the best policy is just to ask engineers where they are spinning their wheels. Engineers know best where to automate. Relying on top down automation decisions inspires ineffective automation.
For those, like me, who don't understand why people make their websites hard to read with low-contrast text, try "select all". The highlighting tends to be easier on the eyes.