Someone got contract money to report that nuclear is scary, cyber is scary, so cyber nuclear is therefore really scary, conclusion, more money needs to be sent their way.<p>The interesting parts are not being covered, aside from the reported grubbing for money. I skimmed the 50+ page report and its very unusual to have IT security staff on site 24x7. The infosec folks don't share incident reports like the nuclear physics community always has (at least in the USA). Nobody does drills where they assume the computers are powned so go manual/verbal. IDS systems are not usually deployed. Patching fixes security holes which are not tracked but results in downed systems and uptime hits which are tracked so you get one guess as to the priority of patching. The IT supply chain is not managed to the military aerospace level of examination that, say, welding gear is managed to at the plants.<p>The journalist reports of problems are bogus. However, the actual report pages 14 thru 17 were pretty interesting reading. The PLC at Browns Ferry is a typical story, they accidentally DOS'd the VFD controller for a circulation pump, so the other eighty billion procedures to protect the plant kicked in and they shut the plant down, an intentional attack would have had the same result. The Hatch story is a good example of just why plant operators hate patches, a poorly applied patch shut down the plant for days due to a SCADA misreading, if they didn't patch the plant would not have been shut down (or maybe it would have been powned later?). Somewhat instructive story about the Korea plant that got their HR database completely powned and its treated as a "nuclear plant attack" even though it was just boring HR pownership like could happen to a food store or something.<p>The report explains in great detail how toxic security is currently implemented, where the nuclear engineers set everything up, and once it all works, as the last step, the infosec guys try to sprinkle magic security pixie dust and checkboxes, and they try to explain in manager language why thats possibly the dumbest possible way to build a secure system, its really pretty well written. Around page 31 of the report.<p>Cloudy Virtualization confuses people, both nuke and infosec, so you'll have some PLC in a janitors closet but put the top secret secured by armed guards button up in the control room behind lock and key.<p>Optical data diodes are wide spread and they need more, but journalists will report successful attacks on the insecure side as being as dangerous as a hit on the secure side, and mgmt loves to write insecure side monitoring as procedure, making the ops react as if the primary coolant system just broke, even though its just some harmless metric gathering webserver.<p>I know its cheating, but verbally, from talking to people, there is a terror related to changing default passwords that some jack*ss will change the control rod PLC password to "R@$Gfgsdg" and promptly get hit by a bus moments before someone needs to log in and change something off shift, and now no one can shut down the reactor without a seance. Well not literally, but close. If you have physical air gapped gear if you change the default password the only possible effect is slowing down emergency response, there shouldn't even be a password on a router console port, etc.