FC is one of the "cheapest" checks.<p>"Maintaining Arc Consistency" is actually asymptotically equivalent (in the same O(n^2) complexity class as FC), but textbooks only list AC3 typically (which is suboptimal but pretty easy to understand. I'd compare it to "insertion sort"). AC4 is difficult to program and understand (especially difficult to extend beyond binary constraints), but worst-case asymptotically optimal.<p>Today: there's AC2001, a simple extension to AC3, which has best-case attributes similar to AC3 but is worst-case asymptotically optimal. Its also "obvious" how to extend into GAC (aka: the GAC2001 algorithm) to handle non-binary constraints.<p>This is an NP-complete problem, so there's really no way to know if FC, MAC (with AC2001), or even Path-consistency (a consistency level above MAC) is optimal. I guess the rule of thumb is that if there's "lots of solutions" out there, then you want to spend less time checking: and maybe FC is better.<p>If there are "very very few solutions", then MAC (and even Path-consistency) may be superior. A stronger check leads to cutting more of the search tree away, but requires a bigger-and-bigger preprocessor at every step.<p>The textbooks I've read seem to default to FC. But given the discovery of AC2001, I think the modern default probably should be MAC using AC2001.<p>----------------<p>Or in the "Sudoku terms", how long do you want to spend searching the tree, vs how long do you want to spend proving that the current search isn't "hopeless" ??<p>The longer you ensure that your "current choice" is "not hopeless", the longer your best-case search takes. The less time you spend on checking for "hopelessness", the longer the worst-case search takes (because you'll run into more hopeless situations but still spend tons of time searching through it).<p>FC, MAC (aka 2-consistency), Path Consistency (aka 3-consistency), and 4-Consistency, 5-consistency, 6-consistency... you can arbitrarily spend as much time as you like on checking for hopelessness. If you have 99-variables (like the 99-locations in Sudoku), checking for 99-consistency is equivalent to solving the puzzle in every possible way (If multiple solutions are possible, 99-consistency would find them all!!!). As such, 100-consistency cannot exist if you only have 99-variables, so 99-consistency is the limit on Sudoku.<p>Your "Big O" is something like (O(d^k)), where d is the size of domains (9 in Sudoku), and k is the consistency level you're aiming for. So O(9^2-ish) is what you're looking at for FC and MAC. O(9^99) is what you're looking at for 99-consistency. So aiming for higher levels of consistency scales poorly to say the least... and textbooks only briefly mention Path (3-consistency) and k-consistency in passing.<p>In practice, if you're in fact trying to "find all solutions", the brute force 99-variable == 99-consistency is suboptimal. In practice, you might find a 55-consistency relationship that's backtrack free (exponentially faster than 99-consistency), but this involves solving some rather nasty graph problems. Its absolutely worth the tradeoff but there be dragons here. (Finding the optimal ordering of variables means solving for the minimum-induced width of an induced graph of the variables, which is an NP-complete problem)<p>--------<p>There's also DAC: Directional Arc Consistency. Which is "more consistent" than FC but "less consistent" than full Arc Consistency (aka: MAC/Maintaining Arc Consistency). A naive implementation of DAC will be asymptotically optimal.<p>Needless to say: researchers have spent a lot of time looking at various "consistency" benchmarks for these backtracking problems.