I have seen numerous articles endorsing dependency injection (DI) in software development. Frameworks such as Spring and Guice provide excellent support for dependency injection, and I am a fan of designing and writing code that minimizes any coupling and provides more flexibility.<p>What are some of the practices when we "do not" use dependency injection (DI)? What arguments exists for not using DI for these scenarios?
When the added costs of integrating a DI framework (added complexity, getting the rest of the team up to speed and debugging problems directly related to the framework) does not provide sufficient value. This could be the case for smaller projects or a system that doesn't have classes with more than 1-2 dependencies.<p>You can always get away with using "sinful hacks" (in the eyes of some DI evangelists) like wrapping dependencies in a getter/setter if you just want to inject a mock version for testing.
This post has some pointers on this:<p><a href="http://tutorials.jenkov.com/dependency-injection/when-to-use-dependency-injection.html" rel="nofollow">http://tutorials.jenkov.com/dependency-injection/when-to-use...</a>