Skip to main content
Post Made Community Wiki by Michael Shaw
Source Link
jhocking
  • 2.6k
  • 20
  • 18

I've found that the problem with premature optimization mostly happens when re-writing existing code to be faster. I can see how it could be a problem to write some convoluted optimization in the first place, but mostly I see premature optimization rearing its ugly head in fixing what ain't (known to be) broke.

And the worst example of this is whenever I see someone re-implementing features from a standard library. That is a major red flag. Like, I once saw someone implement custom routines for string manipulation because he was concerned that the built-in commands were too slow.

This results in code that is harder to understand (bad) and burning a lot of time on work that probably isn't useful (bad).