Often iterated, seldom explained and most of the time misunderstood: premature optimization.
With CPU power, RAM and SSDs, most of what I learned when I started software development seems to be obsolete. Our machines have all of that in abundance, as have our servers. Why worry about an integer field in a database being long instead of short? Why think about how to reduce protocol overhead to cut transmission time by a mere 50ms? Why think about persisting a text replace instead of doing a regular expression search & replace over 200 strings in the frontend, on the client, via JavaScript?
Let me tell you: This knowledge is not obsolete. There is no “too early” to optimize the right parts of your code or architecture. Doing this is important, and it is prudent. Most of the time, it is even necessary.
In the heat of the crunch, with the dreaded deadline inching closer and closer and tickets, stories or stickies mounting up, everyone is excused from not producing good, sustainable code. It had to be done fast, and the cost of speed is almost always quality, including performance.
Good code though, is not only more readable, better documented and thoroughly tested, it also performs better than its peers. Never believe anyone who says otherwise: they do not write good code and cannot teach you how to do it.
Never trust anyone who dismisses optimizing code because “the user base isn’t that big” or “we care about scale later”. You will end up with technical debt that will cost you way more than you saved. And never trust anyone who cites Knuth, but doesn’t know what Knuth writes about “the critical 3 %” one sentence later.
Optimization is about a deep understanding of your surroundings. Without knowing what is going on you have no idea whether a change would yield a different result. Understand the problem domain, explore the context, profile your code, optimize the critical parts.