But get data structures and algorithms right
Far too many codes are developed according to the developer's first idea:
he/she blindly goes ahead with the first idea that probably works.
They don't stop to think about what data structures are appropriate, nor what algorithms.
For modern (deeply pipelined) processors, there are two primary performance inhibitors: dependencies and memory access.
Particularly, deeply nested logic or deeply linked data structures have big costs, as do huge data structures.
I just finished re-writing an environmental-simulation code, which (for a primary operational case) went from a working-set
size of 3.4 GB to 640 MB, and a run-time of 130 CPU-minutes/simulation-day to 7 CPU-minutes/simulation-day. The primary optimization was the elimination of several huge scratch-arrays (with their associated memory traffic), and the replacement of "dumb" run-time searches with setup-time sparse matrix construction. *Not* rocket science.
In another example fifteen years ago, went from a more extreme 12 Cray-hours to 163 SPARC2-seconds, replacing dumb array
searches with sparse matrix arithmetic...
And in both these cases, the results are simpler, clearer, and more maintainable than the originals.
"Think (alternatives for data structures and algorithms) before you code" should be Principle Zero.