True - But
Optimization done carefully can often get better than order of magnitude improvements. Many years ago (on Oracle 7!!!) we had an isolated batch job that had to do calculations based on a 3 day period of a database with several years of data. The tables primary index included the date and time towards the end of the index definition, The initial implementation did selects and joins based on this large table - and it ran like a 1 legged dog - (over 6 hours per customer - and there were over 200 to process).
Making a private copy of the main table that only had the desired date range (with the same indexes as the main table) and using that instead reduced the run time to under 5 minutes per customer.
My rules of optimization
1) Is the system fast enough as it is - if so do not optimize.
2) Would an affordable hardware improvement make it fast enough - if so then upgrade the hardware and leave the working software alone.
3) If you decide that optimization is necessary - start by instrumenting the system to find out where the bottlenecks are - there is a good chance that they are not where you thought,
4) If there are multiple bottlenecks - do not start optimizing - you probably need a system redesign first.
5) Give the optimization job to the best programmer that you have available - and make sure that the sources have all the optimizations explained well enough that the system can still be supported if the support is outsourced to a third world country.