Some more tech common sense
"it's better to USE any unused memory rather than let it sit there in your system going to waste."
Thats OK thinking in a simple single-tasking OS. Or maybe in a phone OS with one dominant foreground task.
But in a modern multitasking desktop OS, applications cannot really know what memory is truly unused. These OS'es also do not have any protocol for telling applications "could you please give me a bit of memory back, if you dont' mind?". Applications that presume to own all memory make for a horrible user experience: Try to go to another application, or even open the desktop "start" menu or equivalent, and it's swapety swap for a while.
Also, like the Mozilla guys note, lessening the memory footprint can improve performance even without swap problems: more of the data stays hot in the processor cache. There is nowadays a very large speed difference between accessing data already in the cache, and going all the way to main memory.
Yes, applications should allocate more memory when they can put it to good use, and no, they may not waste it as if it were a free resource. Most of what the Mozilla developers did was hunting memory leaks and other cases where memory stays allocated for no good reason, and that is what all good software engineers should occasionally do.