Yet another distinction is the focus of the UNIX hacker on constant time optimisations to the Lisp Hackers focus on complexity analysis.
I don't recall ever reading about this noted quality, and yet I'm a Lisp hacker this describes well. I focus on the time and space complexity of my programs, in addition to the more concrete qualities where I can, as opposed to merely testing and measuring my programs on my few machines, and this usually leads to good results; I tend to do this with all of the languages I use, because it seems the most reasonable method. Am I to understand UNIX hackers try to write constant-time operations with fallbacks, which of course has the mentioned disadvantage that these can cascade and cause unacceptable performance? In plenty of instances, I recognize the opportunity to add special cases, but weigh them against the added code complexity and likelihood of triggering them, which usually has me leave the fine general code in place alone.