In today's software ecology, our software tends to treat us as statistics of a user community. It forces us to fit a set of assumptions and constraints. We are, to be blunt, managed by the constraints and assumptions of our software and, to be perfectly honest, our software sucks. Sometimes we have software that is designed for the average user of our profession. But even in these rare cases, the software still sucks. If you don't want to be managed by your software you must deal with the complexity of writing your own. This is Linux and GNU in a nutshell. If you don't like it, change it yourself. Nonetheless, most of us don't want to live with the tedium of constantly managing our user interface so we throw our hands up and give in to being managed by our user interfaces. There appears to be no other way.
There are significant cognitive differences between individuals. Software developers and computer systems that we build usually ignore these differences, treating individuals as population metrics. The resulting designs may be optimal for the 'average' use case are far from optimal for anyone who doesn't fall on near the mean. Human factors testing and human-machine usability is not simple nor easy. And the distributions are not a neat little normal distributions. Just because you can get 'decent' performance from a test user population doesn't mean that your solution is optimal for anyone, even the individuals in your test population.
The software of the future must be better than "optimal on average". Don't just test for a user community, test for individual users. Test for the weird users. Do the opposite of what human factors tells us and test and design for the outliers. We need to begin asking ourselves more difficult questions regarding what makes an effective user interface. How can we write software that adapts to individual cognitive characteristics? How can we ensure that instead of software functionality in-spite-of user differences we can create software functionality which leverages these differences? In essence, we need adaptive user interfaces, not a plethora of adaptably confusing configuration options. This means that our computers need to start paying attention to us; learning us. We are beyond the point where "learning the machine" is even remotely practical. There are simply too many possibilities for average junk, but there is only one "me" and one "you".
The last decade has seen remarkable developments in the design of user interfaces and some of the best "optimal on average" interfaces that have ever existed (think iPhone/iPad). Nonetheless, there are a wealth of opportunities to improve user interfaces. I know this because although we all have different cognitive efficiencies and deficiencies we are all still using essentially the same small set of user interfaces.
Until user interfaces learn my individual cognitive characteristics and adapt themselves to be optimally suitable for me, my mantra to my fellow computer users will stay the same.
"Never blame yourself. Always blame the machine."