Many years ago at Cornell, Richard Feynman taught a course called Mathematical Methods of Physics. Notes from those lectures grew into a book of the same title by Jon Mathews and Robert Walker at Caltech. That book is a starter toolkit for graduate students and journeyman physicists. It covers differential and integral equations, probability and statistics, matrix and tensor algebra, series expansions, and a host of other topics. The subject is mathematics, but very much aimed at real day-to-day problems, without a lot of arcane derivations or unnecessary footnotes. As Mathews once said when teaching from the book, "My goal is to show you a lot of tricks, so that when we're done you'll be a tricky person!" Those "tricks" were really a repertoire of thinking tools for physicists.
In a similar fashion, Z. A. Melzak of the University of British Columbia has written several splendid books that explore the connections and interactions among various techniques from more pure mathematics, but with a concrete focus on potential applications to engineering and science. Melzak also has a more general and philosophical book, Bypasses, concentrating on ideas that radiate out from the concept of similarity transformations. In practical terms, one doesn't have to crash through a brick wall; one can tunnel down, move forward, and then come back up in order to get to the other side. The same concept applies to solving Rubik's Cube, or coordinate changes, or quantum mechanical measurements. If a problem is too hard when viewed from one aspect, think about transforming it into a universe where it becomes simple, solving it there, and then inverting the original transformation to get back to the real world. Melzak's musings about the implications of such "bypasses" may point the direction to new thinking tools.
Polya long ago wrote How To Solve It, a road map of tools for thinking through mathematical problems. Tufte of Yale is famous for The Visual Display of Quantitative Information and its sequels, which collect and analyze a multitude of ways to think with data, all designed to make the patterns in those data emerge with stunning clarity. Abelson and Sussman wrote the freshman MIT computer science text Structure and Interpretation of Computer Programs. Their book aims to provide students with an arsenal of tools for thinking about managing complexity, creating interfaces, isolating unknowns, and designing computational systems.
Moving from books to the computer software world, a number of candidate thinking environments have emerged within the past few decades. Among the more noteworthy are:
- the UNIX operating system — built on a metaphor of pipes that join together programs, each of which filters or transforms data;
- the spreadsheet (Visicalc et seq.) — built on a metaphor of intercommunicating cells which hold formulae and values;
- the Apple Hypercard "software erector set" — built on a metaphor of stacks of cards, with dynamic backgrounds, buttons, and fields;
- the GNU-Emacs editor — built on a metaphor of windows into files, with an integrated Lisp extension language;
- the PROLOG logic programming system — built on a metaphor of Horn clauses, resolution, and unification, a foundation of mathematical logic that turns conventional computer programming into theorem-proving.
All of these environments share one or more of the characteristics of extensibility, responsiveness, metaphorical richness, and foundational power.
Numerous other candidate thinking tools exist. Jeff Conklin of GDSS (formerly at MCC) has long worked on argumentation theory and has developed tools to structure and facilitate group decisionmaking (using computers, whiteboards, paper, or other media). Simulation and modeling can provide great insight, especially when used to generate alternative futures scenarios to explore and provoke thought. Game theory may often feature prominently in studies of situations where actors have competing goals and must resolve their conflicts. Computer tools for the analysis of linkages among discrete events can reveal patterns of large-scale order which are invisible to the unaided observer, as can statistical tools that work to cluster and correlate data sets.
On a broader conceptual level, thinking tools are needed to capture the process of cognition — the Why of a conclusion, and the How it was reached, not just the What of a proposed "answer" — so that researchers can collaborate more effectively on complex tasks and can create a "corporate memory" for their successors to build upon. Thinking tools work on the boundary between knowledge and wisdom; they aim to help reveal meaning in mountains of information.
Thursday, April 08, 1999 at 21:46:30 (EDT) = 1999-04-08
TopicThinking - TopicProgramming
(correlates: BooksToConsider, CommonCompSciSense, AppliedBypasses, ...)