- Tuesday, November 23, 1999 at 06:07:32 (EST)
If the numbers are too big to be grasped, divide by the number of people involved and think on a per capita basis. If low-probability but high-impact events are potential scenarios, compare weighted benefits to costs. But don't just study part of the system; don't fall for the wizard's patter and "pay no attention to the man behind the curtain". Look at the whole picture.
Many years ago, Henry Hazlitt's Economics in One Lesson described good economics as simply the understanding of all effects, on all people, of human actions --- not just the local consequences, or the results for a short interval of time, or the benefits to a privileged few. Hazlitt called for the analysis of what didn't happen --- for the possible futures that were cut off by a bad policy choice, for the jobs that never came into existence, and for the money that wasn't spent on one thing because it was forced to go pay for another.
The principle applies far beyond economic activity. For any decision, what are the large-scale and long-term consequences? Do we feel good today at the expense of our grandchildren, or by taking resources from people on the other side of the globe, or via ignoring the risk of disaster in an (apparently) unlikely outcome? Beware! Particularly take care when the proponents of a choice are those who will profit directly, while the costs are diffuse and spread widely over people who may never be aware of what they are missing or the source of their loss.
We can't all become wealthy by picking each other's pockets, or through selling promises to one another. Once in a while, somebody must work, think, create, build, teach, learn, grow....
- Monday, November 22, 1999 at 06:37:10 (EST)
- Sunday, November 21, 1999 at 12:34:25 (EST)
So the larger bodies get, the smaller their surface-to-mass ratio. Little creatures are "all skin" --- so they can get away without the elaborate internal structures that big animals need to circulate oxygen and nutrients. A lump of coal isn't very reactive, but grind it into dust and the surface area becomes huge --- now it can burn quickly, maybe even explode. A baby gets chilled or overheated much faster than an adult; the same holds for asteroids and planets and stars.
To make something change, think surface and think small; to preserve something, think mass and think big.
- Saturday, November 20, 1999 at 08:04:25 (EST)
Knowledge removes fear and reveals what can and can't be done. There's no need for magic incantations or divine intervention. Planes keep flying thanks to the equations of hydrodynamics; medicines work through straightforward molecular interactions; processors do as they're told because of their circuitry and the data structures and algorithms of the code that they're executing. When things begin to fail, no amount of wishing (or panic) will change the situation; reason, however, based on situational understanding may help.
- Friday, November 19, 1999 at 21:08:37 (EST)
But there's a Big Secret from physics that applies to life: everything is local. Planets don't need to know about the Sun --- they just respond to the local curvature of spacetime and take the shortest path available. ("Matter tells space how to curve; space tells matter how to move" --- locally.) Laws and edicts don't matter if nobody here bothers to enforce them. ("All politics is local.") Electromagnetic waves ripple through space and induce voltages only when they reach an antenna. My sainted Mother's control over my actions comes through my memory of her and her teachings.
Influences certainly propagate from point to point. Gravity and electromagnetism do their magic via fields (and/or via quantum mechanical particles). People give and take orders, and we share ideas as we interact with one another. Our remembrance of the past carries concepts into the future; written records speak to us from centuries ago. But look more closely, and it's all local --- moment by moment, person to person, one baby step after another.
- Thursday, November 18, 1999 at 05:37:46 (EST)
The basic design principle of a kenning is the subtle suggestion of an analogy, by an outline of its shadow or a sketch of its edges. A ship crosses the sea as a stallion crosses the land --- that is, ship is to sea as stallion is to land, or more compactly "ship : sea :: stallion : land". So sea-stallion implies ship. Inverting the equation in various ways, land-ship means stallion, a stallion's sea is the land, and a ship's land is the sea. Some patterns work better than others. If "cow : grass :: car : gasoline", then grass is a cow's gasoline. (Saying that gasoline is a car's grass is more obscure. Grass has too many diverse meanings; gasoline is relatively specific.)
Besides being fun themselves, kennings may suggest escape routes from conventionality in diverse circumstances. For instance, in that last example the "turtle-house" pattern when inverted yields a cute term for a stay-at-home person: house-turtle!
- Wednesday, November 17, 1999 at 06:38:31 (EST)
There's a concept called engineering reserve, an advance budget for the unanticipated (and unanticipatable). How big a reserve to keep depends on how risk-averse one is, as well as how serious the consequences of failure may be. An engineering reserve is like an insurance policy: one hopes never to need it.
- Monday, November 15, 1999 at 21:44:02 (EST)
- Sunday, November 14, 1999 at 09:29:08 (EST)
People argue that you cannot achieve power by adding feature upon feature to a design. Instead, they say, you move forward by removing unnecessary elements, ruthlessly simplifying structures, cleaving off flaws, and polishing facets. The result is what mathematicians call completeness and orthogonality. A set is complete when it spans the space of all possibilities; its components are orthogonal when they are all independent of one another. In other words, the crystalline ideal does everything in a single pure way, with zero redundancy.
That's one approach to perfection. Perhaps Euclid did glimpse beauty bare via geometric logic. But the actual universe is complex! The information content of a crystal is tiny; every atom's location is entirely determined by the arrangement of a single unit cell. A ball of mud, on the other hand, seethes with life from the microscopic level on up. Mud sticks together and is resilient; it doesn't shatter with a tap at the wrong angle; it's unpredictable, full of surprises. Mud has a high information content. It's the place to find beauty, and power, in the real world.
- Saturday, November 13, 1999 at 10:40:18 (EST)
- Friday, November 12, 1999 at 15:27:59 (EST)
PROLOG has two foundations: unification and resolution. Unification is a system for filling in slots in formulæ; resolution is a system for making deductions from a list of facts. Both are simple, almost obvious --- but of course, figuring out how to express the obvious in a correct and useful way is utterly nontrivial.
Most computer languages are procedural: a programmer writes out all the steps needed to reach the answer, and tells the machine exactly what to do in what order. There may be choices along the way, branches to take or avoid depending on partial results, but the process to follow at every point is clear. PROLOG and a few other languages, in contrast, are declarative: a programmer describes a situation and then asks the computer to achieve a goal. The computer decides what path to take to get there. This is an extraordinarily powerful technique for solving many problems, since it doesn't demand that the human know all the details of how to get the answer in advance.
So why hasn't PROLOG taken over the world (of programming, that is)? Maybe logic is too clean, too pure, to abstract to apply to messy real-life tasks. Maybe the early implementations of PROLOG were too inefficient, mathematically sound but so slow as to be practically irrelevant to anything beyond toy problems. (The joke used to be that with PROLOG you could get something for nothing, but only if you were willing to wait forever to get it!)
But maybe PROLOG was just too radical a paradigm shift. People are used to telling others how to do their jobs; we constantly give and receive orders (albeit not always willingly) and we know how to follow recipes. It's quite a twist to say, "Here are the ingredients and the rules for their reactions with each other; now bake me a cake!" One must trust the other person or the computer to do the right thing, to discover which raw materials to use and how to combine them. That takes a leap of faith. And sometimes the leaper falls; there aren't any solutions, or at least no solutions within the given rules.
That happens a lot in everyday life. It's tough to entrust others with jobs without looking over their shoulders. It's tougher to forgive them when they fail because there was no way to succeed.
- Thursday, November 11, 1999 at 11:08:36 (EST)
- Wednesday, November 10, 1999 at 05:25:23 (EST)
It's weird enough already --- wavelike behavior of particles, and particle-like behavior of waves --- but then it really gets bizarre. Remember that the paths of charged bodies curve when they're in a magnetic field. (That's how the picture tube in a TV set focuses electrons and paints images on the screen.) So, pipe a magnetic field between the two holes in our barrier, but confine it carefully so that no magnetism leaks out. Then send electrons through the holes again. The electrons never go through any place where a magnet can affect them --- but the pattern of fringes they produce on the wall has nevertheless moved!
That's totally unexplainable by normal forces and fields. In fact, the effect (named for Aharanov and Bohm) doesn't even exist in the non-quantum world. Invisible interactions control the electrons --- interactions that have no classical counterpart. The details get averaged out when we take lots of particles or when we don't observe precisely enough. But the invisible forces are real, and particles respond to them.
The same thing happens in other contexts, if we pay close attention. Influences on a fine scale blur together and aren't apparent in the big picture. Nevertheless, they are real and significant. Nations may not seem to react to such forces, but individuals do. Money talks big, and everybody pays attention to it. Subtle factors --- such as morality, justice, and charity --- act unseen, on individuals, over long periods of time. They matter more.
- Monday, November 08, 1999 at 21:33:44 (EST)
Issues of meaning come from the use of symbols. Worry about meaning in turn leads to the root question of life --- the "Why?" that we wrestle with when it's quiet and we're alone late at night, when death stares us in the face, when we can't avoid serious thought by the distractions of work and play. What's the point of existence? Is all this just meaningless symbol manipulation? How can mind emerge from mere matter? Is there more to the universe, and if so, what is it?
- Sunday, November 07, 1999 at 18:23:39 (EST)
Then there's the focused method: "each one reach one", individual action. This is slower, but more plausible --- particularly when working toward a complex goal, something that requires free choice and deep thought to accept. It demands great patience, humor, and the ability to communicate on a personal level with those who disagree with one's ideas. And there's always the risk of counter-conversion, of being persuaded to give up and join the other side. (The missionary visits the pagan's house for a feast, and likes it too much to come home again.)
But there's an even better way: reflect --- look in the mirror. Discard shotgun and rifle; both will break the glass. Whom can one directly influence? Only oneself. What good does it do to work on oneself? Well, assuming that there's room for improvement (no nirvana yet) it would seem to be a good example for others (but don't think of them!) to see someone try to do rather than just admonish, advise, and critique. No proseletyzing; when asked, however, it's ok to explain. And even if absolutely no one else notices, there are the hidden rewards of making the right choices, of striving against impossible odds, and of taking at least baby steps toward the goal.
Yes, it's frustrating beyond belief --- but working on oneself is paradoxically the only way to make any progress toward the hardest and most important ends in life ... to glimpse answers that can't be found in the back of any book ... and to really help other people.
(So I had better shut up now and get to work, eh?!)
- Saturday, November 06, 1999 at 05:52:19 (EST)
If you're lucky, after a few weeks you start to develop an unconscious model of the remote acquaintance. You catch yourself in silent conversations ... having gentle arguments ... organizing remarks to persuade and explain ... asking questions and responding in turn. Good! Write out what you can of the discussion and use it in your letters. The model of your friend will deepen. Having a rich internal dialogue helps clarify and improve thinking on any subject --- particularly on issues that are so delicate (or tough!) that they're impossible to verbalize with someone present.
- Friday, November 05, 1999 at 06:16:25 (EST)
People cluster, based on common backgrounds and interests. Someone who fortuitously lands near the center of a dominant group can more easily speak to (or for) the "masses", write a best-seller, perhaps get rich or win an election. But while cluster centroids represent large raw numbers of individuals, they do not stand for as big a volume of possibility-space as do people who live in the deserts between clusters. Those outliers are critical, though they're often overlooked. They probe the wilderness and pioneer new trails between zones of stability. When radical change threatens or disrupts established groups, the adventurers offer a chance for survival.
- Thursday, November 04, 1999 at 05:54:19 (EST)
But the result is an unintentional bias in the literature --- and that slants results. Researchers who are less than honest, or who observe a statistical fluctuation, get the headlines. "Expert witnesses" are paid to take one side of an argument, and cheerfully do so. The more articulate (or noisy, or photogenic) a pundit, the likelier that the media will favor that side of the technical debate.
So beware reported "breakthroughs" until they have been verified by neutral parties. Cold fusion? Polywater? A host of front-page "discoveries" in medicine, sociology, and psychology? Take with a grain of salt. Most are honest mistakes; some are deliberate frauds; a few will turn out to be important and are worth studying, after confirmation. Apply extra skepticism to proofs or disproofs of cherished beliefs, religious or social, where unconscious bias is greatest.
- Tuesday, November 02, 1999 at 21:02:58 (EST)
In simulations it turns out that a modest strategy called "Tit for Tat" is extraordinarily effective. Tit for Tat begins by acting cooperatively and thereafter just echoes back whatever the opposition did in the previous round. Tit for Tat is kind in that if the other player acts politely, so does it. Tit for Tat is responsive in that any attempt to take unfair advantage provokes an immediate retaliation. Tit for Tat is forgiving in that it doesn't hold grudges or remember past slights (beyond a single turn). And Tit for Tat is simple in that it is easily recognized and understood, in contrast to more sophisticated tactics which may confuse the other side.
Tit for Tat is not a bad strategy for real-world situations. Better in many contexts, however, is the more gracious "Tit for Two Tats": turn the other cheek to the first blow, reply only once to the next betrayal, and then try to be friends again. Tit for Two Tats thus adds pacifism to the list of qualities. An accidental misunderstanding (or a little noise in the system) won't trigger a never-ending feud.
So game theory comes down in favor of some quite elementary virtues: kindness, responsiveness, forgiveness, simplicity, and pacifism. They make sense!
- Sunday, October 31, 1999 at 18:00:50 (EST)
The focus is always external. Somebody else changes, makes a discovery, provides a service, takes action. How odd ... and how rarely it works. Each of us can directly affect one person. Yet we spend vast amounts of time and money and personal energy on attempts at lobbying, advertising, promoting, haranguing, and indoctrinating. We close our eyes and turn away from our selves.
- Saturday, October 30, 1999 at 21:17:35 (EDT)
"Real-time" means that simple retrieval requests should be answered in less than a second. (Complex Boolean proximity searches can take a bit longer to set up and execute, but even there, speed is important.) "High-bandwidth" means that the person has to get information back from the computer in a useful form which can be evaluated quickly. It's not good enough to return a list of documents or files, each of which has to be paged through in order to find the few relevant items. "Free-text" means that the input stream can't be assumed to include any regular structure beyond words separated by delimiters (spaces, punctuation, etc.).
People need tools for the earliest stages of research, when they have to be able to browse and free-associate without even knowing what are the right questions to ask. Users have to be able to work with a database without intimate knowledge of the details of what's in it, since nobody will have time to read much of the accelerating flood of information. A big database in a particular area has to be a "corporate memory" for groups of scholars who are working on related topics. A new person has to be able to come in and do research without extensive training, and experienced users have to find the system transparent, so that it becomes an extension of their memories, not a barrier to getting at the data and thinking with it.
Good free-text IR tools coexist with other types of tools, such as structured databases which are appropriate for answering well-formulated queries at later phases of the research effort. There are four fundamental operations that a good real-time high-bandwidth large-scale free-text IR system must support:
See Free Text Information Retrieval Philosophy for notes on how one simple (and free!) IR system was implemented over a decade ago.
- Friday, October 29, 1999 at 21:15:14 (EDT)
What happened? The true average is built upon how much time we spent on each leg of the journey, not how far we went. Slow parts of the trip count for a lot more than fast segments. The same holds for supercomputer speeds: a multiprocessor machine that does trillions of operations per second in parallel parts of a computation, but only gets 1% of that pace on non-parallel code, won't really average anything near its maximum performance (except on special problems that are more than 99% parallel).
And the same holds for us. We focus on peak experiences --- vacations, final exams, weddings, births.... We forget about the long stretches before and after --- home, school, marriage, growing up.... But those "slow" parts of existence are where we spend most of our time. It's fine to remember the view from the summit; that doesn't mean that the climb and the descent are less important. They're critical. They add up to the stories of our lives.
- Wednesday, October 27, 1999 at 20:36:01 (EDT)
- Tuesday, October 26, 1999 at 05:47:59 (EDT)
That approach (the "Copenhagen Interpretation") makes the quantum world of fuzzy waves match up with the classical world of everyday life. It predicts the results of laboratory experiments with astounding accuracy. But it's profoundly unæsthetic and confusing. Why should QM stop working and wavefunctions suddenly collapse just because a macroscopic observer peeks in? And what is an observer, anyway? Can a machine make an observation, or does it take consciousness to clobber a wavefunction? How much consciousness? The legendary Schrödinger's Cat story highlights the paradoxes. Connect a kitten to a QM system so that the creature is killed with a probability tied to the evolution of a wavefunction. Does the kitty take on the utterly nonclassical attributes of the quantum world? Does it exist in a superposition of ghostly states until a human looks at it?
There's a radical alternative, a perspective that cuts through these knotty problems and still gives the same good answers to lab experiments. It's called the Many-Worlds Interpretation (or "Everett-Wheeler" after two physicists who developed it). From the Many-Worlds viewpoint, there is no classical universe --- everything is quantum mechanics, all the way from the tiniest particles to the largest clusters of galaxies. It's like the old model of the flat Earth as a disk resting on the back of a gigantic turtle, which in turn stands on another turtle, which stands on another, etc. "It's turtles all the way down!"
In the Many-Worlds Interpretation people have wavefunctions too, and our wavefunctions become correlated with one another as we interact. The same holds for electrons, atoms, molecules, light, and everything else. The cosmos is one big interlocking set of pieces, continuously and intimately connected. When you make a measurement of something, the correlations fit together so that my observations thereafter will line up with what you saw, to an extraordinarily high probability. No wavefunctions collapse; all possible events actually happen, simultaneously.
If we picture the course of history as a branching tree, with forks where decisions have been made, then all the branches are alive at once. These are the "many worlds". New worlds spring forth every moment, in vast profusion, as particles move and interact. QM equations keep track of the relationships among wavefunctions and prevent paradoxes or arguments among observers.
What does the Many-Worlds Interpretation mean for us personally? Everything that can happen, will happen, or has already happened. Big things (like people) still seem "classical"; no measurements see alternate universes. Schrödinger's cat is both dead and alive, and we all agree on which --- because we ourselves are in the same mix of states. Everything is bound together through complex numbers and probability amplitudes, linked in a single harmonious dance.
There's no need for mysticism; this is just the way the world works. It's simple, consistent, and beautiful. Check the equations, and get used to it!
- Sunday, October 24, 1999 at 19:12:06 (EDT)
So imagine how dangerous it would be for an animal to have access to its own code! Suppose people could decide to rewire their brain circuitry, to make improvements (or for a spasm of pleasure) ... and suppose that the mere act of thinking about it made it so? The human race would be extinct in no time. (Who among us would have survived adolescence?) Even the wisest of the wise would hesitate before doing irrevocable mods. "Measure twice, cut once" would seem utter recklessness; one would test and evaluate proposed mental changes thousands of ways to make sure there were no hidden side effects. Even then, the transition to a new state would be terrifying to undertake.
But stranger still: we can, and we do, reprogram our minds daily. We modify our code when we learn, particularly when picking up new skills or methods of problem-solving. Thank goodness (or Darwin), we're denied access to the depths of our own operating system. We can't flip bits or connect and disconnect neurons at will, lest we trigger hyper-epileptic seizures and suffer instant death. But we're constantly adjusting high-level priorities and procedures in our mental subroutines. We choose to watch TV, or read a newspaper, or practice the piano, or study Sanskrit ... and our choices act like flowing water, cutting channels, leaving our minds with increased or decreased potentials for further changes.
Some decisions leave us where we are --- or worse, box us in until few exits remain. When we choose well, new doors open.
- Saturday, October 23, 1999 at 17:20:45 (EDT)
But the same exaggerated contrast, in other circumstances, leads to the most striking and serious metaphors. Newton speaks of himself as a boy on the seashore, picking up pebbles while the vast ocean of truth lies undiscovered in front of him. People are turned to stone by terror; hearts soar like hawks from joy; bodies and minds melt together in passion. Civilizations crash down, and new ones arise from their ashes.
Why are some comparisons laughable, and others of utmost gravity? Is it the context of the discussion? The choice of subject matter? The careful control of metaphor, avoiding overuse? (Does purple prose arise simply from too great a density of similies?) How can writing stay within the circles of power, and not cross into ridiculousness?
If there were a simple formula, once it was discovered it would cease to work. Novelty is essential, as is diverse vocabulary and a conscious regard for vivid imagery. Self-criticism helps. So does a keen eye and a sharp pencil, poised to edit out misplaced or distracting turns of phrase. Above all, it's critical to empathize with the reader and to care deeply about one's subject. A sympathetic audience will sense and respect the author's sincerity, and will in return forgive many sins --- in exchange for a chance to learn, to witness, and to share true love.
- Thursday, October 21, 1999 at 21:37:00 (EDT)
- Tuesday, October 19, 1999 at 05:43:41 (EDT)
So it helps to look back every so often --- back to times when things were other than they are now. Changes that happen on generational timescales then become apparent. Some are undeniably for the better --- health, wealth, individual liberty, technological progress --- but others are shocking when we make the comparison with bygone days. Seeing those changes in sharp focus is the first step to understanding where we are, judging our situation, and then choosing the right path for action.
- Sunday, October 17, 1999 at 11:08:51 (EDT)
And then ... we die. Can it be that death is in some sense a transition to a yet larger context? If so, what kind? Or is this just a pleasant conceit? Do we delude ourselves to avoid the real issue ... of Why?
- Friday, October 15, 1999 at 22:04:15 (EDT)
For a successful society, it's not enough to have brilliant geniuses, dynamic balls of fire who shoot off ideas in all directions. You need moderators: people to slow down the frenzied conversation, so that new concepts can be absorbed, mulled over, improved, and then used to trigger further good ideas. Moderators aren't nay-sayers who fight against novelty; rather, they must work carefully not to kill innovation, to see the feasible components in even the most flakey proposals.
Being a moderator isn't an easy job, particularly in times of creative ferment when new notions --- in science, technology, the arts, and commerce --- form an avalanche, jostling and prompting each other in an explosive chain reaction. But moderation is critical in all things, not least in turning wild ideas into reality.
- Thursday, October 14, 1999 at 05:50:43 (EDT)
The wellspring of power is direct manipulation --- the metaphor that we're working with physical objects, real things themselves, when we're on a computer. Of course, we're not. Our actions are translated by the user interface into low-level commands: fetch those bytes, put them over there, set that flag, multiply these numbers. Thousands of atomic operations add up to files copied, displays refreshed, words moved, and sounds generated --- mid-level activities. Thousands of those, in turn, give us e-mail, web pages, music, and animation. So we press buttons, grab documents, drop them in the trash, copy/paste photographs, and do all the other top-level actions that serve our needs as people. We (most of us, anyway!) don't get our jollies by flipping bits. Direct manipulation is an illusion that clothes a deeper reality.
But in an important sense, we use the same illusion in real life. When we see a pin and pick it up, at a low level electromagnetic waves (photons) interact with free electrons in the pin, propagate through space, are absorbed by molecules, and cause further rearrangements of atoms. At a mid-level, cells in the retina absorb light, then trigger a cascade of neural firings which lead ultimately to conversion of chemical energy to motion. High-level, we recognize an object, decide to act, bend over, and grasp it. We ignore the curved-space gravitational forces that keep the pin on the ground; the solid-state physics that makes its atoms cohere; the biochemistry that fuels our bodies; the mechanics of bones and joints, tendons and muscles; and the psychology, philosophy, and mathematics of thinking and choosing.
So direct manipulation computer interfaces aren't such a conjurer's sleight-of-hand trick after all!
- Wednesday, October 13, 1999 at 07:49:48 (EDT)
Nowadays, everything is so messy! There are infinite shades of gray (plus all those darn pastels!). Our eyes somehow can't distinguish them any more. We grow old.
The real world isn't axiomatic, regardless of how much we once thought it was. Logic has limits; data are incomplete. Wisdom demands that we hesitate before passing judgment. Devout worship of rules doesn't work. Mercy is an essential part of justice --- complex, individual, case-by-case, and riddled with exceptions. There's a time for rigor, and a time for its opposite. Growing up is learning to be inconsistent.
- Tuesday, October 12, 1999 at 06:01:57 (EDT)
But how to avoid unconscious pretense in our own relationships? Honesty, tempered by kindness, is an essential starting point. Self-deprecating humor helps --- especially if it springs from genuine humility, based on a recognition of one's limitations. (Compared to the infinite, we're nothing, eh?!) It's wise not to assume that others have the same background and capabilities, barring evidence otherwise. Offer context and explanations; invite feedback and questions. Be strict with one's self, but cut others some slack. Above all, empathize. Strive to understand. Listen, first and last.
- Monday, October 11, 1999 at 15:06:21 (EDT)
The more we look at objects, the more obvious it is that they're all one, intimately bound via a spiderweb of relationships. "A is B" either says nothing (if it's just a relabeling) or something infinitely deep. (Or both!) Names give us power --- because a name is a link, a pointer, a first step into the hyperspatial network of ideas that is the universe.
- Sunday, October 10, 1999 at 15:51:57 (EDT)
- Saturday, October 09, 1999 at 20:38:37 (EDT)
"Having been brought up in a serf-owner's family, I entered active life, like all young men of my time, with a great deal of confidence in the necessity of commanding, ordering, scolding, punishing and the like. But when, at an early stage, I had to manage serious enterprises and to deal with [free] men, and when each mistake would lead at once to heavy consequences, I began to appreciate the difference between acting on the principle of command and discipline and acting on the principle of common understanding. The former works admirably in a military parade, but it is worth nothing where real life is concerned, and the aim can be achieved only through the severe effort of many converging wills."from Memoirs of a Revolutionist
These words still resonate for us today, because they cut to the core issue of complexity management. How can people work together effectively on large multifaceted projects? How can true teamwork develop? --- rather than the mindless "I'll serve my time", the inertial "They pretend to pay us, we pretend to work", or the outright adversarial stance of "labor" vs. "management".
When problems are complex, root answers are simple but not easy. Members of a team need to communicate, empathize, offer ideas, and above all listen. They must build systematic mental models together on many levels, from immediate individual tasks to long-range global goals. They have to know not just what to do but why. Then when something goes astray --- as inevitably happens! --- they can diagnose and fix local problems locally, instead of propagating bad news up the chain of command and awaiting orders from on high.
To unleash creative human energy requires Kropotkin's common understanding --- a shared set of metaphors among the people who are collaborating --- a mutual vision of the group's mission. Visions can't be imposed; they emerge from free action by honest, open individuals. Encouraging (but not ordering!) is the real job of everyone on the project.
And what's the Project that we're all working on? Maybe it's called society ... civilization ... life ....
- Friday, October 08, 1999 at 06:27:27 (EDT)
But to an idealist, anarchy implies peace, not violence. In theory, governments exist to keep the peace. They do so by claiming a monopoly on the use of force in a geographical region. In practice, anarchists observe, governments themselves are the entities overwhelmingly responsible for death and destruction throughout history. Governments attract bad, power-hungry individuals, and give them the tools to inflict huge damage. The philosophical anarchist therefore argues that complete voluntarism, with no State to organize and initiate repression, is a far safer and better (and more moral) system.
Maybe --- especially if one models society on oneself and assumes that others are similarly inclined. A person who seeks consistency will be repelled by messy real-world situations, and will prefer the clean and logical laboratory of anarchy. A person who hasn't met with physical abuse, and who doesn't live in a neighborhood where he (alas, it's usually he, not she) fears for his safety, will have difficulty imagining a culture in which the threat of interpersonal violence is commonplace. And a person from a relatively privileged class, who by default gets respect and deference in social interactions, will not have an easy time empathizing with those who are habitually scorned, denigrated, or ignored. To such a person, moving effortlessly through life, the State seems an unnecessary evil.
Life is, for most of us, more complex. When Yeats wrote (in The Second Coming) the lines:
Mere anarchy is loosed upon the world, and everywherehe saw the structures of his universe falling apart after the first World War. Yeats was right to contrast the "passionate intensity" of some with the loss of certainty and conviction of others. Simple answers are seductive. In simple circumstances, simple answers are appropriate: the greatest triumphs of science come from seeing through surface distractions to the underlying simplicity of Nature. It's easy to get passionate about a simple fix; it's easy to shout slogans. It's hard to study a problem for years, develop an understanding of the feedback loops and interactions that make it difficult, and come to a realization that, at best, only slow improvement is possible.
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity.
In complicated situations, it's wrong to impose a false simplicity. Idealistic anarchism, like totalitarianism ("Trust us; we know what's best for you."), defines away the diversity of human life. It so fails.
- Wednesday, October 06, 1999 at 21:08:13 (EDT)
- Tuesday, October 05, 1999 at 06:12:48 (EDT)
Some of these special numbers are positive but closer to zero than any regular numbers, tinier than anything one could name. Some are fuzzy, in that they lie near each other but their order is indeterminate; they quasi-overlap, like wavefunctions of quantum-mechanical particles. When such numbers are added to each other, still more bizarre relationships arise among the results.
Infinitesimal and fuzzy numbers are valuable in understanding difficult-to-compare or conflicting options in real life. Sometimes we have to deal with goals that don't lie along a single axis --- they exist in higher-dimensional spaces, with critical aspects that are independent of each other. Some other goals are of infinite concern --- so important that all others pale to insignificance. The mathematics of incomparables give us a vocabulary and a toolkit to analyze such situations.
So Edgar Rice Burroughs's "incomparable Dejah Thoris", beautiful Martian maiden, perhaps was more than a cliché!
- Sunday, October 03, 1999 at 20:28:59 (EDT)
We trade in ghosts. We pay not for things, but for ideas.
Yet unlike physical goods and services, information isn't exclusionary: when one person buys data, the seller doesn't lose it; both parties still have it, all of it. After you teach me something, we both know it. The original creation of a data structure may be vastly expensive --- billions of dollars for a new high-performance semiconductor device or a next-generation operating system --- but once one exists, subsequent copies can be made for next to nothing.
So for information, many of the axioms of scarcity-based social systems become irrelevant. If you take my cow, I am poorer; I have one fewer cow. If I've written an essay (say this one!) and you read it, I still have the essay; I am no worse off. In fact, if you take my ideas and build upon them, improving and extending them, we are both far better off. Our sharing of knowledge has created wealth that neither of us could have produced alone.
"But wait!" the publishers of information interrupt, "how will we encourage the development of knowledge without intellectual property rights?" --- by which they mean, upon closer examination, special rules to maintain high prices for the old data they are now selling. Note that authors and artists rarely press for special rules; they're too busy working on something new. Lobbying for privilege comes from the traditional purveyors of existing ideas: agents, organizers, producers, distributors, wholesalers, retailers, and managers of the marketing process. The voices of knowledge creators are rarely heard, though often their heirs are vocal enough in asking for extended periods to profit from their "property".
Human society has, over the centuries, experimented with several alternatives to encourage creativity while also promoting the sharing and reuse of discoveries:
But with the shift to information-dominated economic activity, perhaps it is appropriate to think about new mutually-defined systems to balance individual and societal interests. Look, for example, at the past several centuries of science. Since the time of Newton, a cultural tradition of radical sharing has arisen. Honor and glory go to the first who publishes a discovery --- but not to the person who finds something out and keeps it hidden, or who only shows tantalizing fragments. Once a discovery is made, other scholars are free to build upon it without cost or criticism as long as they acknowledge the original publication (e.g., via footnotes in their writings). Scorn goes to those who cheat --- who fake results, who steal other people's ideas without credit, or who attempt to profit by taking shared information, adding a tiny increment, and putting a mercenary shell around the results. That sort of behavior is viewed as low, dishonorable, and unworthy of a scholar.
Could we consider a similar radical shift in our shared social models for rewarding information creation? Do discoverers have to restrict use of new ideas in order to maximize profitability? Have the traditional concepts of "public domain" and "fair use" been squeezed beyond recognition by short-term selfishness on the part of publishers? Are lawmakers being unduly influenced by contributions flowing from artificial monopoly profits? How much has the overall progress of humanity --- total good for society --- been hamstrung by unbalanced rules for "intellectual property"?
In other words, have we foolishly caged the ghosts of our ideas, and so imprisoned our own creative spirits?
- Friday, October 01, 1999 at 20:31:15 (EDT)
Our Hero, young ^z was by all reports amiable, bookish, and clever from an early age. The family moved to Austin, where he attended a private first grade and thereafter public schools. His Father worked for the Carnation Co., and rose from milkman to branch manager over the years; his Mother was a secretary who spent most of her career at the Federal Aviation Administration helping air traffic controllers with their paperwork. The marriage broke up in 1968. Werner wed again, left the milk company, and started his own small business manufacturing campers and pick-up truck covers.
Meanwhile, ^z reveled in his studies, especially science and mathematics, but above all reading. Beginning in the fourth grade he served in his elementary school library, shelving and helping with the check-out and card catalog process. This pleasant book-centric line of work became something of a personal industry: Mark was a student page (junior aide) for the Austin Public Library at his neighborhood branch; he shelved books in both his junior high and high school libraries; and when he attended Rice University he assisted in the Fondren Library on campus.
^z also worked irregularly at his Father's business (doing minor carpentry and inflicting minor injuries upon himself during inadvertent encounters with power tools). During summer months he held a job as a soda jerk at the local drive-in movie theater. Sporadic trips to his Grandfather's farm revealed that picking cotton, going fishing, hunting varmints, and tractor-driving were not his forte.
In short, ^z had fun and avoided hard physical labor. He got around town on a bicycle and later a tiny motorcycle. He was crew-cut, clean-shaven, and carried an attaché case. He looked and acted the straight arrow kid that he was.
At school, ^z prospered. Through books and magazines he immersed himself in science fiction and fantasy, devoured recreational math (especially Martin Gardner's columns in Scientific American), and inhaled popular science. He stargazed through a small backyard telescope. Mark garnered some local scholastic awards, and during the summer of '69 attended a National Science Foundation mathematics study program at Southern Methodist University in Dallas. This was his first time away from home of any duration, as well as the first of his many happy experiences in higher academia. At SMU, in addition to learning some number theory, probability, and statistics, ^z got his initial hands-on exposure to computing machinery and the glories of programming algorithms in FORTRAN. He was hooked.
The two young Zimmermann brothers avoided athletics but were avid gamers. Mark and Keith learned chess from (what else?) books, and throughout the 1960's competed in state-level tournaments and by mail. Neither were serious players; both enjoyed gambits and trappy openings; they earned Class A (Keith) and B (Mark) ratings. With friends and cousins they pushed cardboard tokens around hexagonally-gridded maps of Europe and other theaters of combat, playing wargames for hours on end.
Your Disobedient Correspondent draws a merciful veil over ^z's lack of sophistication (if not outright naïveté) in matters social during his younger years (if not to this day). ^z's voracious reading (no surprise!) gave him a theoretical appreciation of the facts of life, but he had zero real-world experience. Dating was an alien ritual. He took care to treat young ladies and gentlemen alike as colleagues, and to respect women for their minds. In the technical vocabulary of love, he favored philia, admired agape, but dodged the darts of eros.
It is thus only a slight exaggeration to say that ^z acted the rôle of cheerful computing machine incarnate: a Mr. Spock (but with a smile) years before Star Trek first aired. He was the rational Stoic in demeanor if not in the heart. He struggled to categorize, understand, and suppress his untidy emotions. (He failed, of course --- but found it an interesting and educational battle.) On the bright side, he had many fine female friends, but no "girlfriends". He was lucky to stumble onto a lesson early: it is wiser to be friends with those toward whom one feels passion, than to get passionate with those who may not prove worthy of friendship.
Such was his sheltered adolescence. Young ^z read hard, played hard, hardly worked, paid attention in class, did homework with good cheer, scored high on exams, pleased his teachers and parents, kept to a comfortable pace, and generally enjoyed growing up. Comrades who were private pilots took him puddle-hopping and gave him stick time; he dabbled at amateur radio, coin collecting, and other conventional hobbies. But books remained his focus.
Politically and philosophically, ^z began as a stout libertarian but became less doctrinaire (or more appropriately confused!) over time. In religion he was raised a Lutheran, and that faith similarly evolved to an eclectic (if intellectual)reverence for life and the universe. ^z read (as expected!) widely in the more elementary works of Chicago and Austrian School economists, with side trips into anarchism and traditional American conservatism. The strongest influences on his thinking came via the gentle writings of Leonard Read et al. of the Foundation for Economic Education. (But alas, not immune to the siren calls of currency cranks and make-money-fast hucksters, ^z threw away part of his small savings speculating in silver before he came to his senses. Lesson learned: skepticism.)
School passed quickly and pleasantly. While sailing through the math and science curriculum, Mark studied Spanish for four years and German for three. One of his best classes for the Information Age proved to be typing, taken as a lark during a summer term.
^z's circle of friends included a cross section of all available races, sexes, religions, and lifestyles (the sample was far from complete, even though Austin was relatively cosmopolitan for a mid-sized Texas town at that time). Much to the credit of his parents and educational environment, ^z grew up relatively uncluttered with prejudice. "All men are created equal" seemed a good working hypothesis. He was startled when apparently-normal fellow students revealed, in private conversation, racism or misogyny. Thankfully, the schools Mark attended had an exceptionally small level of cliquishness among jocks, nerds, freaks, artistes, slackers, and the like. Conflict was low; times were good; people got along. The downside of this upbringing, however: ^z was slow to learn empathy for the dispirited, for those people (especially cultural minorities and women) subject to constant and enervating discrimination. (He still works on that lesson.)
In the spring of 1970, ^z graduated first in his class from John H. Reagan High School, giving one of the shortest valedictory speeches on record. (It took less than two minutes. He admonished his fellow graduates not to curse the darkness but rather to light candles --- though not at both ends. To this day, the point of that advice remains unclear.). In the fall, shortly before his 18th birthday ^z entered the University of Texas at Austin to study physics and, perhaps, continue to grow up.
- Wednesday, September 29, 1999 at 05:52:15 (EDT)