Wednesday, March 31, 2021

Roy's "linguistic cosmopolitanism" series


     What is it to “beg the question” (BtQ)? Why, it is to “raise a question or point that has not been dealt with.” It is to “invite an obvious question” (Oxford New American Dictionary) 

     Obviously! 

     But it is not so obvious to me, oldster—and philosopher—that I am. The term “beg the question” has a traditional meaning of significant importance in the study of logic or reasoning—and, I would suggest, in critical thinking instruction generally. It is the meaning that I was taught when I went to university in the early 70s. 

     To beg the question in this sense is to commit a particular fallacy. More specifically, it is to do this: in the course of arguing for proposition P, one assumes—without realizing it—the truth of P. 

     Logically, that's a total train wreck. 

     BtQ is a surpassing fallacy, like arguing for the proposition that God exists by simply repeating “God exists.” 

     QED!

     Here’s a standard example of this fallacy. Suppose I argue for God’s existence as follows: 

Of course God exists! After all, the Bible speaks of God, and we can trust the Bible, since it is divinely inspired! 

      — I.e., we know that God exists because the Bible assumes God’s existence; and anything the Bible assumes is true since the Bible has been made truthful by God 

      — Which is to assume that God exists in one’s argument to establish the proposition that God exists. 

      Embarrassing! Absurd! Ridiculous!

     Begging the question is alternatively called “circular” reasoning, for its starting point and end point are, in a way, the same point: God exists; thus, God exists.

     Critical thinkers are taught to be on the lookout for this fallacy because people do commit it. So it is important. 

     But, nowadays, there is no easy way to refer to it. 

     Fifty years ago, among some significant subset of educated persons, one could declare that Jones is “begging the question” and be understood. One was saying that, logically speaking, Jones has messed up bigtime, for he is assuming the very thing he is supposed to be supporting or defending. Intolerable!

     But, nowadays, unless one is among philosophers or logicians, one's remark about Jones will not be understood. —Without the old sense of "BtQ," one is compelled to scramble to make oneself understood.

* * * 

     Let’s turn to the Oxford English Dictionary, which provides the following entry for “beg the question”: 

     To take for granted without warrant; esp. in to beg the question: to take for granted the matter in dispute, to assume without proof. 

     1581 W. Charke in A. Nowell et al. True Rep. Disput. E. Campion (1584) iv. sig. F f iij I say this is still to begge the question. 

     1680 Bp. G. Burnet Some Passages Life Rochester (1692) 82 This was to assert or beg the thing in Question. 

     1687 E. Settle Refl. Dryden's Plays 13 Here hee's at his old way of Begging the meaning

     1788 T. Reid Aristotle's Logic v. §3. 118 Begging the question is when the thing to be proved is assumed in the premises. 

     1852 H. Rogers Eclipse of Faith (ed. 2) 251 Many say it is begging the point in dispute

     1870 F. C. Bowen Logic ix. 294 The vulgar equivalent for petitio principii is begging the question. 

     So the OED recognizes the traditional meaning but does not recognize the newer “raise the question” meaning. But most modern (and likely more up-to-date) dictionaries do recognizer it, though they also recognize the traditional meaning. 

     In fact, one can find three meanings in contemporary dictionaries: 

• To assume precisely what one is supposed to be arguing for  (traditional)

• To raise or invite an obvious question (newfangled)

• To evade an issue (newfangled)

     The last meaning is mentioned only occasionally. It appears to be a creature that has crawled a few inches from the traditional meaning. 

     The New Oxford American Dictionary (which is loaded on all Apple computers) offers this usage note: 

     USAGE - The original meaning of the phrase beg the question belongs to the field of logic and is a translation of the Latin term petitio principii, literally meaning ‘laying claim to a principle’ (that is, assuming something that ought to be proved first), as in the following sentence: by devoting such a large part of the anti-drug budget to education, we are begging the question of its significance in the battle against drugs. To some traditionalists, this is still the only correct meaning. However, over the last 100 years or so, another, more general use has arisen: ‘invite an obvious question,’ as in some definitions of mental illness beg the question of what constitutes normal behavior. This is by far the more common use today in modern standard English. 

     So the NOAD holds that “some traditionalists,” including, evidently, the OED, regard what I’m calling the traditional meaning of BtQ as the only correct meaning. On the other hand, the NOAD is likely correct in saying that the “invite an obvious question” meaning “is by far the more common use today in modern standard English.” 

     The American Heritage Dictionary offers this usage note: 

     Historically, logicians and philosophers have used the phrase beg the question to mean "to put forward an argument whose conclusion is already assumed as a premise." Usually, when people beg the question in this sense, the conclusion and the assumed premise are put in slightly different words, which tends to obscure the fact that such an argument is logically meaningless. For instance, to argue that caviar tastes better than peanut butter because caviar has a superior flavor is to beg the question—the premise that is taken as given (that caviar's flavor is superior) is essentially identical to the point it is intended to prove (that caviar tastes better).· But since at least the early 1900s, laypeople have been using beg the question in slightly different senses, to mean "raise a relevant question" or "leave a relevant question unanswered." When used in these senses, beg the question is usually followed by a clause explaining what the question in question is, as in That article begs the question of whether we should build a new school or renovate the old one or The real estate listing claims that the kitchen is spacious, which begs the question of what "spacious" means. These senses of beg the question are so well established that they have nearly displaced the original sense in everyday usage, but they are still often frowned on by traditionalists, especially those with training in philosophy; in our 2013 survey, the sentences above were judged acceptable only by slim majorities of the Usage Panel—55 and 58 percent, respectively. By contrast, a sentence using the phrase in its original sense (When I asked him why we must protect every endangered species regardless of the cost, he said it was because every species is priceless, but that just begs the question) was considered acceptable by 79 percent of the Panel. The newer senses of beg the question will probably continue to flourish because "begging a question" suggests "begging for," or "raising" a question. However, this broader usage will also probably continue to draw the ire of philosophers and others who use the "circular reasoning" sense of the term, for which there is no good substitute, and do not want to see its technical meaning lost. 

       — “For which there is no good substitute.” Hear hear! 

     AHD’s note strikes me as particularly helpful. It shares with the NOAD the notion that the “new” meaning of BtQ is about a century old. 

     On the other hand Dictionary.com asserts that 

     This phrase [meaning “assuming what is to be proved”], whose roots are in Aristotle's writings on logic, came into English in the late 1500s. In the 1990s, however, people sometimes used the phrase as a synonym of “ask the question” (as in The article begs the question: “What are we afraid of?”). 

     Wiktionary seems to take a similar view when it states, “The sense ‘raise a question, prompt a question’ is more recent….” My impression is that the "raise the question" sense suddenly spread in the 80s. Or maybe the great outbreak occurred in the 90s. Could be.

     Wiktionary further asserts that “The [traditional] sense is not well understood except in specialized contexts, such as in academic and in legal argument. It is based on a sense of beg which is no longer common.” 

     Yes, but the traditional concept or category of BtQ is important, and it is a shame that we are losing—or have lost—this valuable verbal tool. 

     As I’ve grown older, I’ve come to see myself—and users of language generally—as travelers along a long and diverse road in which the folk encountered invariably imagine that their tools are the right tools, the only tools, for whatever needs doing. The traveler, however, to the extent that he has traveled, would never conceive such a notion. 

     I recommend such cosmopolitanism.

 

* * *


     I love thinking about words as having surprising and curious histories as they snake through time. I cringe at the popular notion that a word’s meaning—that is, its “real” or “true” meaning—is its original meaning, a static thing. To mix metaphors: the original meaning of a word is just one of the ingredients in the stew that is that word—or perhaps it is the starting point of a complex transformative journey. 

     It is not the essence. It is not the core. 

     The word may even have left its one-time core far behind, a betrayal of its original self. 

  * * * 

     Speaking of metaphors: one kind of change to which a word can be susceptible is our understanding of a word’s metaphorical quality. 

     It can lose that. 

     Take the word “based.” I like that word. There is the noun, “base,” which refers (often) to “the bottom of something considered as its support : FOUNDATION” (Merriam-Webster). 

     Then there is the verb “base,” as in “have as the foundation for (something)” or “to find a foundation … for : to find a base … for —usually used with on or upon.” 

     And so there is a base: something upon which some other thing rests or is supported. And then there is that which is “based” on (or upon) it. 

     Sensible. Logical. 


     But illogic has long been afoot. 

     In my long career as a teacher—i.e., an evaluator of young people’s verbal efforts—I’ve noticed a definite change in how they think about the verb “base.” Because I think of something being based on something else in relation to the metaphor of a base and that for which it is a support, I have always spoken in this way: 

• Led Zepplin’s “Stairway to Heaven” is based on Spirit’s “Taurus.” 

• The 1960 TV series “The Fugitive” was loosely based on Victor Hugo’s Les Miserables

• Based on your accent, I gather that you’re from crazy town. 

     What else? 

     Here’s how the youth of today speak: 

• Led Zepplin’s “Stairway to Heaven” is based off of Spirit’s “Taurus.” 

• The 1960 TV series “The Fugitive” was loosely based off of Victor Hugo’s Les Miserables

• Based off of your accent, I gather that you’re from crazy town. 

The horror

     I was horrified when, maybe thirty or so years ago, I first began to encounter in student writing the construction “based off.” 

     Based off?! A base is something that supports something on top of it. Hence, one must speak of X being BASED ON Y. Talk of X being BASED OFF of Y is nonsense! It’s confused! 

     I crossed “based off” out whenever I found it. “That’s not English,” I’d say or write. 

     But I kept encountering that construction. It started as a trickle but became a flood. 

     A few years ago, I spoke with some of my younger colleagues at the college. Yep, they, too, said “based off.” When I suggested that “based off” makes no sense, they just stared at me, uncomprehending. 

      (This reminds me of the time that I asked a fresh new colleague who her favorite musical artist was. “Justin Timberlake,” she said. I was nonplussed.) 

     Since then, I’ve paid attention—on TV and elsewhere—and, sure enough, the new standard—at least in my part of the world (Southern California)—is BASED OFF, not that musty old BASED ON.  

     I spoke about all this with an old colleague and friend. I asked her if she is still horrified upon encountering “based off” in her students’ speech and writing. 

     She is. 

     Nay, she is incensed

     But it seems clear that our efforts to draw a line in the sand about “base” are hopeless. Somehow, “base” has been torn from its once familiar moorings (or base). The torch has been passed—and transformed—and “base” is no longer the glowing metaphor it once was. 

     It’s just a word. 

     Without pictures. [END]

     —No, wait. I want to end by citing two entries from the New Oxford American Dictionary: "nonplussed" and "public school":

     Nonplussed

     1 … PERPLEXED  

USAGE In standard use, nonplussed means ‘surprised and confused’: the hostility of the new neighbor's refusal left Mrs. Walker nonplussed. In North American English, a new use has developed in recent years, meaning ‘unperturbed’—more or less the opposite of its traditional meaning: hoping to disguise his confusion, he tried to appear nonplussed. This new use probably arose on the assumption that non- was the normal negative prefix and must therefore have a negative meaning. It is not considered part of standard English. 

     Public school

1 (chiefly in North America) a school supported by public funds. 

2 (in the UK) a private for-fee secondary school. [I.e., the opposite meaning]

* * * 

Coming soon: On the adjective “psychic”

* * *


     As a fan of popular music, I am a fan of lyrics. 

     Let’s talk about the word. 

     “Lyrics,” I mean. 

     If I want to refer to the words of a particular song, I usually write or speak of the song’s lyric. (I'm aware of how strange that sounds to people; so, really, I waver on this, using "lyrics" frequently.)

     Does my use of "lyric" surprise you?

     I am now officially old and, as one who has seen and heard much, I can report a gradual shift in the use of the word “lyric/lyrics,” even in my lifetime. As a young person, I recall being corrected—I don’t recall who did the correcting—when I referred to a song’s “lyrics.” 

     “‘Lyric,’ not ‘lyrics,’” they said, authoritatively. 

     I vaguely recall being impressed by this authority, whoever it/she/he was. And so I mostly went along with that advice. And so, for the most part, I have referred to a song’s “lyric,” and thus to the “lyrics” on an album (i.e., many a lyric; hence lyrics, plural). 

     Still, it soon became obvious to me—even by the 1970s—that I had joined a shrinking minority. I was saying and thinking "lyric," but all I ever heard was "lyrics." 

     "Lyric" even sounded funny.

     It sounds gravely funny today, and it’s not hard to find putative authorities correcting those who say “lyric.” 

     “‘Lyrics,’ not ‘lyric,” they say, authoritatively. 

     My issue with the use of “lyrics” instead of “lyric,” to the extent that I have one, is now clearly a case of “shoveling shite against the tide,” to use my late dad's favorite phrase. 

* * * 

     Let's consult the Oxford English Dictionary, which is pretty dang authoritative about the use of English words. It paints the following picture: “lyric,” the adjective, was originally simply the adjectival form of “lyre,” the musical instrument. But, as always happens, things got complicated in the English language. Among other things, “lyric” eventually became a noun that referred to poets and singers and, well, the words of a song or poem. 

     OK, HERE’S THE THING. By the late 19th Century, it appears that speakers of the language used the word “lyric” to refer to the words of songs. 

     Actually, the OED provides this definition: “The words of a popular song; frequently plural.” 

     – That last part ("frequently plural") perhaps implies that, according to the OED, “lyric” and “lyrics” were used interchangeably back then.

     But wait! In truth, most of the OED’s early examples of the use of the noun, starting 1876, involve “lyric,” not “lyrics.” Here are the first three:

     1876 J. STAINER & W. A. BARRETT Dict. Musical Terms 276/2  Lyric, poetry or blank verse intended to be set to music and sung. 

     1927 Melody Maker Aug. 759/3  On July 8 Edgar Leslie, the prolific and most successful lyric writer in America, arrived in London. 

     1933 Punch 16 Aug. 180/3  The gramophone plunged fervently into that lyric called ‘I've Got a Date with an Angel’. 

     The first use of “lyrics” (plural) that the OED mentions is in 1934: 

      1934 C. LAMBERT Music Ho! IV. 272  The lowbrow poet—the type of writer who in the nineteenth century produced ‘Champagne Charlie’ and now produces revue lyrics

     So I’m guessing that, in the late 19th Century, “lyric” was the word used for the words of a song, but then, by the 1930s, “lyrics” started being used too. 

     Here are the rest of the OED entries; they seem to reveal a pattern:

     1938 Oxf. Compan. Music 526/2  Another well-known poet constantly advertises himself in the British musical press as ‘Lyric Author…2,000 songs…not one failure to give great pleasure’. 

     1946 E. O'NEILL Iceman Cometh II. 150  They all join in a jeering chorus, rapping with knuckles or glasses on the table at the indicated spot in the lyric

     1958 Times 2 Aug. 7/4  Teenagers in Minneapolis, believing that the words of some ‘pop’ songs can encourage juvenile crime, have..‘opened a nation-wide “better lyrics” contest’. 

     1967 Listener 3 Aug. 130/1  Having introduced a new sound in the music, they saw that they had next to change the type of lyric

     1968 Listener 7 Nov. 610/1  According to Mick Farren, lyric-writer of the Deviants: ‘Pop music is..the last free medium.’ 

     1972 Jazz & Blues Sept. 12/1  The banality of the lyrics

     1973 Listener 19 Apr. 522/1  The bo' weevil fugues..in blues lyrics

     I’m guessing, then, that the English language experienced a shift from the dominance of “lyric” in the late 19th Century to the dual use of "lyric/lyrics" by about the 1930s; and, in subsequent decades, the use of “lyric” slowly faded, replaced, especially since the 70s—that's when I stumbled onto the scene—with the use of “lyrics.” 

* * * 

      Wikipedia—I know, I know—has a “lyrics” entry, including a section on “etymology,” which tends to confirm my guess: 

"Lyric" derives via Latin lyricus from the Greek λυρικός (lyrikós), the adjectival form of lyre. It first appeared in English in the mid-16th century in reference to the Earl of Surrey's translations of Petrarch and to his own sonnets. Greek lyric poetry had been defined by the manner in which it was sung accompanied by the lyre or cithara, as opposed to the chanted formal epics or the more passionate elegies accompanied by the flute. The personal nature of many of the verses of the Nine Lyric Poets led to the present sense of "lyric poetry" but the original Greek sense of "lyric poetry"—"poetry accompanied by the lyre" i.e. "words set to music"—eventually led to its use as "lyrics", first attested in Stainer and Barrett's 1876 Dictionary of Musical Terms. 

Stainer and Barrett used the word as a singular substantive: "Lyric, poetry or blank verse intended to be set to music and sung". By the 1930s, the present use of the plurale tantum [plural only] "lyrics" had begun; it has been standard since the 1950s for many writers. The singular form "lyric" is still used to mean the complete words to a song by authorities such as Alec Wilder, Robert Gottlieb, and Stephen Sondheim. However, the singular form is also commonly used to refer to a specific line (or phrase) within a song's lyrics.

     If this is correct, it appears, then, that the person who corrected me 50 years ago was already shoveling shite—by which I mean, not that he was slinging crap, but that he was bucking a trend that was already decades old and approaching dominance. 

     If so, my present use of “lyric” is an instance of hyper- or super-trend buckery, aka abject fuddy-duddery. I may as well be referring to "palaver" or "lingo" or even "blarney."

     On the other hand, I'm with Stephen Sondheim on this one. How uncool could that be?

* * *

     NEXT TIME: if I hear one more kid say that X is "based off" of Y, I'm gonna scream.

     STILL LATER: I remember when "begging the question" was an informal fallacy!



Houdini perhaps benefited from interest in Spiritualism; 

but he became its chief debunker by the time that he died.

woo-woo

noun 
unconventional beliefs regarded as having little or no scientific basis, especially those relating to spirituality, mysticism, or alternative medicine…. 
Adjective 
relating to or holding unconventional beliefs regarded as having little or no scientific basis, especially those relating to spirituality, mysticism, or alternative medicine…. (New Oxford American Dictionary

     It is still possible—just barely—to use the adjective “psychic” to mean “of the psyche,” i.e., “of the mind.” But that meaning has long been squeezed into obscurity by the “having to do with woo-woo” meaning. 
     My issue: what’s with use of the term “psychic” to refer to alleged woo-woo phenomena? Why isn’t a psychic phenomenon simply a mental phenomenon? 

* * * 
     With regard to the concept of the “psychic,” the Oxford English Dictionary (OED) demands attention to two adjectives: “psychical” and “psychic.” The less-familiar “psychical” is the earlier word, having been used as early as 1642 (to refer to the spiritual). The noun or adjective “psychic” didn’t appear until two hundred years later.
     But let’s start with the root: the noun, “psyche.” 

The noun “psyche”

     The Ancient Greek word “psyche” originally referred to breath or life, but it is usually translated as soul or mind or spirit. According to the OED, the “name was extended by Plato and other philosophers to the anima mundi…, conceived as animating the general system of the universe, as the soul animates the individual organism.” 
     The earliest English use of the noun “psyche” recognized by the OED occurs in 1647: 

The animating principle of the universe; = anima mundi n. Obsolete. 

     That meaning didn't last long. 
     The OED also cites a longer-lasting use of the noun meaning mind/soul/spirit, dating from the same time (1648):

The mind, soul, or spirit, as distinguished from the body. 

     That, I think, is what most of us would expect to be the case. In the English language (one might suppose), “psyche” has tended to refer to the “mind or soul or spirit,” which is what the Ancient Greeks meant. By the 19th Century, a more scientific sense of psyche—referring to the mind, not to spirit or soul—emerged. 
     All true. 

The adjective “psychic/psychical”

     The OED dates the first use of the adjective “psychical,” meaning “of or relating to the spirit, spiritual,” to 1642. That meaning, it says, is “now rare.” It largely gave way to the concerning-the-mind/soul meaning, though it was still possible to use the term “psychic” for the “spirit” meaning in the mid-19th Century. 
     I should mention that the OED’s examples of the “spirit” meaning of “psychical” often seem to refer ambiguously to the spiritual or mental. (See for yourself.) 
     The OED lists an odd early 18th Century, theological (“now chiefly historical”) meaning of “psychical”: “Of or relating to the animal or natural life of man, esp. the natural or animal soul, as opposed to the spiritual.” Here, it seems, the psychical is the mental that is not the spiritual—e.g. (I suppose), our desire (a mental phenomenon) for sex or food in contrast with, say, our love or respect for God (also a mental phenomenon, but one presumably not rooted in our bodily nature). 
     This would seem to be a very specialized meaning, very like Kant's notion of the self as possessing natural "inclinations," apart from the faculty of reason.

* * *
The concept arrives:

     By 1836, it was possible to use the adjective “psychical” to mean psychic in the contemporary woo-woo sense. But then, by 1871, the term “psychic” replaced “psychical” to refer to all things woo-woo:

     Psychical: (3) Of, relating to, or designating faculties or phenomena, such as telepathy and clairvoyance, that are apparently inexplicable by natural laws and are attributed by some to spiritual or supernatural agency; involving paranormal phenomena of the mind, parapsychological. [1836 - ] 

     PSYCHIC, adj. 
     3. 
     a. = PSYCHICAL adj. 3. [1871 - ] 

     So the concept of woo-woo related stuff—our contemporary sense of “psychic”—traces back to about the 1830s. The use of the term "psychic" to refer to that stuff begins in the 1870s or so. 
     Meanwhile, as one might expect given the origins of “psyche” and the rapid emergence of the science of psychology (in the 19th Century), the term “psychic” was also used to refer to “the mental” (“Of, relating to, or generated by the human mind or psyche”) starting around 1845: 

     PSYCHIC adj. 1. 
     1. Of, relating to, or generated by the human mind or psyche; psychological; mental. Also, of an illness or condition: psychogenic (now rare)…. 

     1845 Dublin Univ. Mag. Jan. 496/1 The nightmare..may indeed be a mere phantasm or psychic image. 
     1873 W. WAGNER tr. W. S. Teuffel Hist. Rom. Lit. I. 422 In its refined descriptions of psychic events the poem recalls Virgil's manner. 
     1883 Brit. Q. Rev. July 14 The varied stimuli, psychic and physical. 
     1896 Alienist & Neurologist 17 520 Hysteria, is a constitutional psycho-neuropathy with morbid impulsions, caprices, delusions, hallucinations, and illusions, psychic and sensory. ¶1902 J. BUCHAN Watcher by Threshold II. 131 Among women his psychic balance was so oddly upset that he grew nervous and returned unhappy. 
     1910 Jrnl. Abnormal Psychol. 5 68 I have successfully treated by Freud's psychoanalytic method cases of homosexuality, psychic impotence..and many other so-called perversions. 
     1925 J. LAIRD Our Minds & their Bodies ii. 32 ‘Psychic’ tumours or false pregnancies have deceived skilled observers. 
     1968 New Scientist 2 May 226/1 The so-called ‘psychic poisons’, capable of inducing temporary or even permanent insanity. 
     1974 M. MENDELSON Psychoanalytic Concepts of Depression (ed. 2) vii. 254 Unlike the energy of science..psychic energy is directional. 
     2004 D. BIRKSTED-BREEN et al. In Pursuit of Psychic Change vi. 106 His psychic life was dominated by this phantasy which was suffused with such hatred toward his sibling..that it had led to an unconscious belief that he had actually murdered him. 

     Interestingly, the noun “psychic,” referring to a practitioner or "manifester" of woo-woo, dates back to 1860:  

PSYCHIC. n.
B. n.
1.a. A person who is regarded as particularly susceptible to supernatural or paranormal influence; a medium; a clairvoyant. 
     1860 W. D. HOWELLS Let. 14 Nov. in Sel. Lett. (1979) I. 64 We talked chiefly about psychics... I am going largely into skepticism at present. 

     (I should m
ention one further, but obscure, term, that of “psychics” (akin, I guess, to "mathematics") to refer to the study of the human psyche. Evidently, the OED has found such a use from 1832 and subsequently. It competed for a time with the term "psychology.") 

     OK, so here are the facts, I believe: 

• In English, the noun “psyche” has long referred to the mind or soul, though esoteric theological meanings have come and gone.
• The adjective “psychical” (17 Century) is older than the term “psychic,” which only emerged in the 19th Century. 
• The contemporary concept of “the psychic” (i.e., and adjective referring to woo-woo matters) emerged in the early 19th Century, though the term then used was “psychical,” not “psychic.” The term (adjective) “psychic” replaced “psychical” by the 1870s or perhaps a bit earlier (1860s?). 
• Given the emergence of the scientific experimental study called “psychology” (often viewed as formally commencing with Wundt in 1879), one naturally supposes that a noun, “psyche,” referring, not to the soul or spirit, but simply to the mind, has existed from at least that time. According to the OED, the term, with that constrained meaning, has existed since the mid-19th Century, as has the corresponding adjective "psychic."

     This leaves me with a bit of a mystery. Given the history of psyche-related English terms, the emergence, in the 19th Century, of a noun referring to the mind and an adjective referring to the mental makes perfect sense
     But why would the study or appreciation of alleged woo-woo phenomena come to be called “psychic”? And why has that use of "psychic" almost entirely eclipsed the sensible of-the-mental use?

     Two obvious points, I suppose: 

     1. The Ancient Greek term “psyche” referred to the mind or soul or spirit. The soul or spirit are, of course, supernatural entities. 

     2. The 19th Century saw (in Britain and the US) the rapid rise of a powerful movement called “Spiritualism,” which concerned alleged supernatural communications between people and their dead relatives (i.e., spirits). 

     It is possible that a keen interest in the “spirit world,” especially among the British and American upper classes, in the mid- and late-19th Century yielded a kind of competition for use of the term “psychic” between the (often powerful and connected and endlessly watched) woo-woo crowd and the scientific/academic world that gave rise to the field of psychology. 
     In case you are unfamiliar with the Spiritualist movement, here’s the Encyclopedia Britannica’s account: 

     Spiritualism, in religion, [is] a movement based on the belief that departed souls can interact with the living. Spiritualists sought to make contact with the dead, usually through the assistance of a medium, a person believed to have the ability to contact spirits directly....
     Modern spiritualism traces its beginnings to a series of apparently supernatural events at a farmhouse in Hydesville, N.Y., in 1848. The owner and his family, as well as the previous occupants of the house, had been disturbed by unexplained raps at night. After a severe disturbance, the owner’s youngest daughter, Kate Fox, was said to have successfully challenged the supposed spirit to repeat in raps the number of times she flipped her fingers. Once communication had apparently been established, a code was agreed upon by which the raps given could answer questions, and the spirit was said to have identified himself as a man who had been murdered in the house. 
     The practice of having sittings for communication with spirits spread rapidly from that time, and in the 1860s it was particularly popular in England and France. Kate Fox (afterward Mrs. Fox-Jencken) and one of her sisters, Maggie Fox, devoted much of their later lives to acting as mediums in the United States and England. Many other mediums gave similar sittings, and the attempt to communicate with spirits by table turning (in which participants place their hands on a table and wait for it to vibrate or rotate) became a popular pastime in Victorian drawing rooms…. 
     …Spiritualism also inspired the rise of the discipline of psychic research to examine the claims made by mediums and their supporters. A variety of techniques were developed to study not only basic psychic experiences (telepathy, clairvoyance, and precognition) but the more complex phenomenon of spirit contact. By the end of the 19th century, significant efforts were being made to verify the phenomena of mediumship, especially the occasional materialization of spirit entities. ... Among the most prominent supporters of spiritualist claims was the chemist Sir William Crookes (1832–1919), a president of the Royal Society ..., who investigated and pronounced genuine the materialization phenomena produced by medium Florence Cook. 
     Those who placed their hopes in physical phenomena, however, were destined for disappointment. One by one, the mediums were discovered to be engaged in fraud, sometimes employing the techniques of stage magicians in their attempts to convince people of their clairvoyant powers…. Spiritualism fared better in Britain, especially in the 1950s after the repeal of the witchcraft laws, which had been used against mediums quite apart from any charges of fakery....

* * * 

The Fox sisters' home


     Today, "psychic research" has a bad name among scientists and academics—for good reason. But, during the era of Spiritualism (1840s-1920s), it was taken seriously by much of the population, especially members of the upper classes. I suppose we shouldn't be surprised that, when the woo-woo crowd blundered their way towards appropriation of the term "psychic," they might well succeed in "owning" the term, despite the remarkable successes of the scientifically minded in elevating psychology (the study of the psyche, i.e., the mind) to a science.

Friday, August 26, 2016

On embracing your own facts

Everyone is entitled to his own opinion, but not his own facts. 
–Daniel Patrick Moynihan   
     Rational discourse—i.e., useful discussion—depends on several things. Among other things, it depends on people assigning the same meanings to the words they use. Society could not debate, say, the morality of abortion if people were allowed to use such words as “abortion,” “fetus,” “trimester,” and “life” in their own eccentric ways. All discussion would come to a standstill—or would simply become noise.
     To a certain extent, of course, some do use these words in eccentric, and even deceitful, ways. And useful discussion becomes that much harder.

* * *
     I'm reminded of a curious chapter in the anti-tax movement. Ronald Reagan, of course, is an anti-tax hero among Republicans. In fact, however, he did raise taxes.
     But he was inclined to deny this fact.
     Here's how Joseph J. Thorndike tells the story:
     The modern history of GOP linguistic gymnastics [re taxation] begins with Ronald Reagan, who famously began his presidency with a dramatic tax cut. The Economic Recovery Act of 1981 gave conservatives … a huge [anti-tax] victory.
     Before the ink was even dry on the bill, however, Reagan was floating plans for a tax increase. Except he wasn't calling it that. "The administration, carefully attempting to avoid any implication that it would raise taxes, described the proposals as an effort to 'curtail certain tax abuses and enhance tax revenues,'" explained The Washington Post.
     Reagan's attempt to rebrand his tax increases as "revenue enhancements" did not go unnoticed. "They've all sold out, every one of them," complained Jude Wanniski of Reagan's economic advisers….
. . .
     Many loyal Reaganites … embraced the new distinction. Taxes "are not revenue enhancers," declared Rep. Jack Kemp in a typical comment.
     Still, [the new term] … didn't fool most observers. "The Reagan Administration calls it 'revenue enhancement,' but the ranking Republican on the House Ways and Means Committee, Rep. Barber B. Conable Jr. of upstate New York, calls it 'raising taxes' and says he is against it," reported the Times. (See Tax Analysts, 6/30/11)
     So, Reagan didn't raise taxes after all. He merely pursued revenue enhancements.
     No, he raised taxes.
. . . 
     Another necessary condition of rational discourse is the availability of facts upon which participants can agree. To think about and discuss an issue competently and usefully, one needs to have the truth, the facts. Given the facts, one can construct a position in terms of those facts. And, in the course of debate and discussion, the best view has a chance to emerge.
     How does one go about acquiring facts? In recent years, I've encountered people who immediately express a stark skepticism: "everyone knows that there are no facts; there's just different opinions, different ways of spinning reality."
     That, of course, is an absurd and unwarranted skepticism.
     The truth is that, in the case of most issues, one can discern the facts, or at least some of the facts, with information that is available. The budding critical thinker learns, for example, about the relatively objective nature of academia, the basis of academic reputations (evidence, stronger arguments), and reasons to be drawn to opinions there that achieve consensus standing. One learns about standards of reliability within healthy expert communities—refereed publishing, replication, attainment of consensus, etc. One learns about differences in professionalism between various sources, including news sources. One learns to read far and wide, comparing reports. One learns not to "cherry pick" evidence or expertise. With such skills at hand, participants in discussions can indeed discern "the facts" that serve as the necessary background.
     For the most part, public discourse in this country has proceeded against the backdrop of the availability of objective, uncontroversial facts. Admittedly, sometimes, it takes effort to find them. There's a certain amount of distracting noise, at least for a while. Still: Did Ronald Reagan raise taxes? Yes, he did. Was Terri Schiavo in a persistent vegetative state? Well, yes, as it turns out, she wasDid Hillary Clinton keep classified information on the private server that she set up in her house? Facts is facts: yes, she did.
     What would happen if participants in discussions of issues had no way to discern these facts? Well, in that case, perhaps debate would never cease. The lack of a standard of truth or fact would mean that no view would ever come out on top and everyone would endlessly bray forth their position. It would be Babel.
     I recall the debate, more than thirty years ago, over whether one could be infected with HIV through casual contact. That debate quickly dissolved, of course, since, after a couple of years, the truth—the facts—eventually came into view, even for the staunchest of conservatives. At the time of our invasion of Iraq in 2003, the Bush Administration's statements to the contrary, it was unclear whether Saddam Hussein was carrying on development of WMDs. But, when we invaded the country and commenced looking desperately for the evidence of WMD production that were the stated reason for the invasion, none was ever found, a fact numerous and various news sources duly reported.
     Among rationale observers, the debate ended: no, as a matter of fact, WMDs or programs for the development of WMDs were not found in Iraq. Bush was mistaken (was he deceptive? Well, that's a different issue). End of controversy.

* * *
     Oddly, however, many Americans continued to believe that, upon invading Iraq, WMDs were found. They also believe, falsely, that the nations of the world supported our invasion and that Iraq was involved in the 9-11 attack and was in league with al-Qaeda—all demonstrably false or dubious claims.
     Here are the results of work done at the University of Maryland concerning the state of Americans' thinking in 2003:
     From January through September 2003, [Program on International Policy Attitudes] /Knowledge Networks conducted seven different polls that dealt with the conflict with Iraq. Among other things, PIPA/KN probed respondents for key perceptions and beliefs as well for their attitudes on what US policy should be. In the course of doing this, it was discovered that a substantial portion of the public had a number of misperceptions that were demonstrably false, or were at odds with the dominant view in the intelligence community.
     In the January poll it was discovered that a majority believed that Iraq played an important role in 9/11 and that a minority even expressed the belief that they had seen “conclusive evidence” of such involvement. The US intelligence community has said that there is not evidence to support the view that Iraq was directly involved in September 11 and there has clearly never been any observable “conclusive evidence.”
     In February, by providing more fine-grained response options it became clearer that only about one in five Americans believed that Iraq was directly involved in 9/11, but that a majority did believe that Iraq had given substantial support to al-Qaeda—both propositions unsupported by the US intelligence community. Other polls found even higher numbers responding positively to the idea that Iraq was involved in September 11 or had some type of close involvement with al-Qaeda. These perceptions of Iraq’s involvement with al-Qaeda and 9/11 persisted largely unchanged in numerous PIPA/KN polls through September 2003, despite continued disconfirmation by the community.
     More striking, in PIPA/KN polls conducted after the war—in May, July, and August- September—approximately half of the respondents expressed the belief that the US has actually found evidence in Iraq that Saddam was working closely with al-Qaeda. While administration figures have talked about a purported meeting in Prague between an al-Qaeda member and an Iraqi official, this does not constitute evidence that Saddam was working closely with al-Qaeda and, in any case, this purported meeting had been discredited by the US intelligence community during the period of these polls.
     One of the most striking developments in the postwar period was that once US forces arrived in Iraq, they failed to find the weapons of mass destruction that had been a major rationale for going to war with Iraq. Nonetheless, in PIPA/KN polls conducted May through September, a substantial minority of the public said they believed that weapons of mass destruction had been found. A substantial minority even believed that Iraq had used weapons of mass destruction in the war. Polls from other organizations repeated these questions and got similar results.
     In polls conducted throughout the world before and during the war, a very clear majority of world public opinion opposed the US going to war with Iraq without UN approval (see page 8 for details). However, PIPA/KN found in polls conducted during and after the war that only a minority of Americans were aware of this. A significant minority even believed that a majority of people in the world favored the US going to war with Iraq. Other perceptions of European public opinion and Islamic public opinion also contradicted numerous polls.
. . .
     The extent of Americans’ misperceptions vary significantly depending on their source of news. Those who receive most of their news from Fox News are more likely than average to have misperceptions. Those who receive most of their news from [the liberal] NPR or PBS are less likely to have misperceptions. These variations cannot simply be explained as a result of differences in the demographic characteristics of each audience, because these variations can also be found when comparing the demographic subgroups of each audience.
. . .
     An analysis of those who were asked all of the key three perception questions does reveal a remarkable level of variation in the presence of misperceptions according to news source. Standing out in the analysis are Fox and NPR/PBS--but for opposite reasons. Fox was the news source whose viewers had the most misperceptions. NPR/PBS are notable because their viewers and listeners consistently held fewer misperceptions than respondents who obtained their information from other news sources. (From Misperceptions, the Media and the Iraq War, 10/2/03, Program on International Policy Attitudes [PIPA] [A joint program of the Center on Policy Attitudes and the Center for International and Security Studies at the University of Maryland])
From the PIPA report
     It is not possible, of course, to carry on rational discourse about an issue—e.g., justification for the 2003 invasion of Iraq—if we cannot determine the actual facts about it. Despite the right's history of distaste for non-absolutist doctrines such as relativism and skepticism, many ordinary citizens on the right (i.e., conservatives) seem to have become firm skeptics* of news media or at least most news media which they take to be liberally biased (such as, say, PBS or NPR). Unfortunately, the dominant conservative news source of our time—Fox News—is notoriously unprofessional and unreliable compared to its many "liberal" alternatives. And so many conservative Americans, these curious new anti-absolutists, believe rubbish.
     For years, many of us have feared that the levels and commonness of media skepticism among the political right has reached a point that political discussion and debate with members of that group will soon cease to be possible. How do you argue with people who have, not only their own views and arguments, but their own—demonstrably erroneous—"facts"?
     You can't. You can only shake your head.


* * *
     Judging by the current presidential election race, this crisis is now upon us. Candidate Trump routinely utters demonstrable falsehoods that his followers, of which there seem to be many, unquestioningly accept. And given their views about liberal media bias, how might these Trumpsters ever appreciate their error? That can't.
     That a skeptical crisis—an immunity from facts among a certain range of "conservatives"—is upon us is being recognized even by some members of the right:
     Back in the early 2000s, right-wing talk radio was a juggernaut that influenced American politics so thoroughly that all mainstream GOP leaders genuflected to their power. Rush Limbaugh was, of course, the king, a man so powerful that he was given substantial credit for the Gingrich Revolution in 1994 with the freshman Republican class going so far as to award him an honorary membership in their caucus.

…After 9/11 the [right-wing talk radio] format exploded with new voices both nationally and locally. Combined with the ascendance of Fox News, Drudge and total Republican control of the government, right wing media completely dominated the political landscape.
     This phenomenon had a number of bedrock assumptions but the first, and most important, was the notion that the mainstream media suffered from a liberal bias so extreme that it was completely untrustworthy.… [I.e., one could not discover the facts by watching the news, unless it was right-wing news, i.e., Fox]
. . .
     There was even a famous quote from a Bush official to reporter Ron Suskind which perfectly characterized the prevailing right wing ethos of the period. He said that reporters like Suskind lived in the “reality based community” which was made up of people who believe “solutions emerge from a judicious study of discernible reality.” He and his cohorts on the right, however, were not constrained by such restrictions: “We’re an empire now, and when we act, we create our own reality.”
. . .
     The right’s great noise machine just kept chugging along, however. And the rise of social media turned it into an even louder megaphone that simultaneously blocked out any competing information. It was this environment that has made it possible for Donald Trump to emerge. We know he is a twitter and Instagram addict and uses all social media more casually and more intimately than any presidential candidate in history....
     And today, for the first time, some conservatives in the #NeverTrump camp are seeing where their decades-long attacks on the mainstream media and the “reality based community” have led. Right-wing radio talk show host Charlie Sykes from Wisconsin gave an interview lamenting the situation with reporter Oliver Darcy who put up an excerpt on twitter. Sykes also appeared on MSNBC’s “All In” last night where he said this:
     Over the years conservative talk show hosts, and I’m certainly one of them, we’ve done a remarkable job of challenging and attacking the mainstream media. But perhaps what we did was also [to] destroy any sense of a standard. Where do you go to have any sense of the truth? You have Donald Trump come along and the man says things that are demonstrably untrue on a daily basis. My experience has been look, we live in an era when every drunk at the end of the bar has a Twitter account and maybe has a blog and when you try to point out “this is not true, this is a lie” and then you cite the Washington Post or the New York Times, their response is “oh that’s the mainstream media.” So we’ve done such a good job of discrediting them that there’s almost no place to go to be able to fact check.
     Welcome to the reality-based community. (The Danger of the Right's Noise Machine: Years of Misinformation Led to Trump's Rise, Salon, 8/16/16)
SEE ALSO:

Why many Americans hold false beliefs about WMDs in Iraq and Obama's birth place (Christian Science Monitor, January 7, 2015)
Although it has been proven false, more than four in ten Americans – and more than half of Republicans – still believe that the US found weapons of mass destruction in Iraq. After six years in office and the release of his long-form birth certificate confirming his Hawaii birth, one third of Republicans continue to believe President Barack Obama was born outside the US. . . . “People who think we did the right thing in invading Iraq seem to be revising their memories to retroactively justify the invasion,” said Mr. Cassino [a professor of political science]. “This sort of motivated reasoning is pretty common: when people want to believe something, they’ll twist the facts to fit it.”….

*Well, they are skeptical about news media unless the news media tell them what they want to hear. Fox News, of course, is "fair and balanced."

According to politifact.com, a relatively even-handed fact checker. (See here, here, and here.)
SaveSave
SaveSave
SaveSave
SaveSave
SaveSave

Tuesday, December 30, 2014

Sorting out what it is to be respectful and disrespectful (“The obvious”)


     I try to be careful about how I talk about people—I mean when they’re not present.
     I have a friend who seems truly to live by the rule of saying nothing about someone they aren’t willing to say to his face.
     I often think about that rule. Is it wise? If so, why?
     We live in a culture that is big on respecting others. The concept of “rights” has much to do with that, I suppose. We know to respect others—for instance by not interfering with their affairs, not taking what is theirs, and so on. Often, it is obvious what is demanded by respect for others.
     But not always. I think that, in the past, we made a greater effort to provide a kind of catechism of respect and politeness and morality. Children were taught how to behave, what to do and not do. The content of such teachings must have seemed arbitrary to children (often even to objective observers!), but much of it does make sense relative to the overriding idea that one is to respect others as having a kind of significant moral standing, requiring constraint on our behavior relative to them.

     (We can view the somewhat [or very] rigid package of dos and don’ts sometimes recognized in a culture as the product of an effort to arrive at a way of life that constitutes “respect for others,” among other things. [Here, the elements of right action are made meaningful by the goals or values that are expressed by them.] We might feel an obligation to honor every element of such a package, even knowing that the package, and many of its elements, is likely flawed, imperfect. [“We’ve got to stick to the plan,” says the general, in the face of mounting losses.])

     In the wild and wooly U.S., the land of never-ending unconscious social experimentation, much that is traditional is lost, including much of the kind of instruction referred to above.
     I often think about this.
     Parents, of course, are conscious of a responsibility to instill in their children a proper regard of others. They might even consciously suppose that many of the “dos and dont’s” taught to their children are aspects or manifestations of “respect for others.” That is, these details are in the service of that larger goal.
     We can imagine a society in which an ongoing “sorting out” of what it means to treat others with respect goes on. This would be sensible especially in a society that is accustomed to endlessly changing roles, practices, etc.
     In a society much more bound by tradition (especially in the interactions between persons) than our own, it might seem obvious that the traditional teachings are prima facie adequate to anything that might come along. There might not be a consciousness of the need to sort anything out. A respecter of persons might simply insist on doing things as we’ve always been taught to do them.
     That’s not our society.
     It seems obvious to me—though it is clearly not obvious to everyone—that respect of others demands that one tread carefully in discussing others’ lives, especially the lives of those one knows. Most of us, I think, recognize that “gossip” is vicious, though we might not conceptualize this in terms of respect. Freely speculating about others’ lives, even when it is not attended by schadenfreude or malice or envy, also strikes me as an obvious “sin” as regards the obligation to respect others. It is perhaps a natural extension of the notion of gossip understood as a vice.
     My dictionary defines gossip as “idle talk or rumor, especially about the personal or private affairs of others.” Other dictionaries seem to provide the same meaning.
     Gossiping is not just talk, but “idle” talk. After all, one might have a very good reason for discussing a rumor or delving into others’ affairs. (A psychologist, a parent, a spouse.) Such discussing and delving isn’t always objectionable.
     The gossip gossips because doing so is enjoyable, not because it is necessary. We want to say that gossips are enjoying themselves at others’ expense. And that’s wrong.
     “But I’m not saying it to them!” insist the gossip who is called out. “What they don’t know won’t hurt them!” they add.
     I’m not so sure about that. In any case such talk—behind someone’s back—feels like deception. It also seems to be an instance of using a person. One who discovers that they are the object of gossip is offended. When one finds out that others have gossiped about one, one feels disrespected. One is inclined to say of the gossipers that one’s affairs are none of their business.

* * *

     My family poses challenges for me with regard to gossip and related talk, for they talk about other people all the time. I have come to find such talk to be objectionable and disrespectful. I say that I have “come to” find it objectionable because I was raised by these people, and they freely—not quite unashamedly—gossiped about others routinely when I was growing up. I do not recall participating in it much, but I certainly heard quite a lot of it.

     (A few years ago, my cousin J moved to Kentucky after a divorce with her husband of many years. Her only child, a son, was 19 years old and seemingly on track to become a policeman. Given these facts, my folks immediately drew the conclusion that J had “abandoned her family.” It was not obvious to me that she had done any such thing. None of us really knew the details of J’s situation. It seemed to me that my folks had no basis for such a judgment. I said so, and that ruffled some feathers.
     (After a while, it became clear that J was living on a ranch there in Kentucky. There was some other woman around for a while. This fact immediately inspired my folks to speculate that J was “now a lesbian.” They would discuss J with knowing and disapproving looks.
     (“Good grief!” I said. “First of all, you have no basis for that conclusion, and second, why are you speculating about what goes on in her life? It’s none of your business.”
     (Let’s just say that my folks responded to my remark as though I had told them that they were fishwives. Naturally, they were offended.)

     My mom more or less gossips routinely. She also enjoys discussing the lives of famous people, people in the news, et al. (Is that gossip?) When she and I are alone, I usually respond to such blather with obvious indifference or with the remark, “I don’t want to talk about these people’s lives. Could we please talk about something else?”
     My father seems generally disinclined to participate in these discussions, but he does not object to them either.
     He has no compunction about criticizing people, including famous people, that he does not like. He is from Europe, and I suspect that there is an older and more settled practice of pontificating about politics and current events—typically at the dinner table—than exists in the U.S. In any case, one obviously attractive activity for many people is to spout off without reservation about the failings of famous and important people—while it is plain that the spouters make no effort to get their facts straight or even to know at all what they’re talking about.
     Perhaps owing to the influence of his children (?), my father has grown less crass in this regard, more likely to soften his judgments of politicians, et al., and to consider alternative views.
     My mother is not a political pontificator. On the other hand, she still gossips and discusses the lives of others (to be fair, she has never been what one might call a terrible gossip).

* * *

     My folks, and especially my father, have always seemed utterly uninhibited about noting others’ physical beauty or lack thereof. “God, she’s ugly,” my dad would say about the famous comedic actress on the screen. Anybody whose face might flash upon the TV would get an automatic attractiveness (especially an unattractiveness) assessment.  “Imagine waking up to that face!” my dad would announce. My mom would just smile. This is what people do, in their world.
     For whatever reason, they are less liable to do this now—possibly because of my years of pushback—but they clearly still feel no compunction about assessing people’s attractiveness in the world.
     Admittedly, this failing (if that is what it is) is very common. I have good friends—seriously decent people—who routinely note others’ physical beauty or ugliness. I always cringe. I rarely say anything. I’m always thinking, “Poor dear. He (or she) can’t help having the face that he has!”
     Why doesn’t that factoid inhibit people more? I hesitate to launch into a moral correction of my friends though.
     Somehow, with my folks, it’s different.

     Mostly, people are what they are. That is, their features are not really matters of choice. Isn’t that obvious? (Apparently not.) I am horrified to think that people are shunned or treated badly or “talked about behind their backs” owing to some feature provided by indifferent nature, something they had absolutely no say in. And, really, most of us pretty much are what we are. The notion that our moral and physical natures are “choices” strikes me as an ugly and stupid and deeply unfortunate fiction, a source of endless oppression.
     We are here to fight such things not to participate in them!

* * *

     Today, at lunch, my mom referred to a holiday postcard from an old couple my family knows but hasn’t seen for many years. The postcard had a photo on its cover. At one point, my mom, referring to the photo, mentioned that Mrs. X “seems sick, doesn’t she?” (It was a gossipy remark, not an expression of concern. –I could be wrong, I guess.)
     Well, first of all, Mrs. X is 88 years old, and my mother doesn’t often see pictures of her. So, likely, mom was struck by how old looking Mrs. X is compared to the last photo of her she saw.
     Second, everyone at the table was well aware that Mrs. X has been suffering various ailments that might make her look old and tired, etc.
     So just what was the point of mom’s remark?

     “Oh, come on!” I said. “Why do you have to say that?”
     “What? It’s true!” said mom.
     “She can’t help the way she looks, so why mention it?” I said.
     Mom sputtered forth some explanation.
     I walked over to the adjoining room, visiting with my cat, Teddy, who I had brought with me. My dad got up and said something pleasant about Teddy’s attitude. I said: “At least he doesn’t talk trash.”
     –This was meant to be lighter than it came out. I was pretty sure my mom heard me.
     Worried that mom misunderstood my remark (she’s an immigrant with a sometimes tenuous understanding of English) and that she might be offended and even hurt, I explained that “talking trash” refers to talking about other people.
     “I know what it means!” she said, obviously annoyed.
     Well, I know my mom. She is very inclined to take offense based on misunderstandings. Happens all the time. So I clarified my remark further. I said, “Saying that you’re talking trash doesn’t mean that you’re trash; it means that you are talking about other people, criticizing them. OK?”
     “That’s not what it means to me!” she roared. I knew then that I had lit the fires of inevitability. As I feared, she “understood” my remark to be implying that, in some sense, she is trash. But no, that’s not what I meant.
     It matters not.
     I said: “Listen, what matters is what I meant, and I meant what people normally mean by saying that somebody is talkin’ trash; I meant that they were putting down others. It in no way implies that the talker is trash. OK?”
     “I have my own meaning of the word!” roared my mother. “And that’s not what it means to me!”
     Good grief. I said: “You can’t have a private meaning for a word or phrase. I word means what people normally mean by it, not what some oddball hearer misunderstands it to mean.”
     “You’re just using your meaning, and I’m just using mine!” said mom.
     --Yes, yes, I know. I am an idiot. I should learn to walk away in silence, cut my losses. Obviously.
     I said: “No, I’m not using my meaning, I’m using the meaning of the phrase ‘talkin’ trash.’ It’s the meaning you’ll find in a dictionary.”
     It was plain that, to my mom, I was just pulling things out of my ass. Now, from my perspective, I was doing anything but that; from my perspective, it was as though I were saying, “the sky is blue.”
     By now, mom was disgusted. It was then that the wisdom of silence finally took control of me. I grabbed Teddy and headed home.
     But there’ll be hell to pay. “He has no respect of his own mother!” she’ll say. And there is nothing to do about that except to wait for time and events to wash away the whole business from mom’s or anyone else’s attention.

* * *

     Owing to my training and my profession, I think a lot about such things as respecting others and what that entails. I think about rules such as “never say anything about someone you’re not willing to say to their face.”
     So some things seem obvious to me.
     Often, they’re not so obvious to others.
     And so, once again, I’ve got my aged mother upset; I got her thinking that I have no respect for her. She feels that way because I tagged her yet again for her actions that, in my view, are disrespectful of others.
     I dunno.
     At one point today, I told mom, “I don’t think you understand how hard it is for me to hear this stuff you say all the time.”
     But, obviously, she can’t possible understand a remark like that. It is hard to listen to my folks say some of the things they say and to watch them do some of the things they do. But objections accomplish nothing.
     I understand that they are what they are—that they lived in a world very unlike my own that produced certain ways of being and acting and thinking and feeling.
     But some of this stuff—it just won’t do, will it?

* * *

     We should have an ongoing sorting out of the implications of our values. That would be a good thing.
     But we need to do it together.

     That’s not always possible.