Thinking About Leap Year

The prompt from my Florida writers’ group was to write a short piece about Leap Year. This is what I came up with—

“Sometimes,” Gus says, “I think to myself this whole Leap Year thing is nothin’ but a boondoggle.”

“That’s redundant,” I say absently.

“What?” Gus says.

“I said what you said is redundant.  Repetitive, superfluous.  How else could you think, except to yourself?”

“What?

“Gus, think about it!  When you’re thinking, it’s just you communicating inside your brain.  Nobody else is privy to it.  If other people knew what you were thinking, it wouldn’t be thinking.  It would be talking.”

“You think so?” Gus says, brow furrowed.  “Yeah, I think you’re right.”

“Of course I’m right!” I say.  “So, it’s unnecessary to say you were thinking to yourself.  Needless, pointless.  All you have to say is, I think Leap Year is a boondoggle.”

“Yeah, that is what I think!” Gus says.  “You an’ me agree.”

“No, no,” I say, a tad exasperated.  “It’s you who thinks that, not I.”

“What?”

“It’s you who thinks Leap Year is a boondoggle!”

“Yeah, that’s what I said, an’ you wanta know why?  It’s them damn calendar-makers!”

“What?” I say.

“Think about it!” Gus says.  “Every seven years, the days of the month fall on the same date, like clockwork.  So, if it warn’t for them calendar-makers, after seven years, nobody would hafta buy calendars no more.  We could just recycle ‘em.  Them greedy SOBs up an’ stuck an extra day in Febeeary every four years to make sure we’d hafta keep buyin’ their products.” 

“Gus, Leap Year has nothing to do with calendar-makers!” I say.  “It has to do with Earth’s orbit around the sun, which takes three-hundred-and-sixty-five days, plus six hours, for a complete cycle.  After four years, that’s a whole extra day.  Your theory is poppycock!”

“What?” Gus says, forehead crinkling.

“Your theory is flawed, mistaken, incorrect.”

“Naw, I don’t think so,” Gus says.  “Leap year is just a marketin’ ploy.”

“No, it’s not!” I say.  “It’s a scientifically-proven manifestation.” 

“A what?”

“A manifestation, an occurrence, a fact!”

Gus stares at me for a long moment, then points a bony finger in my face.  “That’s redundant,” he says.  “An’ mark my words, Leap Year is nothin’ but a boondoggle!”  Then, with a sly grin, he adds, “A sham, a scam!  Leastways, that’s what I think!”

I smile weakly as he finishes, “To myself!”

On Thinking

The French philosopher, Rene Descartes, is remembered among other things for his thesis: I think, therefore I am.  The notion is most commonly expressed, not in French or English, but in Latin: Cogito, ergo sum.

His premise was not, as is widely believed, that he exists because he can think; rather, it is that he is aware he exists because he is able to think.  That assumption presupposes that so-called lower forms of animal life, being non-sentient as far as we know, exist without knowing they exist.

Descartes appears not to have considered the possibility that some humans may also exist without full awareness, largely because of their demonstrated inability or willingness to think rationally.  But I digress.

In conversation with other folks, I occasionally hear them offer their opinion by beginning with the phrase, So, I think to myself…  I find that phrase redundant, because I can contemplate no other way of thinking; by definition, all thinking is to oneself, is it not?  Unless, as some would have it, a person is thinking out loud, which strikes me as verbalizing, not thinking.  Better, I suggest, to think first, speak second.

But as a counterpoint to that, people might deem praiseworthy the ability some folks have to think on their feet—to offer an opinion, receive feedback, and modify that opinion, all in the course of one conversation.  That facility is admirable, I suppose, but it can happen, of course, only if they’re standing; if they were seated, they would surely be thinking…well, on their tush, right?  And somehow, blowing it out their…you know…doesn’t seem as impressive.

I’ve long thought of thinking as a fluid process, a constant progression, a multi-directional flow, rather than as a static, linear plod from point A to point B.  And if that is so, then a graphic tracing of my thinking pattern would appear, not as a straight line, but as a higgledy-piggledy, zig-zagging line—frequently interrupted and intercepted, but always arcing upward toward higher illumination, I would hope.

As a writer, it’s my thinking that takes me far from my physical surroundings, even to the point of forgetting all about time and place.  As I wrote in haiku verse some time ago—

my thoughts, unbridled,
take me to worlds I ne’er will see,
nor have ever seen

my boundless thoughts are
like hot air balloons, slipping
bonds that tie me down

I wander freely
throughout the universe, yet
never leave my chair

There are two adages on thinking that I try to hold to, at least presently, and they both grace the résumés and bios that appear on my online, social-media sites.  The first is, Certainty is the enemy of an open mind…I think.  And the second is, Don’t believe everything you think.  Regular readers of this blog will know whether or not I’m successful in living up to those.

Certainty plagues many people after they’ve thought a subject through—or even when they have not—and then adopt a position they think is accurate or true, and stubbornly cling to that opinion, come hell or high water.  But I think every opinion we hold should be subject to periodic, critical study, the more frequently the better, in order to test its validity in the face of facts and evidence that can change from time to time.  Being overly-certain about one’s opinion can stifle that sort of examination.

The irony with this adage, however, is that I can’t be certain it’s correct, for to be so would violate its basic premise.  Like every other opinion I hold, it requires my constant scrutiny…at least, I think it does.

The notion of believing everything we think, just because we think it, likewise can lead to cognitive stagnation.  In everyday interactions, our behaviours are governed by what we think we should say or do at any given time, and there’s nothing wrong with that.  Guidelines are preferable to social anarchy.  But if, for example, I believe it’s safe to jaywalk across a busy thoroughfare just because I think it’s safe, and if I persist in that belief, the consequences to me could be catastrophic.  Better, I think, to examine my thinking in the light of facts before committing it to belief-status.

The irony with this second adage is it presents a danger that one will never commit to believing in anything.  I think that, too, could present a problem.

For those who’ve read this far, let me finish with an anecdote about two people engaged in a mild argument over some inconsequential subject.  “So, is that what you really think?” the woman asks, a touch of incredulity tinging her tone.

“I don’t think!  I know!” the man replies smugly. 

With barely a pause, the woman smiles condescendingly and says, “You know what?  I don’t think you know, either.”

And that could well be the case for all of us.  Even when we think we know, even when we are absolutely certain of it, we still might be mistaken.  The wise carpenter’s advice—measure twice, cut once—could easily be adapted and applied to our thinking process: think, rethink, then act.

I’ve done just that in this post…I think.

What say you?

Cogito, Ergo Sum

Cogito, ergo sum—I think, therefore I am. 

So opined René Descartes in 1637, in his famous work, Discourse on Method, demonstrating what he regarded as the first step in the acquisition of knowledge.

Of course, we don’t know that he was right, but because enough of us have come to believe his posit, it is almost universally accepted.  Left unanswered is the question as to whether other living organisms are sentient, whether they also can think.

Some people believe they can—that creatures such as elephants, whales, and dogs are capable of thought—and they cite observed actions by these animals as proof of their belief.  But what of other animals, or fish, and what of plants and rocks?  To my knowledge, no one has as yet been able to prove (or disprove) the thesis that any lifeform other than human is capable of thought.

Regardless, it does seem likely that no form of life on our planet has attained the same level of high-order thinking that the human species has.  And if any have, they have hidden it from us remarkably well.  With physical brains somewhere between the largest and smallest in size among all living creatures, we humans appear to have outstripped them all in our capacity to think rationally.

The capacity to think is what allows many of us to read widely, listen to diverse sources of information, and weigh the relative merits of differing schools of thought before deciding on a course of action—critical thinking.  Alas, it is also what allows us to read narrowly (if at all), listen carelessly, and reject schools of thought that do not reflect our own preconceived notions.

Either way, thinking broadly or narrowly allows us to form opinions.  And those opinions, whether supported by evidence or not, often morph into staunch beliefs if we don’t continue to think about them, to test them against emerging information.  And inference plays a big role in that.

For example, if I waken one morning to the sound of thunder, and if I see flashes of lightning illuminating the drawn curtains of my bedroom, I might well infer that it’s raining outside.  But I have no proof of that until I actually see (or feel, or smell, or taste) the tangible rain.  I might throw open the curtains to discover there is no rain falling, despite the harbingers of storm; merely hearing and seeing those from inside my room would have drawn me into a false conclusion, yet one I believed until faced with proof of the opposite.

It points out the danger of choosing to believe everything we think, at least before we have evidence to support (or deny) our premises.  As sentient beings, we are compelled to seek answers to the baffling phenomena we observe around us, to find reasons why situations unfold as they do, to explain the arcane mysteries that bedevil us—like where we came from and where we’re going.

Our world is replete with examples of how we have gone about this—in religion, science, engineering, medicine, music, literature, and so many other fields.  The list of human accomplishments over the millennia is long and laudable.  Errors have been made along the way, and corrections applied, but the steady march of knowledge-acquisition has been relentless.

Many of our ancestors, for instance, once believed (and some still do) that the earth was flat, that any who got too close to the edge would topple off the edge, fall into the void, and be lost forever.  We know now, of course, that belief was untrue.  As an amusing aside, the Flat Earth Society still boasts today in its brochures of having chapters of believers around the globe!

Still other folks believed once upon a time that our planet was at the centre of the known universe, that the moon and sun revolved around us; those people’s skills of observation, primitive by today’s standards, and their earnest thinking about those observations, led them to that conclusion. Yet, it was also not true.

Nevertheless, despite our many errors and missteps along the way, our capacity to think rationally—and to forever question our thinking—has allowed us to advance our collective knowledge.  A key factor in continuing that progress is to avoid investing complete faith in any one thesis, regardless of its appeal at any given time; we must retain an appropriate level of skepticism in order to keep from falling into the acceptance of rigid dogma and blind ideology. 

As George Carlin is reputed to have said, “Question everything!”

Another key is to continue testing theses to reinforce their viability, to find evidence of their truth (or falsity).  But at the same time, we must remember that an absence of evidence of truth in the moment does not mean the same thing as evidence of an absence of truth.  In other words, just because we don’t have the facts to prove the legitimacy of a thesis right now does not mean that thesis is untrue; it may mean simply that we haven’t as yet discovered the facts to validate it.

The advance of knowledge is, to paraphrase Hemingway, a movable feast.

An exception to prove the truth of any thesis can always be found, of course—something that demonstrates the general truth of a thing by seeming to contradict it.  For instance: most of the teachers in that elementary school are female, and the one male teacher on staff is the exception that proves the rule.  The thesis is factual.

We also know the true merit of any pudding is put to the test in the eating.  But with one person’s taste being different from another’s, whose opinion is to be accepted as the truth?  Any decision there must be regarded as opinion, not fact.

It is interesting to note that Descartes did not write, Cogito, credo quod cogitare, ergo non est recta—I think, I believe what I think, therefore it is right.  Apparently, he understood that because we think something, even to the point of believing it, that does not necessarily make it true.

He also wrote, Non satis est habere bonum mentem; Pelagus res est ut bene—It is not enough to have a good mind; the main thing is to use it well.

Quaestio, semper quaestioquestion, always question!

Asking Questions

“Anyway, what do you think, Gramps?”

We’re in the midst of a long conversation where my granddaughter has been explaining the options lying ahead as high school graduation approaches.  She’s university-bound for sure, but where and to do what are still up in the air.  She already has acceptances from five schools, pending submission of final marks and other documentation, and the choice really is hers.  An array of forms from the different schools is scattered on the table in front of us.

My first post-secondary foray began more than sixty years ago, so I’m hardly an informed source for her to be consulting, but this conversation has more to do with our relationship than with my expertise.  All five of my grandchildren—siblings and cousins—have always afforded me this courtesy when faced with decisions affecting their lives.

I attribute that to the upbringing they’ve received from their parents—my two daughters and their husbands.  My wife and I benefit from the affection and respect for elders that has been inculcated in the children in both families.  Even as we become increasingly irrelevant, we remain cherished.

The kids have always been encouraged by their parents to make intelligent choices when they face significant decisions, but more importantly, they’ve been helped to learn strategies for doing that.  They’ve learned to distinguish between fact and opinion, between truth and falsehood, between goodwill and venality.  They’ve learned to assess the multitude of sources of information they encounter—and to favour those that are fact-based, that are truth-oriented, that appear to advance the common good.

They were encouraged to learn from their mistakes, too, and to understand that failure can be a springboard to important learning.

Along the way, their parents also learned an important lesson, just as my wife and I did while raising our girls: when you help children learn to think for themselves, be prepared for the fact that they may eventually think differently on certain issues than you do.

In any event, here I am being asked my thoughts about my granddaughter’s options going forward.  Stroking my chin thoughtfully, I say, “Do you have a particular favourite at this point?”

“I like a couple better than the others, I guess.  But they’re all good.”

“What are the things you like that might sway your thinking?”

After a moment, she begins talking about how the academic opportunities at each school might best blend with her as-yet-unfinalized career decisions, including co-op work experience.  She talks about where her friends might be going; about the advantages of living in residence, away from home; about the extra-curricular opportunities at each school; about part-time job possibilities around campus; and about the costs associated with each choice.

“Well, you’re certainly considering a lot of factors,” I say.  “Are there any deal-breakers or must-haves?”

“There were,” she says.  “And I’ve already eliminated schools that don’t offer things I feel are important.”

“What about dead-ends?” I ask.  “What are the chances you could find yourself constrained at any of the schools if you decide to switch majors a year or two in?”

She nods as she takes this in, jots a quick note to herself on a sheet of paper listing all the schools.

“That could happen,” I add, reflecting on my own experience those many years ago, when I switched universities after finally deciding on a teaching career following graduation from a journalism program.

“Yeah, and I need to consider the possibility of post-grad work, too,” she says, circling the names of two of the schools.

“For sure!” I say, marvelling at her long focus.

“Okay, Gramps, thanks for your advice!” she says, gathering up her papers.  With a kiss on my cheek and a loving hug, she bounces out of the room.

Advice?  All I did was ask a few questions.  You don’t need advice from me!

“Let me know what you decide,” I call after her.  And I comfort myself that perhaps asking questions was the best thing I could have done because, like my other four grandchildren, this little girl knows how to think for herself.

And what do I think?  I think that’s good!

Perhaps We Need to Think More About That

Perhaps we need to think about this.  And a lot harder than we seem to be thinking at present.

innovation

Do you know what the items in the following list are, and what they have in common: Macrostylis villosa, Galapagos Amaranth, Courtallum Wenlandia, Viola cryana, and Fitchia mangarevensis?

All of them are species of plants that once upon a time thrived in, respectively, Africa, the Americas, Asia, Europe, and Oceania.  Before the dawn of the twenty-first century, all of them had become extinct.

How about the items in this list:  Acipenser naccarii, Coregonus johannae, Cyprinodon arcuatus, Gila crassicauda, and Platytropius siamensis?

These are species of sturgeon, salmon, carp, smelt, and catfish that, likewise, have disappeared from the face of the earth.  It is beyond obvious to say that we shall never see them again.

Here’s an easier list:  Pachycephalosaurus, Dreadnoughtus schrani, velociraptors, Ankylosaurus, and therizinosaurs.  Do you know what these species have in common?

dinos

As you might have guessed, all are dinosaur species that became extinct more than 66 million years ago.

Try this one:  Dromaius minor, Camptorhynchus labradorius, Pinguinus impennis. Sceloglaux albifacies, and Ectopistes migratorius.

These are bird species that have ceased to exist—in order, the King Island emu, the Labrador duck, the Great auk, the Laughing owl, and the iconic passenger pigeon.

And now, perhaps the easiest list of all:  Balaenoptera musculus, Panthera tigris tigris, Elephas maximus sumatranus, Gorilla beringei graueri, and Diceros bicornis.

These are critically endangered animal species, on the cusp of extinction—the Blue Whale, the Bengal Tiger, the Sumatran Elephant, the Eastern Lowland Gorilla, and the Black Rhino.

rhino

Science estimates that approximately 99.9% of all the species of life that have inhabited this planet of ours since its formation are extinct.  In fact, Charles Darwin theorized that evolution and extinction are not mutually exclusive.

Or, as Annie Dillard put it, more poetically, in Pilgrim at Tinker Creek—

                         Evolution loves death more than it loves you or me.  This is easy to write,  easy to read, and hard to believe.

Still, if we can believe our planet has hosted some sort of life for more than 3.5 billion years, it’s staggering to think that less than one-tenth of one percent of all those lifeforms survive today.

Here’s a final list to ponder:  Homo habilis, Homo ergaster, Homo erectus, Homo heidelbergensis, Homo neanderthalensis, and Homo sapiens.

sapiens

These, of course, are all species of human life, the first of which, scientists believe, first appeared around 2.5 million years ago.

Those of us alive today are members of Homo sapiens sapiens, a sub-species of the last one in the list, which is thought to have sprung up almost half-a-million years ago—not too long when compared to the 3,500 million years life has existed on earth.

But here is the critical implication arising from that final list:  of the six species listed, the first five have vanished.  We are the only ones not yet extinct.

Not.  Yet.  Extinct.

Perhaps we need to think more about that.