The simpler things seem, the more complicated they become

The single best machine to measure trust is a human being. We haven’t figured out a metric that works better than our own sort of, like, ‘There’s something fishy about you’.
– Simon Sinek

The dehumanizing I referenced yesterday at the conclusion of my blog is a commonplace in modern society. Its most recognizable manifestation is heard in the common complaint that we are, as individuals, too often treated as numbers or that the system (whatever system you might choose as an example) doesn’t deal with us personally. In the latter instance, all you need recall is the last time you phoned any organization and were treated to a variety of voice prompts that may have, if you persisted long enough, led finally to a live responder. Increasingly, though, as the technology advances, it is entirely possible to conduct your business without ever actually conversing with anyone. I suspect we don’t take much time thinking about the pros and cons of such a system, other than to complain, perhaps, about how annoying it is. But then, why should we? Irritating sometimes? Yes. Reason for a crusade against automation? Hardly.

techno-man
I mention this, however, as an example of how we have come to accept technology as a complement to virtually every activity we undertake as human beings. I received a Fitbit for Christmas so now I can monitor my physical activity. I can know how many steps I walked, how “active” I was (beyond simply walking), check my sleep patterns, link to other apps, and monitor food intake and calories consumed. I’m sure it does other things of which I am unaware.

Some thirty-five years or so ago, I remember having moved in the middle of a strike by some union or other from the phone company. I ended up without a phone for the better part of six months. If anyone wanted to contact me, or me them, a real effort had to be made. I actually enjoyed the sense of freedom that seemed to be a consequence of being so “offline” if you will.

These days, I am very familiar with cellphone anxiety (if it isn’t a recognized condition, give it a while). It can strike me when I’ve gone out, regardless of the reason, and forgotten my phone at home. I’ve seen it at work in others on any number of occasions. If anxiety is related in any way to frequency of use, then young people are in greatest danger of infection. Who has not been privy to a room full of teens and twenties where virtually everyone is, in some way, attached to his/her phone, even if involved in conversation or some other activity?

I no longer need to use a key in my car; if I want to watch a show, I have any number of avenues that will allow me to watch it whenever I feel like it; if I’m traveling, a physical map – if I even have one – is there exclusively as a back-up to my GPS; I don’t have to remember an appointment – I simply enter it in my phone and make sure I set an alert. Add anything that occurs to you – by way of technological innovation – that has significantly changed the way you live your life.

While much discussion could be had over the positive or negative elements of all, or any one, of these changes, whatever they might be, I am more interested in what I believe is the “meta-message” (to coin a term) that arises from the growing predominance of technological advance in public consciousness over the last fifty years or so. While naysayers exist, the dominant conclusion that is trumpeted again and again – whether through news outlets, corporate entities, governments, individuals or some other means – is that technology is GOOD.

And I’m not suggesting for a moment that it is bad. What concerns me, however, is the application of that meta-message, “technology is good”, to areas where it should be viewed somewhat more critically. From the outset, I have wanted to use this blog to argue for the realization/recognition that few things are simple. By making so many of the commonplace features of daily activity “simple”, “simple” has a tendency to become a value in itself. Technology, in its application, aims to simplify tasks, to lessen the need for human activity or thought. No need to remember a phone number: put it in your phone. Televisions without a remote control? I’m not sure if you even CAN change a channel on a TV these days without one.

Complications
If technology can provide increasingly successful simplification of onerous tasks, it becomes all the more likely that we, as a society, would accept the idea that technology can be employed to “improve” pretty much anything. Our acceptance of that notion is all the more understandable when just such a claim is made by the “experts”, regardless of the field in which they are working. The result of this process is the “techno-faith” I’ve tried to outline in earlier blogs.

And this is where “techno-faith” and education collide. When we use our cellphones, we rarely stop to ponder – if we ever do – how they work. They work and such is the nature of modernity. If we think about it for even a moment, though, I’m sure we can summon at least a smidgen of awe over what it is we are able to do. From virtually anywhere in the world, I can be in contact with anyone else who has a phone on him/her (providing I have the number) no matter where he/she is in the world. The people who design such things understand how it all works, but we don’t have to bother ourselves with such matters.

cell-phone-evolution
When it is an object such as a phone that we are concerned with, our faith in the technologists and the developed technology seems justified. Today’s phones are better than those from even 2 or 3 years ago (maybe even 6 months ago). But what happens in schools if an idea takes root that we can treat children in the same manner as any other “thing” that we would hope to make “better”? In the peculiar doublethink of modern educational theory, we talk about embracing diversity even as we strive to develop “tools” (the system’s word, not mine) that – should the technology of delivery/instruction be perfected – will lead to near-uniform “outcomes.”

The “experts” in education want you to have the same faith in the system they say they are building (fixing, tweaking, creating, modifying, reforming – choose your participle) that we are asked to have in Apple as it releases its latest iPhone. In this instance, Apple alone has credibility.

Samuel Taylor Coleridge said it best . . .

Every reform, however necessary, will by weak minds be carried to an excess, that itself will need reforming.
– Samuel Taylor Coleridge

educational reform
To this point, I’ve tried to avoid education as a topic on this blog, primarily because of my fear that if I start, I’ll consign myself to it exclusively. That being said, I cannot deny the role that education has played in so many facets of my life. Professionally, I spent 28 years teaching English language and literature at three high schools in southern New Brunswick. I became involved in provincial politics out of a desire to potentially influence educational policy. While I do not feel I made much progress in that regard, I do believe the experience helped me to gain a deeper understanding of the nature of the dilemma facing education in New Brunswick and elsewhere. More on that later.

If teaching was a vocation, then my own education has long been a parallel avocation although, connotatively, that might suggest something of lesser importance. Perhaps a better way to put it would be, simply, that education – and the growth inherent in its pursuit, regardless of the form that pursuit might take – has been my passion, professionally and personally, in one way or another for most of my life.

I can’t remember a time when I didn’t read. I grew up with an aunt and a grandmother who always had something on the go, my grandmother having read all of Dickens (she would tell me) when she was young, even though she had only attended school to the end of grade 8. And it’s not that the reading material around me was exclusively highbrow or especially intellectual; it was simply that reading was as much a part of the fabric of everyday as eating and sleeping. It’s what people DID.

Certain books had to be read on the sly. I still remember being in my bedroom, underneath the sheets with a flashlight, reading The Exorcist. I couldn’t have been more than 13 or 14 at the time and, strict Catholic household that mine was, a book on such a subject would not have been tolerated. All I really remember is that the novel managed to “creep me out” as they say. Realistically, I can’t even offer an opinion of its quality. All I know is that I felt compelled to read it and it doesn’t haunt me today so it couldn’t have been all bad. I suspect it falls quite nicely into that category of reading I continue to pursue even now: books that I read and forget for the most part within a few days or a week of reading them. I’m a firm believer in the value of both books and visual media that serve as diversion. My grade 8 and 9 English teacher, Miss Petersen, if she could hear me saying such a thing, would be delivering her most severe “tut-tut” I’m sure. Anyone who had her might remember her admonition: “Don’t read good books – life is too short for that; read only the best”.

ed reform idol
While I appreciate the sentiment, I’m afraid Miss Petersen and I will have to agree to disagree on this one. While I’ve been diverting myself through the years, I’ve also been reading more than my fair share of serious literature as well as any number of volumes on virtually any subject that happens to grab my attention. I think of myself as persistently curious: just about anything can pique my interest every once in a while. Beyond literature, I’ve had an abiding interest in history, biography, physics, politics, theology, and art, especially, but I’m liable to wander into anything given the right circumstances.

Why this reminiscence, you might ask. I offer myself as a product of a system of education that has fallen into serious disrepute over the last few decades. For that matter, I offer an entire generation of my age, as well as generations older than mine as examples of what the supposedly disreputable system of education from times past managed to achieve. However much the dominance of personal technological innovation has intruded on modern life within the last twenty years, the foundations for that advance were laid by those raised in an educational system currently and persistently under attack. Strange, isn’t it, that within educational circles, the very notion of wisdom accumulated from experience is virtually absent? “Enduring” has become virtually synonymous with antiquated and the lone defensible hallmark of all that is good is “change”.

Almost from the onset of my career in teaching, the clarion call has been for reform of a system that is repeatedly characterized as rigid, “teacher-centred” as opposed to “child-centred”, archaic and out of touch with the modern world, and increasingly irrelevant. In an effort to counter these purported failings, I have watched our school system be subjected to a litany of reforms almost universally disparaged by anyone immediately involved in the day-to-day work of actually teaching young people in a classroom. The defenders and promoters of reform have been (and continue to be), for the most part, individuals who long ago left classrooms behind (if they ever practiced in one at all) for the world of educational theory, wherever you might find it, whether in the university; the School District, Dept of Education or corporate boardroom; or the halls of educational “policy”, wherever you might find such policy formulated.

education apple
My contention (and I’m really only following in the footsteps of others who have said the same before me) is that most current reforms are the product of what might best be called “junk” science, a science that is given whatever credence it has by virtue of its ability to confirm a predominant ideology. Sadly, the change that is being pursued, regardless of the current manifestation (“current” because change in education is a constant, a statement that should strike you as inherently paradoxical), is universally directed by someone’s idea of what a “good outcome” would be, both for the individual and the for the system at large. Modern educational theory and its attempted application in schools is, perhaps, the most vivid proof of the old adage “the road to hell is paved with good intentions”. (to be continued)

Words, words, words

If I were to be asked what my definition of a classic was, I would say it was a work that won’t go away. It just stands in front of you until you deal with it.
– Northrop Frye

Areopagitica image
I’ve been focused on travel blogging for the last couple of weeks but today I was pleased to read – courtesy of a friend – an account of a teacher in the U.S. (http://www.theatlantic.com/education/archive/2015/02/lets-talk-about-sexin-english-class/385135/?utm_source=SFFB) who ended up in considerable trouble because he dared to allow for frank discussions of sexuality and sexual behaviour through the medium of literary fiction or, to use the term I prefer, simply “literature”.

If literature isn’t your field then you might not know that one of the great debates of the last century concerned the notion of the “classic”. More specifically, elements within academia questioned whether or not anything could rightly be considered a classic, the argument being, in essence, that any such designation was the product of societal and cultural biases.

Perhaps you can see why this particular question matters to me. While it may not appear so initially, further reflection reveals that we have here another manifestation of the dominant techno-faith I’ve spent so many blogs trying to outline. The scientific premise demands clear criteria and observable proof as validation for an assertion. And so it should, if the assertion under consideration is subject to physical laws. In the case of the determination of a “classic”, something else is at play: human judgement.

Judgement is not fashionable these days. By its very nature, it cannot be subjected to objective verification. It demands subjective interpretation and argument, neither of which can be validated using the scientific method. So it is that a certain element can argue that Shakespeare should not be considered any “better” than even a Harlequin romance. Preferring one to the other becomes a matter of time and taste. Such logic underlies the contention that Shakespeare shouldn’t be taught in schools. His work is irrelevant and passé and not to be thought of as any better than what is preferred by high school students today, say The Hunger Games or Divergent. After all, who are “we” to judge?

Happily, I spent my entire career judging such things. As someone who studied literature for many years and who has been an avid reader for as long as I can remember, I will happily argue the merit of any work (and the designation of Shakespeare et al as “classics”). In the process of doing so, I may end up identifying certain of the criteria that form the basis of my judgement. That, to my way of thinking, is the very core of reading (and the reason why book clubs exist).

I’ve said many times that I am not someone who can teach anyone to read. If forced, I’m sure I would work my way through it as best I could and, hopefully, achieve a reasonable result, but my expertise was/is in helping students to read better.

Huck Finn
As an example, consider the novel The Adventures of Huckleberry Finn (HF). I choose this particular “classic” because of its singular place as a frequent choice of book-banners in schools. I taught it many times throughout the years and did so because of what it can tell us, what it can make us FEEL, about prejudice and racism and the human condition in general.

Condemnations of HF tend to focus on the use of the word “nigger” and the general perception that African-Americans in the novel are portrayed in a negative light. In the case of “nigger”, the developed argument suggests that the word itself is so inherently prejudicial that it can’t be freed of such an encumbrance and so will cause harm regardless of any supposed justifications for Twain’s use of it.

Nonsense! My task in teaching was to help students see that the only decent characters in the novel are the ones most despised by whites. The MOST villainous characters, in fact, are ALL white. To seriously understate Twain’s achievement – it deserves a much deeper and more thorough discussion – , the novel is a vilification of the supposed morality and superiority of those most ready to condemn another based solely upon the colour of his/her skin. This wasn’t some radical reading I dreamt up; this is the considered and carefully concluded opinion of multitudes of readers who have picked up this volume and read it carefully. As for the word “nigger” itself, no student left my class feeling empowered to use it. Through teaching the novel, the word’s power to harm, and its place in demeaning others, were made very clear.

Understanding such things, however, requires time, attention, and a measure of deference to those who can make a supportable claim to being better readers. The nature of judgement and of literary analysis being what they are, such claims can be overthrown by better, more persuasive arguments, perhaps, but the last 130 years of careful reading has supported a predominant belief that Twain’s motives were as I’ve outlined.

teaching reading
That’s why “teaching literature” is what we need in our schools. We do not need “facilitators” and “child-centred learning” (what learning ISN’T child-centred?!?) as much as we need people, in whatever discipline, who know more than those they teach about whatever it is they are teaching. And that might mean choosing a work that tackles challenging, uncomfortable topics. The best teachers help us to see that we do not need to be afraid of such things, neither the books themselves, nor the thinking they might engender.

Our very own Brave New World

Humanity will not make much further progress until it learns to embrace complexity and contradiction instead of instinctively trying to solve them.
– Rick Docksai (http://www.wfs.org/content/cautions-about-techno-faith)

Techno man 2

Anyone reading the last number of these blogs might be wondering by now, just what does this guy mean by “an ideology of technological humanity”? As my last blog on ideology argued, I am suggesting we have a fixed and increasingly pervasive view of humanity that shapes public discourse, policy, institutions and much of the day-to-day of modern life. If you suspect hyperbole is at work in such a statement, I want to assure you that I am not exaggerating. This ideology has gained the power and prominence it possesses because it supplies a roadmap of sorts through our increasingly complex lives.

If you consider the paradigm shifts I talked about in an earlier blog – the loss of Earth’s place at the centre of the universe, the advent of the theory of evolution, and Freud’s unveiling of our inscrutable inner life – each one was an undermining of a particular certainty or deeply held truth. The effects of each may have taken time to be felt (the ripples continue to spread as far as I’m concerned), but by the middle of the 20th century, the developed Western world was living with a deeply felt sense of uncertainty. God’s guarantee of human primacy had been compromised, we couldn’t claim inherent ascendancy in the “creation”, and we couldn’t even expect to really know ourselves, especially in light of all the misery humanity had visited upon itself in the first half of the century.

Above all else, ideology promises certainty. It lays out a set of assumptions, held to be inviolable truths, which give purpose and meaning to the world. Putting it another way, it promises that the world is comprehensible and meaningful, even if that meaning does not extend beyond our immediate existence.

Technological humanity is an ideology supremely suited to an increasingly post-religious Western world. Some will object to that characterization but I won’t withdraw it. While a majority in the world today would claim some brand of belief in a deity, anyone of my generation would know that religious practice, the formulation of laws and policies based on religious precepts, and daily life in general are not suffused with the religious sensibility that was waning but still strong when we were growing up in the 1950s and 60s.

In the midst of this crisis, technological innovation advanced with increasing speed and the standard of living of the developed world rose to heights previously unimaginable. Even if external validations of our superiority were fading, advances in standard of living and the phenomenal development of consumer society with its endless array of new and better products couldn’t help but have us swelling with pride. Sure there were problems: pollution and environmental concerns, poverty and famine, inequities, wars, extremism, etc. But still, for all of the mayhem, things were getting better and better. The notion of “progress” that had emerged with the Renaissance, regardless of any setbacks, was reaffirmed based upon the wondrous new things we were able to achieve.

Techno man

The ideology taking root as a consequence of all this is what I can only call a “faith” in the power of science to unlock every “mystery”, whether current or yet to come. I’m happy to see that the word “scientification” has made the urban dictionary, but I prefer the term “scientism” to describe the trend I am suggesting. This is science as faith, as belief, as ideology. In my use, the term asserts that science – an empirical method – provides the best – maybe even the only – avenue to truth. It would hold that all things (most importantly, human beings) can be viewed as mechanisms open to adjustment and to improvement. All we need do is find what makes the mechanism “tick”.

One result of this increasingly triumphant viewpoint is the subjection of all human behavior to “scientific” analysis. I have long been suspicious of the notion of “research” in the social sciences. From all I remember of my high school science classes (and my ongoing interest in such things ever since), the scientific method demands two thing more than anything else: control and clarity. In order for the results of any experiment to be considered valid, anything which might affect the result must be identified and controlled. Failure to recognize potentially intrusive elements can lead to the purported results of any experiment being discounted or, at the very least, seriously undermined.

When applied to human behavior – the purview of psychology, sociology and other social sciences – such a research model inherently lacks certainty. Valid social science research speaks more of trends and of possibilities than it does of absolutes. When we imagine human beings as nothing more than a rather complex amalgam of neurons firing and chemicals mingling in order to produce an “outcome”, we can claim a scientific validity for our research that doesn’t really exist. Still, when an entire society has become increasingly satisfied with such a technological model of humanity, voices raising objections aren’t given much of a hearing. Purveyors of the new orthodoxy of ideological techno-faith feel no obligation to listen to heretics.

Ideology and the ideologue

The ultimate end of any ideology is totalitarianism. – Tom Robbins

Idealogues

 

At this point I need to revisit my use of the term “ideology”, especially because of my desire to avoid confusing the term with idealist, something I would encourage anyone to become. Ideology is one of those concepts open to various shades of meaning. At its most innocuous, it simply refers to any body or system of belief held as a guide to living by individuals, groups, peoples, parties, etc. Inasmuch as we are all governed, whether consciously or not, by certain assumptions about how the world and everything and everyone in it works, we can all be considered followers of some kind of ideology, even if most of us probably don’t spend much time trying to define the particulars of ours.

Ideology becomes of concern (and potentially terrifying) when it leads to the creation of ideologues, individuals who see the world exclusively through the lens of a particular, fixed point of view. The harmless ideologue might be the person who sees absolutely everything through a positive lens. While that can sound innocuous, the potential for detrimental consequences is possible even in that instance. Such an ideologue becomes, potentially, the person we might all know who refuses to look at reality and constructs elaborate facades and rationales in order to preserve a “positive outlook”.

Examples of more distressing examples are easy to find. I’ve referenced in an earlier blog my continuing interest in the Holocaust and, in particular, my quest to somehow understand how anyone could reach a place where the systematic murder of millions could be viewed not only as acceptable but even as a benefit. I still don’t understand it on a deep level; however, my investigations lead me to believe that the Hitlers and the Himmlers and their many imitators, for all that they might have recognized that the world at large might disagree with them, were “true believers”. Somehow, they were able to embrace an ideology defined by the belief that eradication of the world’s Jewish population was “good”. One account I’ve read of a visit by Himmler to a concentration camp where he watched the execution of inmates tells of how upset Himmler was and, further, how concerned he was for the well-being of the executioners who were tasked with “this unpleasant but necessary work”.

We are living in an era when radical Islamism is able to justify broadcasted beheadings, suicide bombings, targeted executions or any other method that they imagine advances the possibility of some kind of success. The ideology and the fulfillment of its ends trumps all other considerations.

Another example that has always been powerful for me: Josef Stalin. Stalin purportedly was directly responsible for some 20 million deaths over the course of his 30 odd years as absolute ruler of the former Soviet Union. Many have written at length of these “atrocities” but if you can imagine, for even a moment, that all that Stalin did was in service of the advancement of Soviet communism, do his crimes become necessary and good? Certainly a central tenet of Leninist communist ideology is that the individual is not as important as the furthering of the ultimate goal, the creation of the communist utopia. This falls into that same category as the Holocaust for me. Could anyone actually believe that mass murder on such a scale could be justified? The answer to me is far more straightforward than you might think: if people can strap explosives to children and detonate them in the midst of an unsuspecting crowd because they believe God wants them to do so, what CAN’T ideology find a way to justify?

I know I have been depending on extreme examples in this blog but I want to make clear how much the embracing of an ideology can trump what the rest of us might think of as “reality” or clear moral positions, positions that most of us take for granted. For example, surely no one can justify the deliberate sacrifice/murder of children to a cause? And yet, it happens with frightening regularity these days.

Ideologies, when they become the lens through which we view the world, can predetermine what we see. Of greatest concern for my purposes is what I have come to regard as the ideology of technological humanity – a belief, in simplest terms, that human beings are mechanisms subject to adjustment and to continuous improvement. While that doesn’t sound so bad, I want to argue it is at the root of any number of problems we encounter these days. As you might expect, considering my background, I became convinced of this, initially, through my experiences in education.

We have the technology

The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom. ~Isaac Asimov, Isaac Asimov’s Book of Science and Nature Quotations, 1988

technological man

So if you’ve stuck with me over these last few blogs, I want to say a heartfelt thank you. When I started on this topic, I suggested it was complicated, and a central point of my starting this blog continues to be my contention that we have little time for complexity and the ensuing subtlety complexity demands. That being said, my almost thirty years as a teacher, taken in conjunction with my experiences in politics, has convinced me that trends accumulating through the process I have described over the last few blogs have come to dominate important elements of modern life (education being the one of longest standing concern for me). So where are we?

As the 20th century was dawning, we were on the verge of a technological revolution that would make all previous marvels pale in comparison. My grandmother was born in 1889 and died in 1982. Compare the changes she saw in her lifetime to those of someone who lived a similar span a hundred years earlier. We’ve probably all heard some version of how advancement has accelerated as time has passed. One I recall is that, between the dawn of civilization and 1850, the sum of human knowledge doubled. Since then, that doubling has increased in speed. Check out the link I’ve included below for one version based on Buckminster Fuller’s “Knowledge Doubling Curve”. Regardless of the details, it isn’t hard to see that technology has come to dominate our lives to an ever-increasing degree.

In the meantime, the Copernican revolution, the theory of evolution, and Freud had undermined virtually every assumption regarding humanity’s status as the chosen ones. Little wonder that as the century progressed, however much it might be disguised, humanity’s latest efforts to reinforce just how special we are would ultimately find a home in the most apparent evidence of our superiority: technological advancement.

Just so you know, I love technology and all the gizmos it has given us. While I still find being tied to a cellphone unsettling at times, I would never suggest that we throw it all away and go back to some imagined idyllic past where all was well with the world or, at least, substantially better. As you may have surmised, my contention is that this existence is always problematic for human beings. Alone among the living on this planet, we are conscious of mortality and we seek meaning and purpose for our lives. Where we go to find that meaning and how we grapple with the mystery of existence is far more central to living than we probably realize. As conventional religious practice has waned in the Western world, we have looked for an alternative.

So, to put it bluntly, I believe technology has become ideological. The 20th century saw incredible advancements – at ever increasing speed – in virtually every sphere of human activity. Scientific understanding of the natural world made such progress possible and, initially, it was regarded as a fundamental good. In the midst of all of this, however, the human race saw examples of just how destructive technology could be in the hands of those who wanted to use it for destructive purposes, either deliberately or indifferently. Two world wars, the development of nuclear arms, global warming, irresponsible uses of technology in general, all have one thing in common: human beings are the ones at the controls. If we can’t put technology back in the box, what are we to do? How do we avoid self-destruction, something that we seem to be prone to given the proper circumstances?

And so we have the IDEOLOGY of technological humanity. What has technology told us, especially in the last 30 years or so? Put simply, everything that exists in the world can be made better. We have come to expect that technological innovation is without boundaries. Take the television. Just when you think you have purchased the TV for the ages, the next upgrade comes along. The same applies to cars, phones, crop yields, etc, etc, etc. As science focuses on human beings themselves, is it any wonder that a similar “technological model” of personhood should take root in our consciousness? We have arrived at a place where we imagine ourselves as organic commodities which, through rigorous application of a scientific method, can be “improved”. In this uncertain world, we have found a “belief system” (an ideology) that tells us we can control our destiny – make humanity intrinsically “better”. Needless to say, I don’t agree.

 

Knowledge Doubling Every 12 Months, Soon to be Every 12 Hours

If only we could be perfect!

I have no faith in human perfectibility. I think that human exertion will have no appreciable effect upon humanity. Man is now only more active – not more happy – nor more wise, than he was 6000 years ago.
― Edgar Allan Poe

Einstein

Continuing from yesterday, I want to tell you a story. It’s probably told much better somewhere else, but I’ve been pursuing it over many years and it makes sense to me. Call it “Carl’s brief history of human identity”. I don’t think the title is the best either. If you read this through and come up with a better one, I’m open to suggestions.

As you saw yesterday, Alexander Pope posits humanity as a series of contradictions, not simply to suggest contradictions for their own sake but, more significantly, to contend that human beings are gloriously complicated and by times infuriating, majestic, petty, generous, etc. It’s a viewpoint with which I have great sympathy but one, I think, that has become rather passé in the 21st century. As I wrote a day or two ago, the emerging “technological model” is something quite different (something I have yet to define clearly, as well, I know – told you this would be long). So how did we arrive where we are?

As an early 18th century writer, Pope lived during a time when science – and an accompanying focus on all things rational – was taking hold in a big way, at least among the 1% in a position to take time to think about such things. Nowadays, we think of the scientific method as the primary way to truth: we want evidence and we want it to be clear. In Pope’s time, alternative approaches to determining reality were accepted and pursued equally. Probably the best example would be Sir Isaac Newton, arguably one of the greatest scientists of all time, who was also a longstanding student of alchemy and other “occult” studies.

The eighteenth century “Age of Reason” or “The Enlightenment” saw some adopt a view of human beings that made no allowance for anything other than rationality. If people could examine their habits and persons with a critical eye, adjustments could be made to that “object”, the result being an intrinsically “better” human being. Certain eminent figures of the day kept daily journals which, so the argument went, could lead to a better understanding of where improvements and adjustments were needed.

Granted, this was an extreme, an entirely mechanistic view which imagined individuals as little more than empty vessels waiting to be filled: with knowledge, with experience, with determining factors. In modern psychological terms, proponents of such a view could be said to be on the extreme “nurture” side of the nature-nurture debate, so fundamental to psychology.

Pendulums always having to swing, a brief period toward the end of the 18th century saw a significant move to the opposite pole. A number of writers and thinkers thought it necessary to emphasize a more mystical and unseen path to “truth” or understanding. Human beings possessed intangible qualities of imagination, insight, intuition, empathy, to name a few, and the any understanding of what it meant to be human had to take such features into account.

It should come as little surprise that the 19th century became a rather fractious time where these two tendencies commonly conflicted. That being said, science was beginning to develop both an objective credibility and a momentum that would have profound consequences, not only for the trappings of modernity (inventions, advancements, etc), but for our understanding of ourselves as well. (to be continued)

The glory, jest and riddle of the world

human maze

At the conclusion of yesterday’s blog, I mentioned a topic that is dear to my heart, while not saying just what that topic might be. Even now, clearly articulating that topic in a simple phrase is difficult. Before sitting down to write today, I did a little reading online and landed on an article (see the link at the end), taken from an online source called the International Socialist Review. I’m sure better sources exist but this piece offered some of the thinking that I believe has landed us in a dire condition in many spheres. While not specifically using the language I have come to prefer, the article, when its implications and assumptions are recognized, outlines what I have come to call a “technological model of humanity”.

Consistent with oversimplification (and the accompanying polarization), the author contends that the choice is stark: either accept capitalism which holds to a fixed view of human nature (supposedly) or embrace socialism which sees human nature as adaptive (another supposedly) – something coming into being; something that is always malleable.

While this may seem to be one of those “who cares” moments for many, it shouldn’t be. As I will try to explain as this blog develops further, ideas that strike us as irrelevant to the reality of our day to day lives do, in fact, have the potential to determine far many more things than we might imagine. So what is this technological model of which I speak?

By way of contrast, in the last blog I referenced the “indefinable character of ideals such as virtue, goodness, etc”. In a broader context, I do not hold with a somewhat dominant view that human beings are fundamentally mechanisms that can be adjusted and “improved” in much the way one might hope to build a better car or computer. It may very well be a result of my background in literature but I do not apologize for continuing to believe that there is something impenetrable about both this universe and us.

As a simple illustration of this notion, I would often ask students to consider a sunrise or a favourite piece of music. We can all grasp the idea that, in either case, we find something attractive that draws us. The exercise was my attempt to show just how difficult it is to nail down, in concrete terms, exactly what it is that MAKES them attractive. In the case of the sunrise, we might speak of the colours, or the overall “beauty of the scene”. If it were possible to extract the colour or the perceived “beauty” from the event itself, everyone agreed that such a display wouldn’t account for that thing’s “power”. In other words, to revert to the old gestalt notion, in some indefinable way, the whole is frequently greater than the sum of its parts.

The authors of the article I’ve referenced imagine any view of a fixed human nature as death to human development. They discuss at great length how social structures are reinforced by assumptions about human nature, those assumptions originating out of the desire of the powerful within society to remain at the top. Chief among these assumptions is that human nature is fundamentally bad. If we didn’t oppress, (so the argument goes) the wheels would come off the bus and all would descend into anarchy.

As I’ve argued repeatedly, such simplistic overstatement lends little credence to an article or an argument. So where do I begin to present my own view, and why do I care? I’ll leave you today with one of my favourite renderings of our mysterious humanity, courtesy of Alexander Pope.

from An Essay on Man

Placed on this isthmus of a middle state,
A Being darkly wise, and rudely great:
With too much knowledge for the Sceptic side,
With too much weakness for the Stoic's pride,
He hangs between; in doubt to act, or rest;
In doubt to deem himself a God, or Beast;
In doubt his mind or body to prefer;
Born but to die, and reas'ning but to err;
Alike in ignorance, his reason such,
Whether he thinks too little, or too much;
Chaos of Thought and Passion, all confus'd;
Still by himself, abus'd or disabus'd;
Created half to rise and half to fall;
Great Lord of all things, yet a prey to all,
Sole judge of truth, in endless error hurl'd;
The glory, jest and riddle of the world.

http://www.isreview.org/issues/47/wdss-humnature.shtml,

And now for something completely different.

herd mind

To date, my blogs have been exclusively my musings on whatever struck my fancy but now I’m going to mix it up a bit. I’ve been involved in a very interesting exchange (with someone who shall remain anonymous) over my blog “Grey is the best colour” from a couple of days ago. The exchange is one I find heartening. It’s fun and my fellow conversationalist displays the subtlety that I find so lacking in so many spheres these days. I suspect the conversation will continue but I thought you might like to see how things have gone so far.

_____________: I understand the notion of contextualization and the philosophical importance you are attaching to grey, however, I am reminded of a statement I often make in classes I teach. “Moral relativism is intellectual cowardice.” Sometimes the world is about blacks and whites, and standing up for black, or standing up for white, are the only two choices, because any shade of grey is de facto white or black. Get my drift? Sometimes the aberration from the standard is to fall holus bolus into moral vacuity. Another thoughtful piece. So glad I saw the first one.

Me: I agree with your general contention: positions – clear and unequivocal – are frequently required. What bothers me in our pronouncement-happy world is that such pronouncements commonly take the place of a supporting argument. While the motto, appeal or “call to action” might appear clearly defensible, without an understanding of how one arrived at such a place, actions become shallow and little more than the herd mind at work. In such an environment, especially when media coverage/frenzy is added to the mix, the result can be a radicalized group of slogan chanters fueled by emotion without understanding. Taking a position – clear and unequivocal – should be the end of a process, not where one begins. And thank you for the thoughtful comment.

_____________: I agree with the idea that position should be based on some understanding rather than blind acceptance. But if we applied that logic to many things, the outcomes I believe would be deeply distressing to most. Like the Allegory of the Cave, the truth is hard to take, like turning the lights on in the morning. This relates to John Rawl’s philosophic notion of the “veil of ignorance,” or even John Lennon’s song “Imagine” for that matter. But here’s a thought. What if we take a position on an issue without entering the process of meditation you allude to, and instead take the position by stripping ourselves of our preconceived notions, biases and hang-ups? Chances are, the just or virtuous position would be on the only option left on the table. The magic is then realized by reassuming our inherent biases and still ending up at that point- whether it be black or white.

Me: “Know thyself, presume not God to scan” to coin a phrase. I have a fundamental, core conception of life that includes an element of mystery in all things, including the self. It is difficult for me to imagine recognizing all of my “preconceived notions, biases and hang-ups” let alone being able to rid myself of them, even for a moment. To my way of thinking, being rational requires that we recognize the fluidity and indefinable character (at the very least, some small part) of ideals such as virtue, goodness, etc. While I will forego the opportunity for the moment, this exchange offers me an excellent springboard into a topic that is dear to my heart. As for just what that topic is, tune in another day.

Grey is the best colour

grey et all I was talking to a former student of mine last week and she was bemoaning the state of the world in general but, more particularly, how difficult it is to change the way people think and act regarding things that have more impact on their lives than most would credit. Politics is the obvious example but that is the subject of more than a few blogs to come. For now, I prefer to deal with the issue in more general terms.

This issue matters to me because, as I mentioned in an earlier blog, friends have often accused me of cynicism, a charge I refute at every opportunity. The cynic, as defined by Dictionary.com is “a person who believes that only selfishness motivates human actions and who disbelieves in or minimizes selfless acts or disinterested points of view.” The nature of the definition itself serves to illustrate what annoys me about being called one. This “person”, in order to strictly conform to the definition, would need to spend all of his/her time assessing ALL human action in such a way. In truth, while I know it isn’t intended to suggest such a thing, it can’t help being valuable as an illustration of what simplistic labeling tends to promote.

Who among us has not been cynical about something? In fact, if regarded as a simple feature of human behavior – a common one, in fact – is it so difficult to believe that self-interest is an element in the things we do? What frustrates me on a regular basis is how readily people are dismissed (even condemned), especially public figures, if so much of a hint of self-interest is evident in anything they do. In this absolutist view of charitable work, for example, one is either the next best thing to Mother Teresa or a grasping con artist seeking to sway public opinion in one’s favour even while busy pursuing personal gain.

As an idealist, I prefer to take the long view. Finding examples of bad behavior becomes easier as more and more of our lives becomes public. Facebook and other social media invite certain of us, it seems, to put on display – potentially for all the world to see – concrete examples of pettiness, prejudice, misogyny, and all varieties of extremes that most of us would choose to hide even if we were subject to them. That being said, surely we do not accept the idea that any person’s totality can be captured by one facet revealed through an intemperate remark or thoughtless action? I suspect all of us have had thoughts of which we would prefer no one ever become aware.

And yet, for all of the world’s flaws, we still manage to make occasional progress here and there, usually as a collective first, and then as individuals. Attitudes toward slavery provide an obvious example. I’m hard-pressed to think of anyone who would defend the concept today even as I acknowledge that prejudice is alive and well in the world. The reality of the latter does not support the notion that things haven’t changed. To my mind, the most remarkable shift of the last twenty odd years remains attitudes toward sexual orientation. To be openly gay twenty years ago was to invite all kinds of trouble. In Canada today, as a societal principle, it hardly causes a ripple.

All of which is to say, this idealist continues to believe that goodness is real and, in fact, evident in the world. And so, too, is evil. Somewhere in the mixing of the two arises the dynamic of day-to-day experience – in each of us and in society at large. We do not live, either in our own lives or in that larger world, in black or white. Everywhere you look, expect to find grey.