Skip to content

eLearning and Digital Cultures: untangling human values from the dispensations of technology #edcmooc

This blog post forms my response to the #EDCMOOC topic that asks “what does it mean to be human within a digital culture, and what does this mean for education?”; well the first part anyway as the question is considered from two perspectives, humanist and post humanist, and I have yet to fully digest the resources presented that give the post humanist perspective. As such, I’ll briefly consider the term “humanism” and then review the article by Lowell Monke (2004) The Human Touch, which is offered as a response to the apparent threat that technology poses to essential ways in which we learn and exist as “human”, before considering the views of Jaron Lanier on the subject.

It's human nature - sharing.
Sharing – it’s human nature.

Humanism gives reference to the existence of “human nature”. Often aligned with secularism, it’s concerned with beliefs and ideas pertaining to the meaning and purpose of life and emphasizes the value and agency of human beings. Undoubtedly, much of our educational philosophies and approaches today are underpinned by humanist thinking.

In his article, Monke observes that, despite persistent claims, educational transformation fashioned by a variety of technologies has, thus far, largely failed to materialise. He believes the reason for this can be attributed to an “uncritical faith” in technology that fails to acknowledge the interaction between dispensations inherent in technology and human values.

Although technology can provide information and simulations of experience, Monke just sees it as the decontextualized consumption and manipulation of abstract symbols on a two-dimensional screen, which, in his view, is no real substitute for the first hand, concrete experience a person must have with nature, everyday objects, people and with their community so that they’re able to make meaning out of experience. Similarly, the Quantum Physicist, Amit Goswami, in a video that I watched recently, also testifies that “people are interested in processing meaning and values” (26.00 min).

Monke claims that the values embedded in computer simulated experiences are distinct from the values inherent in real, concrete experiences. Here, he cites the example of a computer game called “Oregan Trail” that teaches children about the exploration of the American frontier, and whose implicit message, according to Monke, is one of resourcefulness; resourcefulness that’s based on the rational and calculated decisions of the Pioneers regarding the appropriation of goods and commodities, rather than any resourcefulness derived from the Pioneers’ determination, courage, ingenuity and faith in the face of adversity. As a consequence, “the resilient souls of the pioneers are absent” from the computer game because such technological simulations are unable to do truck with these deep human qualities.

The description of the “Oregan Trail” computer game put me in mind of the “dinosaur” sequence that we saw in the video in Week 2 by Corning Glass; the experience with nature is mediated and augmented by technology, and accordingly demonstrates “the ambiguity of technology” with its ability to promote certain qualities and relegate others. It also put me in mind of Angela Towndrow’s blogpost, which beautifully makes the same point.

Monke goes on to relate how straightforward it is, in his experience, to teach computer skills to students who have little or no prior experience of computers, but instead have rich life experiences gained through traditional play on which they might build their computer skills.

“Ironically, it was the students who had curtailed their time climbing the trees, rolling the dough, and conversing with friends and adults in order to become computer “wizards” who typically had the most trouble finding creative things to do with the computer”.

Kids climbing trees
Climbing trees – hands on learning of deep human qualities.

“Certainly, many of these highly skilled young people (almost exclusively young men) find opportunities to work on computer and software design at prestigious universities and corporations”.

In his New York Times article, virtual reality pioneer turned digi-tech critic, Jaron Lanier, seems to echo similar sentiments when he asks, “how do we use the technologies of computation, statistics and networking to shed light — without killing the magic? […because] it goes to the heart of what we are after as humans”. The magic Lanier alludes to are aspects quintessential to human nature and being human. Lanier recognises that many technological design decisions today are being made by the individuals that Monke describes above, geeks of Silicon Valley, and ultimately that their decisions can either lock in or lock out elements that speak to human values.

Take music as a case in point, which incidentally is something Sharon Flynn picked up on in her reflective post, and the digitizing programme known as MIDI (short for Musical Instrument Digital Interface). Here, Lanier explains that MIDI

 “was conceived from a keyboard player’s point of view…digital patterns that represented keyboard events like ‘key-down’ and ‘key-up.’ That meant it could not describe the curvy, transient expressions a singer or a saxophone note could produce. It could only describe the tile mosaic world of the keyboardist, not the watercolor world of the violin” (p.7).

Jaron Lanier
Jaron Lanier – music making, the human way.

Software development, it transpires, is particularly prone to the phenomena known as “lock-in”, and often an extremely rigid form of “lock-in” at that. “Lock-in” happens when software is designed to work along with other, already established, software programs and when design decisions in the original program become increasingly difficult to modify due to the fact that more and more software programs have become dependent upon the original.

Software attempts to express many ideas, from the nature of a musical note to the nature of personhood itself. However, digital designs not only have the tendency to promote or relegate certain qualities, but they’re inherently predisposed almost to lock certain qualities in and to lock certain qualities out.

I wonder, what qualities are we locking in and what qualities are we locking out, in our new digital culture, and where does the balance lie between the human and the technological? In order to better understand the technological dimension, I’m going to have to give serious consideration to the “post human”concept.

Image sources:

#17 - Sharing!

Climbing trees

TEDxSF 2010 Edge of What we Know - Jaron Lanier ©Suzie Katz #2582

References:

Monke, L. (2004) The Human Touch. Available at: http://educationnext.org/thehumantouch/

Lanier, Jaron (2010) You Are Not A Gadget: A Manifesto. Penguin UK. Kindle Edition.

Lanier, Jaron  (2010) Does the Digital Classroom Enfeeble the Mind? Available at: http://www.nytimes.com/2010/09/19/magazine/19fob-essay-t.html?pagewanted=1&_r=3&

Rosenbaum, R. (2013)  What Turned Jaron Lanier Against the Web? Available at: http://www.smithsonianmag.com/arts-culture/What-Turned-Jaron-Lanier-Against-the-Web-183832741.html

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Published ineLearning and Digital Cultures

5 Comments

  1. Good posting bringing up a lot of questions. As someone who works with computers at a very surface level it may not be right for me to theorize but a fundamental quality of interacting with people is the ability to accept, understand and draw value from someone being “wrong” or mistaken. We seem to be adept at navigating a personal relationship that can be aggravating, frustrating or even a bit destructive that were it with a computer would cause us to chuck it out the window.

    There’s something sophisticated and important in our accounting for the quirky in humans that can’t be tolerated in machines. Even if we built a machine with “character” like the ones in Hitchhikers Guide to the Galaxy, Star Wars or Blade Runner I doubt we could ever accept them as anything more than machines. Anyway, what value is there in applying human qualities to assemblies of mechanical and electronic junk?

    • Thanks for reading my blog post and for taking the time to comment. I appreciate it. I’m glad you liked it. Interesting points you make too, Scott, about how we “tolerate” human imperfections and foibles. And interesting too that the main “characteristic” given to the robot in Hitchhiker’s Guide to the Galaxy was paranoia – Marvin, the paranoid android, with a “brain the size of a planet”, yet not given the opportunity to use it.

      • Had forgotten about Marvin. Reading over what I said, “tolerate” might be too strong a word for how adjust our judgments expectations and judgments of others. More like a literacy based on personal connection, experience, hopefulness and empathy. Having raised two daughters there seemed to be no operating system they could display that didn’t drive me crazy but never beyond endurance and hope. Even our graphics geek at work who seems to be merely enacting the characteristics of normal humans has a sort of fall back behavior that couldn’t possible have trained into him in his early years spent off-planet:-)

        Read the Human Touch article and really liked “What is a Person” by Christian Smith http://www.brainpickings.org/index.php/2011/10/27/christian-smith-what-is-a-person/

        Could be by the raise of online learning and the wholesale instance on tech everything we need to study humans at a deeper level. Makes sense that teachers should be in the middle of this. It’s not really about technology anyway but the teaching of people through whatever means reaches them in a genuine way. Can you imagine a heated debate among accounts over the de-humanizing aspects of Simply Accounting?

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php