Monthly Archives: August 2014

Educational Technology: It’s Not Black and White

There has been some debate for many years now — decades and counting — about educational technology: its effectiveness, its cost, its benefits and its drawbacks.

The questions go something like this: What measurable evidence do we have that technology improves education? Does it provide any benefits over traditional pencil-and-paper learning? Is it worth the cost? Could there be deleterious effects — how will it affect the physical, psychological, or emotional well-being of students?

Let me say right up front that I unequivocally believe that we should be adopting and adapting new technologies in the classroom.

But don’t call me an edtech cheerleader.

When it comes to the debate on educational technology, it is too often treated as a black-and-white situaton: Should we or shouldn’t we? And I understand the logical impetus for that: administrators and policy-makers need to know whether to “flip the switch” or greenlight such adoptions.  Because of this viewpoint, all modern (ie. electronic) technologies are clumped together into simply “educational technology” or “technology in the classroom”, and we are asked to justify its presence.

Because of that dichotomy, you have a couple of opposing camps:

There are the technology pessimists, those who fear change. They will point out the inherent costs of purchasing and using technology, as well as some of its potential ills and dangers: the effects of too much screen-time, the dangers of social networking and online communities. Of course, there is also the old adage that “if it ain’t broke, don’t fix it.”  Another term for these people could be “Luddites.”

Then there are the edtech optimists, those progressives who adopt, embrace, and evangelize the shiny new promises of better learning through modern technology. They jump to use the newest hardware gadgets or the buzziest new app they have heard about. These are the early adopters, the beta testers, the first-in-liners for the newest techno-toys. They attend conferences which, in many cases, are glorified pep rallies for technology in the classroom. But they don’t always acknowledge or give credence to the potential pitfalls, challenges, and limitations inherent in said technology.  I would call these folks “The Cheerleaders.”

However, there is a third camp, and I like to think this is the one to which I belong: the edtech realists. I really resonate with this quote by William Arthur Ward: “The pessimist complains about the wind; the optimist expects it to change; the realist adjusts the sails.” … and not just because I’m a fan of sailing!

The way I see it, there is very little benefit to being either entirely for or entirely against educational technology because, like most things in life, both sides have some valid points. To blindly reject — or blindly accept — these tools into our lives is foolish or, at the very least, uneducated.  Only by acknowledging valid arguments on both side can we make wise and informed decisions that solve problems — and isn’t that what technology is all about? Merriam-Webster defines technology as “the use of science in industry, engineering, etc., to invent useful things or to solve problems.” How can we do that, use tech effectively to solve problems, if we aren’t willing to look at both the benefits and the concerns or limitations?  To put it another way: being a quarterback is going to help you win the game more than being a cheerleader will.

What are the benefits of educational technology, and how do we measure them?

This isn’t a straightforward question to answer, because an answer that is acceptable to some may not be acceptable to others. There are a variety of benefits that I, and many others, have witnessed in the classroom, including:

  • Increased motivation — sometimes due to sensory-stimulating mechanisms like multimedia videos, animations, and games, but even more so from simply doing authentic, “grown up” tasks using authentic “real life” tools.
  • Increased efficiency — by efficiency, I mean reduction in time and materials required to do various tasks. Clearly, it is faster and easier to instantly find something on the World Wide Web than to wait until you get a chance to head to the library. Written work can be modified by spell-checking, cutting-and-pasting, moving sentences around and eliminating words… without having to continually erase, rewrite, etc. Using a virtual math manipulative, or submitting a paper at the click of a button, saves time vs handing out or turning in papers, or cleaning up materials and supplies.
  • Individualized instruction — things like practice games, self-paced tutorials, and “flipped classroom” videos allow students to get the remediation, review, or practice they need… without slowing anyone else in the class down. Students who are more advanced and capable of extended their learning also have the opportunity to soar even higher, and push into “above grade level” tasks and challenges that they would normally find if it were not available on a computer.
  • Authentic problem-solving and real-world “21st century skills” — in many professions, computers and technology are not optional; they are the tools of the trade. Being able to apply knowledge while using the same methods and techniques as the pros use allows students to produce “real world” products, as well as better preparing them for college and careers. Being able to collaborate and communicate on a global level is something made possible by technology, and something the modern workforce needs to know how to navigate.

The problem comes when we ask “How do we know the above things are actually happening? How do we measure these results?”  That’s a tricky question to answer, because it’s true that, even though these tasks and benefits are guided by standards (such as those set forth in Common Core as well as technology standards like those proposed by ISTE and The Partnership for 21st Century Skills), the argument could be made that a lot of the above things are subjective, and that the evidence is merely anecdotal.

Policy-makers and administrators tend to prefer the concrete: something that can be easily and objectively measured, and compared to peers or to past performance. So, how can we measure the effect of technology on learning? Well, when it comes to learning core subject content — math, language arts, sciences, social studies — there are already tools and metrics used to measure these things. Tests from textbook curricula and, more validly, state-wide and nation-wide standardized tests can be used. However, even relying on “standardized tests” as a measuring stick for the efficacy of technology is a sticky wicket for two different reasons.

First of all, those standardized tests — and even the standards they are measuring — are currently in the middle of a revolution. The Common Core State Standards initiative has introduced not only new (and, in some cases, more rigorous) standards, but also entirely new testing methods that require new ways of demonstrating knowledge. This means it will be harder than ever to isolate “technology” as a variable; you can’t really compare results of the current testing to results obtained in previous years, because the playing field itself has changed. Not only that, but the results in many cases aren’t even accessible — for example, in California we have no way to know how well our students performed on the inaugural administration of our new state “Smarter Balanced” test of Common Core standards; they are treating it as a pilot/practice year, and thus schools in California don’t even have actual standardized testing benchmarks from this year. (Speaking of Common Core testing, one fact that can be seen as a testament to the value and importance of technology is that the tests are now to be done electronically, administered entirely on computers or tablets! Electronic testing isn’t new, but it has never been so widely adopted — mandated, even — in public K-12 schools.)

However, there is another danger inherent at looking to some sort of standardized metric for the value of technology: it assumes that the technology (hardware or software) itself can be an isolated variable. This doesn’t make logical sense. A computer or an iPad is a tool — just because it is an electronic device doesn’t make it any different than other tools such as, say, a hammer… or a pencil, or a book. Yes, even those, as antiquated as they may be, are “technology.”  Would it be valid to draw conclusions about the efficacy of a hammer by simply giving it to some different people and measuring how well they build a house? No, because the big variable here is not just the tool, but the implementation of that tool; a skilled carpenter is going to be able to do more with a hammer than an unskilled person would be able to do even if given power tools. Likewise, a skilled teacher is likely to get better educational results using “old-school” pencil-and-paper tech than an unskilled teacher using computers or tablets would achieve. In the aforementioned scenario, some people would draw the conclusion that “technology is not effective” for learning, but it would be an invalid conclusion because “technology” was not an isolated variable — the real variable was skill level and efficacy of the teacher and his/her proficiency with implementing a given set of tools.

Unfortunately, there are some studies that promote the idea that technology can lead to learning in a vacuum, even in the absence of human intervention or guidance — for example, Sugata Mitra’s “Hole-in-the-Wall Project.”  This is misleading and unfortunate, not only because it reinforces an all-or-nothing “black and white” approach to educational technology, but because it is simply false. Sure, certain (limited) learning occurred autonomously when children were presented with technology; for that matter, studies have shown some learning to occur in children who just have a book plopped in front of them to read. Does this mean we should just hand books to children and take a hands-off approach to education?  No. There’s a reason why we haven’t been doing that, and the reason is because it simply doesn’t work. Even when “learning occurs”, such as reported in Dr. Mitra’s work , the studies show that both the extent of knowledge and the speed at which it is acquired are improved through presence of teaching and guidance. (Note: Dr. Mitra’s original study showed only that users can figure out a user interface on their own… which anybody who has played a video game without first reading the instruction manual can tell you. His subsequent studies actually found that for acquiring content knowledge, such as biology, users/subjects benefitted from some guidance from an adult.)

A vast body of edtech research shows that you can’t simply decouple “technology” from “skilled use.” This is not to say that all educational technology is created equal, and that all tools will be worthwhile or viable investments of time, effort, or money. As I have explored in previous posts on this blog, some tools are more effective for certain tasks than others. However, very few tools will be effective at all if not used the right way. The takeaway here is that simply investing in a device and sticking it in a classroom may not achieve desired results — you also have to invest in the training and skilled human resources required to make effective use of that technology.

 

Are there drawbacks to educational technology? What about the well-being of students? What about the cost? 

Concerns about educational technology should not merely be dismissed — they are warranted and, in many cases, valid. Change is difficult, it is scary… partly because  even though we know change can mean betterment, we also know that in most cases “it could always be worse.” Change presents that as a real risk: that things could actually get worse.

The reality is that there are drawbacks to using technology — especially in excess, or without proper precautions or training. For example, there can be negative psychological and sociological aspects to online interactions and social networking, including cyber-bullying, depression, and anxiety. However, these are aspects that can be mitigated through precaution, knowledge, and proper training.

But what about physical/health effects? One example of this is the rising incidence of carpal tunnel syndrome. Other modern (but not entirely new) concerns include screen time and radiation (before, it was microwaves, radio, and TV… now it is smartphones.)

Science supports some of these fears, so they are something to keep in mind and try to mitigate. Having said that, what is the solution? To ban all electronic appliances? To thrust ourselves back into the stone age?

If children are prevented from using modern devices due to these sorts of concerns, this would place them at a crippling  disadvantage for entering into an information economy and a globally-competitive workforce where, in many cases, they will be required to use them. If we propose that it is unhealthy to be in front of a screen for much of the day, then this means we also have to exclude access to those jobs that require extensive screen time.  According to the United States Department of Commerce, in 2011 ninety-six percent of working Americans used new communications technologies as part of their daily life, while sixty-two percent of working Americans used the Internet as an integral part of their jobs (and those numbers have probably grown in the past few years).

As for the cost of adopting technology: meditate on the facts stated above and consider the cost of not adopting technology. What cost to society does an unemployed person present? What does it cost our country when we are globally non-competitive in areas like research and invention, all because of a hesitance to adopt modern tools that help us achieve these goals?  Aside from these rhetorical statements, I’ve already previously shown that going paperless can actually save money.  So, while it is possible to waste money by overspending on expensive or unwarranted technology purchases, with wise decision-making it is possible to obtain effective edtech is actually extremely cost effective.

Solution: EdTech is not “black and white” — it may be more of a gray fog, but one that we can navigate if we just adjust our sails! 

At the end of the day, it comes down to this: change is scary, but it’s going to happen. That’s a guarantee. And, despite the problems and challenges that may often be introduced, change is good.  Who would say we should go back to riding horses everywhere, to hunting down food with a wooden spear, to sleeping in caves?  There are a few, but most people have embraced these modern conveniences because they make our lives easier and more enjoyable. When it comes to computer technology, the world has already made that transition over the past 30 years. It’s time for the classroom to follow suit.

Educational Websites That Use Flash

Four years ago, with the ongoing advent of HTML5 and the non-Flash support of iPads, many people declared that “Flash is dead.”  However, anybody who has been paying attention will realize by now that such a statement is easier said than done: Flash, Java, and similar technologies were used for more than 20 years of rich internet application development — especially in the world of educational websites.  While that has started to change, many (if not most) of those educational resources still have not changed to cross-platform supported technologies such as HTML5.  So, in short, it is still important for a paperless device to be able to access all sorts of educational websites — including those that use Flash (and preferably even supporting other plug-ins like Unity, Java, Shockwave, and more.)  Chromebooks can access most educational websites, including Flash. Windows and Mac can access even more, including Java, Shockwave, Silverlight, Unity etc.  But iPads and Androids are much more limited; neither can access Java, Shockwave, or other plug-ins, and Android devices can only access Flash by going through some special setup routines.

Currently, Flash is used on about 15% of websites — so this is lower than the 25% that required Flash four years ago.  However, the percentage of educational websites that require Flash is much higher. Here’s a list of some examples…

Textbook-Publisher Curriculum Resources

Other Educational Resources That Use Flash
(just a few examples — there are many, many more!)

And these are just the tip of the iceberg…

As you can see, even in 2014 students are missing out on many resources and opportunities if they are unable to access educational resources (including Flash) online — and all of the above can be used for free, without installing any apps.

Some people believe you can get around this problem by simply using a “Flash app” for iPad (or Android) — these are actually cloud browsers, and are not great to use for a variety of reasons. For one, they are more laggy and not as smooth of an experience as simply using Flash would be. The bigger problem is that these cloud browsers — including iSwifter, Rover, Puffin, Photon, CloudBrowse, and others — are actually streaming video to your device, which requires a ton of bandwidth so they can cause serious problems if used on multiple devices sharing an internet connection, which is the case at schools:

 

 

The Perfect 1:1 Paperless Device

(TL;DW – Too Long; Didn’t Watch… here’s the Cliffs Notes: There is no such thing as a perfect device. Every device currently available has some strengths but also some weaknesses. To learn more about those specific strengths and weaknesses, see the following Google Presentation below)

10 Needs / Considerations for a 1:1 Paperless Device

Ten 1:1 Paperless Classroom Needs