Further testing required
Specifically in computer science.
Students who have access to computer devices in the classroom do significantly worse than colleagues without them, a study has found. In a study by MIT's School Effectiveness & Inequality Initiative, titled The Impact of Computer Usage on Academic Performance: Evidence from a Randomized Trial at the United States Military …
You have to weigh up the questionable value of the activities performed on computers, the effect that has on learning and learning approaches against the extended set up time required by barely IT-literate teachers, the arguments in class by children who become increasingly de-socialised by obsessive/repetitive use of computers.
Consider the value of IT skills learned by a 5 or 6 year old who will enter into the workplace in say 10 or 15 years time and the rate of change in IT against the value of learning lifetime skills in reading, writing and simple mental arithmetic.
I see computers used to great success in keeping bothersome children occupied (playing on Mums smartphone while she chats to friends or keeping them quiet while on a long car journey) and laptops and tablets in schools serve much the same invaluable role, especially where there are children who don't have English as a first language or behavioral problems - the laptop doesnt require any real communication and keeps them nicely occupied.
I'm a big fan of using these invaluable teacher aids / activities for other peoples children at school or if we're out and about, but personally, I'd rather my little kids (aged 3 & 6) get a real education and spend meaningful fun time with us.
Currently we have to write our CS type exams on paper, since that's the rules for exams.
The only "good" thing is that while it's a real pain in the arse to code on paper*, it's an even bigger one to mark it.
I don't mind too much, since I came of age in the previous millennium, so tend to hand write my notes anyway.
The class which only allowed computers in the first two rows and had a TA sitting in the back row did the best from making people have to actually pay attention.
The classes which have breaks longer than an hour apart tend to leave people with terrible recall of anything passed the 60 minute mark. Based on what I need to help explain to otherwise very on to it kids anyway. 4 hour calc classes with a single break led to the lowest passing rate.
* actual code. Design stuff and pseudocode are fine being hand written for exams IMHO
We further cannot test whether the laptop or ta blet leads to worse note taking, whether the increased availability of distractions for computer users (email, facebook, twitter, news, other classes, etc.) leads to lower grades , or whether professors teach differently when students are on their computers.
If the difference is students who chose to use a computer and those who chose not to, it implies an attitude difference that maybe translates to greater self-reliance/determination to succeed.
Control group with normal laptop/tablet on-net, no prohibitions.
Group with f***book, tw*tter, instant messaging and email blacklisted, but with allowed internet access (rationale - so they can search on stuff they are being taught).
Group with electronic devices off-net, just for notes taking.
I suspect that the difference observed in the experiment will show up between group 1 and 2 with 3 giving a marginally better improvement in an academic setting (albeit easier to police).
That would have been my thinking too - it's not the devices present/not present that would be the problem; but how those devices are used.
Also not tested for is the proficiency of those users with their devices. I've got a bluetooth thing that unfolds into a nearly-full-sized keyboard and can slap down truncated notes/reminders into a phone while carrying on a conversation. I couldn't do that if I had to rely on the on-screen keyboard. Even with the right kit, though, proficient typists and hunt-and-peck typists are going to be allotting more or less attention into the note-taking process depending upon how good they are. And the more attention you have to put into working your device; the less you have to spare for input from the teacher.
> That would have been my thinking too - it's not the devices present/not present that would be the problem; but how those devices are used.
While I agree in principle, can we expect computers to be used in a non-distracting manner?
A paper and pencil are passive. Whereas using computer is more interactive and things are happening asynchronously there, diverting your attention -- so there is likely to be a mental cost. Unless you are just using the computer for fast typing. Then it can be beneficial. But I am not convinced this would ever be the typical use.
A paper and pencil are passive. Whereas using computer is more interactive and things are happening asynchronously there, diverting your attention -- so there is likely to be a mental cost.
Basic learning theory states that the closer to the actual performance conditions the learning conditions are, the better the performance will be. If computers are used to take notes while retention is measured using only paper and pen or oral tests, it should be no surprise that students who use computers in class do worse than those who use more traditional note-taking methods.
Well in college quite a few years ago when I ended up in a room that had PC's at the desk I will say I generally paid way less attention to the class as I was busy looking at the internets, chatting with people I knew through IM's, or even playing games vs others in the same class.
So I can see how this probably is true.
Now classes where there was no internet access, and the desk were too small even for my laptop(were talking over 15 years ago so they were quite large) I paid way more attention in.
But then she is quite severely dyslexic, so I am going to spend half term starting her touch typing - which apparently does wonders for their language processing by using different bits of the brain.
Finding an appropriately sized keyboard was relatively hard work though - and at least the staff at school are supportive.
In the general case I suspect it depends far more on the attitude towards the machine than on the machine itself. Deny oneself internet access is a simple switch on most devices - it just needs to be applied for appropriate times.
Airplane mode in meetings is always good - you come out and people ask why you haven't replied to some inane email...
"including the ease at which students could be distracted by surfing the internet"
Taking away their capability to use Facebook, Instagram, Gmail, IM apps and similar timewasters, creates a surprising increase in productivity.
Not just for school graders either - their middle aged parents seem to be more productive at their jobs too.
I was taught to touch type in the nineties whilst still in primary school so I could use a computer during class as a way of getting my round my terrible dyslexia (seriously who ever decided on that name is c*** ) but as it turns out 90's laptops costed a small fortune, had crap screens and was loud enough to irritate just about every other classmate (that and would you give something worth more than some peoples cars' to someone under the age of 11?) so that got thrown on the way side.
Theses days despite how relatively quickly I can type I like good ol' fashioned fountain pen for taking notes. Yet even that seems like a dying art (seriously does no-one do joined up writing anymore?)
use a computer during class as a way of getting my round my terrible dyslexia
Years ago, a firend of mine was tested for dyslexia when aplying for university.
The test came out positive - he was dyslexic, so they gave him a laptop.
He's not dyslexic. He was stoned when he went for the test.
[Who is dyslexic, and very glad not to have found out until mid-30s]
I started learning computer programming in the 1970s. FORTRAN, mostly. As programming became my main focus, the result is that I now tend to print in block capitals, slow down for printing lower case, and have real trouble with script writing. Even my signature is an illegible scrawl, though that's mostly laziness.
Keypunch machines with their asymmetric controls contributed to my deteriorating touch typing skills. Actually I do pretty well there still, but my form is terrible. Fortunately speed and accuracy aren't paramount. (Yes, accuracy is, but only needs to be corrected. I'm not typing on paper, so can easily correct after I type. In case anybody misunderstood what I meant.)
Anyway, yes, I represent what is wrong with the world: people losing skills that are less relevant than they used to be.
I saw were someone noted colleges students prefer real text books over e-books for some of their courses, usually major or key prereqs. There is something to be said to paying attention to the lecture and taking notes by hand versus the potential distractions of a tablet or laptop.
Yes, but why? Real books more familiar? Newer generations will be more comfortable with electronics. Easier to scribble in the margins or otherwise take notes? Always on, no problem with charging, specific to the task, frees up the computer for looking up references on the side, easy to add (and see) sticky bookmarks? Doesn't react to stay touches when you're reading in an odd position and shift positions? Or for that matter, if you set it down and pick it back up. Also, the screen doesn't go dark to save power.
There are lots of possible contributing factors, and some may merely be problems to solve. Others may be inherent. I'd be curious to know why, not just what.
anything less than 3 standard deviations is undecided. Only believe anything above 5.
Computers have their uses (for engineering and science) to grasp the meaning of the maths and examine real data.
I personally never understood the Runge-Kutta method until I could graph it on a computer.
anything less than 3 standard deviations is undecided. Only believe anything above 5.
No, you're misinterpreting what has been said.
This isn't a statistical test yiedling 0.2 sigma; this is a variation from the norm of 0.2 sigma. Assuming the original sample was statistically significant, that's a measured change, not a statement of uncertainty.
 The article didn't mention significance, and I didn't check any further.
So, I've been in the actual military course and environment described and for me, taking notes by hand is painful, slow, hard to read later, and far more distracting in terms of losing track of what the lecturer was saying. I can type upwards of 200 wpm (when you write code for a living, you get pretty good at that particular skill :) so the advent of computing in information-intensive activities such as education was a godsend. I took much better notes when referring to textbooks and later, other material, rather than in lectures. I haven't seen any studies (but haven't looked recently), but where computing devices seem to excel is in providing the interactive experience of presenting problems for students to solve and patiently helping them to work toward understanding.
This can be done lots of ways, and is easiest to apply to subjects where there tends to be the few right answers and mostly wrong answers so that software can be used to ascertain exactly in what skills the student is deficient. I've been suitably impressed by the increasing sophistication of software that isn't just looking for syntactic correctness (exactly the right answer using a very limited acceptable vocabulary, starting with single correct words), but can actually deduce semantic correctness, without being tripped up by linguistic structures that may be odd, particularly Local (English, where I am) Language Learners and special education students.
The software being able to assess a student's starting level, progress, trouble spots in skills, etc., can be a great aid to the educator because the latter can spend most of their time on improving skills that the students actually need help with, instead of giving boring lectures about things that the students already understand ... and will tend to drift off to the distractions, missing the parts of the lectures they don't understand. We need studies about those kinds of computing tools, not whether the presence or absence of technology is a boon (no, I did not mean boom) or a bust.
That's like studying whether pencils, paper, textbooks, slide rules, etc., are worthless because some people wind up doodling, playing slipstick hockey, or passing notes to each other. Anything can be used or abused - we already know that. Heck, oxygen is lethal above a certain concentration, as is water if it gets above your mouth and nose. This is yet-another cockamamie study that some fuzzy-headed psychology degree-holder came up with that completely lacks rigor, logic, or a useful outcome.
I now teach and I can say categorically that we're in trouble because of social media alone, with kids' phones flashing the camera flash every time they get a Tweet, Like, text, fart, etc. The attention span of dogs averages around 30 seconds, except for when squirrels abound. The attention span for gnats is probably fractions of a second, given close personal observation. The kids today are down into the microsecond range, I swear. They are completely addicted to social media, foofy games that require no real skill beyond virtual button smashing, listening to mindless drivel (our parents thought rock 'n roll was mindless drivel, but a lot of this stuff doesn't even have a melody, harmony, or lyrics that promote anything except sex, drugs, and violence, usually combined.
We are truly worried in education about who's going to run, let alone design and maintain, the nuclear power plants, aircraft, networks, food and water production and distribution, and everything else our "civili(z/s)ation" depends on to a degree that those without science, technology, engineering, and math(s) (STEM) backgrounds can't adequately fathom. These disciplines (which is why they're described by that word) require a level of concentration that those who can't/won't perform in them can't appreciate. Having severe attention deficit disorder that you weren't born with, but is induced via an environment of social media, flashy games and videos with no redeeming value, etc., is dangerous, as it produces exactly the same brain activity in animals and humans that those addicted to drugs exhibit.
If I were going to get a degree now, I'd get one in addiction disorders treatment ... except that the people afflicted don't realize/believe they are, and can't earn the income to pay for the treatment because they're unemployable with no useful skills. We are in biiiiig trouble ...
Suppose the students are using computers the way I use them in my job - I don't need to remember facts, just principles - I can find the facts online. Suppose the exams are still of the rote learning type, where memory wins over understanding. I would then argue that it is the exams losing relevancy to the modern world, with the students best equipped for that world being penalised by an archaic examination system.
Remember, these are military officers-in-the-making, and you're generally not sitting at a desk with infinite Internet bandwidth where Google is at your beck-and-call, along with Archive.org and everything else. One of the biggest logistics challenges for troops deployed to the pointy edge of the spear in hell-holes in places like Afghanistan is keeping a steady stream of batteries going to them that they have to carry around in numbers that would depress you ... no, I mean as in pushing you into the ground/mud, you're carrying around so many of them. Then you have to figure out what to do with them when they're dead, since armchair generals/lawyers working at environmental groups insist on leaving one's campground as one found it (which apparently includes a place remaining overrun by Taliban, ISIS, Al Qaeda, etc.). Some people would say, oh, no problem, just use solar power to recharge them, except that the size of panels needed for all of the equipment wouldn't fit on a Humvee, let alone your back. Plus, that stuff is notoriously fragile, even in protective cases, which add weight and bulk, and decrease efficiency substantially. At the right Sun angle, they additionally provide a gleaming target for the bad guys to shoot right at, hitting you and/or the critical power source.
I thought that having to learn obscure facts very quickly in one read, often not much more than a glance, was a complete pain in the patootie, but once I got out into the Real World, I realized that I had acquired skills that left my civilian counterparts in the dust, quite literally in cases where construction was being performed, when not on a battlefield in Southwest Asia or the Middle East. When you start to get into special operations scenarios, this advantage becomes even more pronounced, and your ability to know a lot and learn very quickly means the difference between life and death for you and your teammates, vs. the bad guys and people you're supposed to be protecting and/or rescuing.
We are evolving into Homo Digitalus and that may not be a good thing. An indication of this is what happens when things go to crap and the power, water, food, medicine, and sanitation quickly disappear. It doesn't take very long to go from a First World lifestyle to a Fourth World one - in the Third World, you get help from the First World - in the Fourth World ... there is no First World that can get to you, or is willing to expend the resources needed to do so, especially over the long term. Just look at Nepal a year after the killer earthquake that dropped all of that ancient architecture in seconds - it's never going to be what it was before, and the people who have lost everything are beginning to realize that, much to their long-term dismay that's becoming what would be called clinical depression here. There, as in much of the world, it's called "Life".
People with access to games, the internet, and so on will fuck about with distractions rather than paying attention to their work. The results would no doubt be different for a course which actually required computers, but "people provided with distractions can become distracted" is not some stunning new revelation.
Is a novel use of the word 'magnitude'
To put the results in context, if performance was expressed as a standard score (like IQs) with a mean of 100 for the no IT group, the wasters with their tablets and laptops averaged 97.
This is a good example of confusing statistical with real world significance.
Not at all surprising. Why bother to think and learn when you can just look up the answer on Google. Get in the real world, and they haven't a clue how to think. It's been proven that hand writing class notes is more effective than tapping computer keys as more parts of the brain are involved.
Its a lot harder to pick up someone else's social science essay, rename it and do a quick find and replace to get around plagiarism and head of back to the student bar if you only have your private machine to find other people's course work on.
Biting the hand that feeds IT © 1998–2019