Because the Edublogs site (which hosts this blog) now requires me to pay to remove advertisements from all of my posts, and to enable basic blog features such as the ability to embed videos in my posts (which is included with EVERY other free blogging service, and should be considered a basic essential in an educational blog) I have decided to move this blog to http://mlearning.wordpress.com. I used to just keep a backup of this site over there, but after this move I’ll be updating the site over there with all of the content from this blog, as well as reinvigorating my posting! 🙂 There is a LOT happening in mobile learning, now that others have caught on to the potential and power of mobile devices, and I have much to share… 🙂
If a kitten can learn with an iPad… how much more can we learn? 🙂
This video shows the positive use of mobile devices in the classroom to provide a “back channel”; and also discusses other aspects of mobility, such as the teacher being able to interact with her class even though she was physically away that day.
I’ve been engrossed in an article on the tech blog Gizmodo this morning, which reveals that Dr. Alan Kay made the following quote about the iPhone when it was launched:
I’ve written several posts about Dr Alan Kay in the past, but to summarise, Dr Kay is one of the greatest minds in the history of computer science. He predicted (and invented) mobile computing, the windowed GUI, and was a pioneer of object oriented programming and social constructivist learning. And a large proportion of his work was in pursuit of a computing device to support learning – he is indisputably the first person to research and develop digital m-learning, and was involved with much of the design of the OLPC.
Dr Alan Kay’s prediction that a large, multitouch tablet will be an incredibly popular device must therefore, I think, be read in the context of his life’s work. I believe that he sees significant potential for a device like this to make a powerful impact on the way students learn both formally and outside the classroom (Dr Kay was also an early proponent of “informal learning” – since the 1970s).
Kay’s prediction for the iPad’s success is further supported by the work of another luminary in the computing world, Jef Raskin, whose work pointed to a simple, easy-to-use “information appliance” as having the most chance of success: a computer as easy to use as a toaster. The iPhone was a significant milestone towards that goal – and some believe the iPad will advance it even further.
When a pioneer of computing and mobile learning makes a prediction like this about the iPad, it is worthwhile taking note.
Over the last week or so, I’ve been keeping up with the story of the US school that activated the webcam on a student’s Macbook while the student was at home, and took photographs to allege that the student was handling drugs (which the student asserts were actually candies).
In the half a decade I’ve been involved with mobile learning, the issue of student ettiquette in classrooms and schools has surfaced frequently. It is sometimes asserted, for example, that mobile phones and other portable digital devices are “intrusive” in classrooms; and they are cited as being problematic when it comes to the recording of playground fights and bullying, or to secretly record peers and teachers inappropriately. While these issues concerning student use of mobile, portable, and ubiquitous devices are frequently discussed, the inverse responsibilities of schools and teachers are rarely, if ever, discussed.
But these issues now need to be properly addressed. This incident will almost certainly whip up fear in educational communities worldwide – particularly amongst students and their families. Many educational institutions have long-standing “student policies” on the use of mobile devices on campus; but almost none would have public policies on how mobile devices may be used by organisations when the student leaves the campus.
Turning on webcams when students and their families have a reasonable expecation of privacy is just one way mobile devices might be abused by educational organisations. Unsolicited or overly frequent instant or SMS messaging, GPS tracking, or content/communications monitoring are amongst other issues that may need to be addressed in the wake of this incident.
The internal enforcement of policy would be another issue to address. The school being sued for this particular incident has claimed that these laptop webcams were only used to try to retrieve lost or stolen laptops, could only be accessed by two personnel, and they were activated exactly 42 times, ever. But none of that explains how someone else gained access to a laptop that was NOT stolen or lost, used said device to watch a student’s activities, and ultimately decided to take photos of those activities to confront the student.
I’m concerned that unless public mobile technology policies are put into place and enforced, this incident will have a chilling effect on the growth of mobile learning. Students and families will be suspicious of institution-issued or -accessed devices, and from educational institutions will be afraid of issuing said devices due to resistance and/or being accused of inappropriate use of these devices. In the words of this article on Arstechnica:
“School-issued laptops are becoming more and more common these days, but thanks to the action of one high school, students and parents might have second thoughts about bringing them home.”
That would be a terrible shame. This school may have a lot to answer for for the damage they’ve done to the reputation and advancement of mobile learning.
The FLiP, made by V-Tech, is an e-book reader targeted at children aged 3-7. It features a 4.3-inch colour touchscreen, QWERTY keyboard, rugged design and over 100 downloadable titles.
A programmable, touchscreen mobile learning device with a QWERTY keyboard for $60!? Things are getting very exciting for mobile learning. 🙂
The Mirus Schoolbook Convertible looks like it’s a decent step forward in the design of a low-cost mobile computing device for education.
It’s a step up from existing netbooks and even the OLPC because it features a convertible design – the screen can be swivelled and locked flat so that it turns into a “tablet” computer, which responds to both the built-in stylus and to finger touch (like an iPhone).
The finger-responsive touch screen is particularly useful in an educational tablet computer because of the device’s utility as an ebook reader. Nobody wants to be holding a stylus to turn pages while reading an ebook, and this innovation allows stylus-free ebook reading. However, a fully finger-based design (like an iPad or iPhone) wouldn’t be optimal: the stylus is much better than a finger for more precise tasks such as drawing accurate diagrams or writing handwritten notes. And while I’m listing features that make this device better than Apple’s upcoming iPad for education, I should also mention that this device has a built in webcam (the iPad will not). 🙂
The design also features a liquid-resistant design just in case there’s an occasional spill or run through the rain – scenarios that are possible (or probable!) in a classroom, school yard, or school bag.
Like most netbooks, however, this device doesn’t have a massive amount of processing power or hard drive space; but it doesn’t really need it for the tasks it would most commonly be used for: accessing web-based activities and resources, working on homework or assignments, reading ebooks and basic communications and connectivity. Indeed, having reduced processing power means it’s far less likely to be used for playing the latest computer games rather than used as a learning tool.
After analysing the product features and reviews of it that are sprinkled across the web, I suspect that this device could be better in a classroom than any edu-netbook I’ve previously seen. The one specification that could probably do with improvement is the 5.5 hour claimed battery life. Some netbooks are now obtaining usable durations of 8-10 hours, and this would allow it to be used for a whole day without requiring a charge.
All in all though, this is a most capably specified device – so much so that I’m considering buying one myself to try out more rigorously.
There has been considerable activity at the University of Canberra with the implementation of Apple-based systems for supporting teaching and learning. With the University installing a new lecture recording system, staff here in the Teaching & Learning Centre have been focused on ways to optimise the capture, editing, and delivery of videos from all sources (including learner-created, teacher-created, and lecture-recorded).
Amongst the many ideas for content delivery we have been investigating iTunesU and the use of iPod Touch and iPhone devices for accessing content on-campus (or at home) for later review and reflection. With that in mind, I applied for one of the Apple University Consortium (AUC) scholarships to attend last week’s iPhone Software Developer’s Kit (SDK) Workshops in Sydney, and was delighted to be accepted.
The three-day event was hosted at Clifton’s Training on George Street, and the facilities were excellent. There simply wasn’t a technical glitch the whole time we were there, which meant we could focus on learning instead of troubleshooting. The facilities were adequately spacious, well-lit, quiet, clean and modern. A shiny new Apple Powerbook was provided to each participant from the AUC’s own “Classroom(s) in a Box” – this was a simple and flawless way of ensuring all participants were up and running in mere minutes.
The main trainer trainer was Nicholas Circosta, a 21-year-old Honours student from Murdoch University and a founding partner in start-up software development company Codelity. Nick’s interest in all things Apple has naturally led him to apply his studies in Software Engineering to developing all manner of cool, useful, and whacky iPhone apps. It was a privilege to have someone so knowledgable and talented as our trainer, and he made learning iPhone development heaps of fun. I’m no Apple fanboy, but talking with Nick I couldn’t help but be somewhat infected with his enthusiasm for all things Apple! No surprise, then that he’s been headhunted by Apple themselves and will shortly be heading over to begin working for them in Cupertino.
Nick was assisted by Louis Cremen, a mobile developer and teaching member at the University of Wollongong’s Faculty of Informatics. Louis provided excellent support during the “hands on” practical coding parts of the course, as well as great perspectives during teaching and discussion. When Nick goes off to Cupertino, Louis will be taking on the main teaching role for future iPhone SDK Workshops run by AUC, and we were very lucky to have both experts supporting our class during this transitory handover period of the course.
The course was divided into 10 modules of varying size and increasing technical complexity. The course content was designed to be approachable for those with little experience in coding Apple applications in Objective C; and was really ideal for the mixed experience levels in the class (which contained everything from post-doctoral through to minimally-experienced developers!) The first day focused on fundamental concepts of iPhone development (I shall never forget the Model/View/Controller Song from last year’s WWDC), the language (Objective C) and the development environment (XCode+Interface Builder+iPhone Simulator).
We finished the day with a look at the basic structure of an app in development and the concept of “Views” created through both code and Interface Builder. On Day 2, we got into the guts of development and did plenty of coding based on Nick’s examples, achieving things like storing data between sessions, enabling multitouch, and having a look at the various ways to implement 2D, 2.5D, and 3D graphics. By the third day our brains were pretty much bursting… but we were pushed harder conceptually, exploring the Core Animation and Core Location frameworks. Nick allowed us some free programming time at the end of the session, even putting up a nice prize for the participant who could code the best app in the last 3 hours of the day. 🙂
This was only my second ever AUC event (the first being CreateWorld09), but if this is an indication of the quality of AUC events I will definitely be hoping to attend more in future. First class training begins with first class trainers, and Nick’s ascendancy into the realms of Apple itself provides some indication of his energy, enthusiasm and talent in iPhone development.
This iPhone SDK workshop is being held again several times this year – in Melbourne, Brisbane, and Perth. While I don’t believe it’s possible to get into the Melbourne workshop any more, if you are able to attend the Brisbane or Perth workshops I would highly recommend them. See the AUC website for more details.
Gear Diary has just informed me of some good news, just in time for Christmas. Historically, Apple have maintained strict control of the capabilities of the iPhone, by restricting the use of certain functions and preventing developers from using them in “approved” apps. This is the reason that older model iPhones (the original iPhone and the previous model, the 3G) could not install software to record or stream video, despite having a camera built in that was quite capable of the task.
It seems that Apple have recently relaxed their control of some private APIS, and this means that developers have been able to create approved apps that can be installed even on older iPhones to allow them to record and even stream video.
Hopefully, this signifies a change of heart at Apple that will allow developers to more fully embrace and exploit the full power of iPhones past and present!
(via Gear Diary)
Back in 2006, I made some predictions about where mobile learning might be heading, including the use of augmented reality or “Heads Up” data displays to provide information on a learner’s environment and allow learning “in situ,”. Augmented reality has recently really taken off during 2009, with a number of apps on various GPS-enabled mobile phones (notably the iPhone) providing information layered over a camera view of the world; one example of this is the Layar application.
I also predicted the use of image recognition that would effectively enable “visual searches” of objects and images in the real world (and indeed, I reiterated this belief in a comment just yesterday on Stephen Downes’ blog). Want to know more information on that bridge over there? No worries! Just point your camera at it, and image recognition will provide some suggestions on appropriate websites to look at.
When I blogged that idea, however, I’m not sure I expected this technology to actually become available quite so fast. Today, Google announced a new beta application they’ve coined “Google Goggles“. And guess what? Their concept illustrations even features a bridge as the subject of their illustrated example – even if it is an American one rather than an Australian one. 🙂
The official Google site for the project (which is still in development) provides a number of ways Goggles can be used to accomplish a “visual search”, including landmarks, books, contact information, artwork, places, logos, and even wine labels (which I anticipate could go much further, to cover product packaging more broadly).
So why is this a significant development for m-learning? Because this innovation will enable learners to “explore” the physical world without assuming any prior knowledge. If you know absolutely nothing about an object, Goggles will provide you with a start. Here’s an example: you’re studying industrial design, and you happen to spot a rather nicely-designed chair. However, there’s no information on the chair about who designed it. How do you find out some information about the chair, which you’d like to note as an influence in your own designs? A textual search is useless, but a visual search would allow you to take a photo of the chair and let Google’s servers offer some suggestions about who might have manufactured, designed, or sold it. Ditto unusual insects, species of tree, graphic designs, sculptures, or whatever you might happen to by interested in learning.
Just watch this space. I think Google Goggles is going to rock m-learning…
(via Mobility Site)