Lak11 Week 1: Introduction to Learning and Knowledge Analytics

Every week I will try and write down some reflections on the Open Online Course: Learning and Knowledge Analytics. These will by written for myself as much as for anybody else, so I have to apologise in advance about the fact that there will be nearly no narrative and a mix between thoughts on the contents of the course and on the process of the course.

So what do I have to write about this week?

My tooling for the course

There is a lot of stuff happening in these distributed courses and keeping up with the course required some setup and preparation on my side (I like to call that my “tooling”). So what tools do I use?

A lot of new materials to read are created every day: Tweets with the #lak11 hashtag, posts in all the different Moodle forums, Google groups and Learninganalytics.net messages from George Siemens and Diigo/Delicious bookmarks. Thankfully all of these information resources create RSS feeds and I have been able to add them all to special-made Lak11 folder in my Google Reader (RSS feed). That folder sorts its messages based on time (oldest first) allowing me some understanding of the temporal aspects of the course and making sure I read a reply after the original message. A couple of times a day I use the excellent MobileRSS reader on my iPad to read through all the messages.

There is quite a lot of reading to do. At the beginning of the week I read through the syllabus and make sure that I download all the PDF files to GoodReader on the iPad. All web articles are stored for later reading using the Instapaper service. I have given both GoodReader and Instapaper Lak11 folders. I do most of the reading of these articles on the train. GoodReader allows me to highlight passages and store bookmarks in the PDF file itself. With Instapaper thus is a bit more difficult: when I read a very interesting paragraph I have to highlight it and email it to myself for later processing.

Each and every resource that I touch for the course gets its own bookmark on Diigo. Next to the relevant tags for the resource I also tag them with lak11 and weekx (where x is the number of the week) and share them to the Learning Analytics group on Diigo. These will provide me with a history of the interesting things I have seen during the course and should help me in writing a weekly reflective post.

So far the “consumer” side of things. As a “producer” I participate in the Moodle forums. I can easily find back all my own posts through my Moodle profile and I hope to use some form of screen-scraper at the end of the course to pull a copy of everything that I have written. I use this Worpress.com hosted blog to write and reflect on the course materials and tag my course-related post with “lak11” so that show up on their own page (and have their own feed in case you are interested). On Twitter I occasionally tweet with #lak11, mostly to refer to a Moodle- or blog post that I have written or to try and ask the group a direct question.

What is missing? The one thing that I don’t use yet is something like a mind mapping or a concept mapping tool. The syllabus recommends VUE and CMAP and one of the assignments each week is to keep updating a map for the course. These tools don’t seem to have an iPad equivalent. There is some good mind mapping tools for the iPad (my favourite is probably iThoughtsHD, watch this space for a mind mapping comparison of iPad apps), but I don’t seem to be able to add using it into my workflow for the course. Maybe I should just try a little harder.

My inability to “skim and dive”

This week I reconfirmed my inability to “skim and dive”. For these things I seem to be an all or nothing guy. There are magazines that I read completely from the first page to the last page (e.g. Wired). This course seems to be one of these things too. I read every single thing. It is a bit much currently, but I expect the volume of Moodle and Twitter messages to go down quite significantly as the course progresses. So if I can just about manage now, it should become relatively easy later on.

The readings of this week

There were quite a few academic papers in the readings of this week. Most of them provided an overview of education datamining or academic/learning analytics. Many of the discussions in these papers seemed quite nominal to me. They probably are good references to keep and have a wealth of bibliographical materials that I could look at at some point in the future. For now, they lacked any true new insights for me and appeared to be pretty basic.

Live sessions

Unfortunately I wasn’t able to attend any of the Elluminate sessions and I haven’t listened to them yet either. I hope to catch up this week with the recordings and maybe even attend the guest speaker live tomorrow evening.

Marginalia

It has been a while since I last actively participated in a Moodle facilitated course. Moodle has again proven to be a very effective host for forum based discussions. One interesting Moodle add-on that I had not seen before is Marginalia a way to annotate forum posts in Moodle itself which can be private or public. Look at the following Youtube video to see it in action.

[blip.tv ?posts_id=4054581&dest=-1]

I wonder if I will use it extensively in the next few weeks.

Hunch

One thing that we were asked to try out as an activity was Hunch. For me it was interesting to see all the different interpretations that people in the course had about how to pick up this task and what the question (What are the educational uses of a Hunch-like tool for learning?) actually meant. A distributed course like this creates a lot of redundancy in the answers. I also noted that people kept repeating a falsehood (needing to use Twitter/Facebook to log in). My explanation of how Hunch could be used by the weary was not really picked up. It is good to be reminded at times that most people in the world do not share my perspective on computers and my literacy with the medium. Thinking otherwise is a hard to escape consequence of living in a techno-bubble with the other “digerati”.

I wrote the following on the topic (in the Moodle forum for week 1):

Indeed the complete US-centricness of the service was the first thing that I noticed. I believe it asked me at some point on what continent I am living. How come it still asks me questions to which I would never have an answer? Are these questions crowdsourced too? Do we get them randomly or do we get certain questions based on our answers? It feels like the former to me.

The recommendations that it gave me seemed to be pretty random too. The occasional hit and then a lot of misses. I had the ambition to try out the top 5 music albums it would recommend me, but couldn’t bear the thought of listening to all that rock. This did sneak a little thought into my head: could it be that I am very special? Am I so eclectic that I can defeat all data mining effort. Am I the Napoleon Dynamite of people? Of course I am not, but the question remains: does this work better for some people than for others.

One other thing that I noticed how the site seemed to use some of the tricks of an astrologer: who wouldn’t like “Insalata Caprese”, seems like a safe recommendation to me.

In the learning domain I could see an application as an Electronic Performance Support System. It would know what I need in my work and could recommend the right website to order business cards (when it sees I go to a conference) or an interesting resource relating to the work that I am doing. Kind of like a new version of Clippy, but one that works.

BTW, In an earlier blogpost I have written about how recommendation systems could turn us all into mussels (although I don’t really believe that).

Corporate represent!

Because of a very good intervention by George Siemens, the main facilitator of the course, we are now starting to have a good discussion about analytics in corporate situations here. The corporate world has learning as a secondary process (very much as a means to a goal) and that creates a slightly different viewpoint. I assume the corporate people will form their own subgroup in some way in this course. Before the end of next week I will attempt to flesh out some more use cases following Bert De Coutere’s examples here.

Bersin/KnowledgeAdvisors Lunch and Learn

At the end of January I will be attending a free Bersin/KnowledgeAdvisors lunch and learn titled Innovation in Learning Measurement – High Impact Measurement Framework in London (this is one day before the Learning Technologies 2011 exhibit/conference). I would love to meet other Lak11 participants there. Will that happen?

My participation in numbers

Every week I will try and give a numerical update about my course participation. This week I bookmarked 33 items on Diigo, wrote 10 Lak11 related tweets, wrote 25 Moodle forums post and 2 blog posts.

The Future of Moodle and How Not To Stop It (iMoot 2010)

Yesterday morning I got up at 6:30 to deliver a presentation at the very first virtual Moodlemoot: iMoot 2010. All in all it was a hugely enjoyable experience. I had people attending from among other the United States, Ireland, Zambia, Australia, Japan.

The platform for delivery of the session was Elluminate, which worked flawless. I am still amazed at the fact that we now have easy access to the technology that makes a virtual conference with a worldwide audience possible.

My talk was titled “The Future of Moodle of How Not to Stop It”, an adaptation of the book by Zittrain.

The Future of...
The Future of...

I first recapped the recent discussion about the death of the VLE:

[wpvideo htbfDWKU]

I showed how Moodle was conceived and developed when the web was less mature then it is now (the social web as we know it was basically non-existent) and how a teacher can create a learning experience for his or her students using nothing but loosely coupled free tools. Horses for courses.

I then looked at the two mental models that Moodle could adapt from Drupal:

  1. Drupal’s tagline is “Community Plumbing”. I believe Moodle’s could be “Learning Plumbing”.
  2. Drupal sees itself as a platform. This is exactly what Moodle should reinvent itself as.

In the final part of the presentation I looked at how the new Moodle 2.0 API’s (repository, portfolio, comments and webservices) will be able to help make the shift towards a platform. I finished with asking people to imagine what an appstore for repository plugins and what an appstore for learning activities would look like.

The slides are on Slideshare and embedded below (you can also download a 2MB PDF version). The session has been recorded. Once that recording comes online, I will update this post and try and share that here too.

[slideshare id=3065049&doc=100204imootthefutureofmoodle-100203180009-phpapp01]

The one difficult thing about a virtual conference, by the way, is communicating the dates and times. Timezones add a lot of complexity. iMoot, for example, provides users with a custom schedule for their timezone and replays each session twice after the live event. I am starting to believe in the Swatch Internet Time concept again. Wouldn’t a single metric .beat not be great? See you @850!