elearning analytics - are your courses effective?

We're looking for one or two companies who are interested in finding out whether:

  • your Storyline courses are effective
  • your students are having successes/difficulty with concepts/topics
  • embedded survey question results indicate corporate culture OR security/compliance issues
  • and much more

We're finishing up a data capture and evaluation technology that works seamlessly with Storyline, analyzing student data and reporting the results in a series of web-based dashboards.

Storyline integration is simple and straightforward and we'll help you all along the way.

Once the course is published, course and student data is transparently captured and ready for you to review.

I've included some report screen grabs below. Please feel free to respond here and or contact me directly ( peters@quizzicle.com )

Dashboard_ReportStudent_Course_ReportStudent_Surveys_ReportClustering_Report

10 Replies
Matthew Bibby

That's interesting Peter, I look forward to seeing this tool develop.

What tech are you using to pull this off? Am I correct in assuming that this runs independently of an LMS, yet is pulling some data from the LMS if one is used? Does this work with both SCORM content and xAPI? 

I'm also interested in hearing about the calculations behind these metrics - for example, how are you measuring attention, reading rate, motivation and experience? To me, understanding how these are calculated is important, because if data doesn't accurately represent the reality of the situation, then it's just noise.

I'm happy to test and provide some feedback, if you'd like, but if you are looking for companies to work with, them I'm not a good fit.

peter sorenson

SCORM content certainly - or NO LMS at all.

xAPI requires the developer to identify what needs to be captured and to write the statements to record activities. We have a very few set requirements for the developer to include in their course - and we let the course and the interactions of each student determine the data to be recorded. In some ways xAPI is like when developers needed to include tags in their courses to indicate <b><i> to inform the text content how to behave.

Most gauges at the top of the reports (except for reading rate) result from more complex analysis of student interactions - and are "indicators" of categorized behaviors - as nothing is absolute - although the analysis will get better and better the more the engine understands the intricacies of each student.

Each report provides a certain level of "drill-down" - such as tests allow drilldown into questions. We want developers to begin to think of training as a "conversation" they can have with their students - so that the students can help the developers evaluate the efficacy of their training modules and help the students by personalizing the training experience - by letting technology do what it does best - -

As an example - the "Cluster Report:

1.       Daily Launches identifies the number of courses launched each day as clustered circles – bigger circles = more launches

2.       Daily Scores indicates if there is a correlation between the day of the week in which a course is launched and the student scores

3.       Daily Experiences indicates the number of unique course launches based on the day of the week to help identify  a pattern that might help determine network stress

4.       Browser usage displays the browser usage per course – helpful in determining whether people are adhering to browser standards or whether browser usage in unmanaged

5.       Sessions and seat time provides insight into the number of sessions required to complete a course – identifying outliers based on sessions and time.

I'll post a link to a test course soon to allow people to test it on their own - and then provide them with login credentials to review their results as compared with all other testers.

peter sorenson

Another issue to consider is - we can "classify" the difficulty of a question based on "how" the student approaches answering the question.

  • How long did it take for the student to make a decision.
  • Did he consider other answers before he selected and submitted?
  • How long did he consider (each) other answer?
  • Was his consideration of each answer good (based on its potential as being correct) or poor?
  • Was his execution of his consideration of an answer good (because it was part of the correct answer (set)) and he selected it - or poor?

It's more about understanding the potential consequences of actions as well as understanding the actual consequences.

I live for this stuff....

Brian Allen
peter sorenson

SCORM content certainly - or NO LMS at all.

xAPI requires the developer to identify what needs to be captured and to write the statements to record activities. We have a very few set requirements for the developer to include in their course - and we let the course and the interactions of each student determine the data to be recorded. In some ways xAPI is like when developers needed to include tags in their courses to indicate <b><i> to inform the text content how to behave.

So, your system doesn't rely on SCORM or xAPI to capture and record interaction data? It can do the same thing for a Storyline course not published for LMS?

Does this system only work with Storyline content or can a company use/track any kind of content?

peter sorenson

It works with Lectora as well, but could be used in any web environment. We even track and capture the mouse x,y position 2x per second to plot the users mouse onscreen path - so it could even potentially be used to evaluate how users engage web based applications.

We've initially focused on web based training as it's the one area where ROI has been determined "theoretically" rather than practically - by analyzing data.

Sent from my iPhone

peter sorenson

That's correct. The technology does not require either to capture data.

For courses we host on our own LMS - we haven't used SCORM for years to capture and record "SCORM" style data ( completion, etc ). Although our technology can replace an LMS ( which is how we use our tech ) we recommend that clients use the traditional LMS recording and reports for compliance needs ( audits ) and our reports for competency evaluation.

We're actually looking to incorporate a new connection method to our database that creates a "secure handshake" between the client and the server where requests from the client to the server do not require the opening and closing of a connection, but where the connection is open for the entire session - so the completion of client requests are immediate and are NOT stacked behind other requests. It speeds up response by 10x ( even beneficial when you're talking milliseconds )

peter sorenson

Thanks for the opportunity Leslie - we want to help Storyline Users get even more out of their courses - and get to know more about their students in the process.

We find Storyline much more user friendly than your "competitors" - even though we work with those tools as well - the "variable based approach" provides an easy way to communicate between Storyline and other technologies.

We've even been able to include a method for a student to transparently submit a question to a monitored email so he/she can ask a question about course content should something not be clear - just like raising your hand in a classroom.