Quiz attempts, Variables and Success (according to the LMS)

I need the expertise of this group to solve an issue my colleagues and I have been wrestling with. Here's the scenario:

  • All assessment questions are at the end of this 2 hour course.
  • I give the learner 3 tries to get a passing score and use a variable within Storyline to count the attempts. (In SABA (our LMS) they have unlimited attempts.) Here's a sample of the variable on the Results sli
  • Each time they click “Retry Quiz” it adds one to the number of attempts. When the number is greater than 2, I change the state of the Retry Quiz button to read “Finished” and when clicked this time, directs them to a scene that tells them what to do next.
  • When publishing the course for the LMS, I use that one and only Results slide to pass the score and have the LMS Reporting status set to “Completed/Incomplete” (our typical setting for courses in SABA

That works like a charm, BUT….. and here’s the part that has us baffled. If the student starts and stops the course more than twice, they get marked as “unsuccessful” with a score of 0 even if they haven’t ever reached the Results slide. This removes the course from their In-Progress Learning Activities (causing some panic and a deluge of email) It appears that somehow, that CountQuizRetries variable is being used, even though the Retry Button is never clicked.

As a work around, I’ve changed the number of quiz retries allowed to 5 (hoping that they won't start and stop the course more than 5 times) but any advice on how to truly correct the issue within the content would be greatly appreciated.


6 Replies
Ashley Terwilliger-Pollard

Hi Michael,

Thanks for the detailed description of what's happening. I'd like to know if you were also able to test this outside of Saba at a site such as SCORM Cloud which is a free, industry standard testing method for SCORM Content? If the behavior persists there, we'll want to take a look further and it may help in both situations if you're able to pull the debug logs as well so that we can see what information is being sent and when.

Michael Palko

Thanks for the quick reply, Ashley. I wasn't aware of SCORM Cloud, so that's a bonus answer for me.

I uploaded my content there and didn't see the same issue. I was able to start and stop the course more than 3 times and wasn't marked as unsuccessful. That's what I would expect.

I've attached the debug document for you to help interpret.

If I publish the course to our internal LMS and leave the debug enabled, will the learner see that window? I'd like to compare them, but obviously don't want to impact the student experience.

Thanks again!

Ashley Terwilliger-Pollard

Hi Michael,

If you didn't see the same behavior at SCORM Cloud, then it sounds like something you might want to connect with your LMS team about as this debug log will show things correctly. If you leave the debug enabled and someone goes to run the course, yes - they'll see that window as well. Could you publish a version of it that you don't make accessible to your users to be able to run the debug log and then you could share that with your LMS team to help pinpoint where it's sending the "unsuccessful" status? I'd share with them the behavior you saw at SCORM Cloud as well.

Michael Palko

That's the conclusion I came to as well, Ashley: Our LMS is treating the content a little differently, so the issue isn't likely in the content, but in the LMS. I'm going to publish the course in our TEST environment with the debug on and take a closer look.

Thanks as always for the quick reply.