I want to find the total time spent by an user in a course. kindly help me with an sql query
There is a report for this, but it is inaccurate. We submitted a behind Blackboard ticket and were told that content which opens a new window cannot be tracked. Am very curious if a query can track that because if it can, that means the data is available in the DB and the report needs fixing.
Here you go! It can be modified to show more than the last 30 days in a particular course. I'm interested to see someone run with it and total up the days/hours/minutes automatically.
SELECT min(timestamp), max(timestamp), user_pk1, course_pk1, session_id
WHERE timestamp > sysdate-30
AND user_pk1=(select pk1 FROM bblearn.users WHERE user_id='MY_USER_ID')
AND course_pk1=(select pk1 FROM bblearn.course_main WHERE course_id='MY_COURSE_ID')
GROUP BY user_pk1, course_pk1, session_id
ORDER BY min(timestamp);
Mark, and if the student launched a scorm package which opens in a new window, that time is captured by the bblearn.activity_accumulator? I was told it was not.
Unsure. This is something I use for an overall picture, but I'm sure there are things that cannot be tracked, or tracked easily.
I think tracking the overall time a user spent in a course is always going to be tricky as Blackboard (and any other online system) can only track clicks, but not what's happening between them. A student could open some content from Blackboard, read it for two hours, and then just close everything; Blackboard would only know about the initial click to open the resource. Equally the user may open that same resource, decide they don't want to read it, and close it straight away. How to you measure the different time spent in these two situations?
Gail Watson - I thought of the report you mentioned too - it's interesting that Bb Support say it's not accurate. Other than not being able to track resources opened in a new window, did they say any more about the reason for the inaccuracy?
Chris, my ticket was on the student overview for single course. Here is exactly what Blackboard said.
"Thank you for taking my call in regards to this case. Per our conversation, we reviewed the information provided by the Subject Matter Expert. The issue with the current discrepancy in the report is that Blackboard can only record the activity for the Scorm untill it the user access it and is then sent outside of Blackboard. Once outside of Blackboard we would not be able to track user activity for the Scorm. Per our conversation; I would encourage you to please place an enhancement request to see if Blackboard may consider a way to track the user activity in the Scorm in full."
Thanks for this Gail Watson, that's really helpful - that makes sense that the report doesn't include time spent in a SCORM package, although it would be nice if the report could include this. Thanks for sharing.
An organisation in the US created a large B2 to estimate the time spent based upon clicks in the activity accumulator table. The Bb data records multiple entries per UI request which needed to be ignored, while the types of requests required different weightings, but it's not technically possible to really record engagement reliably. A guide at best.
It is often unclear how much time a student spent reading content after the last click was made. This gets even more difficult if the student went to VirtualThread or Panopto or another vendor site, where they continue course activities.
It is relatively easy to tell how many days a week the user logged into Blackboard, but how much time they spent on course materials is an approximation.
Because of all of the "fun" with trying to "know" what the user was doing; the fact that outside activities aren't tracked; and the fact that a user could click on one thing, wander away for most of their session, then click on something else and have that whole time counted as 'active' makes time-in-course and/or time-on-activity very hard to calculate. Just like everyone else mentioned. If you're trying to get a metric that correlates to success we've found two things a) the simple number of clicks in a course correlates pretty closely to engagement in the course which ties directly to success in the course and b) forum posts and replies in a course correlates to success in the course (more posts/replies == more engagement == higher rate of success). We've boiled that down to an "engagement" index which is ultimately just a count of clicks in a course based on the AA table. This has been very helpful for us! YMMV
Retrieving data ...