We are having more and more people wanting to be able to collect and analyse assessment submissions based on a form. While the test tool can be used for some of the simpler submissions it's not really designed to cater for analysis of form-based submissions. Perhaps this could be solved by expanding the capabilities of the test tool but either way I'll provide some examples:
1. In-class worksheets - students must complete a worksheet while participating in an activity. This could be a laboratory activity or some other form of observational activity. The worksheets might contain standard sections that could be done with a test, but it may also contain other parts like actually specifying the activity you're completing (as often it is a student choice) which might adjust the questions being asked. It may also have additional questions that are not actually part of the activity but are used for analysis like demographics. In addition, there are often questions linked to each other. eg. choose something (maybe a radio list)... then explain/expand on your choice (free text field). Finally, it will often have sections of a worksheet, not catered for with tests at the moment. In a test, you either have 'All at once' or 'One at a time'. There are many cases where you need groups of questions displayed together with an overarching instructions section. When marking a worksheet like this, often it will be marked by grouping together responses of a certain type (eg. all worksheets with the question about 'Activity type' selected as 'b. photosynthesis') and on occasion, a particular delegated grader will mark these.
2. Reflections - these are similar to the above but instead of being in class, the students may be out on prac or placement. The instructors have specific questions to ask them and the students have to submit these weekly. Often they will have supervisors who are also required to provide similar feedback on a student's learning which needs to be linked to the student's attempt.
Students would be required to complete the same form numerous times. At the moment, it's difficult to track when the students completed these so you can determine what was missed. You could use the journal tool but there is no way to prompt a specific response and it's difficult to extract the submissions and track the completion of them because often students will post supplementary messages that shouldn't be included in the assessment. On top of that, there is no way to categorise the posts if they have to specify what they have been doing. You could create multiple tests but then it makes it difficult to collate a single student's submissions. At the moment only one attempt of something can be used as the mark, where there are many cases where all attempts must be completed to receive a final mark. Managing attempts is difficult right now. Also, being that these are completed without supervision the instructors often ask for originality checking. Being that they would be completing the form, it should only be checking the actual responses and not the templated questions (which is an issue with document submission based on a template). Finally, the assessment may require a time restriction between attempts or a specific period in which to complete an attempt.
While both of these could be completed by filling in a document and submitting it weekly to an assignment, that unfortunately doesn't make it easy to retrieve the data for analysis. And it doesn't solve the problem of calculating a mark based on the total of multiple attempts.
Finally, when extracting the submissions from the system (eg. downloading) these types of submission need to include the metadata of the submission like the time and date it was submitted and any other relevant submission data as well as the actual submission.
As I said, might be possible by expanding existing tools but for us, this is being requested more and more often.
|Product Version (if applicable):||0|