The 2023-24 D3:QRS initiative has come to its completion. The following is a report on the program and activity.
Program Overview
Funded by the Vice-Provost, Innovations in Undergraduate Education, the Data Driven Design: Quercus Record Store (D3:QRS) initiative program activities began in August 2023 and concluded in January 2024. Six instructors formed a peer network with ongoing support to deliver:
- Participation in three virtual workshop meetings to explore the use of a new dashboard in an applied course planning context.
- Provision of feedback to inform work to the dashboard development team.
- Complete instructor survey/report on experience using analytic data from the dashboard.
- Contribution to a showcase of insights and outcomes on the Open UToronto website.
D3:QRS Instructor Cohort
- Judi Laprade (Department of Surgery, Anatomy)
- Laura Dempster (Faculty of Dentistry)
- Jeannette Sanchez-Naranjo (Department of Spanish and Portuguese, FAS)
- Liyang Dai-Hattrick (Department of Materials Science and Engineering, FASE)
- Alex Rennet (Mathematical and Computational Sciences Department, UTM)
- Dan Weaver (Department of Physical & Environmental Sciences, UTSC)
D3:QRS Facilitation Team
- Laurie Harrison (Director, Digital Learning Innovation, ITS)
- Will Heikoop (Coordinator, Digital Learning Innovation, ITS)
- Alan da Silveira Fleck (Data Analyst, Learning Analytics, CTSI)
- Cora McCloy (Faculty Liaison Coordinator, SoTL, CTSI)
D3:QRS Programming
The D3:QRS full instructor cohort met with the facilitation team three times between August 2023 and January 2024. These full team workshops were focused on program onboarding, familiarization with dashboard tools, development of inquiry questions, and reporting on dashboard exploration. Between the meetings, instructors had continuous access to the Quercus Data Insights dashboard – which was updated daily – and support from the facilitation team. The progression of the workshops is captured in the table below:
Table 1: Workshops delivered during the D3:QRS project.
Workshop | Date | Objective |
Workshop 1 | August 31st, 2023 | D3:QRS orientation and familiarization with dashboard tools. |
Workshop 2 | October 19th, 2023 | Development of inquiry questions and identification of supporting dashboard visualizations. |
Workshop 3 | January 17th, 2024 | Reporting on dashboard exploration and feedback to the development team. |
Project Impact and Inquiry Questions
This was the first cohort of instructors to use the learning analytics dashboard connected to the Quercus Record Store (QRS) data to answer their course-level inquiry questions. This dashboard was designed to monitor and analyze student activity on Quercus, allowing instructors to gain insights into their students’ aggregated activity with course elements. This dashboard is organized into four pages with visualizations that display different aspects of student activity, including:
- General Student Activity: Provides an overview of the activity metrics (i.e., views, participation or downloads) throughout the course’s term.
- First Access and Review of Course Elements: Provides insights into first and later access to one or more course elements.
- Views of Resources by Modules: Provides insights into student views of items inside each module.
- Summary: Provides an overview of key metrics from other dashboard pages in the form of easy-to-read cards.
Instructors had access to up to 20 months of data, which allowed them to analyze current course data while also having access to data from a past iteration of the same course for comparative purposes.
The following examples of inquiry questions were developed by instructors to direct dashboard exploration and answer their pedagogical needs:
- Is there a difference in student engagement (e.g., download of weekly readings, timing of first access and review of resources) between different types of delivery modes (Hybrid Type A x Hybrid Type B x In-Person)?
- How often and when do students access or use the links to visual resources?
- What are the patterns of participation during the course’s term?
- Do the badges in modules motivate module completion?
- What types of announcements, discussions and module items do students access during the course’s term? Is there a difference in engagement patterns between distinct sections of the same course?
- How many students have viewed/downloaded homework solutions before quizzes, or slides before lectures?
- What are the periods of peak activity throughout the course’s term?
When asked which report/dashboard page was the most useful for them, instructors’ answers were evenly distributed among all available reports (including New Analytics). This result potentially highlights the diversity in the type of inquiries related to student activity and the capacity of the dashboard’s multiple-page design to answer this diverse set of questions related to student activity.
Finally, most instructors expressed that they were surprised that student engagement levels were generally lower than they expected. This could suggest that the dashboard might be an important tool to validate the assumptions of instructors regarding student engagement and support interventions on this matter.
Plans are underway for future collaboration with CTSI in the context of their Scholarship of Teaching and Learning (SoTL) program with the aim of sharing more broadly these examples of learning analytics as a potential data source in SoTL studies.
Instructor Feedback: Dashboard Development
During this program’s final workshop, instructors shared feedback on three different aspects of their experience with the dashboard by rating the following statements:
- Navigating the dashboard elements was intuitive (e.g., filters, visualizations, metrics).
- Dashboard visualizations were easy to interpret.
- The user guide was user-friendly and instructive.
The result was an overall positive instructor experience with the dashboard, with all either agreeing or strongly agreeing with the statements, with the exception of one instructor who qualified that they rarely accessed the guide.
Enhancements During the D3:QRS Initiative
Instructors also provided written feedback regarding the dashboard navigation, resources they believed could be added as well as dashboard elements that could be further included. Instructors requested improvements to the dashboard navigation, the behavior of filter settings, resource examples and the inclusion of a glossary of terms. During the programming much of these issues have now been resolved and incorporated into the current iteration of the dashboard and supports. For example:
- The new app environment has the page navigation on the left, making it easier to navigate the dashboard.
- The new app includes a bookmark option to save filter settings.
- There is now a supporting website with many different examples, images, case studies and other supports to assist users of the dashboard.
- A definitions table and tooltips throughout the dashboard provide clear definitions of the tools and functions.
Recommendations for Further Development
For the short term (i.e., before the dashboard’s full launch), development efforts should be focused on addressing high-priority feedback. This includes the possible development of a new dashboard page for comparison between multiple sections and courses in the same visualization.
Because this group only had access to the User Guide and not to the website and other resources available to Early Adopters in January 2024, suggested needs related to missing supporting resources could be compared and combined with feedback from Early Adopters.
Items that require important infrastructure developments (e.g., marks and grades data) or that are restricted by current data governance agreements (e.g., identifiable individual information, ability to download underlying data) are currently considered as “low priority” for the context of short-term developments. However, these items can be viewed as important additions to instructor dashboard functionality in future phases of the dashboard.