top of page

SMARTTeach

An application to assist university instructors in lecture content creation

MacBook Pro Mockup Front view.png

Project Background

This project reimagines Lumio, a K-12 software by SMART Technologies, for university instructors, particularly in Engineering. Through research and design, we aimed to enhance classroom material creation and improve educator satisfaction. This case study outlines our approach, findings, and design recommendations for adapting Lumio as a more effective tool in higher education

Timeline

20 weeks

My Contributions

Research, Design, Design Systems, Usability Testing, Interaction Design

Platform

Desktop Web App

Collaborations

I collaborated with UX Researcher (Pranali), UX Designer (Leo and Natalie), Product Manager (Jisu), and our client SMART Technologies.

Problem 🧐

Instructors face challenges in creating engaging and interactive classroom experiences due to time constraints and difficulty tailoring content for students with diverse educational backgrounds. They also lack streamlined methods to gather student feedback and are dissatisfied with the quality of their current teaching materials.

Our Solution 🥁

1. Upload existing content and continue editing
Imported Slides.png

Instructors can upload their existing material. Once the file is uploaded, Instructors can edit it as they normally would.

2. AI recommended activities
Adding Activity.png

Instructors get AI-recommended activities based on the content of their existing material. They can easily add these activities directly to their materials, saving time and effort in creating new content from scratch.

3. Editing AI recommended activities
Editing Activity.png

Instructors can edit the AI-generated activities to better suit their class and teaching style. This feature allows Instructors to make different kinds of edits to the recommended activities.

4. Presenting content and monitoring class in-class activity
Monitoring.png

Instructors can present content while having a separate activity monitoring and speaker notes screen to track students' performance during in-class activities. A live chat feature allows students to post questions to the professor, either named or anonymously. 

The process we followed...

Firstly, Research 🔬

Our research methods

  • Exploratory Research

  • Literature Review and Competitor Research

  • Online Surveys

  • Semi-structured Behavioral Interviews

What we found?

01

Instructors expressed the need to make classroom teaching more engaging and interactive.

02

Instructors want feedback on classroom learning and activities from the students.

03

Instructors feel that they do not have enough time to create lecture content.

04

Tailoring class content for all students is challenging for them since students come from diverse educational backgrounds prior to university.

05

Many Instructors shared how they were not entirely satisfied with the quality of their current teaching content.

Thinking Styles / Personas 🧑

Thinking Styles.png

Ideation #1 💡

Based on our research insights, we conducted an initial brainstorming session to identify important features and user flows.

Initial brainstorm.png
Frame 10.png

A user flow for capturing students’ background specific to the course syllabus and also understand their learning styles.

Frame 10-1.png

A user flow to assist users in forming contextual groups based on students backgrounds and learning preferences.

A user flow for monitoring student progress during in-class activities and assisting them whenever they are stuck.

Frame 10-3.png

A user flow to add activities to existing static course materials to make lectures more engaging and interactive.

Eliminating user flow #2 to scope down

We opted to exclude user flow #2, which involved forming contextual groups of students for in-class activities. This decision was made because group formation can be highly subjective and varies greatly from class to class.

 

Additionally, group formation may not always be managed solely by the instructor and can involve numerous variables beyond just students' backgrounds and preferences.

Concept Test 📋

To ensure that these overarching concepts resonated with our users and to gauge their understanding of each concept, we conducted concept tests with three professors. 

concept test.png

Artifact shown to participants during concept test

What we found?

01

Instructors desire to maintain authority over the content while they appreciate the assistance of AI. They value having control over the content and would like the ability to edit AI recommendations.

02

Want feedback on the activity and understand its success and failure. Want to understand how much students learned in the lecture.

Ideation #2 💡💡

Building on the first ideation, concept test feedback, and initial research insights, we further explored and brainstormed each flow individually.

01. Student background and preferences

Frame 10.png

Instructors can create forms to gather information about students’ backgrounds and preferences.

Frame 10-1.png

Instructors can share the created forms with students.

Instructors can view student responses and get insights that can help them tailor content for the upcoming lectures.

Student Bkg.png

02. Content creation

Frame 10.png

Instructors can upload existing materials and open them in editable format.

Frame 10-1.png

Instructors can get AI recommendations for activities based on the content in the uploaded material and based on student backgrounds and preferences.

Instructors can make edits to the recommended activities by changing the controls or the AI prompt.

Content creation ideation#2.png

03. Monitoring in-class activities

Frame 10.png

Separate monitoring screen that is only visible to instructors, where they can see students’ working on the activity.

Frame 10-1.png

Live chat feature for students to post questions and instructors can answer questions.

Ability to share activity results with class.

In-class monitoring.png

Design through Iterations ✏️

We completed three iterations, progressing from low-fidelity to mid-fidelity prototypes, incorporating feedback from stakeholders and the internal team at each step, along with insights from usability testing.

01. Student background and preferences

What design decisions did we make?

01

Survey format to resembles popular platforms like Google Forms and Microsoft Forms, aiming to minimize the learning curve for instructors.

02

Options for instructors to regenerate and manually edit survey questions, giving them greater control and authority over the content.

02. Content creation

What design decisions did we make?

01

Upload Functionality – Instructors often have lecture materials they've developed over the years. This feature allows them to easily reuse existing content, making only the necessary edits.

02

Most instructors use Google Slides or PowerPoint for lecture materials. To match their editing habits and reduce the learning curve, we kept the functionalities consistent and designed a familiar layout.

03. Editing AI recommended activities

What design decisions did we make?

01

Quick edit features, along with manual editing options, providing instructors greater freedom and control over their content.

02

The editing and prompt editor are designed to align with popular LLM platforms, helping to reduce the learning curve.

04. Presenting and monitoring in-class

What design decisions did we make?

01

Updated the 'Live Chat' feature to follow a threaded reply format, with the added ability to save and revisit past conversations.

02

Enabled the option to 'Share' results with students and export activities/scores in various formats for grading purposes.

Final Solution 🥁

Lastly, Reflections and Learnings ✍🏼

01

The university learning tech space presents a diverse range of users, learning styles, and teaching methods, making design a significant challenge. One of the key lessons from this project was the importance of scoping down and prioritizing user needs to address this complexity effectively.

02

Due to time constraints, we had to leave out research and concept testing with secondary users like TAs and students, but if we had more time, gathering their perspectives and feedback would have been valuable.

03

This project focused more on research and concept refinement than developing a final high-fidelity product, as it was intended to help SMART enter a new market space. The deliverables serve as a baseline, providing numerous recommendations for SMART to move forward.

bottom of page