The Learning Dashboard was created as a corporate prototype to show learners (and their instructors, coaches, or mentors) how each learner is progressing in a course at-a-glance. All data points are consolidated into one screen.

PROBLEM:

A large, multi-billion dollar corporation with hundreds of learners per year needed a way for learners and their instructors or coaches/mentors to see at a glance how each learner was progressing. Adult learners felt frustrated because expectations were unclear and there were lots of surprises during one-on-ones when they discovered they weren’t doing as well in the course as they’d previously thought. When it came to performance improvement plans, there were key concerns among those who were not meeting expectations or failing the program. Because job success was tied to the outcome of this training program, we needed a better way to communicate expectations in clear, transparent ways and allow adult learners the opportunity to self-evaluate.

RESEARCH & ANALYSIS:

After conducting research, I was inspired by research and case studies from the following two institutions:

Carnegie Mellon recommends having a learner performance dashboard with a red/yellow/green learner status to quickly identify where a learner is struggling vs not struggling. This learner dashboard should include the relationship between the learning objectives and each skill demonstrated. For each skill, the dashboard should identify: difficulty level, the baseline background knowledge the learners are expected to have for each skill going into the course, and how long it should reasonably take learners to acquire the skill based upon their pre-existing background knowledge. If several learners are struggling in the same area, the instructional designers, learning strategists, and facilitators or coaches should be able to drill down and look at the assessments and activities. Then, they can better determine the types of answers the learners are giving, the questions they’re being asked, and determine where the disconnect is. Whenever possible, try to determine what misconceptions your learners are getting, so that you can explain the concept in a different and more clear way. This dashboard cognitive model should be regularly tuned based upon the learner data. Carnegie Mellon takes this a step further with their Prediction Engine, which is based off of the Bayesian Hidden Markov Model and helps predict mastery of skills.

Bayesian Hidden Markov Model Machine Learning

Bayesian Hidden Markov Model. Source: https://turing.ml/dev/tutorials/04-hidden-markov-model/

Pittsburgh Science of Learning Center uses data to create a learning curve tool. They take student interactions against a specific skill and chart how many specific errors they make in the achievement of said skill. You can generally expect that the first time a learner encounters a problem, they will make more errors. However, it is expected that subsequent exposure should result in a decrease in errors. This approach gives you a data-driven prediction model for performance, which you can then compare to actual learner performance. When your learner performance diverges from this model, then you know you need to make some adjustments. Potential adjustments might include switching up the learning model, examining the activities and assignments, providing different assignments, assessing demonstration in a different way, or providing more skill scaffolding prior to demonstration and assessment.

Pittsburgh Science of Learning Lab Learning Curve Analysis

Source: Learnlab

A recent study funded by the U.S. Department of Education, Carnegie Learning Inc., the Pittsburgh Science of Learning Center and DataShop team, and Ronald Zdrojkowski proposes that “tracing student knowledge more effectively” can help “sequence practice in tutoring/training software.” Source.

Dashboard Inspiration – In addition to previewing the Carnegie Mellon and Pittsburgh Science Learning Center models, I was also inspired by the sleek, modern, and intuitive dashboards often found within the tech industry – including on portfolio sites like Dribble and Behance – as well as some Bootstrap dashboard interfaces.

KEY FEATURES:

The client requested an Overall Score along with detailed breakdowns of individual progress to help clarify the areas where each learner needed to focus, deepen their knowledge, or improve their skills. To add an element of fun for more competitive learners, we also gamified the course and reflected bonus points for gamified tasks within the dashboard.

Dashboard overview:

Sections of the dashboard include:

  • Training Hours Completed
  • Total Courses Completed
  • Required Tasks Completed (Out of 63)
  • BONUS Points Earned
  • Final Project Grade (0 to 100 on a grading scale with a rubric)
  • Quiz Scores
  • Peer Feedback (0 to 100 on a grading scale with a rubric)
  • Overall Score (average of all other scores, above) – For all scores on a 0 to 100 scale: 85+ is passing, 74 and under is failing, and 75-84 is borderline. Color-coding notifies students whether they’re failing, passing, or borderline, and informs teachers or instructors that remediation and possibly extra 1:1 training might be needed.

ITERATIONS:

The dashboard was rolled out in two-week sprints in a fast-paced Agile development environment within a tech corporation:

Version 1.0 – We needed to deliver our first iteration quickly, then refine it later. For our pilot program, we created a working prototype of a learner dashboard in a Microsoft Excel spreadsheet. Learners filled out the spreadsheet each week, self-reporting and self-evaluating on their progress. The spreadsheets were stored in a shared folder on Microsoft SharePoint, where the instructors, coaches, and learning team (instructional designers and learning strategists) could view the data in real-time. The learners also discussed the data points with their instructors and coaches during weekly 1:1 coaching meetings.

Version 2.0 – Because the spreadsheet approach was time-consuming due to the necessity for manual data entry, I collaborated with the company’s Data Engineering team. We formed a plan to code algorithms and connect to APIs to assist with automation, thus reducing the number of manual updates to the spreadsheet each week.

Version 3.0 – For the final iteration of this learner dashboard, we needed a very clean, intuitive dashboard that clearly communicated expectations and course standing to the students and teachers. To accomplish this desired outcome, I drafted a mockup in Photoshop. Then, I hired a professional User Experience (UX) and User Interface (UI) Designer who specializes in creating dashboards and data visualizations. The UX/UI Designer helped bring our ideas to life in Figma, creating a layout that ensures usability with an intuitive, clean, and modern design.

NEXT STEPS:

  1. We formed a small team of key stakeholders to collaborate with the company’s Data Engineers to completely automate the Learning Dashboard. Upon our future launch of a fully-functional, clean, and intuitive dashboard, we will no longer need to use the spreadsheets or perform any manual updates at all.
  2. My future vision for this dashboard is that it will allow the option of “drilling deeper” into the reasons behind each score. This behavior is similar to the Carnegie Mellon and Pittsburgh Science Learning Center models. For instance, if a learner failed a quiz or assessment, you could select that score to view all the questions that were missed. This provides key data points to help assist with growth, studying, and deepening skills. Likewise, if a majority of learners miss the same questions on the same assessments, it can be a helpful clue to the instructional design team to reevaluate those assessments.
  3. Future iterations will follow learning engineering best practices to integrate the Experience API (xAPI – formerly called Tin Can API) and the new cmi5 standard to store data points in a Learning Record Store database with robust reporting features; provide customized xAPI statements using JavaScript and JSON code; and allow for more personalized and adaptive learning and data-driven decisions.