Brand
PROJECT
Learning Management System for students & educators
SUMMERY
Me and my team redesigned the LMS (Learning Management System) of our academic institution to make it convenient and understandable system, that mediates between the student and the services of the educational institution
RESPONSIBILITIES
During my Bachelor's in Psychology with a focus on Human-Computer Interaction, I took part in a course on UX design for complex systems. This project, completed with my team, aimed at enhancing the existing student portal.
We were aware that, as students ourselves, it might be challenging to set aside our own preconceptions and ideas about the ideal functionality and appearance of such a product.
Company Objective & Purpose
Our team has established a clear objective and purpose to guide us throughout the project:
Objective: Developing an intuitive and user-friendly interface that serves as a liaison between the student and the educational institution's services.
Purpose: Ensuring that students have straightforward access to information, tasks, and assignments.
System Objectives
Primary Goal: To merge current college platforms, enabling students to quickly and easily carry out tasks and access information.
Supplementary Goals:
- Enhance the score of the user satisfaction survey.
- Minimize redundant interactions with technical support.
- Boost the rate of system sign-ins.
- Decrease the time taken to perform tasks.
- Lower the rate of user errors within the system.
Background
Competitors Analysis
After conducting an in-depth analysis of our competitors, which involves evaluating similar systems utilized by various institutions, we compiled our observations into advantages and disadvantages:
Advantages:
- Streamlined and user-friendly system navigation.
- Easily accessible contact details for lecturers.
- A calendar enriched with relevant and contextual details.
- Class-specific and time-specific file accessibility.
Disadvantages
- Placement of irrelevant information and actions in crucial areas of the system.
- Irrelevant information and actions for different user types (students/lecturers).
- Poor user interface and layout design.
System Value Addition
- Simplicity: Ensuring user-friendly access by highlighting key features prominently, thereby eliminating confusion.
- Efficiency: Reducing the volume of support requests, thus saving institutional resources.
- Certainty: Offering users a reliable and uniform experience across the system, thereby assuring them of their actions.
Key Success Factors
We have identified the following key performance indicators to evaluate the effectiveness of our design once it is launched and in use by students:
- Decrease the time it takes for lecturers to find information to 1.5 minutes or less.
- Shorten the time required to download presentations/documents to under 1.5 minutes.
- Reduce the time needed for submitting assignments and exercises to less than 2 minutes.
- Raise the average system satisfaction score to a minimum of 4 out of 5.
Stakeholders
Target Users: Students, faculty members, technical support staff.
Client Base: Higher education institutions such as universities and colleges.
Key Stakeholders: The founding team.
Research Methods
We recognized that thorough and in-depth research was crucial to inform the strategic decisions for our product.
To achieve this, we employed a mix of quantitative and qualitative research methods. This included conducting surveys to collect user feedback and performing in-depth interviews. We also engaged with the technical support team and several lecturers at our institution to gain comprehensive insights.
Our research methodologies were as follows:
User Feedback Survey with 50 Students: We collected both quantitative and qualitative data through an online survey, gathering a range of responses from a representative group of student users. This survey included both open-ended and closed-ended questions.
Interviews with 5 Students and 3 Lecturers: Conducted by a researcher, these interviews provided an opportunity for direct conversation with both students and lecturers. The aim was to delve deeper into their thoughts and opinions on the subject matter.
Qualitative Research - User Interviews
Initially, we prepared a set of questions to guide our discussions with students.
The feedback we received from students regarding their use of the current system included:
- "I've come to accept that this is the only system available to us, even though it's far from ideal."
- "The lack of clarity on what buttons do has been a major source of frustration for me; it feels like a waste of time."
- "Considering the high tuition fees, it's disheartening to deal with a system that doesn't meet expectations."
- "Getting enrolled in the courses has always been a struggle, almost like a constant battle."
- "Next time, I'll try to be more patient, but I'm not sure if there will be any improvements to look forward to."
Quantitative Research
Our survey was structured around two primary themes: actions and information items, with each theme containing 17 questions. For instance, within the "Actions" theme, one of the questions was about "Sending a message to a lecturer".
Participants were asked to evaluate each question based on two criteria: importance and frequency, with a rating scale from 1 (very low importance/frequency) to 4 (very high importance/frequency).
Based on the responses, we were able to determine the number of participants (N) who rated each item on the 1-4 scale. To accurately assess which items were deemed more important or more frequent, we employed a formula that expanded the scale to 200 (reflecting the total number of participants N=50, multiplied by the ratings from the 1-4 scale), allowing us to assign a numerical value to each item's significance or commonality.
The formula used was: [(N×value) + (N×value) + (N×value) + (N×value)] / maximum possible value
For instance, considering an information item based on its frequency: 15 participants rated it as 1, 17 participants rated it as 2, 11 participants rated it as 3, and 7 participants rated it as 4. Applying the formula = [SUM (15×1, 17×2, 11×3, 7×4)] / 200 = 0.5, we derived the average frequency score. This numerical value was then combined with the importance score (calculated using the same formula). The final ranking of items, based on the sum of frequency and importance scores, was organized in a hierarchical manner and visualized through color coding for easy interpretation.
Task Analysis
Upon collecting all the data from user interviews and feedback, we transformed this information into a task analysis framework. This approach allowed us to prioritize the data items and actions that our users are most likely to engage with on the platform, providing a clear overview of their needs and preferences.
Concepts
Concept 01
Action Flow
Downloading a presentation file of a certain course:
References:
College of Management Academic
- Each course is showcased with its essential information.
- The organization of information is logical and user-friendly.
Hadassah Academic College
- The design of the courses was straightforward and effective.
- Content was uniform across all courses.
Concept 01 Wireframe
Advantages
- The interface is straightforward, easy to navigate, and comprehensible.
- Displays the most pertinent information prominently.
- Aligns closely with real-world expectations.
- Maintains consistency and adheres to standards.
- Prioritizes recognition over memory recall.
- Offers flexibility and efficiency in usage.
- Provides accessible help and guidance.
Disadvantages
- The visual design might be overly simplistic or whimsical.
- Lacks detailed personal information about students, such as academic year, department, and grades.
Concept 02
Action Flow
Downloading a presentation file of a certain course:
References:
- Streamlined, minimalist, and contemporary dashboard interface.
- Well-defined structure with a dedicated profile section.
- Intuitive, easy-to-navigate, and clear layout.
- Incorporates a system-wide search feature and readily available support.
Concept 02 Wireframe
Advantages
- Effective presentation of information on the dashboard
- Efficient functionality of the search bar
- Highlighting of critical actions (e.g., course registration)
- User-friendly and efficient operation
- Attractive and minimalist interface
- Tailored solutions within the system
- Comprehensive help and guidelines
Disadvantages
- Potential for cognitive overload
- Possible confusion in navigation
- Adjustment period may be required
Concept Conclusion 🎉
We showcased both designs to our peers and instructor, gathering their insights.
Concept 01 – Preferred Choice
- Navigation was straightforward
- Visually appealing and welcoming
- Images facilitated clearer decision-making
- User-centric interface design
- Potential language barrier issues
- Possibly overly simplistic
Concept 02
- Visually attractive
- Clear separation of information and actions
- Positive reception of the search functionality
- Customizable layout was well-received
- Information on the dashboard led to some confusion
- Concerns raised about cognitive overload
Following this feedback, we refined and combined the most favored features and layouts from both concepts to create the final product version.
Next Steps
Questions currently beyond the scope of our process that need additional exploration include:
- Adapting the final design for compatibility with both mobile and tablet platforms.
- Developing comprehensive user journeys for all system actions and screens.
- Establishing additional Key Performance Indicators (KPIs) for further features.
- Conducting A/B tests for primary screens and functionalities.
- Developing a profile system for lecturers.
- Gathering feedback from a broader range of stakeholders.
Main Dilemmas
Research Methods and Approaches
In our quantitative survey distributed to participants, we aimed to collect detailed information while minimizing the risk of survey abandonment. To achieve this, we structured questions related to user experience on a scale from 1 to 4, rather than using open-ended questions, to avoid participant fatigue.
The choice of a four-point scale was deliberate, based on research methodology that suggests an even-numbered scale can mitigate the risk of neutral responses, which often yield inconclusive data. This approach encouraged participants to provide a more decisive response.
In our exploration of competitor systems to understand the landscape of student platforms, we initially lacked detailed knowledge about the technical aspects of these systems.
This gap in understanding prompted us to engage in qualitative research with our campus's online support team. Through this inquiry, we delved into the 'Moodle' system, discovering its widespread use across Israeli universities.
Design and User-Centered Design (UCD)
In developing our final design concepts, we faced the challenge of how to display courses occurring on the same day on the homepage.
While considering various options, including the use of distinct images for each course, we ultimately embraced the principle of 'Aesthetic and minimalist design.' This led us to present each course with its title and a unique color gradient for easy differentiation.
During peer reviews, concerns were raised about the potential for a high cognitive load due to an abundance of details.
To address this, we aimed to decrease the cognitive burden on the homepage by indicating the number of tasks, exams, and other obligations next to each title.
This design choice, rooted in the 'Visibility of System Status' principle, enables users to quickly ascertain their outstanding tasks without feeling overwhelmed by information.
Endorsements
Credits
Design Team
Shay Cohen Ambalo
Dana Sergeev
Noya Ariel
Omer Aviram