ePST

Page 1


Executive Summary

BrightVision Consultants agreed to evaluate the current lessons of the “Electronic Problem-Solving Training” (ePST) developed at the University of Georgia. This online intervention aims to support survivors of traumatic brain injury (TBI) by enhancing their problem-solving skills. The project goal is to provide insight into the lessons' strengths and weaknesses and offer constructive suggestions for improvements. The evaluation focused on factors such as language quality, accessibility, user engagement, effectiveness of interactive elements, and the microlearning approach.

BrightVision Consultants were granted operational freedom regarding the deliverable containing evaluation feedback. The long-term goal is to utilize the feedback and evaluation tools provided by BrightVision Consultants to enhance the quality of both future microlearning modules and those currently in development for TBI survivors.

To meet the project goal, we developed an evaluation form with a mixedmethods data collection approach, presenting both quantitative and qualitative data. The evaluation questions were informed by Kirkpatrick’s Model, specifically Level One, focusing on learner engagement, and usability principles, emphasizing the learner experience, effectiveness, and satisfaction. Quantitative data was collected using survey questions with a Likert scale ranging from one (strongly disagree) to five (strongly agree). Qualitative data came from open-ended survey questions. These questions helped review the content and design for quality, relevance, and clarity, ease of navigation, visual appeal, microlearning approach, and effectiveness of the interactions. We captured experiential data from both mobile devices and desktop computers.

Executive Summary

Key Findings and Recommendations

• Strengths

High user engagement and effective interactive elements.

• Weaknesses

Identified areas for improvement in language clarity and visual appeal.

• Recommendations

Specific suggestions for enhancing language quality, accessibility, and interactive components.

Impact and Future Steps

The feedback provided will be instrumental in refining the current ePST modules and guiding the development of future microlearning modules for TBI survivors. The continuous improvement process aims to ensure that the ePST remains an effective and engaging tool for its users.

Content Evaluation

The reviewed lessons generally present high-quality and relevant content that aligns with the learning objectives and activities.

Some examples include:

1. In all of the lessons, the segmented structure is marked by visual graphics such as check marks, which help users track their progress effectively.

2. In all of the lessons, the use of a persona adds a personal touch and makes the content more relatable.

3. Several lessons feature introductory testimonials from individuals affected by Traumatic Brain Injury (TBI), which helps engage learners from the onset and makes the content more relatable.

4. Several lessons included music with logo intros and multimedia themes which enhanced the overall learning experience due to the engaging and dynamic presentation.

Content Evaluation

Language quality, accessibility, and user engagement are critical themes in the feedback.

Some examples include:

1. Issues with language quality were prevalent across all lessons.

2. There were significant grammatical and spelling errors across all lessons.

• Ensure thorough proofreading is performed to eliminate these errors.

Content Evaluation

3. The interactive elements contributed to effective user engagement; however, the implementation thereof was not always successful, especially on mobile devices.

• For example, the interactive elements in Lessons 2.6 and 3.4 were sometimes dysfunctional.

4. Several lessons included interactive quizzes or simulations that aligned well with the currently stated learning objectives; however, we do encourage revising the learning objectives.

Although constructive alignment exists between the assessments, knowledge checks, and learning objectives, best practices are not being followed.

The stated learning objectives are not measurable and rather resemble a list of instructions as opposed to a specific learning objective.

Using the image to the left as an example, the learning objective should be to ‘create a problem statement’. The numbered text that follows resembles instructions or an introduction to the section.

In summary, while the micro-learning lessons reviewed generally present highquality and relevant content, several areas require improvement to enhance language quality, accessibility, user engagement, and overall user experience. Clear instructions, sufficient support resources, thorough proofreading, functional interactive elements, and consistency in content delivery are crucial for improving the effectiveness and professionalism of these lessons.

Content Evaluation

Content curated for survivors of TBI

Content

Microlearning

UxD LxD

Micro-learning experience

Learning experience design

User experience design

Design Evaluation

The evaluated lessons exhibited aspects of both effective and ineffective learning design.

Positive aspects included effectively using multimedia elements such as images, audio, and video to enhance learning. The structure of many lessons was generally intuitive with ample opportunities for interactive engagement, which will be well-received by learners. The visual design of the lessons featured appealing layouts that aid in focusing on content. Elements like badges will increase learner engagement and provide a motivational impact.

In addition, the microlearning format, with its bite-sized chunks of information and interactive elements, is effective and will be appreciated by learners. This approach makes learning more manageable and allows learners to consume content at their own pace.

Design Evaluation

However, we highlighted several common design issues across multiple lessons:

1. Navigation problems were a recurring concern, especially on mobile devices.

• We frequently encountered difficulties locating and interacting with elements referenced in the content, such as arrows and tabs.

• Instructions were frequently unclear with inconsistent navigational clues, which may lead to confusion among learners.

• For example, in Lesson 2.1, the instructions indicate to select each tab for more information, however there is not a tab interaction present.

Design Evaluation

2. Inconsistencies in design elements will potentially cause frustration among learners and fall short of compliance with 508 accessibility standards.

• These inconsistencies included varying font sizes, alignment, and color schemes which impacted the readability and aesthetic coherence of the learning materials, specifically for those with low vision or color blindness.

• Improving accessibility features for learners with disabilities, including those with fine motor skill challenges or traumatic brain injury (TBI), is crucial. Ensuring navigable elements across various devices will create a more inclusive learning environment.

• For example, the font sizes in the screenshot below would be considered too small for a learner with low vision.

3. Functionality issues also detracted from the overall user experience.

• These included malfunctioning interactive elements, problems with scrolling functions, and issues with multimedia interactions on mobile devices.

4. In some cases, AI voices and facial expressions did not align with the tone of the content, which may be distracting for learners.

Design Evaluation

Despite these challenges, the structured approach of the lessons and the effectiveness of interactive elements when functioning correctly contribute significantly to the overall effectiveness and engagement of the learning experience. While the microlearning approach shows promise in engaging learners and delivering content effectively, the persistent usability and technical issues undermine its full potential.

Addressing these challenges such as improving navigation consistency, optimizing design for mobile usability, and ensuring technical functionalities work seamlessly across devices is crucial for enhancing user experience, maximizing the effectiveness of the microlearning format, and increasing learner satisfaction. Additionally, maintaining accessibility standards across all aspects of the lessons will improve inclusivity and accommodate diverse learner needs.

Recommendations

Several key recommendations emerged from the feedback across various lessons to improve the overall quality and user experience of the lessons. These recommendations focus on enhancing clarity, consistency, and interactivity.

Throughout the lessons, unclear instructions and content should be revised to ensure they are straightforward and easy to follow.

Specific recommendations include:

1. Avoid references to non-existent features.

• For example, in the screenshot below from Lesson 2.1, the learner is told to click and open tabs that do not exist in the lesson interface and there is also an error in the lesson number.

Recommendations

2. Use clear, actionable language.

3. Make specific phrases clearer and more concise.

• For example, change "how to flex in problem-solving" to "how to be flexible when problem-solving.“

4. Maintain consistency in design elements, such as font sizes, button styles, and overall visual style to enhance readability and user experience.

• For example, in the screenshot below, the font size is too small for readability on mobile.

5. Develop a style guide to ensure consistency of specific sections in a lesson.

Recommendations

6. Improve and standardize navigational elements to ensure intuitive and consistent cues throughout the lessons.

• This includes addressing navigation issues, particularly on mobile devices.

7. Review mobile compatibility to ensure that interactive elements and multimedia components work seamlessly across different platforms.

• This includes addressing functional issues, optimizing text size, and interactive elements for mobile devices.

• The screenshot below illustrates a common interaction in the lessons that, when used on mobile, has a non-intuitive mechanic that makes moving to the next slide difficult.

Recommendations

8. Refine interactive scenarios to prevent premature progression or confusion among learners.

• This includes simplifying navigation between sections and introducing interactive elements early to maintain learner motivation.

9. Increase the frequency of knowledge checks to reinforce learning and provide additional support.

10. Provide detailed and helpful feedback for assessment questions to guide learners on how to improve and reinforce key concepts.

11. Proofread to eliminate grammatical and spelling errors and ensure all content is accurate and clear.

12. Ensure that all video content includes closed captions to maintain consistency and accessibility across the lessons.

13. Increase the availability of help resources and documentation within the lessons to assist users, reduce frustration, and enhance learning outcomes.

14. Consider changing the PDF to an editable Google document on downloading. This will assist mobile users who might have problems downloading and viewing the PDF. If the remains, provide clear instructions on how mobile users can download, view, and return to the lesson.

15. Improve the quality and naturalness of the voiceovers and align facial expressions and body language with tone.

Recommendations

16. Consider having lesson titles remain consistent and avoid arbitrary division into parts (e.g., refrain from using "3.1, Part 1" and "3.1, Part 2" if lessons are not originally segmented into parts within the initial course structure).

17. Generally, micro-learning lessons only have one outcome per microlearning lesson, thus, consider reducing the number of outcomes to one.

By addressing these recommendations, the lessons can be significantly improved, offering a more cohesive, intuitive, and engaging learning experience for all users.

Additional Areas for Evaluation

The evaluation of the training lessons identified several key areas for improvement to enhance its overall quality and effectiveness. Recommendations focus on user engagement, content clarity, instructional design, and technical functionality.

In addition, there are several other areas to consider for evaluation:

1. Consider only providing a time estimate for the full lesson.

• It is not necessary to provide time estimates for individual sections of the lesson, as this is extraneous information that may confuse learners.

• Ensure time estimates for lessons are realistic and achievable for the desired audience.

2. In the knowledge checks and assessments, provide detailed feedback that clarifies why answers are correct or incorrect to enhance comprehension.

3. Ensure that the learning objectives are measurable and that knowledge checks, assessments, and feedback align with the learning objectives.

• Avoid using words such as “learn,” “understand,” or “know.”

• Formulate learning objectives as actionable rather than partial ideas.

4. User experience should be carefully reviewed and improved.

• Consider including participants from the desired audience in user experience testing to collect feedback to help identify and address any points of difficulty.

Additional Areas for Evaluation

In conclusion, the training lessons have strong potential. By addressing these areas for improvement, the lessons can be significantly enhanced, leading to a better user experience and more effective learning outcomes. The next steps include a comprehensive content review, resolving technical issues, enhancing interactive elements, and ensuring alignment with learning objectives. Continuous user feedback mechanisms will help refine the lessons for an even more effective and engaging learning experience.

Appendix A. Original client request:

Background

I'm in the UGA CoE in Workforce Ed and Instructional Tech, specifically in Learning, Design, and Technology: https://coe.uga.edu/academics/concentrations/learning-design-andtechnology/

Slack page for ePST UGA project: https://app.slack.com/client/T01KPK0RZPV/C078QSAD4BE

The primary objective for your team is to evaluate the existing ePST modules. Because the microlearning modules are in active development, you will need to request the latest versions from Yueqi Weng, the PhD student who is designing them. I have informed her about your work and copied her here. As to the scope of the evaluation, specifically, I would like you to:

Review the Content: Assess the quality, relevance, and clarity of the information presented in the modules to the best of your abilities. Provide feedback on any areas that might require revision or enhancement, particularly focusing on language quality, accessibility, and user engagement. Evaluate the Design: Analyze the learning design and user interface of the modules. Consider aspects such as ease of navigation, visual appeal, microlearning approach, and the effectiveness of the interactive elements. Provide Recommendations: Based on your evaluations, offer constructive suggestions for improvements. This could include recommendations for content adjustments, design changes, or the addition of new features to enhance the learning experience.

Identify Additional Areas for Evaluation: Be proactive and use your knowledge and skills to identify other potential areas where evaluation would be useful. Your fresh perspective is invaluable in uncovering aspects we might have overlooked.

Your team will not be responsible for creating new modules. However, your insights and feedback will be invaluable for refining and improving the existing content.

I am available via email or via Slack (I added all of your team to the "epstevaluation" channel) to discuss this further and answer any questions you might have. Unfortunately, I am traveling extensively this month, so setting up a Zoom meeting is not going to work for me in the short term. Longer term, I am available after Jul. 8. I am happy to provide detailed responses to any questions you have.

Looking forward to working with you and your team!

Dr. S.

Appendix B. The Core Team:

Julie

Wyatt, Project Manager

Julie holds a M.P.A from the University of North Georgia and is now pursuing her M. Ed. for Instructional Design and Development at the University of Georgia. Julie is currently an academic advisor at Kennesaw State University and plans on developing e-learning modules to support higher education departmental trainings. She enjoys breaking down complex ideas into easily understandable material for adult learners and takes pride in her organizational skills.

Brooke Oliver, Instructional Designer

Brooke holds a Bachelor of Science in Interaction Design and a Minor in Technical Communication from Kennesaw State University. She is now pursuing her M.Ed. In Instructional Design and Development at the University of Georgia. Brooke is a full-time Instructional Designer and Developer in the Medical Device Education industry. Brooke’s area of expertise is designing and developing interactive eLearning and continuing education modules for healthcare professionals.

Christelle de Beer, Instructional Designer

Christelle holds a Bachelor of Commerce in Industrial and Organizational Psychology, a Bachelor of Arts in Psychology, a Postgraduate Diploma in Education Technology, and is currently pursuing an M. Ed. In Learning, Design, and Technology with an emphasis on Instructional Design and Development from the University of Georgia. Christelle is currently a Learning Experience Designer at the North-West University in South Africa. She specializes in online training and development and content creation.

Appendix B. The Core Team:

Jennifer Hoosier, Instructional Designer

Jennifer holds an MA in TESOL from Azusa Pacific University and is currently pursuing an M.Ed. in Learning, Design, and Technology with an emphasis on Instructional Design and Development from the University of Georgia. Jennifer has extensive experience with international students and second language learners and has been in higher education administration since 2017. Her professional work focuses on creating training that is both highinterest and pedagogically sound.

Maria Paula Borras-Patino, Instructional Designer

Paula received her BA in Financial Engineering from UNAB in Colombia in 1999. After working in the banking industry, she transitioned to HR and Safety, which served as a bridge into the training world. She is currently pursuing her M.Ed. in Learning, Design, and Technology with an emphasis in Instructional Design and Development from the University of Georgia, with an expected graduation in December 2024. Paula now works for Coca-Cola as a full-time Training Consultant. She enjoys working with SMEs and excels in active listening, conflict resolution, and fostering a positive and engaging work environment.

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix C. Quantitative Data:

Appendix D. Qualitative Data:

Overview Summary

2.1: Assess the Problem

Overall, the lessons are praised for their clear, engaging, and well-structured content, with robust support for learning outcomes through effective multimedia use and interactive engagement. However, to create a more comprehensive and user-friendly learning environment, it is crucial to address the weaknesses identified in support documentation, grammatical accuracy, navigational aids, and instructional clarity. Enhancing these aspects will significantly improve the overall user experience and learning effectiveness.

2.2: Brainstorm Solutions

The lesson content excels in clarity, engagement, and alignment with learning objectives, as evidenced by high mean scores (4.40-4.57) and low standard deviations (0.53-0.89) for these criteria, indicating consistency among reviewers. However, the availability of help or documentation needs improvement, with a low mean score of 2.25 and a higher standard deviation of 1.31, showing variability in reviewers' perceptions. Design elements received high marks for visual appeal and structure, with mean scores of 4.00-4.33 and standard deviations of 0.52-0.90, but there were significant differences in ease of navigation (mean score 2.75, standard deviation 1.71) and clarity of instructions (mean score 2.50, standard deviation 1.89), indicating areas for improvement. While multimedia usage scored well (mean 4.25, standard deviation 0.83), some variability in its impact was noted. Consistent high scores for assessment alignment and engagement opportunities (mean scores of 5.00 and 4.25, respectively, with low standard deviations) demonstrate robust support for learning outcomes. Overall, the lesson is well-received for its clear, engaging, and well-structured content, but enhancing navigational aids and instructions, as indicated by the statistical data, will improve the user experience. Addressing these weaknesses will create a more comprehensive and user-friendly learning environment.

The lesson content is presented clearly and is engaging, with high mean scores (4.50-3.71) and modes of 5, indicating strong reviewer agreement. However, the availability of help or documentation is a weakness, reflected in a low mean score of 2.00 and a high standard deviation of 1.31, showing variability in perceptions. The design elements are generally well-received, with high scores for multimedia usage (mean 4.25) and structure (mean 3.71), but ease of navigation and clarity of instructions have lower mean scores (3.14 and 3.57) and higher variability, indicating areas needing improvement. Consistent high scores for assessment alignment and engagement opportunities (mean 4.25, mode 5) demonstrate robust support for learning outcomes. Overall, the lesson is praised for its clarity, engagement, and structured content, but enhancing help documentation and navigation, as indicated by statistical data, will improve the user experience. Addressing these weaknesses will create a more comprehensive and user-friendly learning environment.

Appendix D. Qualitative Data:

2.3: Consider and Choose

The lesson content is clearly presented and engaging, with mean scores ranging from 3.43 to 4.43 and low standard deviations (0.79 to 1.50), indicating strong reviewer agreement. However, the availability of help or documentation is a notable weakness, with a low mean score of 2.50 and a higher standard deviation of 1.31, showing variability in reviewers' perceptions. The design elements are well-received overall, with high scores for multimedia usage and structure (mean scores of 4.57 and 4.57) and low standard deviations (0.53), but ease of navigation (mean score 3.00, standard deviation 1.12) and clarity of instructions (mean score 3.71, standard deviation 0.95) require improvement. Consistent high scores for assessment alignment and engagement opportunities (mean scores of 4.25 and 4.25) demonstrate robust support for learning outcomes. The lesson is praised for its clarity, engagement, and structured content, but enhancing documentation and navigation, as indicated by the statistical data, will improve the user experience. Addressing these weaknesses will create a more comprehensive and user-friendly learning environment.

2.4: Develop and Do

The content in the "Develop and Do" lesson is generally clear and understandable, with mean scores of 3.43 to 3.71 and low standard deviations (0.79 to 1.70), indicating consistent reviewer feedback. Opportunities for active engagement are well-integrated and focused on learning objectives, scoring high across the board (mean of 4.50). However, the availability of sufficient help or documentation remains a weakness, as reflected by a low mean score of 2.50 and higher variability (standard deviation of 1.58). Design elements are highly effective, with multimedia use and structure scoring consistently high (mean of 4.00 to 4.57) and low standard deviations (0.79 to 1.00). Navigation ease, however, shows room for improvement, with a lower mean score of 2.75 and a higher standard deviation (1.32). Reviewers consistently agree that the lesson's visual design, layout, and instructional clarity enhance the learning experience, but more accessible help documentation and navigation improvements will further benefit users.

In the "Evaluate the Outcome" lesson, content is presented clearly and understandably, as reflected by a high mean score of 4.29 and a standard deviation of 0.95. Active engagement opportunities are highly effective, with perfect mean scores of 5.00 and zero variability. Sufficient help documentation is rated lower, with a mean of 3.50, indicating a need for improvement. Measurability of learning objectives is consistent, but only moderately rated at a mean of 3.00. While grammatical accuracy is generally good (mean of 3.25), the higher variability (standard deviation of 1.29) suggests inconsistent quality. Design-wise, multimedia use and lesson structure are rated highly (mean scores of 4.50 and 4.25, respectively), though visual design variability (mean of 3.50 and standard deviation of 0.87) indicates room for enhancement. Navigation and instructional clarity show mixed feedback, averaging 3.75 and 3.75, respectively.

2.5: Evaluate the Outcome

Overall, the lesson effectively engages learners and supports learning objectives, but improvements in help documentation and visual consistency could further enhance the user experience.

Appendix D. Qualitative Data:

In the "Flexibility in Problem Solving" lesson, the content is generally clear and understandable, with a mean score of 3.86 and a standard deviation of 0.98, showing consistent positive feedback. Opportunities for active engagement are rated highly, with a mean score of 4.25, although help documentation needs improvement, indicated by a lower mean of 3.50. Measurability of learning objectives is stable but moderate, with a consistent mean of 3.00. Grammatical accuracy is the weakest area, with a mean score of 2.00 and no variability, suggesting a consistent issue. The assessment questions focusing on learning objectives and content support both score moderately, with mean scores of 3.50 and 4.25, respectively.

2.6: Flexibility in Problem Solving

Design elements are highly praised, particularly the use of multimedia and overall structure, both scoring a mean of 4.50. Visual design and navigation also receive positive feedback, averaging 4.25 and 3.75, respectively. The time estimate accuracy is slightly lower at 4.00, while instructional clarity and opportunities for interactive engagement are rated positively, with mean scores of 4.25 each. Overall, the lesson excels in engaging and visually appealing design, but addressing grammatical issues and enhancing help documentation could further improve its effectiveness.

3.1, Part 1: Assess the Problem

In the "Assess the Problem In Depth" lesson, the content is generally clear and understandable, with a mean score of 3.82 and a standard deviation of 1.01, indicating consistent positive feedback. Opportunities for active engagement are highly rated, with a mean score of 4.25, though help documentation is seen as somewhat variable, with a mean of 3.50 and a standard deviation of 1.08. The measurability of learning objectives is stable but moderate, consistently scoring 3.00. Grammatical accuracy is the weakest aspect, with a low mean score of 1.75 and minimal variability, suggesting a persistent issue. The assessment questions focusing on learning objectives and content support both score moderately, with mean scores of 4.25 each.

In terms of design, multimedia elements are effectively used, scoring a mean of 4.00. The overall structure and visual design are generally well-received, with mean scores of 3.00 and 4.00 respectively. Navigation ease is rated lower at 3.25, with significant variability. The accuracy of time estimates scores moderately, with a mean of 3.50, while clarity of instructions is a notable weakness, with a low mean score of 2.00. Opportunities for interactive engagement receive positive feedback, with a high mean score of 4.75. Overall, the lesson excels in engagement and multimedia usage, but addressing grammatical issues and improving instructional clarity could enhance its effectiveness.

3.1, Part 2: Creating a Problem Statement

The findings reveal that the lesson content and design generally meet the learning objectives with some areas for improvement. Content clarity scored highly, with a mean of 4.00 and a median of 5.00, indicating strong comprehension. Active engagement opportunities also received high marks, with a mean of 4.75, suggesting the lesson effectively supports interactive learning. However, the presence of grammatical or spelling errors was a noted issue, scoring a low mean of 2.25. In terms of design, the effective use of multimedia elements received a mean of 4.50, enhancing the learning experience. Navigation ease, however, had a lower mean of 2.75, indicating potential user experience issues. Overall, the lesson structure was deemed intuitive (mean of 3.25), but clarity of instructions could be improved, as reflected by a mean score of 2.00. These statistical data highlight the lesson’s strengths in engagement and multimedia use, while pointing out specific areas needing enhancement in grammatical accuracy and navigational ease.

In Depth

Appendix D. Qualitative Data:

3.2, Part 1: Effective Brainstorming Techniques

The findings from the evaluation of the "Effective Brainstorming Techniques" lesson indicate that the content is generally well-received, with clear presentation (mean of 4.25) and strong support for learning objectives through active engagement (mean of 4.25). However, the availability of sufficient help or documentation scored lower (mean of 2.50), indicating a need for more supportive resources. The lesson also faced issues with grammatical accuracy (mean of 2.25). In terms of design, the effective use of multimedia elements received high marks (mean of 4.75), enhancing the overall learning experience. The intuitive structure of the lesson and the visual design were positively rated, both with a mean of 4.25. However, navigation ease had a mean score of 3.00, suggesting room for improvement. Overall, the lesson’s design elements were well-regarded, but the content could benefit from additional documentation and error-free presentation to further enhance effectiveness.

3.2, Part 2: Considering Inaction as a viable option

The findings from the evaluation of the lesson "Considering Inaction as a Viable Option" highlight that while the content is generally clear and understandable (mean of 3.67), there are significant areas for improvement. The opportunities for active engagement were highly rated (mean of 4.67), indicating that interactive elements effectively support the learning objectives. However, the availability of sufficient help or documentation scored lower (mean of 2.67), and grammatical accuracy was a concern with a mean of 2.00. In terms of design, the use of multimedia elements received excellent feedback (mean of 5.00), enhancing the learning experience. The overall structure and visual design were well-regarded, both with a mean of 3.67, but navigation ease had a lower mean of 3.00, suggesting some usability issues. Additionally, the clarity of instructions varied (mean of 2.67), indicating room for improvement in preventing confusion. Overall, while the lesson excels in engagement and multimedia use, it would benefit from improved documentation, error-free content, and more intuitive navigation.

3.3: Make informed choices

The findings from the evaluation of the "Make Informed Choices" lesson indicate that the content is generally clear and understandable (mean of 3.71). Opportunities for active engagement were consistently rated positively (mean of 4.00), highlighting effective interactive elements. However, the availability of sufficient help or documentation received a lower score (mean of 3.00), and grammatical accuracy also needs improvement, with a mean score of 2.00. The design aspects of the lesson were highly rated, particularly the use of multimedia elements (mean of 4.50) and the overall structure (mean of 4.00), which were found to be intuitive and visually appealing. Navigation ease had a slightly lower mean of 3.50, suggesting some usability concerns. Additionally, clarity of instructions varied (mean of 2.50), indicating a need for clearer guidance. Overall, the lesson excels in engagement and design but could benefit from enhanced documentation and grammatical accuracy to further improve the learning experience.

Appendix D. Qualitative Data:

3.4: Detailed planning and execution

The evaluation of the "Detailed Planning and Execution" lesson reveals that the content is generally well-received for clarity and engagement, with mean scores of 3.71 and 4.33, respectively. However, there is a notable need for better support documentation (mean of 3.00) and grammatical accuracy (mean of 1.67). The lesson's design is praised for its effective use of multimedia elements (mean of 4.50) and overall intuitive structure (mean of 3.67), which enhances the learning experience. Navigation ease scored a mean of 3.50, indicating some room for improvement in user experience. Additionally, while the lesson's visual design was appealing (mean of 4.25), the clarity of instructions varied (mean of 2.50), suggesting a need for clearer guidance. Overall, the lesson is strong in engagement and multimedia use but could benefit from improved documentation, grammatical accuracy, and clearer instructions to further enhance the learning experience.

Appendix E. Recommendations

Lesson 2.1 Assess The Problem

• Conflictive numerations. It says "Welcome to Lesson 3.1" but it should be 2.1

• The "Select each tab below for more information" is a non-existing feature.

2.1 Slide 3

Appendix E. Recommendations

Lesson 2.1 Assess The Problem

• The signal principle is used effectively.

• Recommendation: The color contrast could be enhanced for the signal words.

2.1 Slide 6

• This video explains how to go to the next slide. Recommendation: If the down arrow has more contrast there would not be the need to show this video.

• The screencast is somewhat busy/distracting.

2.1 Slide 6, at the end of the tutorial.

Appendix E. Recommendations

Lesson 2.1 Assess The Problem

• Aesthetics: Alignment and font size.

• Instruction on "click continue…" could be closer to Ruth to avoid clicking the arrow and miss the assess the problem information.

2.1 Slide 7

• Spatial contiguity could be enhanced by moving the options close to the question.

• Aesthetics Recommendation: Make answer boxes the same size.

2.1 Slide 7

Appendix E. Recommendations

Lesson 2.1 Assess The Problem

• Recommendation: Remove the continue button and add the next step as part of the conversation, for example Ruth may say "ok, here is a summary of all steps" and show the list. Having continue next to the down arrow could send the user to the wrong slide.

2.1 Slide 7

Recommendation: Remove "at once" from title.

2.1 Slide 7

Appendix E. Recommendations

Lesson 2.1 Assess The Problem

• The back arrow takes you out of the conversation in slide 7 into slide 6

• Recommendation: There is an opportunity to include a practice here.

• Recommendation: Make fonts size more homogeneous.

• Design feels inconsistent (including fonts size).

• Punctuation could be modified to make the computer voice sound better.

2.1 Slide 7

Lesson 2.2 Brainstorm Solutions Appendix E. Recommendations

• This is a better way to give the instruction of selecting the bottom right arrow to continue, compare to the one used in 2.1

• Design Feels inconsistent.

• Font size is better in 2.2 2.2

• The A-B-C-D-E-F concept was never described before.

Appendix E. Recommendations

Lesson 2.2 Brainstorm Solutions

• Recommendation: Rephrase- “They might help you overcome those barriers.”

2.2

Recommendation: Consider adjusting ratio of fonts in box 1 and 2The font is too small.

2.2

Appendix E. Recommendations

Lesson 2.2 Brainstorm Solutions

• Recommendation: Change punctuation to make it sound better. 2.2

• Recommendation: Consider narrowing the space between the person and the text.

2.2

Appendix E. Recommendations

Lesson 2.2 Brainstorm Solutions

• Make complete sentences.

• Recommendation: Include the word ideas in 1. List as many "IDEAS" as you can.

2.2

Appendix E. Recommendations

Lesson 2.3 Consider and Choose

• The font is bigger on 3rd option.

2.3

• Recommendation: Consider rephrasing the text. “Jane is an individual with TBI. She feels frustrated when things are out of her control or schedule. Let's hear her story."

2.3 Video in slide 10.

Appendix E. Recommendations

Lesson 2.3 Consider and Choose

• Recommendation: Consider rephrasing the text for the computer voice to sound more natural. “Yesterday, I felt accomplished because I completed many tasks from my checklist. I enjoy having this feeling throughout my day."

Video in Lesson 2.3 slide 10.

• Pop up window feels inconsistent with the previous visual design.

• Recommendation: Consider adding a transparency banner similar to the one use in the previous slide to maintain consistency.

Video in Lesson 2.3 slide 10.

Appendix E. Recommendations

Lesson 2.3 Consider and Choose

• Recommendation: Consider rephrasing

• Make complete sentence by adding a period at the end of each sentence.

2.3

• Typo: sped to spend.

• Add periods at the end of each sentence.

2.3

Appendix E. Recommendations

Lesson 2.3 Consider and Choose

• Recommendation: Consider rephrasing “ I will have less stress and improve my well being.” and “I will have to accepts the fact that it might not happen every day.”

• Recommendation: Rephrase #5 to: "Is it difficult to do it?".

Appendix E. Recommendations

Lesson 2.3 Consider and Choose

• Recommendation: Congratulation! You earn a badge for lesson 2.3!

• Consider rephrasing: "you are able" to "In this lesson, you learned to:"

Lesson 2.4: Develop and Do

• Recommendation: Use same size as in the previous introductory slides.

2.4

• First screen shot is the end of lesson 2.3 Where is stated that lesson 2.4 name is "Evaluate the outcome." but the name of lesson 2.4 is "Develop and do."

• Recommendation: Title Case for titles.

2.4

Lesson 2.4: Develop and Do

• Recommendation: Rephrase to “Think about what the outcomes will be if you achieve your goal fully, partially, or not at all."

• Incomplete sentences.

2.4

• Step 2 - wrong title

• Recommendation: Rephrase step to: How to develop and Execute.

2.4

Lesson 2.4: Develop and Do

• Select (the) Continue button below… or Select “continue” to talk with Ruth…

2.4

• Incomplete sentences. Recommendation: Rephrase them.

2.4

Appendix E. Recommendations

Lesson 2.4: Develop and Do

• Recommendation: Provide the option to go back.

• Recommendation: Rephrase “Does it difficult to do it? “To Does it seem difficult to do? Or Is it difficult to do?

2.4

• Recommendation: Move the note pad and person to the left to unclutter the right side. 2.4

Lesson 2.4: Develop and Do

• Recommendation: Title Case for “How to Evaluates the Outcome” 2.4

• Recommendation: Title Case for Learning Assessment

• Typo on 3rd black box: Explain how he will determine if "the"… 2.4

Appendix E. Recommendations

Lesson 2.5: Evaluate the Outcome

Recommendation: Here is an opportunity for recognition not recall. Perhaps showing her quote or just removing the sentence that mentions the quote. Based on Nielsen' 10 usability heuristics- Recognition rather than recall.

2.5

• The steps seem to be in incorrect order. Step one should be step 3.

• Recommendation: Title case.

2.5 Slide 6.

Appendix E. Recommendations

Lesson 2.5: Evaluate the Outcome

• Recommendation: Change font size to a bigger one.

2.5 Slide 7

Recommendation: Rephase sentence. Recommendation: remove” that” from: So, I told my colleagues “that”

2.5

Appendix E. Recommendations

Lesson 2.5: Evaluate the Outcome

• Recommendation: Keep consistent the deign of the pop-up boxes.

2.5 Story in slide 7

Appendix E. Recommendations

Lesson 2.5: Evaluate the Outcome

• Recommendation: Change the size of the of font. In the current example it is too small

2.5 Story in slide 7

Appendix E. Recommendations

Lesson 2.6: Flexibility in Problem-Solving

• Recommendation: Change flex for flexible.

2.6

• Recommendation: add “ the” to all mid-banners in slide 6 or remove “button”. 2.6

Appendix E. Recommendations

Lesson 2.6: Flexibility in Problem-Solving

• Recommendation: Change font size to a bigger one.

2.6 Slide 8

• Recommendation: Rephrase to "It's really good to let Jeremy keep trying to figure out how to handle his emotions on his own. But what if he's still having a tough time with it?"

2.6 Slide 8

Appendix E. Recommendations

Lesson 2.6: Flexibility in Problem-Solving

• Recommendation: Aesthetically, there is a lot of white space underneath the character. Recommendation: Scroll the character down so that her head is not cut off, creating a better sense of space.

Appendix E. Recommendations

Lesson 3.1 Part 1: Assess the Problem in Depth Using the "5 W’s"

• The first screenshot is from lesson 3.1 slide 7 and the second screenshoot is from lesson 2.5 slide 7. Design consistency is affected in these slides. The Player and back and next buttons are different.

• Recommendation: Maintain player design consistency.

3.1 slide 7

Appendix E. Recommendations

Lesson 3.1 Part 1: Assess the Problem in Depth Using the "5 W’s"

• Recommendation: Add word “a” to “I'm going to introduce to you the importance of a detailed problem statement.”

3.1.1

• Player design is not consistent.

• Player is missing the back and forth arrows.

3.1.1 Slide 7

Appendix E. Recommendations

Lesson 3.1 Part 1: Assess the Problem in Depth Using the "5 W’s"

• Next button does not work.

3.1.1 Slide 12

• Recommendation: Align text inside brown boxes.

3.1.1 Slide 13

Appendix E. Recommendations

Lesson 3.1 Part 2: Creating a Problem Statement

• First screenshot above is from lesson 3.1 part 2, compared to bottom screen shot from 3.1 part 1. The slide side bar is no longer part of the design. This bar is a great addition, and we recommend keeping it. It allows the user to navigate freely throughout the module and informs the user as to how much has it been completed/or is left to complete.

• Also the up and down arrows were removed.

• All subsequent modules and slides from here on have omitted the bar and up and down arrow.

3.1, Part 2

Lesson 3.1

Statement Appendix E. Recommendations

Part 2: Creating a Problem

• This should read: Welcome to Lesson 3.1 Part 2, not Part 1

3.1, Part 2

• Recommendation: remove the 'Select Next' tab as it creates a cluttered design, especially since the 'NEXT>' option is already available just below.

2.1 Slide 7

Appendix E. Recommendations

Lesson 3.1 Part 2: Creating a Problem Statement

• Arrows are no longer there, therefore video showing how to advance with the arrow is irrelevant. Recommendation to add the arrow as sometimes scrolling down does not always work.

3.1 Part 2

• Font is too small and difficult to read. Recommendation: Remove/vanish Ruth to make room for the activity information.

3.1 Part 2

Appendix E. Recommendations

Lesson

3.1 Part 2: Creating a Problem Statement

• The initial design of this screen is not consistent with previous ones. Second screenshot.

• Recommendation: Title case for this title. First screenshot

3.1 Part 2

Appendix E. Recommendations

Lesson 3.1 Part 2: Creating a Problem Statement

Recommendation to rephrase: Great job so far; you have thoroughly assessed the problem. Now, let's help others assess their problems.

3.1 Part 2

Appendix E. Recommendations

Lesson 3.1 Part 2: Creating a Problem Statement

• Aesthetics: Alignment and font size.

• Instruction on "click continue…" could be closer to Ruth to avoid clicking the arrow and miss the assess the problem information.

3.1 Part 2

Lesson 3.1 Part 2: Creating a Problem Statement

• 3.1 Part 2 the font size is not consistent. Compare 2.1 to font in 3.1 part 2. 3.1 Part 2

Appendix E. Recommendations

Lesson 3.2 Part 1: Effective Brainstoming Techniques

• Add "s" to Technique_.

3.2. Part 1

• Recommendation: There is a community always ready to support you/ or There is a community that is always ready to support you.

3.2. Part 1

Lesson 3.2 Part 1: Effective Brainstoming Techniques

• Add "s" to Technique_

3.2. Part 1

• Recommendation: Slow down pace of video.

3.2. Part 1

Appendix E. Recommendations

Lesson 3.2 Part 1: Effective Brainstoming Techniques

• Recommendation: Add "s" to ones: in Result 1: "Carefully consider the pros and cons of each solution you come up with so that you can save time by not including unhelpful ones later.

3.2. Part 1

• Recommendation add: "s" to Technique_

3.2. Part 1

Appendix E. Recommendations

Lesson 3.2 Part 1:Considering Inaction as a Viable Option

• Recommendation: Should the title be "Doing Nothing Strategy"? See Red box.

3.2 Part 2

Recommendation: Center top text box above small boxes Match font size of small text boxes.

3.2 Part 2

Lesson 3.2 Part 1:Considering Inaction as a Viable Option

• Recommendation: Enable previous button

3.2 Part 2

Appendix E. Recommendations

Lesson 3.3: Make Informed Choices

• Recommendation: Add: "an" In this lesson, you will learn how to make an informed choice by:

Appendix E. Recommendations

Lesson 3.3: Make Informed Choices

• Recommendation: Rephrase to: You have completed this section.

• Recommendation: Enlarge font to information shown in the red box.

Lesson 3.3: Make Informed Choices

• Recommendation: Rephase to “Making an Informed Choice “or Making Informed Choices.

Lesson 3.4: Detailed Planning and Execution

• Recommendation: Verify if the intent is having that questions (in the red box) come from the user and not from Taylor.

Lesson 3.4: Detailed Planning and Execution

• Recommendation:… will you provide to him?

• Recommendation: Take it easy, you may try more than once to solve the problem.

3.4

Recommended rephrasing: #1 to: "Get details and organize your thoughts as much as possible.“ #2 to: "Be concrete with the plan for you to follow through." #3 to: "Decide what could happen based on the achievement of your goal." or" Decide what will happen based on the achievement of your goal."

Rephrase #4 to : "Try it once you have finished the plan of action."

3.4

Appendix F. ePST

ePST Mobile Evaluation Form

Appendix F. ePST Mobile Evaluation

Appendix F. ePST Mobile Evaluation

Appendix F. ePST Mobile Evaluation Form

Appendix F. ePST Mobile Evaluation Form

Appendix F. ePST

Appendix F. ePST Mobile Evaluation

Appendix F. ePST Mobile Evaluation

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.
ePST by PaulaBorras - Issuu