elearning testing

Testing & QA of e-learning experiences across multiple platforms – Making remote learning effective

  • Mobile Testing, Performance Testing
  • No Comments
  • Qapitol QA

We learn every day. The eLearning medium is an interesting way to reinforce learning in a self-paced, easy to access and pick-where-you-left mode. We have come a long way from attending trainings at specific times and chosen instructors to switching content providers within minutes. With the attention span of people reduced to a few seconds, it is imperative to keep the audience engaged and encouraged to complete each learning block. As the technology continues to evolve, we are in an interesting phase of eLearning across different learning styles (audio, video, kinesthetic).

Qapitol QA had the opportunity to work with several leading e-learning organizations as their platforms testing partner. We employed creative experiential testing techniques and implemented automated solutions to add velocity to product releases. Some of the requirements, solutions and learning are outlined here.

Key challenges and solutions with eLearning platforms

Learning without restrictions – Compatibility testing to the rescue

The compatibility across a variety of devices, platforms and customized content is a real challenge to many eLearning platforms. With the audience accessing the same content from multiple devices, network conditions, tests on the right set of devices is essential.

Look, there is a mistake in the lesson itself! – Functional testing

Even a small functional issue is a big NO in the education sector as the trust factor for authentic and useful content is very high. Any miss there reduces the brand value immediately.

It just keeps loading – Performance tests are a must

As the customer base keeps loading, performance tests are no longer a luxury. They are a must have in your test suite to ensure that you continue to scale without any issues.

Mobile App Testing

Almost everyone has a smartphone and the testing across a variety of manufacturers along with OS versions and screen sizes is a problem to be solved for every business with mobile apps in the market. eLearning is no exception.

Automation in Testing

Some of the key challenges in eLearning testing like test data setup, executing cases across a variety of students, devices and modules can be solved by automating the checks. Be confident when new code is added as your regression is taken care of by automated checks.

Qapitol QA eLearning Testing Expertise

The testing teams are experienced in testing for key clients in the education sector. Focusing on the educational content for a variety of students across age groups and grades, the teams understand the importance of age wise content and difficulty levels. End to end testing of a student journey across the learning plans on a variety of platforms has added a lot of value add to the platforms.

Attention to detail, setting up the right set of test data, automating the checks that are best tested via automation, execution of test cases across platforms and configurations and ensuring that the load doesn’t affect the APIs or the UI layer has been a rich experience for the teams.

Overview of the different types of testing reports

The purpose of the testing reports is to help stakeholders take informed decisions about the products and projects. Any test report is complete only with the accompanying story behind the numbers. As numbers alone can be misinterpreted, we at Qapitol QA take care to provide the story along with the numbers in every report.

Following are some of the key test reports generated as part of our testing activities during the different stages:

  • Requirement Analysis: A questionnaire and understanding document is shared with the stakeholders which helps everyone be on the same page.
  • Daily Plan and Status Update: This is communicated via the medium setup for communication with the development team and POC. The update is shared at the end of the day with a list of tasks accomplished during the day, plan for the next day and any artefacts (bug reports, questionnaire, test reports) created during the testing sessions.
  • Feature Listing: Every feature is listed in a spreadsheet or mind map along with the variables involved so that the feature mapping can be done in a systematic way.
  • Test Cases: Based on the feature, the smoke suite, functional suite of test cases is written, reviewed internally and then shared to the stakeholders for their review. Based on the context, test cases are written for the different quality criteria.
  • Bug Reports: A consolidated list of bugs across the features will be shared filtered by severity and the current state (open, reopened, closed, to be verified) of the bugs.
  • Feature Test Reports: Every feature tested will have a testing report generated by the testing team. It is often similar to the test cases execution reports. It is also a collection of the reports generated during the testing sessions along with tracking of the builds on which the tests were executed.
  • Test Data sheets: We also track and document the test data used across tests for a variety of tests including functional, automation, performance and other quality criteria.
  • Weekly Testing Summary: This gives a summary of the testing activities conducted throughout the week, highlighting the corresponding task status, impediments and action points for stakeholder’s perusal.

Write to [email protected] for detailed case study.

Access our e-learning testing talent, automation tools and quality engineering expertise.

Author: Qapitol QA