Crowd sourced evaluation process for mobile learning application quality

Asharul Islam Khan, Zuhoor Al-Khanjari, Mohamed Sarrab

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Crowd sourcing is a novel method for requirements elicitation, development, testing, and evaluation of software in a dynamic environment. Crowd sourced evaluation is a new technique to overcome the traditional methods where usually the co-presence of stakeholders is required during software quality evaluation. Learning using mobile devices takes place in a dynamic and heterogeneous environment where learners have their own learning styles and need. The distinguish characteristics of Mobile Learning (M-Learning) is accessibility at anyplace, anytime, and by anyone. Therefore evaluation of M-Learning application quality using traditional techniques such as interview is tedious, costly, and time consuming. Hence in this article Crowd sourced evaluation process for M-Learning application quality has been proposed. There are four important steps: Analysis and classification of M-Learning application, defining customized quality standard based on the M-Learning application category, categorization of Crowd such as users and application development team, and lastly use of Crowd sourcing platforms. The main idea behind the process is the involvement of end users in the assessment of M-Learning application quality rather than merely inviting experts to evaluate the application. The proposed approach is theoretical in nature and is based on the finding from the existing literature on Crowd sourcing, M-Learning, evaluation of M-Learning application, and software quality. The proposed approach would be implemented in future to test the feasibility in real situation.

Original languageEnglish
Title of host publicationProceedings - 2017 2nd International Conference on Information Systems Engineering, ICISE 2017
EditorsHoussain Kettani, Chen-Huei Chou
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1-5
Number of pages5
ISBN (Electronic)9781509048793
DOIs
Publication statusPublished - Dec 27 2017
Event2nd International Conference on Information Systems Engineering, ICISE 2017 - Charleston, United States
Duration: Apr 1 2017Apr 3 2017

Publication series

NameProceedings - 2017 2nd International Conference on Information Systems Engineering, ICISE 2017
Volume2017-January

Other

Other2nd International Conference on Information Systems Engineering, ICISE 2017
Country/TerritoryUnited States
CityCharleston
Period4/1/174/3/17

Keywords

  • Crowd sourcing
  • crowd sourced evaluation
  • mobile Learning quality
  • mobile learning
  • mobile learning application quality

ASJC Scopus subject areas

  • Hardware and Architecture
  • Information Systems
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Crowd sourced evaluation process for mobile learning application quality'. Together they form a unique fingerprint.

Cite this