Published in

2009 Ninth International Conference on Quality Software

DOI: 10.1109/qsic.2009.45

Links

Tools

Export citation

Search in Google Scholar

Evaluating the Ability of Novice Analysts to Understand Requirements Models

Proceedings article published in 2009 by Silvia Abrahão ORCID, Emilio Insfrán, José A. Carsí, Marcela Genero, Mario Piattini
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

This paper is aimed at evaluating the ability of novice analysts to understand models specified using a RUP extension for modeling requirements. The evaluation is guided by a theoretical model for IS design methods, the Method Evaluation Model (MEM). In this work, we present the empirical testing of the MEM in the evaluation of a RUP extension for modeling requirements. The testing was conducted through an experiment using 39 novice users. The evaluation's primary goal was to test the users' ability to understand requirements models. The results provide a strong indication that our RUP extension is indeed both ease to use and useful and there is an intention to use the method in the future.