Supporting material

Incorporating Decision Maker Preferences in a Multiobjective Approach for the Software Release Planning


Raphael Saraiva, Allysson Allex Araújo, Altino Dantas, Italo Yeltsin and Jerffeson Souza

Optimization in Software Engineering Group (GOES.UECE) | State University of Ceara - Brazil

1 - Abstract

Background: The Release Planning (RP) is one of the most complex and relevant activities in the iterative and incremental software development, because addresses all decisions related to the selection and assignment of requirements to releases. There are a number of existing approaches which are based on the belief that the RP can be formalized as an optimization problem. In this context, Search Based Software Engineering (SBSE) proposes to apply search techniques to resolve complex problems of software engineering. Since the RP is an ill-defined problem with a large focus on human intuition, the Decision Maker (DM) preferences and engagement poses a key factor to the resolution process. Thus, we emphasize the importance at gathering the human preferences and, consequently, guide the search process to solutions which are also subjectively valuable to the DM.

Methods: In this paper, we explore a multi-objective approach which considers the human preferences as another objective to be maximized, as well as maximizing the overall clients satisfaction and minimizing the project risk. Basically, the DM defines a set of preferences about the requirements allocation which are stored in a Preferences Base responsible for influencing the search process. The performed empirical study was divided in two different experiments, respectively named as (a) Artificial Experiment and (b) Participant-based Experiment. Basically, the first one aims to analyze the approach using different search-based algorithms (NSGA-II, MOCell, IBEA and SPEA-II), over artificial and real-world instances, while the second one aims at evaluating the use of the proposal in a scenario composed of real human evaluations.

Results: The Artificial Experiment points out that NSGA-II obtained overall superiority in two of the three datasets investigated, positioning itself as superior search technique for smaller scenarios, while IBEA showed to be better for large ones. Regarding the Participant-based Experiment, it was found that two thirds of the participants evaluated the preference-based solution better than the non preference-based one.

Conclusions: Through the achieved results, we may assume that is feasible to investigate the approach in a real-world scenario. In addition, we made available a prototype tool for the release planning able to incorporate the human preferences about the requirements allocation into the final optimized solution.

Keywords: Release Planning, Genetic Algorithm, Human Preferences, Search Based Software Engineering.


2 - Instances

Three instances were used in this experiment, respectively named as dataset-1, dataset-2 and dataset-3. Both dataset-1 and dataset-2 are based on real-world. In Dataset-3, we generated him artificially with 600 requirements and 5 clients.

Instance name Download View
Dataset-1 Download View
Dataset-2 Download View
Dataset-3 Download View

3 - Source Code

For the artificial experiment (RQ1) a simple version of the algorithms was used to evaluate the behavior of multiobjective metaheuristics:

Source Code Algorithm

For the experiment with human participants (RQ2 and RQ3) a version of the graphical interface algorithm was used to evaluate the behavior of the approach in a real context:

Source Code Tool


4 - Empirical Studies

Three research questions were designed to assess and analyze the behavior of our approach. Due to the space limitations of the article, some information regarding RQ1 and RQ2 could not be presented. Therefore, it is shown below some extra data for a better understanding of the approach

RQ1: Which search-based techniques, among the evaluated ones, produce better solutions?

The results produced by all scenario algorithms for each instance and scenario presented in the article.

Instance name HighPrefs LowPrefs
Dataset-1 Download Download
Dataset-2 Download Download
Dataset-3 Download Download

The table below shows for dataset-1 the average result achieved by each algorithm in the Hyervolume (HV), Spread (SP) and Generational Distance (GD) metrics for the lowPrefs and highPrefs

Dataset-1
Metrics lowPrefs
NSGA-II MOCell SPEA-II IBEA Random
HV 0.83740 0.83619 0.82953 0.82953 0.22143
SP 0.78020 0.65790 0.65790 0.63950 0.65681
GD 0.00127 0.00147 0.00411 0.00053 0.11061
Metrics highPrefs
NSGA-II MOCell SPEA-II IBEA Random
HV 0.72371 0.70914 0.71869 0.64942 0.13676
SP 0.66785 0.64870 0.41861 1.18683 0.62849
GD 0.00180 0.00243 0.00164 0.00203 0.08058

The table below shows for dataset-2 the average result achieved by each algorithm in the Hyervolume (HV), Spread (SP) and Generational Distance (GD) metrics for the lowPrefs and highPrefs

Dataset-2
Metrics lowPrefs
NSGA-II MOCell SPEA-II IBEA Random
HV 0.84700 0.84238 0.84585 0.81186 0.26363
SP 0.99211 0.73748 0.78670 1.44466 0.59559
GD 0.00124 0.00147 0.00102 0.00034 0.09962
Metrics highPrefs
NSGA-II MOCell SPEA-II IBEA Random
HV 0.76198 0.75858 0.75930 0.70028 0.16678
SP 0.74885 0.63225 0.46772 1.45275 0.58432
GD 0.00189 0.00212 0.00162 0.00175 0.08700

The table below shows for dataset-3 the average result achieved by each algorithm in the Hyervolume (HV), Spread (SP) and Generational Distance (GD) metrics for the lowPrefs and highPrefs

Dataset-3
Metrics lowPrefs
NSGA-II MOCell SPEA-II IBEA Random
HV 0.61776 0.51512 0.61163 0.65923 0.03241
SP 0.63511 0.64056 0.54903 0.63064 0.69089
GD 0.01290 0.02203 0.01336 0.00365 0.29462
Metrics highPrefs
NSGA-II MOCell SPEA-II IBEA Random
HV 0.52480 0.42575 0.51901 0.59654 0.01978
SP 0.65354 0.65738 0.50282 0.63285 0.70770
GD 0.01132 0.01931 0.01045 0.00362 0.25938

RQ2: What is the subjective benefit when considering the DM’s preferences as an objective tobe optimized?

In order to evaluate the subjective benefit, we analyzed the subjective evaluation obtained during the second and third stages of the experiment with Participants. The table below shows the results obtained for each participant in terms of value of the objectives and subjective evaluation attributed to the final solution presented.

Participants Second Stage Third Stage
No-Interactive value Satisfaction Risk Interactive value Satisfaction Risk Preferences Nº of Attended Preferences
#1 75 22896 264 75 17993 222 0.80 5
#2 25 22945 284 50 17020 169 0.50 6
#3 75 21116 221 25 18665 164 0.00 0
#4 75 22932 279 100 16014 117 1.00 4
#5 50 23686 372 75 23903 357 0.56 3
#6 25 24164 338 50 24495 534 1.00 6
#7 75 22714 286 75 20186 187 1.00 3
#8 25 23601 303 100 17206 166 1.00 2
#9 25 21625 237 50 22112 496 0.68 13

January 2017 (last update: January 2017)