artigo breno

53
Abordagens para Simulação em Engenharia de Software Revisão Sistemática COPPE/PESC 1 quasi-Systematic Review Simulation Studies in Software Engineering Breno Bernard Nicolau de França Guilherme Horta Travassos Janeiro/2011

Upload: cassio-cristiano

Post on 14-May-2017

236 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 1

quasi-Systematic Review – Simulation Studies in

Software Engineering

Breno Bernard Nicolau de França

Guilherme Horta Travassos

Janeiro/2011

Page 2: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 2

Índice

1 INTRODUÇÃO .......................................................................................................... 3

2 QUASI-REVISÃO SISTEMÁTICA ............................................................................ 8

2.1 Formulação da Questão de Pesquisa ....................................................................... 8 2.1.1 Foco da Questão: .................................................................................................. 8 2.1.2 Qualidade e Amplitude da Questão ....................................................................... 8 2.2 Seleção de Fontes .................................................................................................. 10 2.2.1 Definição de Critérios para Seleção de Fontes ................................................... 10

2.2.2 Idioma dos Estudos ............................................................................................. 10 2.2.3 String de Busca ................................................................................................... 10 2.2.4 Identificação de Fontes ....................................................................................... 11

2.2.5 Seleção de Fontes após Avaliação ..................................................................... 12 2.3 Seleção de Estudos ................................................................................................ 12 2.3.1 Definição dos Estudos ......................................................................................... 12 2.3.2 Execução da Seleção .......................................................................................... 13

2.4 Extração de Informação .......................................................................................... 14 2.4.1 Definição de Critérios de Inclusão e Exclusão de Informação ............................. 14 2.4.2 Formulário de Extração de Informação ............................................................... 14 2.4.3 Execução da Extração ......................................................................................... 15

2.4.4 Resolução de Divergências entre Revisores ....................................................... 15 2.5 Results Summarization ........................................................................................... 16

2.5.1 Results Presentation in Tables ............................................................................ 16 2.5.2 Sensitivity Analysis .............................................................................................. 16

2.5.3 Plotting ................................................................................................................ 16 2.5.4 Final Remarks ..................................................................................................... 16

3 ANÁLISE DOS RESULTADOS .............................................................................. 17

3.1 Simulation Approaches ........................................................................................... 17 3.2 Software Engineering Domains in Simulation Studies ............................................ 19

3.3 Simulation Tools for Software Engineering ............................................................. 24 3.4 Characteristics of Simulation Models ...................................................................... 26 3.5 Verification and Validation (V&V) Procedures for Simulation Models...................... 31 3.6 Simulation Output Analysis ..................................................................................... 37 3.7 Study Strategies involving Simulation ..................................................................... 39

4 CONCLUSIONS ...................................................................................................... 46

4.1 Threats to validity .................................................................................................... 46 4.2 Open Questions ...................................................................................................... 47 4.3 State of the Art and Future Directions ..................................................................... 47

5 REFERÊNCIAS ...................................................................................................... 49

Page 3: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 3

1 Introdução O termo Simulação pode ser definido como “a imitação da operação de um processo ou sistema do mundo real sobre o tempo. Simulação envolve a geração de um histórico artificial do sistema, e a observação do histórico artificial para realizar inferências considerando características operacionais do sistema real que está sendo representado” (BANKS, 1999). A simulação computacional1 emergiu em meio a Segunda Guerra Mundial com a utilização de modelos matemáticos contínuos e de Monte Carlo. Não existem evidências exatas sobre a origem de técnicas como a simulação por eventos discretos, por exemplo, pois estas técnicas são baseadas em modelos matemáticos antigos, de um período em que estes ainda eram propostos sem uma implementação computacional. Entretanto, sem o avanço da computação em termos de hardware e software, era inviável para alguns pesquisadores e profissionais da prática pensar na utilização de simulação como técnica de resolução de problemas, e tão pouco como a técnica mais utilizada para a maioria deles (NANCE e SARGENT, 2002). Nance e Sargent (2002) afirmam que a larga utilização de modelos baseados em simulação dirigida por eventos discretos em diversas áreas e por todo o período de sua evolução representa o quão dominante esta técnica é em relação às demais. Um modelo de simulação é uma ferramenta útil no estudo de comportamentos de sistemas e processos, seja com a finalidade de observação/entendimento ou até de otimização do objeto simulado. Os modelos de simulação são construídos com base em abordagens2 clássicas (ou variações destas) para simulação, que representam uma abstração que pode ser traduzida para estruturas computacionais. Segundo Birta e Arbez (2007) as principais abordagens para simulação são: a simulação dirigida por eventos discretos e a simulação baseada em tempo contínuo, ou simplesmente simulação contínua. Os Estudos Baseados em Simulação são definidos como uma série de passos, tais como: coleta de dados, codificação e verificação, validação do modelo, projeto experimental, análise dos dados de saída, e implementação (ALEXOPOULOS, 2007).

A simulação possui aspectos interessantes do ponto de vista de experimentação. A simulação permite um alto controle do ambiente, e com isso, realizar observações ou fatos, e validar hipóteses ou teorias. Além disso, o fato do ambiente ser virtual faz com que o tempo e esforço gastos na execução das simulações sejam baixos quando comparados a experimentos com sistemas/processos do mundo real, viabilizando a execução de todas as combinações possíveis entre as variáveis sob investigação (ÖREN, 2009). Maria (1997) apresenta um esquema do que seria um estudo baseado em simulação, conforme apresentado na figura 1. Trata-se de um procedimento iterativo, onde o número de ciclos depende de alterações no sistema em estudo, mudança na perspectiva de observação do sistema, ou ainda resultados inconclusivos. O sistema em questão volta, depois de alterado, a ser alvo de estudo e assim por diante. Cada um dos retângulos contidos na figura representa uma etapa do estudo. Nestas etapas são sempre requeridas tomadas de decisão. A única etapa que não requer intervenção humana é a execução das simulações, a qual pode ser realizada por pacotes computacionais.

1 Neste texto, os termos “simulação” e “simulação computacional” são utilizados como sinônimos, da mesma

forma como são utilizados na literatura técnica de simulação. 2 Aqui utilizamos a expressão “abordagem para simulação”, por ter uma semântica mais ampla. Entretanto,

na literatura especializada outros termos podem aparecer, tais como: paradigma, método, técnica, entre outros.

Page 4: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 4

Figura 1. Esquema para Estudos Baseados em Simulação (MARIA, 1997).

Maria (1997) lista, ainda, onze etapas para o desenvolvimento de um modelo de simulação, no projeto do experimento e na análise dos resultados, são elas:

1. Identificação do problema;

2. Formulação do problema;

3. Coleta e processamento de dados do sistema real;

4. Formulação e desenvolvimento de um modelo;

5. Validação do modelo;

6. Documentação do modelo para futura utilização;

7. Seleção do projeto experimental apropriado;

8. Estabelecimento das condições experimentais para as execuções;

9. Execução das simulações;

10. Interpretação e apresentação dos resultados;

11. Recomendação de direções futuras.

A simulação vem sendo utilizada com sucesso em diferentes disciplinas com os mais diversos propósitos. Exemplos destas disciplinas são as engenharias, economia, biologia e ciências sociais (MÜLLER e PFAHL, 2008). Na Engenharia de Software (ES), a construção de modelos também é diversificada no que diz respeito ao propósito e aos seus domínios, principalmente no contexto de sistemas e processos de software complexos.

Page 5: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 5

De acordo com Travassos e Barros (2003), estudos in virtuo e in silico são as classes de estudos em Engenharia de Software em que a simulação é aplicável. Nos estudos in virtuo, o objeto de estudo é simulado, mas o participante é real. Já nos estudos in silico, tanto o objeto de estudo quanto os participantes são simulados. Com os Estudos Baseados em Simulação é possível reduzir riscos, tempo e custo, considerando que o ambiente no qual o estudo é executado é um ambiente virtual. Além disso, a simulação facilita a repetição de estudos, devido à natureza virtual do ambiente já mencionada. Outra vantagem é que os estudos baseados em simulação permitem testar hipóteses antes de implementá-las em experimentos reais. Com isso, é possível “prever” os efeitos de tais implementações. Na Engenharia de Software, já existem modelos de simulação baseados nas abordagens existentes, tais como a simulação dirigida por eventos discretos e a dinâmica de sistemas. Entretanto, existe uma série de tomadas de decisão relacionadas à escolha da abordagem adequada para construção de um modelo de simulação para o comportamento que se deseja observar. A seguir, alguns exemplos da Engenharia de Software são apresentados para ilustrar como determinadas abordagens vem sendo utilizadas para resolver problemas em subáreas dessa disciplina.

Em Luckham et al (1995) é apresentada a ADL (Architecture Description Language) Rapide, a qual permite a simulação e análise comportamental de arquiteturas de sistemas. Essa linguagem descreve uma arquitetura de maneira executável para que simulações sejam realizadas em fases iniciais do desenvolvimento de software, antes que sejam tomadas decisões de implementação.

O modelo utilizado para simulação com a linguagem Rapide, também chamado de modelo de execução, é baseado em um conjunto de eventos, os quais são gerados juntamente com as causas e o tempo da sua ocorrência, formando conjuntos parcialmente ordenados de eventos, os quais descrevem a dependência entre esses eventos.

Quando uma simulação é iniciada, esse conjunto de eventos é gerado e observado por um conjunto de processos (threads de controle que fazem parte dos componentes de uma arquitetura). Estes processos reagem (executam alguma ação) com base na geração de um evento e que também geram outros novos eventos através de mecanismos similares a gatilhos.

Arief e Speirs (1999) apresentam uma abordagem para geração automatizada de modelos de simulação com base em modelos UML, com a finalidade de prever o desempenho do sistema modelado. O modelo de entrada é composto por diagramas de classe e de interação, por exemplo, diagramas de sequência, em nível de projeto (design). O mapeamento é realizado por meio de um parser desenvolvido para traduzir os elementos da UML como elementos da simulação por eventos discretos, mais especificamente para a linguagem orientada a objetos JavaSim, uma implementação Java do toolkit C++Sim. Esta linguagem possui construtores relacionados com elementos da abordagem por eventos discretos, tais como: processos, filas, parâmetros de entrada, variáveis (pseudo) aleatórias, eventos, entre outras. A execução do modelo gerado ocorre como com a geração de eventos, de acordo com funções de distribuições teóricas (tais como, uniforme, exponencial, normal, Erlang, entre outras), os quais são entradas para a simulação por eventos discretos baseada em processos. Cada processo representa uma instância de uma classe do modelo, os quais trocam mensagens com outros processos até que a execução do modelo termine.

Os resultados de desempenho do sistema representado pelo modelo UML de entrada permitem avaliar a existência de gargalos, tempo de processamento e condições de escala do sistema.

O modelo DynaREP (AL-EMRAN et al, 2008) segue o paradigma evento-discreto para simulação de processo de software e é focado no planejamento e re-planejamento

Page 6: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 6

de releases. Ele permite saber quais as implicações do acréscimo ou remoção de uma funcionalidade de um release em termos de custos, esforço, prazos e recursos disponíveis. Do processo de planejamento de release, esse modelo se propõe a atender apenas as fases de Planejamento Operacional do Release, isto é, alocação de recursos para as tarefas em cada release e o Re-planejamento dinâmico, isto é, a revisão de planos para tratar mudanças inesperadas impostas sobre os gerentes do produto/projeto responsáveis pela implementação dos releases individuais.

Outra abordagem para simulação bastante utilizada em Engenharia de Software é a Dinâmica de Sistemas (FORRESTER, 1961). Essa é uma abordagem contínua capaz de representar o comportamento de sistemas complexos por meio de diagramas de causalidade baseados em ciclos de retroalimentação, diagramas de estoques e fluxos, e equações matemáticas que determinam o relacionamento entre variáveis e taxas relativas aos fluxos.

Segundo Madachy (2008), os elementos básicos da dinâmica de sistemas são os níveis (ou estoques), os fluxos e a fonte/drenos. Tais elementos podem ser observados na figura 2, onde é apresentado um exemplo de um modelo em dinâmica de sistemas que representa o relacionamento entre algumas variáveis do desenvolvimento de software, tais como produtividade e inserção e detecção de defeitos. Os níveis são representados pelos retângulos, os fluxos pelas válvulas e as fontes/drenos pelas “nuvens”.

Figure 2. Exemplo de modelos em dinâmica de sistemas (MADACHY, 2008)

O modelo de Abdel-hamid e Madnick (1991) talvez seja em Engenharia de Software o mais difundido entre os modelos que utilizam dinâmica de sistemas. Foi proposto como um modelo para simular projetos de software, sendo bastante abrangente no que diz respeito aos subsistemas considerados em seu modelo: Gerência de Recursos Humanos, subsistema que trata variáveis como treinamento, turnover de pessoal em termos organizacionais e de projeto, nível de experiência e produtividade de desenvolvedores, entre outras; Produção de Software, este subsistema representa a alocação de esforço ao projeto; Desenvolvimento de Software, este é o maior subsistema e trata a produtividade como uma variável complexa e que representa o andamento do projeto, onde existe uma produtividade potencial e a produtividade real bem como essa é impactada devido a variáveis como motivação e comunicação; Garantia de Qualidade e Retrabalho, como o próprio nome sugere, representa as taxas de inserção e detecção de defeitos, os impactos de pressões por prazos sobre essas taxas e o retrabalho resultante desses defeitos; Testes, este subsistema representa os ciclos das atividades de testes durante o projeto, bem como o impacto de defeitos não corrigidos ou não detectados em fases posteriores; Controle, este subsistema está relacionado com a medição de tarefas realizadas, produtividade, retrabalho, entre outras,

Page 7: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 7

além de ajuste na alocação de esforço e carga de trabalho; e Planejamento, este subsistema representa a estabilidade do cronograma e o término de atividades dentro do prazo. A partir desse modelo, muitos outros foram propostos. Exemplos são os modelos de Madachy (1996) e Martin e Raffo (2000).

Outros exemplos de modelos que utilizam dinâmica de sistemas são os modelos de Barros et al (2003) para Gerência de Riscos, utilizando simulação baseada em cenários para modelar o impacto de riscos e a eficácia de estratégias de resolução, auxiliando na tomada de decisão por todo o processo de desenvolvimento; outro exemplo é o modelo de Araújo (2004) para observação de tendências no decaimento de software, baseado nas Leis de Evolução de Software de Lehman (LEHMAN, 1980). Embora sejam inúmeras as vantagens dos Estudos Baseados em Simulação, é necessário que os modelos construídos sejam avaliados antes de se realizar estudos com base nestes modelos, principalmente no que diz respeito à Validade de Constructo, isto é, o quanto o modelo construído se aproxima do sistema real (WOHLIN et al, 2000). Além disso, existe um custo/esforço considerável associado à construção destes modelos. Tendo em vista a grande diversidade de abordagens para simulação e a possibilidade de aplicação dessas abordagens em estudos baseados em simulação na Engenharia de Software, o objetivo dessa quasi-Revisão Sistemática da literatura é caracterizar como as diversas abordagens para simulação existentes na literatura têm sido aplicadas em estudos baseados em simulação na área de Engenharia de Software. A partir dessa caracterização, espera-se apontar as vantagens e desvantagens de cada abordagem, bem como as características que determinam sua aplicabilidade diante de um determinado problema. Com isso, é possível identificar a abordagem mais adequada para simular determinadas características de sistemas ou processos, ou ainda, de um sistema ou processo em particular, reduzindo o risco de se construir um modelo que não possa retornar os resultados desejados.

Page 8: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 8

2 quasi-Systematic Review Nesta seção é apresentado o protocolo da revisão sistemática realizada. O protocolo contém a definição dos objetivos do estudo, bem como a organização dos procedimentos a serem seguidos pelos executores da revisão. Para elaboração deste documento foi utilizado o template proposto por Biolchini et al (2005). O estudo em questão possui o propósito de caracterização, ou seja, não se possui conhecimento prévio que possibilite a realização de comparações. Sendo assim, chamamos de uma quasi-Revisão Sistemática, de acordo com Travassos et al (2008).

2.1 Formulação da Questão de Pesquisa

2.1.1 Foco da Questão:

Executar uma quasi-Revisão Sistemática com objetivo de caracterizar como as diversas abordagens para simulação computacional existentes na literatura vêm sendo aplicadas em Estudos Baseados em Simulação na Engenharia de Software (ES), intencionando:

Identificar quais os domínios de ES que utilizam abordagens para simulação. Com isso, determinar a freqüência de utilização por domínio;

Identificar as metodologias utilizadas no planejamento e condução desses estudos;

2.1.2 Qualidade e Amplitude da Questão

Problema: Estudos baseados em simulação oferecem vantagens na investigação preliminar de hipóteses. Entretanto, é necessário utilizar a abordagem de simulação adequada para a construção do modelo do sistema a ser simulado. Caracterizando as diferentes abordagens para simulação, espera-se aumentar a orientação no planejamento e condução de estudos baseados em simulação e na construção de modelos de simulação em ES, no sentido em que se possa escolher de forma objetiva a abordagem e o projeto experimental adequados.

Questão: µ0: Como as diferentes Abordagens para Simulação Computacional existentes na literatura vêm sendo aplicadas em Estudos Baseados em Simulação na Engenharia de Software (ES)?

Palavras-chave e Sinônimos:

o Estudos baseados em simulação: estudos de simulação, simulação computacional, modelagem e simulação, simulação e modelagem, In Virtuo, In Silico, Sampling, Monte Carlo, Modelagem Estocástica, Dinâmica de Sistemas, Simulação por eventos discretos, Simulação baseada em estados, Simulação baseada em agentes

o Engenharia de Software: engenharia de sistemas, engenharia de aplicações, desenvolvimento de software

o Modelo de Simulação: modelo de dinâmica de sistemas, modelo dirigido por eventos discretos, modelo de agentes, modelo de estados.

Para estruturar a string de busca, utilizamos a abordagem PICO, conforme definida por Pai et al (2004). Nessa abordagem a questão de pesquisa (string de busca) é separada em quatro partes: População de interesse, Intervenção ou exposição sendo avaliada, Comparação (se aplicável) e Resultado (Outcome).

População: Artigos que apresentem Estudos Baseados em Simulação em ES.

Intervenção: Modelos de simulação computacional utilizados nos estudos.

Page 9: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 9

Comparação: Nenhuma.

Medida de Resultado: Objetivo, domínio e aspectos experimentais (projeto experimental) do estudo. Além de características do modelo. O resultado (outcome) coincide com as informações a serem extraídas dos artigos (ver formulário de extração). Entretanto, os abstracts e títulos dos artigos não utilizam os termos-chave para identificar tais informações, por exemplo, um abstract de um artigo descreve características do modelo proposto, mas utiliza o termo “característica” para identificá-las. Por isso, o resultado (outcome) foi suprimido da string de busca e será considerado somente no momento da extração de informações do artigo. Essa decisão foi tomada em virtude de não se encontrar uma boa cobertura com os termos do resultado (outcome), e como o esforço para tratar o número de artigos retornados somente considerando a população e intervenção é viável, foi resolvido manter todos os artigos para a fase de seleção.

Efeito: Caracterização de abordagens para simulação no contexto da ES.

Aplicação: Pesquisadores em Engenharia de Software, Engenheiros de Software.

Projeto Experimental: Nenhum método estatístico será aplicado sobre os

resultados.

Controle: Os artigos utilizados como controle para a elaboração da string de busca

foram os seguintes:

o Martin, R.; Raffo, D. Application of a hybrid process simulation model to a

software development Project. Journal of Systems and Software, Volume 59,

Issue 3, 2001, Pages 237-246;

o Khosrovian, K.; Pfahl, D.; Garousi, V. GENSIM 2.0: A customizable process

simulation model for software process evaluation. Lecture Notes in Computer

Science, Volume 5007 LNCS, 2008, Pages 294-306;

o Drappa, A.; Ludewig, J. Simulation in software engineering training.

Proceedings - International Conference on Software Engineering, 2000,

Pages 199-208;

o Madachy, R. System dynamics modeling of an inspection-based process.

Proceedings - International Conference on Software Engineering, 1995,

Pages 376-386;

o Al-Emran, A.; Pfahl, D.; Ruhe, G. A method for re-planning of software

releases using discrete-event simulation. Software Process Improvement

and Practice Volume 13, Issue 1, January 2008, Pages 19-33;

o Al-Emran, A.; Pfahl, D.; Ruhe, G. DynaReP: A discrete event simulation

model for re-planning of software releases. Lecture Notes in Computer

Science, Volume 4470 LNCS, 2007, Pages 246-258;

Page 10: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 10

o Luckham, D. C.; Kenney, J. J.; Augustin, L. M.; Vera, J.; Bryan, D.; Mann, W.

Specification and analysis of system architecture using rapide. IEEE

Transactions on Software Engineering, Volume 21, Issue 4, April 1995,

Pages 336-355;

o Arief, L. B.; Speirs, N. A. A UML tool for an automatic generation of

simulation programs. Proceedings Second International Workshop on

Software and Performance WOSP, 2000, Pages 71-76;

o Choi, K.; Bae, D.-H.; Kim, T. An approach to a hybrid software process

simulation using the DEVS formalism. Software Process Improvement and

Practice, Volume 11, Issue 4, July 2006, Pages 373-383.

2.2 Seleção de Fontes

2.2.1 Definição de Critérios para Seleção de Fontes

As fontes utilizadas para essa quasi-Revisão Sistemática da Literatura são as bibliotecas digitais disponíveis na web, cujos artigos nelas contidos sejam acessíveis. Além disso, as bibliotecas devem permitir consulta online por meio de um mecanismo de busca no qual se possa utilizar expressões lógicas para definir a string de busca. Em complemento, o mecanismo de busca deve permitir a busca por título, resumo (abstract) e palavras-chave do artigo. Devem conter artigos de diversos domínios da Engenharia de Software.

2.2.2 Idioma dos Estudos

Inglês.

2.2.3 String de Busca

µ0: Como as diferentes Abordagens para Simulação Computacional existentes na literatura vêm sendo aplicadas em Estudos Baseados em Simulação na Engenharia de Software? P: (("simulation modeling" OR "simulation modelling" OR "in silico" OR "in virtuo"

OR "simulation based study" OR "simulation study" OR "computer simulation" OR

"modeling and simulation" OR "modelling and simulation" OR "simulation and

modeling" OR "simulation and modelling" OR "process simulation" OR "discrete-

event simulation" OR "event based simulation" OR "system dynamics" OR sampling

OR "monte carlo" OR "stochastic modeling" OR "agent based simulation" OR "state

based simulation") AND ("software engineering" OR "systems engineering" OR

"application engineering" OR "software development" OR "application

development" OR "system development"))

I: ("simulation model" OR "discrete event model" OR "event based model" OR

"system dynamics model" OR "agent model" OR "state model")

C: Não aplicável.

Page 11: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 11

O: Suprimido. Termos que representam o que deve ser a saída: ("area" OR

"domain" OR "context" OR "discipline" OR "study planning" OR "study design" OR

"experimental support" OR "experimental planning" OR "experimental design" OR

"experimental study" OR "goal" OR "target" OR "objective" OR "purpose" OR

"problem" OR "aim" OR "characteristic" OR "property" OR "feature" OR "attribute"

OR "aspect" OR "factor" OR "dimension" OR "perspective" OR "advantage" OR

"disadvantage" OR "benefit" OR "approach" OR "technique" OR "method" OR

"paradigm" OR "mechanism" OR "instrument" OR "methodology" OR "procedure")

2.2.4 Identificação de Fontes

Métodos de busca de fontes: As fontes serão acessadas via mecanismos de

busca das bibliotecas digitais.

Lista de Fontes:

o Scopus;

TITLE-ABS-KEY((("simulation modeling" OR "simulation modelling" OR "in silico"

OR "in virtuo" OR "simulation based study" OR "simulation study" OR "computer

simulation" OR "modeling and simulation" OR "modelling and simulation" OR

"simulation and modeling" OR "simulation and modelling" OR "process simulation"

OR "discrete-event simulation" OR "event based simulation" OR "system dynamics"

OR sampling OR "monte carlo" OR "stochastic modeling" OR "agent based

simulation" OR "state based simulation") AND ("software engineering" OR "systems

engineering" OR "application engineering" OR "software development" OR

"application development" OR "system development")) AND ("simulation model" OR

"discrete event model" OR "event based model" OR "system dynamics model" OR

"agent model" OR "state model"))

o Web of Science (ISI Knowledge);

TS=((("simulation modeling" OR "simulation modelling" OR "in silico" OR "in virtuo"

OR "simulation based study" OR "simulation study" OR "computer simulation" OR

"modeling and simulation" OR "modelling and simulation" OR "simulation and

modeling" OR "simulation and modelling" OR "process simulation" OR "discrete-

event simulation" OR "event based simulation" OR "system dynamics" OR sampling

OR "monte carlo" OR "stochastic modeling" OR "agent based simulation" OR "state

based simulation") AND ("software engineering" OR "systems engineering" OR

"application engineering" OR "software development" OR "application

development" OR "system development")) AND ("simulation model" OR "discrete

Page 12: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 12

event model" OR "event based model" OR "system dynamics model" OR "agent

model" OR "state model"))

o Engineering Village (Ei Compendenx)

((("simulation modeling" OR "simulation modelling" OR "in silico" OR "in virtuo" OR

"simulation based study" OR "simulation study" OR "computer simulation" OR

"modeling and simulation" OR "modelling and simulation" OR "simulation and

modeling" OR "simulation and modelling" OR "process simulation" OR "discrete-

event simulation" OR "event based simulation" OR "system dynamics" OR sampling

OR "monte carlo" OR "stochastic modeling" OR "agent based simulation" OR "state

based simulation") AND ("software engineering" OR "systems engineering" OR

"application engineering" OR "software development" OR "application

development" OR "system development")) AND ("simulation model" OR "discrete

event model" OR "event based model" OR "system dynamics model" OR "agent

model" OR "state model")) WN KY

2.2.5 Seleção de Fontes após Avaliação: As três fontes selecionadas satisfazem os critérios definidos na seção 2.2.1. Além disso, Scopus, Ei Compendex, Web of Science (ISI Knowledge) englobam as principais bibliotecas digitais (incluindo artigos de conferências e periódicos) para pesquisa em simulação computacional e engenharia de software, além de outras áreas correlatas. Exemplos destas bibliotecas são ACM, IEEE, Elsevier, Springer e WILEY.

2.3 Seleção de Estudos

2.3.1 Definição dos Estudos

Definição dos Critérios de Inclusão e Exclusão de Estudos o Inclusão:

Os artigos devem estar disponíveis na web; Os artigos devem estar descritos em inglês; Os artigos devem tratar de estudos baseados em simulação

(computacional); Os estudos devem pertencer ao domínio da Engenharia de Software, e; O artigo deve mencionar um ou mais modelos de simulação.

o Exclusão: Artigos não escritos em inglês; Artigos publicados por meios que não exigem revisão por pares; Artigos que tratem de simulação não computacional; Artigos que não apresentem estudo algum, e; Prefácios e apresentações de Proceedings de conferências.

Definição dos Tipos de Artigos

Artigos Teóricos (com fundamentação teórica) descrevendo algum modelo de simulação;

Estudos primários quantitativos, e;

Estudos secundários;

Page 13: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 13

Procedimento para Seleção de Estudos Três pesquisadores aplicarão a estratégia de busca para a identificação de potenciais artigos. A seleção dos estudos será baseada no título e resumo dos artigos. O Pesquisador 1 aplica as Search String nas máquinas, recupera os artigos, armazena no gerenciador de referências JabRef3 juntamente com os resumos, acrescentando uma coluna para indicar o status do artigo: I - Incluído, E - Excluído, D – Dúvida, e elimina as duplicatas. Após isso, realiza a primeira classificação. Então, o Pesquisador 2 recebe o arquivo Jabref (em formato BibTeX) contendo as referências e anotações adicionais (por exemplo, o status do artigo), e realiza a conferência dos I e E. Caso algum precise ser alterado, marca como D2 e reclassifica os Ds como incluídos ou excluídos, marcando como I2 ou E2 para saber o que pode mudar. Por último, o Pesquisador 3 realiza o mesmo que o 2, porém com marcações I3 ou E3. Os D que sobrarem serão repassados e ao final será realizada uma reunião para a decisão final, com análise elaborada dos Ds. Ainda que restem dúvidas (Ds), após a passagem pelo Pesquisador 3, estes artigos são incluídos para posterior análise. Mesmo artigos incluídos na etapa de seleção, pela leitura de título e resumo dos artigos, podem ser excluídos posteriormente na etapa de extração. Na etapa de extração ocorre a leitura do texto na íntegra, e isso pode ocasionar um melhor entendimento do artigo, esclarecendo dúvidas e permitindo uma melhor decisão de sua manutenção como artigo incluído ou sua exclusão por não satisfazer, de fato, os critérios estabelecidos.

2.3.2 Execução da Seleção

Seleção Inicial dos Estudos:

Data da Busca 14/03/2011

Scopus 906

Web of Science 85

Ei Compendex 501

Total 1492

Duplicatas 546

Artigos para Seleção 946

Avaliação da Qualidade dos Estudos: Os critérios utilizados para avaliação da

qualidade do artigo estão relacionados aos itens contidos no formulário de extração,

conforme a tabela 1.

3 A ferramenta para gerenciamento de referências bibliográficas JabRef (http://jabref.sourceforge.net/)

permite o armazenamento estruturado de referências bibliográficas, bem como a adição de anotações sobre estas referências. Para esta revisão, as informações extraídas de cada referência foram armazenadas como anotações na ferramenta.

Page 14: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 14

Tabela 1. Critérios de Qualidade dos Artigos

Critério Valor

Identifica a abordagem para simulação utilizada? 1 pt

Explicita o propósito do modelo de simulação? 1 pt

Explicita o propósito do estudo realizado? 1 pt

É possível identificar o domínio (disciplina da Engenharia de Software) em que o estudo foi aplicado?

1 pt

Menciona o apoio ferramental utilizado para conduzir as simulações? 0,5 pt

Descreve as características apontadas em relação ao modelo de simulação?

1 pt

Apresenta uma classificação das características apontadas? 0,5 pt

Apresenta as vantagens do modelo de simulação? 0,5 pt

Apresenta as desvantagens do modelo de simulação 0,5 pt

Técnicas de Verificação e Validação 1 pt

Descreve a metodologia de análise para os resultados da simulação? 1 pt

Identifica o tipo de estudo em que o modelo de simulação foi utilizado como instrumento juntamente com seu projeto experimental?

1 pt (0,5 para o tipo de estudo + 0,5 para o projeto experimental)

Revisão das Seleções: Como um (R1) dos três revisores executou a primeira seleção dos artigos com base nos títulos e resumos, os outros dois revisores (R2 e R3) inspecionaram o resultado da seleção realizada por R1, para que possa haver um critério de desempate e reduzir o viés na seleção dos artigos.

Artigos para Seleção 946

Conflitos 29

Após desempate

Excluídos 796

Para Extração dos Dados 150

2.4 Extração de Informação

2.4.1 Definição de Critérios de Inclusão e Exclusão de Informação

As informações extraídas dos artigos devem conter descrições de estudos baseados em simulação, bem como dos modelos utilizados nestes estudos.

2.4.2 Formulário de Extração de Informação

Para cada artigo selecionado após a execução do processo de seleção dos estudos, os pesquisadores extrairão os seguintes dados apontados na tabela 2 e organizarão através da ferramenta JabRef.

Page 15: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 15

Tabela 2. Formulário de Extração de Informação

Campo Informação Extraída

Identificação do Artigo [título, autores, fonte, ano, tipo de artigo]

Nome da abordagem para simulação utilizada

Propósito do modelo de simulação utilizado

[Objetivo para o qual o modelo foi construído]

Propósito do estudo realizado [Objetivo para o qual o estudo foi realizado]

Domínio (disciplina da Engenharia de Software) em que o estudo foi aplicado

[Área de aplicação do estudo. Caso o estudo pertença a mais de uma área, explicitá-las.]

Apoio ferramental [o modelo utilizado no estudo possui algum apoio ferramental? Caso sim, qual?]

Características apontadas em relação ao modelo de simulação

[por exemplo, discreto, contínuo, determinístico, estocástico, baseado em eventos, entre outras]

Classificação das características apontadas

[possui alguma classificação das características? Descreva-a.]

Vantagens do modelo de simulação

Desvantagens do modelo de simulação

Técnicas de Verificação e Validação [técnicas de V&V utilizadas para avaliar o modelo utilizado no estudo, em termos de validade interna, externa e de construção]

Metodologia de Análise [metodologias de análise de resultados do estudo baseado em simulação]

Tipo de estudo em que o modelo de simulação foi utilizado como instrumento

[experimento, observação, ...]

Principais resultados do artigo em relação à abordagem

[aplicabilidade da abordagem ao problema, acurácia dos resultados]

2.4.3 Execução da Extração

Artigos para Extração 150

Excluídos 28

Sem acesso 14

Total de Extraídos 108

2.4.4 Resolução de Divergências entre Revisores

Não aplicável.

Page 16: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 16

2.5 Results Summarization

2.5.1 Results Presentation in Tables

All the papers, as well as the extracted information, were stored in the JabRef tool.

2.5.2 Sensitivity Analysis

Not applicable.

2.5.3 Plotting

Content presented in analysis section.

2.5.4 Final Remarks

Number of papers: After information extraction for each of 108 research papers, including title, abstract and body of text, it was possible to identify 88 simulation models, distributed among several Software Engineering research sub-areas, called domains.

Search, Selection and Extraction Bias: Publications are restricted to the sources indexed by the digital libraries and used search engines.

Publication Bias: As expected, no negative result was found. However, as a characterization review we did not consider just the results of each study, but mainly how they were conducted and in which context to understand how that simulation model or approach could be used in SE.

Inter-Reviewers Variation: All solved by a third reviewer and agreed by the others.

Results Application: The results found in this review can be used as a starting point for future research directions that must be addressed by the Software Engineering community when conducting simulation-based studies. Besides, the information can be organized as a body of knowledge to support the decision making regarding simulation in Software Engineering.

Recommendations:

Page 17: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 17

3 Análise dos Resultados Among the 108 papers selected for extraction, only two secondary studies were identified: a literature survey [Ahmed et al, 2008] and a Systematic Literature Review [Zhang et al, 2008]. Both discuss specifically Software Process Simulation.

The survey by Ahmed et al (2008) relates its results to the current practice in simulation. Such results indicate that software process simulation practitioners are, in general, methodic, work with complex problems (resulting in large scale models), and use a systematic process to develop the simulation models. This work also points out that the simulation modeling process and the model evaluation are both the main issues needing attention from the community.

About the systematic review by Zhang et al (2008), it has the goal of tracing the evolution in Software Process Simulation and Modeling research during 10 years, from 1998 to 2008. The authors analyzed about 200 relevant papers in order to answer research questions defined in their research protocol. Among the main contributions of the conducted systematic review, the authors highlight: “(1) Categories for classifying software process simulation models; (2) Research improving the efficiency of SPSM is gaining importance. (3) Hybrid process simulation models have attracted interest as a possibility to more realistically capture complex real-world software processes”.

In the following sub-sections we present the characterization results about the studies and models found into the context of Software Engineering.

3.1 Simulation Approaches

The large amount of papers about simulation-based studies in Software Engineering adopts a discrete-event or continuous simulation (mostly represented by System Dynamics) approach. The Systematic Review presented by Zhang et al (2008) also confirms this statement in the context of Software Process and Project Simulation. Some slightly different approaches appear in the technical literature, but they rely on discrete or continuous behavior. For example, Agent-Based Simulation is often mentioned as a distinct abstraction, but agents and their environment are usually characterized by a continuous behavior.

Figure 1 presents the simulation approaches found in this study distributed according to their numbers of models and papers.

Page 18: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 18

In Figure 1, there are clear dominances of System Dynamics and Discrete-event approaches. Even when authors present hybrid models, most part of the combinations fall into these two approaches. Discrete-event simulation is a mature approach that has shown to succeed for years in a vast range of research areas. However, System Dynamics seems to have another explanation for its majority over the other ones, specifically in Software Engineering: the influence of Abdel-Hamid and Madnick (AHM) software project integrated model [Abdel-Hamid and Madnick, 1990]. Many works mentioned their model as a basis for new ones, for example, Martin and Raffo (2001), Lee and Miller (2004) and Choi et al (2006) use some parts of the AHM model to conceive their own models. The AHM model encompasses a great part from what can be observed in software projects from a continuous perspective.

Some simulation approaches found in this quasi-systematic review were not clearly defined. This issue happens basically for two reasons: either papers do not explicitly say that models were based on a specific approach or their specification of the proposed/used models are not similar to any know approach. The model presented in [Ormon et al, 2001] seems to be an analytical model, instead of simulation model, but there are not enough details to confirm it and no simulation approach was mentioned. Another example of not clearly identified simulation approach is [Navarro and Hoek, 2005], we were not able to find details about their model and there is no sentence mentioning the simulation approach used to execute simulations. These approaches were grouped into the “Not Specified” category, since we were not able to understand the underlying approach. Something similar to that occurs in the categories “General Discrete-time Simulation” and “General Continuous-time Simulation”, where it was possible to perceive in the model description

Figure 1. Simulation Approaches Distribution

Page 19: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 19

that such models implement discrete or continuous time-advancing mechanisms. Unfortunately, we were not able to recognize the specific approach used to build these models. Examples of discrete and continuous-time simulation can be found in [Grillinger et al, 2004], where a simulation approach is applied to design and test embedded systems and in [Tunru et al, 2006], where a continuous model for open source development using Test-Driven Development is presented, respectively.

The remaining approaches appearing for one (at most two) simulation model seems to be an investigation for the suitability of them to simulate software engineering systems or processes, since they do not submit their models to systematic validation procedures. Most of papers found present motivation for simulation studies and model development, brief literature review, theoretical foundation, the proposed simulation model and an example of how the model works. Typical examples can be found in [Choi et al, 2006] and [Al-Emran and Pfahl, 2007].

For System Dynamics approach, it is possible to see (in figure 1) a greater number of papers (gray bar) compared to number of models (black bar). This fact occurs due to use of a same model as an instrument across different simulation-based studies or even through several replications of a specific study. Houston et al (2001) uses four models in their study, and most of them are captured by this quasi-Systematic Review, i.e., a model used in [Houston et al, 2001] is also present at another study, in a different paper. A replication example of the original study in [Pfahl et al, 2001] can be found in [Pfahl et al, 2003], it means that the same model was present in both papers.

In the Monte Carlo simulation approach, it was considered just simulation models described strictly as a function of pseudo-random variables. Models (based on other approaches, like discrete-event) only applying this technique over input parameters for stochastic simulation were not taken into account here. Thelin et al (2004) present a Monte Carlo model using such approach to improve software inspections performance by sampling documents according to systematic criteria in order to reduce the set of artifacts to inspect, and so, reducing the time spent on inspections.

3.2 Software Engineering Domains in Simulation Studies

Although domains involved in Software Engineering simulation studies can be easily identified in model descriptions, the same does not occur with the purpose for which the model was built. Frequently, authors use terms like “requirements engineering simulation model”, but these simulators seldom encompass the whole domain and its peculiarities. It’s important to establish boundaries and scope of simulation models in order to evaluate its adoption as an instrument when conducting a simulation study or identifying a research opportunity.

About domains which simulation studies have been applied, Software Process and Software Project domains are the most present in the technical literature [Stopford and Counsell, 2008]. In this quasi-Systematic Review, we could confirm this statement based on papers found.

Of course, it is possible to say, almost always, that simulation models can rely on software process or product. But, in our classification, we use “Software Process” to characterize simulation models concerned with applying analysis in the whole software development process structure and performance points of view. Analysis of process bottlenecks, activities dependencies, and cross-project issues are of interesting of Software Process Simulation. For instance, in [Chen & Liu, 2006] the following quote exemplifies what we characterize as a Software Process Simulation concern:

Page 20: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 20

“So carry on software process simulation dynamically can predict defect and bottleneck in advance, help to eliminate the defect and optimize the software development process, and offer theory support for making decision.”

On the other side, Software Project Simulation is related to (human and material) resource management, allocation policies, and scheduling and cost issues, among others. Examples of such studies can be found in a series of studies conducted by Abdel-Hamid and Madnick in [Abdel-Hamid and Madnick, 1986] [Abdel-Hamid, 1988a] [Abdel-Hamid, 1988b] [Abdel-Hamid, 1989] [Abdel-Hamid, 1990] and [Abdel-Hamid, 1993].

In figure 2, it is clear the majority number of papers and models related to Software Process and Software Project domains, even when analyzing them separately.

Software Architecture and Design domain groups simulation models which the purpose encompass design issues for different classes of systems, for instance: fault-tolerant systems, embedded systems and real-time systems, under the perspective of quality attributes such as reliability and performance. This group is characterized also by an approach of simulating the product (design specification), instead of design process. Alvarez e Cristian (1997) presented a simulation tool (and model) to the design and performance evaluation of fault-tolerant systems and Kang et al (1998) presented the ASADAL/SIM tool, a simulation and analysis tool for real-time software specifications.

Almost all other domains are related to a process-based perspective, for instance, Software Inspections (Madachy and Khoshnevis, 1997), Quality Assurance (Drappa and Ludewig, 1999) and Requirements Engineering (Ferreira et al, 2009) processes. In these cases, simulation models are used to provide an understanding of the impact of these specific sub-processes variables on the whole software development process performance.

Page 21: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 21

An interesting perspective of this distribution (figure 2) is presented in figure 3; it shows the coverage4 of each simulation approach (figure 1) over different Software Engineering domains.

4 By “coverage” we mean the percentage of elements (SE domains in this case) observed to appear given a

simulation approach.

Figure 2. Software Engineering Domains Distribution

Page 22: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 22

The bar chart presented in figure 3 should be read considering each percentage as a number representing the relative occurrence of a specific approach in the space of software engineering domains. For example, the Discrete-Event approach appears in 47.1% percent of the Software Engineering domains found in this review, considering the number of papers and not models only.

Among the different simulation approaches, Discrete-Event and System Dynamics present the major coverage over these domains. Besides, Hybrid Simulation, which is mostly a combination of Discrete-Event and System Dynamics, covers 23.5% of the domains. So, the representation of these approaches in Software Engineering, in our sample, can be considered as the main alternatives to build a simulation model in this context.

A detailed view of this coverage can be seen in table 1. It illustrates this coverage by mapping each SE Domain (rows) to the simulation approaches (columns), according to the findings in analyzed papers. The numbers inside the cells indicate in how many papers such mapping was found. For instance, the “Hybrid Simulation (Continuous + Discrete)” approach was used to develop modes of the following domains: “Agile Methods” (1 paper), “Global Software Development” (3 papers), “Software Process” (6 papers) and “Software Project Management” (2 papers). Thus, these four domains represent 23.53% of the domains observed in this quasi-systematic review, in other words, the “Hybrid Simulation (Continuous + Discrete)” approach has coverage of 23.53%.

Figure 3. Simulation Approaches Coverage over Software Engineering Domains

Page 23: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 23

Table 1. Mapping between SE Domains and Simulation Approaches

ABS CG DEVS GCS GDS HS (Cont + Discrete)

HS (PN + DEVS) KBS MCS

Not Specified OOS

Proxel-based QS qCS SQS SBS SPA SD TPA COVERAGE (%)

Agile Methods 1 1 2 17,65

CBSE 1 5,88

Global Sw Development 1 1 3 17,65

OSS Dev 1 1 1 1 23,53

QA 1 5,88

Req Eng 1 6 11,76

Software Acquisition 1 5,88

Software Architecture/Design 5 1 2 2 1 29,41

Estimation 1 5,88

Sw Evolution 1 1 4 17,65

Inspections 1 3 11,76

Software Process 1 1 6 1 1 1 6 1 47,06

Product Line 2 5,88

Sw Project Management 4 2 1 2 1 1 1 1 23 52,94

Release Planning 1 3 11,76

Testing 3 1 11,76

Technology Substitution / New Technology Adoption / Innovation 1 5,88

COVERAGE (%) 11,76 5,88 47,06 17,65 5,88 23,53 5,88 5,88 5,88 23,53 5,88 5,88 11,76 11,76 5,88 5,88 5,88 76,47 5,88

ABS – Agent-Based Simulation CG – Conditional Growth DEVS – Discrete-Event GCS – General Continuous Simulation GDS – General Discrete Simulation HS (Cont + Discrete) – Hybrid Simulation (Continuous + Discrete) HS (PN + DEVS) – Hybrid Simulation (Petri Net + Discrete-Event) KBS – Knowledge-Based Simulation MCS – Monte Carlo Simulation

OOS – Object-Oriented Simulation QS – Qualitative Simulation qCS – quasi-Continuous Simulation SQS – Semi-Quantitative Simulation SBS – State-Based Simulation SPA – Stochastic Process Algebra SD – System Dynamics TPA – Temporal Parallel Automata

Page 24: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 24

3.3 Simulation Tools for Software Engineering

Simulation models are systems or processes abstractions specified in a simulation language, representing the concepts involved in the underlying simulation approach. So, when using these models as instrument for a simulation-based study, simulation tools are needed in order to make feasible the simulation trials.

For the papers found in this quasi-Systematic Review, most papers present an experience using generic-purpose simulation tools, like Vensim5, Arena6 and iThink7. On the other side, specific simulation tools appear as the authors’ implementation of their models; it is the case of SESAM (Software Engineering Simulation by Animated Models) [Drappa & Ludewig, 2000], where a tool with an interactive graphical user interface is used to simulate a software project organization for training purposes. Often, specific-purpose tools are used only in a few studies. It may occur because the tool purpose reaches only objectives of a specific model and of its respective studies. Figure 4 presents simulation tools distribution across papers and models captured in this quasi-systematic review.

The most used simulation tools as it can be seen in figure 4 are Vensim, Extend8 and iThink. The main reasons for this adoption may be related to their generic purpose and also for simulation approaches supported by these tools, named System Dynamics and Discrete-event simulation.

The tools and their references are presented in table 2, as well as the simulation approach supported by them.

5 http://www.vensim.com

6 http://www.arenasimulation.com

7 http://www.iseesystems.com

8 http://www.extendsim.com

Figure 4. Simulation Tools Distribution

Page 25: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 25

Table 2. Simulation Tools References

Simulation Tool Reference Simulation Approach

AnyLogic www.xjtek.com/anylogic/why_anylogic -Discrete-Event

ARENA www.arenasimulation.com -Discrete-Event

ASADAL/SIM selab.postech.ac.kr/xe/?mid=selab_link -Discrete-Event

C-Sim www.atl.lmco.com/projects/csim -Discrete-time Simulation

CESIUM No website found. -Object-oriented Simulation

DEVSim++ sim.kaist.ac.kr/M5_1.htm - Hybrid Simulation

(SD + DEVS)

-Discrete-Event

-Not Specified

Extend www.extendsim.com - Hybrid Simulation

(SD + DEVS)

-Discrete-Event

-System Dynamics

GENSIM No website found. -System Dynamics

iThink www.iseesystems.com/ -System Dynamics

Matlab www.mathworks.com/products/matlab/ -Discrete-Event

NetLogo ccl.northwestern.edu/netlogo -Agent-based Simulation

PEPA www.dcs.ed.ac.uk/pepa/tools -Stochastic Process Algebra

PowerSim www.powersim.com -System Dynamics

Professional Dynamo Plus (PD+)

No website found. -System Dynamics

Prolog www.swi-prolog.org -Discrete-Event

QSIM ii.fmph.uniba.sk/~takac/QMS/qsimHowTo.html -Qualitative Simulation

-Semi-quantitative Simulation

ReliaSim No website found. -Not Specified.

RiskSim www.treeplan.com/risksim.htm -Hybrid Simulation

(PN + DEVS)

SEPS No website found. -System Dynamics

SES www.stackpoleengineering.com/software.aspx -Hybrid Simulation

(PN + DEVS)

SESAM www.iste.uni-stuttgart.de/en/se/forschung/schwerpunkte/sesam.html

-quasi-Continuous Simulation

SIMNET No website found. -Discrete-Event

SimSE www.ics.uci.edu/~emilyo/SimSE -Not Specified

SLAMSYSTEM research.microsoft.com/en-us/projects/slam -Discrete-Event

Statemate Magnum No website found. -State-based Simulation

SystemC www.systemc.org -Not Specified

Vensim www.vensim.com -System Dynamics

Page 26: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 26

3.4 Characteristics of Simulation Models

This quasi-systematic review aims at characterizing simulation-based studies in many facets. So, among other data, we looked for characteristics of simulation models. Besides, we tried to identify any kind of taxonomy or classification for models characterization.

After intense information extraction through selected papers, we have found a set of characteristics explicitly mentioned by the authors conceived with the model descriptions (figure 5). In several papers, we perceived that authors do not characterize their models; but just present the model or a representative part of it. There is no concern on describing models according to their essential characteristics; it is likely only to mention the underlying simulation approach (and sometimes to describe this approach) instead. For example, Pfahl and Lebsanft (2000) mentioned the simulation approach in the quote “The simulation model was implemented in a modular way using the SD (System Dynamics) tool Vensim”. Another example is “We present a discrete-time simulator tailored to software projects which…” [Padberg, 2003]. The remaining characterization is just a model specification of brief description; in terms of its variables, instead of how it works.

All the characteristics presented in figure 5 are mostly related to the simulation approaches, instead of simulation models. Thus, we conclude that authors assume simulation model characteristics are known through its underlying approach. When authors are using a new or hybrid approach, a brief description of how it works is given. For these reasons, the characteristics distribution is biased by System Dynamics characteristics (such as, “dynamic”, “causal relationships”, “feedback loops”, and others), since this approach has more occurrences than others. Table 2 presents the description of it characteristic taken from these papers.

Figure 5. Simulation Model Characteristics

Page 27: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 27

Table 2. Description of Simulation Models Characteristics

Characteristic Description

Analytic Express high-level quantitative relationships between input and output parameters through mathematic equations.

Asynchronous Lack of centralized coordination of the simulation model.

Bi-directional Simulation (Forward and Reverse)

Using both types of models together: Forward simulation - models of the system of interest as it evolves forward through time. Reverse simulation - models of the system as it moves in reverse or backward in time.

Causal Relationships Establish cause-effect relationships.

Continuous-time Time advances in constant and small steps, as a continuously differentiable function.

Deterministic Given a specific input, the output will be the same for every simulation run.

Discrete-time Timeless steps that are interleaved with user defined simulated durations. Timeless execution does not mean that the code in one step takes zero time to execute; it means that the model time that is used in discrete simulations is frozen during the step. Basically, the state space is built by observing all possible options of what can happen at the next time step.

Dynamic Model’s parameters are less reliant on highly precise numerical data, and they are likely to exceed historic ranges in any case. Cause-effect relationships constantly interact while the model is being executed.

Envelop functions Given interval bounds on the values of some landmarks and envelopes on the monotonic functions, its QDE defines a constraint-satisfaction problem (CSP).

Extensible Provide explicit extension point through input and output ports. If any model has a compatible input or output port, the model can be extended.

Feedback loops A continuous flow of information within a system and it has a property of self-correction.

Formal The structural relations between variables must be explicitly and precisely defined.

Fuzzy variables Use of fuzzy logic on defining model (input and output) variables.

Hierarchical Structured in different levels and blocks.

Interactive Users are responsible for generating control signals or data out of external entities. Users can also change system states interactively to situate special conditions and to debug and locate the cause of unexpected behavior.

Knowledge Representation

Use of knowledge modeling techniques (for instance, Cognitive Maps)

Nonlinear interactions Interactions between variables not following a linear function.

PI-Calculus Uses a rigorous semantics described in pi-calculus formalism.

Process-based A discrete model described as a workflow, different from event-driven.

Qualitative Abstraction Qualitative Abstraction (QA) of the empirical data transforms a sequence of measurements into a pattern.

Qualitative Differential Equations

Parameters of the differential equations do not need to be specified as real numbers. It is sufficient to know whether they are positive or negative, specifying simply as monotonically increasing and decreasing. Expresses natural types of the incomplete knowledge from real world.

Queue models Each stage is modeled as number of servers, where every server has its own (resource) queue.

Rule-based The rule part of the model determines how the state is changed with every time step. Each rule consists of a condition and an action part.

Scenario-based Enable the user to test the effects with several combinations of events on the process. It is a model that allows a user to define several different scenarios for the system or process.

State based on events The state of a system is changed only when certain events occur and is not changed between these events.

State-based Processes an input event based on its state and condition, and it generates an output event and changes its state (state transition).

Stochastic Instead of assigning deterministic values to model parameters and variables these values can be sampled from plausible input distributions.

Synchronous Opposite of asynchronous.

Page 28: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 28

The approaches coverage for these characteristics is presented in figure 6.

According to figure 6, Discrete-Event and System Dynamics approaches are the ones which cover, or at least were described like this, the most part of the characteristics found in the set of papers selected for this quasi-systematic review, and besides, the Hybrid Simulation approach covers 24.14% since it is described based on the characteristics of two previous approaches and how they are combined. The majority of papers describing models based on these two approaches may explain it, in part. So, the probability to find a better characterization about these approaches is substantially larger than the others. Other approaches, in general, can only cover their specific characteristics, since it is what is highlighted in papers that contain models based on them.

Table 3 illustrates this coverage by mapping each characteristic (rows) to the simulation approaches (columns), according to the findings in analyzed papers. The numbers indicate how many times such mapping was found. For instance, the characteristic “Nonlinear Interactions” is mentioned in five research papers describing or using a model based on “System Dynamics” approach.

Figure 6. Approaches coverage over characteristics

Page 29: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 29

Tabela 3. Characteristics X Simulation Approach Mapping

ABS CG DEVS GCS GDS HS (Cont + Discrete)

HS (PN + DEVS) KBS MCS

Not Specified OOS

Proxel-based QS qCS SQS SBS SPA SD TPA

COV.(%)

Analytic 1 5.26

Asynchronous 1 5.26

Bi-directional Simulation 1

5.26

Causal Relationships 1 15 10.53

Continuous-time 2 4 1 1 4 26.32

Deterministic 7 5.26

Discrete-time 5 1 2 1 1 26.32

Dynamic 1 2 1 4 1 22 31.58

Envelop functions 1 5.26

Extensible 4 1 2 15.79

Feedback loops 4 1 13 15.79

Formal 1 1 10.53

Fuzzy variables 1 5.26

Hierarchical 2 1 10.53

Interactive 1 5.26

Knowledge Representation 1 1

10.53

Nonlinear interactions 5 5.26

PI-Calculus 1 5.26

Process-based 2 1 10.53

Qualitative Abstraction 2 5.26

Qualitative Differential Equations 2 1

10.53

Queue models 3 5.26

Rule-Based 2 5.26

Scenario-based 4 5.26

State based on events 3 2 1 15.79

State-based 2 1 10.53

Stochastic 1 11 1 1 1 2 1 1 1 3 52.63

Synchronous 1 5.26

COVERAGE (%) 7.14 7.14 42.86 10.71 7.14 25.00 3.57 7.14 3.57 14.29 0.00 3.57 10.71 14.29 7.14 7.14 7.14 39.29 7.14

Page 30: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 30

ABS – Agent-Based Simulation CG – Conditional Growth DEVS – Discrete-Event GCS – General Continuous Simulation GDS – General Discrete Simulation HS (Cont + Discrete) – Hybrid Simulation (Continuous + Discrete) HS (PN + DEVS) – Hybrid Simulation (Petri Net + Discrete-Event) KBS – Knowledge-Based Simulation MCS – Monte Carlo Simulation

OOS – Object-Oriented Simulation QS – Qualitative Simulation qCS – quasi-Continuous Simulation SQS – Semi-Quantitative Simulation SBS – State-Based Simulation SPA – Stochastic Process Algebra SD – System Dynamics TPA – Temporal Parallel Automata

Page 31: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 31

Among the selected papers, no one gives a classification or taxonomy for simulation models. However, Raffo (2005) says that Software Process Simulation Models rely on three main paradigms: discrete-event, state-based and system dynamics. In that opportunity, some characteristics are only mentioned.

On a Systematic Review about Software Process Simulation, Zhang et al (2008) also present the simulation “paradigms” for each papers/model selected by their study, but no characteristics were discussed.

3.5 Verification and Validation (V&V) Procedures for Simulation Models

Any simulation model based on observation of a real-world system or process needs to be validated in order to ensure a minimum confidence degree of its output results and a compliance with the observed system or process structure and behavior. In an attempt reach such validity, reported simulation studies describe, or mention, some of the procedures in figure 7.

Figure 7. Verification and Validation Procedures Distribution

Table 5 presents a brief description of each V&V procedure found in this review.

Page 32: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 32

Table 5.

Procedure Description

Comparison against actual (dataset) results This procedure consists in comparing the simulation output results against actual output of the same phenomenon. It is likely to use this procedure for measuring model accuracy.

Comparison against data from literature This procedure consists in comparing the simulation output results against output (performance) data from others studies presented in the technical literature. The studies should have the same goals. It is likely to use this procedure when there is no complete data at hands.

Comparison against reference behaviors from the technical literature

This procedure consists in comparing the simulation output results against trends or expected results often reported in the technical literature. It is likely to use this procedure when no comparable data is available.

Comparison against other models results This procedure consists in comparing the simulation output results of one simulation model against other model. Controlled experiments can be used to arrange such comparison.

Review with experts This procedure consists in getting feedback from system or process experts in order to evaluate if simulation results seems to be reasonable. This review may be performed using any method, including inspections. It is likely to use this procedure for model validation purposes.

Interview with experts This procedure consists in getting feedback from system or process experts through interviews in order to evaluate if simulation results seems to be reasonable. It is likely to use this procedure for model validation purposes.

Survey with experts This procedure consists in getting feedback from system or process experts through surveys in order to evaluate if simulation results seems to be reasonable. It is likely to use this procedure for model validation purposes.

Testing structure and model behavior This procedure consists in submitting the simulation model to several of tests cases and evaluating the responses and traces. It is likely to use this procedure for model verification purposes.

Based on empirical evidence from the technical literature

This procedure consists in collecting evidence from the technical literature (experimental studies reports) to develop the simulation model.

It seems that the first alternative chosen has been the comparison between simulation results against actual data, i.e., this is the most common procedure found in the technical literature. It’s a nice way to verify what we call “model accuracy”, in terms of its result outputs, but many others threats to the study validity should be evaluated.

In order to make such comparisons like in early 90s, when Abdel-Hamid and Madnick applied their model in several different environment configurations, as mentioned in section 3.2, it is important to check whether the actual collected data capture the whole context (influence variables and constants) the model requires or assumes to be real, in other words, model parameters and variables should share the same measurement context. Otherwise, it will be a naive comparison between two distinct enough contexts that cannot be compared.

Page 33: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 33

The same problem occurs when the comparison is between simulation output results against data from the technical literature; the latter is seldom available in enough details in order to make valid comparisons. Thus, it is very difficult to assure the same configuration (input and calibration parameters) for both simulation results and data collected from the technical literature. Ambrosio et al (2011) use two procedures for model validation: one is a comparison against actual data from a software company, and the other is a comparison against summarized data from different sources extracted from the technical literature.

Lack of data is a known problem in Software Engineering simulation [Raffo et al, 1999], including support to experimentation [Garcia et al, 2005]. So, different efforts are needed in order to increase the validity of simulation studies. An interesting approach is the comparison of simulation results against known reference behaviors9 in some research areas. In this case, it is possible to analyze whether the simulation model is capable of reproduce consistent results, even if it is not possible to measure accuracy using this procedure. Setamanit et al (2007) compare the output results of a global software development simulation model against behaviors of GSD projects as described in the technical literature.

Once we have validated models, or at least know them in terms of performance and accuracy, it is possible to compare them. It would be good to have benchmark results and available datasets to perform controlled experiments aiming at comparing models, establishing distinct treatment and control groups in order to test hypothesis about independent and dependent variables influence relationships.

Considering the data and reference behaviors unavailability needed for the previous V&V procedures, it still useful to conduct reviews, interviews and surveys with simulation and system under study experts. These kinds of procedure help in a better understanding of the simulation model structure and getting insights to improve it. It is more like a validation procedure, since model validation gets the customer involved. Choi et al (2006) mention a feedback review with experts for verification of a simulation model based on UML for mission-critical real-time embedded system development. Setamanit and Raffo (2008) calibrated their model based on information from survey questionnaires as well as in-depth interviews with the technical director, the project manager, and software developers.

The testing model structure and behavior is related to apply several test cases to the simulation model. No paper gives details about how to plan and perform these tests.

One of the most important “V&V procedures” should be considered before the model to be conceived, i.e., it does not work as a test, but it brings confidence to the simulation model. The model building methodology should be based on empirical evidence in the technical literature; it would rather be based on controlled experiments, which can take conclusion about hypothesis involving the simulation model variables. Anyway, to use results of well-conducted studies (such as controlled experiments, surveys and case studies) in the conception of simulation models would be better than observing in an ad-hoc manner the system or process to be simulated. Melis et al (2006) presented a series of experiments and case studies results about pair programming and test-driven development. These results were used both to determine variables relationships and equations of a system dynamics model.

Figure 8 shows how simulation approaches cover the procedures mentioned before.

9 Examples of reference behaviors in Software Engineering can be the “Brook’s Law” for software projects,

the Lehman's laws of software evolution, and any process/product repeatable behavior in a software organization.

Page 34: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 34

Figure 8. Simulation Approaches Coverage over V&V Procedures

All procedures found in papers report simulations studies seems to be applicable to System Dynamics models, i.e., it possible means that this approach has reached a high maturity degree, covering several ways to verify and validate a model built under this approach. Or, maybe, it is only a matter of a higher sample of studies using System Dynamics. On the other side, there are approaches with no attempt to verification or validation. Further, it will be discussed the evaluation settings in terms of the studies performed in the analyzed papers.

In another perspective view, figure 9 presents the coverage of V&V procedures over simulation approaches.

Page 35: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 35

Figure 9. V&V Procedures Coverage over Simulation Approaches

According to figure 9, the procedure “Comparison against actual results” covers the largest set of simulation approaches with 31.58%. It is a common procedure, but in many cases it seems to be used inappropriately, since data came from two distinct contexts and a comparison cannot be established.

Also, “Testing structure and model behavior” and “Based on empirical evidence from the technical literature” are the two other approaches with a greater coverage, reaching 26.32% of the simulation approaches. The former is presented as an evaluation of the simulator (tool), but not the model itself. On the other hand, evidence from the literature is directly related to the simulation model, i.e., the relationship among variables as well as equations derived from collected data.

Page 36: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 36

ABS CG DEVS GCS GDS HS (Cont + Discrete)

HS (PN + DEVS) KBS MCS

Not Specified OOS

Proxel-based QS qCS SQS SBS SPA SD TPA

COV.(%)

Comparison against actual (dataset) results

1

2 1

1

1

11

31.58

Comparison against data from technical literature

1

2

10.53

Comparison against reference behaviors from the technical literature

2

1

9

15.79

Comparison against other models results

4

1

1

1

21.05

Review with experts 3

1

1

15.79

Interview with experts 9

5.26

Survey with experts 1

1

1

15.79

Testing model structure and behavior

1

1

1

1 1 26.32

Based on empirical evidence from the technical literature

2

1

1 1

8

5.26

COVERAGE (%) 22 0 44 11 0 56 22 0 11 33 11 0 22 11 0 0 0 100 11

ABS – Agent-Based Simulation CG – Conditional Growth DEVS – Discrete-Event GCS – General Continuous Simulation GDS – General Discrete Simulation HS (Cont + Discrete) – Hybrid Simulation (Continuous + Discrete) HS (PN + DEVS) – Hybrid Simulation (Petri Net + Discrete-Event) KBS – Knowledge-Based Simulation MCS – Monte Carlo Simulation

OOS – Object-Oriented Simulation QS – Qualitative Simulation qCS – quasi-Continuous Simulation SQS – Semi-Quantitative Simulation SBS – State-Based Simulation SPA – Stochastic Process Algebra SD – System Dynamics TPA – Temporal Parallel Automata

Page 37: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 37

3.6 Simulation Output Analysis

After executing simulation runs several data is generated and need to be properly analyzed. Analysis of simulation output results is a challenging research area and more about it can be found in [Alexopoulos, 2007] and [Law, 2007]. Here, in this section, we are interested in presenting the statistical instruments (namely charts, tests and metrics) often used to analyze simulation output results. Mostly, output analyses are based on charts, rarely statistical procedures such as hypothesis tests are used upon. Maybe, it relies on the lack of a rigorous and systematic approach for Simulation-Based Studies. Figure 10 shows the instruments found for simulation output analysis.

Figure 10. Simulation Output Results Analysis Instruments

The far mostly present analysis instrument in figure 10 is the Sequence Run Chart. This chart can be used for both discrete and continuous data and well-represent time sensible data, showing a time increasing on it x-axis. Maybe, these characteristics are some reason for its high adoption. In figure 11, we have almost the same chart from figure 10, but excluding Sequence Run Chart to better analyze other occurrences.

Page 38: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 38

Figure 11. Simulation Output Results Analysis Instruments without “Sequence Run Chart”

A curious case happens with Sensitivity Analysis technique: the number of models is greater than number of papers. It happens since there are some papers promoting the use of this technique to analyze and understand simulation models, identifying most relevant model factors and input parameters. In these cases, authors apply sensitivity analysis in more than one simulation model, making some comparisons and presenting different situations that can be found. For example, when models make use of too many input parameters, but just a few really contribute for results. Examples of studies involving sensitivity analysis are [Houston et al, 2001] using four models and [Wakeland et al, 2004] using just one model.

The major part (13 out of 22) of these instruments is represented by statistical charts. Besides, they are the most found in papers. Of course they are a relevant way of presenting data, but the significance of these results should never be analyzed only looking to charts. Other statistical instruments are needed, namely significance tests and systematic analysis procedures (such as sensitivity analysis). These other instruments are still underused and misused. Even descriptive statistics haven’t received the needed attention, as long as they are good sample summaries.

Figure 12 presents simulation approaches coverage over statistical analysis instruments through papers selected.

Page 39: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 39

Figure 12. Simulation Approaches Coverage over Statistical Analysis Instruments

System Dynamics covers, as in other perspectives, the major part of the occurrences with 66.67%, followed by Hybrid Simulation (which uses System Dynamics too) and Discrete-Event approaches with 42.86% and 38.10%, respectively. Again, it seems to be that the number of papers and models reporting these approaches biased the results. However, we can still perceive a lower use of analysis instruments and methods per paper or model, considering approaches with a little number of papers reported. It can be interpreted such that a coverage factor of 4.76% and 9.52% means that these papers present only one or two analysis instruments. Thus, just seven simulation approaches have applied more than two analysis instruments, and considering the complexity of simulation output results, likely, it is not enough.

3.7 Study Strategies involving Simulation

A common motivation to use and develop simulation models is to provide a basis for experimentation. Simulation allows a researcher to estimate the behavior of an existing system under some conditions and can maintain much better control over experimental conditions [Wu and Yan, 2009]. However, simulation models have not been used as experimental instruments in many cases.

Among 108 papers selected in this quasi-systematic review, just 57 present primary studies. Besides, there is a misunderstanding on classifying them in study strategies such as case studies, experiments and others. Most of them are only examples of use (assertions or informal feasibility studies [Zelkowitz, 2007]), with no systematic methodology for planning, execute and analyze the study.

As we could not find any taxonomy or classification schema specific for simulation based studies, we tried to analyze the studies found from the perspective of known study strategies of experimental software engineering research area. First, we present the terminology used and, then, we apply this terminology to our findings.

For an understanding and classification of primary studies, it is important to be aware of the level of control in experimental studies, and it is also important in the context of

Page 40: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 40

simulation. Travassos and Barros (2003) present a four-staged classification of empirical studies in Software Engineering concerning their environment and the participants:

In vivo: studies involving subjects in their own environments;

In vitro: studies executed in a controlled environment, such as a laboratory or a controlled population;

In virtuo: studies involving interaction among subjects and a computerized model of reality;

In silico: studies characterized for both subjects and real world being described as computer models.

Figure 13. Software Engineering Primary Studies Classification

As presented in Figure 13, the setting of environment and participants of each study strategy impact on the level of control, risk and cost. However, for our scope, simulation-based studies, we are interested just on the in virtuo and in silico studies, where the threats to control, risk, and cost is claimed to be low, and need for SE knowledge to be high.

Some confusion can be observed about what control does mean in simulation studies. The explicit variation of input parameters (many times used in techniques like Sensitivity Analysis) does not mean that a model user has the control over the object under investigation. In doing so, he/she is just establishing a parameter setting or configuration, which is different from controlling factors (independent variables) in a controlled experiment arrangement.

There is a slightly difference from understand a model behavior and arrange a valid experiment design. When the variation of the input parameters is performed in an ad-hoc manner, just intending to understand the model behavior through the correspondent impact on the model output variables, you have no control over the simulation model. In other words, the output values are not meaningful yet in a real context, since you cannot make assumptions that you are generating confident or valid output values without previously observing the actual behavior, for instance, on in vivo or in virtuo contexts as pointed out in Figure 13.

Unless the variation of model input parameters change the model behavior, what has been done is just to determine the value of a dependent variable for the model behavior curve/function/equation, given independent variable value.

Page 41: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 41

The same concept can be applicable to a study comparing two models: one that has been built and evaluated against actual data collected from the system it has been abstracted from, and a new model based on the first model but adding new modules/features without actual data support. If both models are running and being compared based on the same data set, this use of shared data set can be a threat to construct and conclusion validity of the study. The explanation for that relies on the fact that output values of a modified model does not come from the same measurement context of the former model. So, they are not comparable.

Applying this concept of control to the different study strategies, we took as a starting point the glossary10 and the ontology11, both proposed by Lopes (2008), in order to classify the simulation studies observed in the SLR. Below, we present the research strategies taken from the ontology and their respective definition from the glossary:

Action Research: “Action research is a form of action inquiry that employs recognized research techniques to inform the action taken to improve practice”.

o Characteristics: applied in a real-life and non-controlled environment, researcher intervention, collaborative action with subjects;

o In simulation: not applicable, if not case study. Survey: “A comprehensive research method for collecting information to describe,

compare or explain knowledge, attitudes and behavior. A survey is often an investigation performed in retrospect, when, for example, a toll or technique, has been in use for a while.”

o Characteristics: information gathering, retrospective, no control. o In simulation: to run simulations based on quantitative historical data without

change model variables. Case Study:

o “A case study is an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between the phenomenon and context are not clearly evident (Yin, 2003). Data is collected for a specific purpose throughout the study. It is normally aimed at tracking a specific attribute or establishing relationships between different attributes.”

o Characteristics: real-life context/environment, unclear boundaries and scope, specific purposed data gathering, low level of control;

o In simulation: to run simulations based on current quantitative real-life context data in order to investigate with a specific purpose.

Observational Study: “Observational study collects relevant qualitative, sometimes quantitative, data as a project develops. There is relatively little control over the development process.”

o Characteristcs: data gathering, low level of control; o In simulation: not applicable.

Controlled Study: “A controlled experiment is an investigation of a testable hypothesis where one or more independent variables are manipulated to measure their effect on one or more dependent variables. Controlled experiments allow us to determine in precise terms how the variables are related and, specifically, whether a cause-effect relationship exists between them.”

o Characteristics: high level of control, testing hypothesis, cause-effect relationship among variables;

o In simulation: comparison of output variables of two distinct models12.

10

http://lens-ese.cos.ufrj.br/wikiese/ 11

http://lens.cos.ufrj.br/esee/ 12

Here, we are talking both simulation and analytical models.

Page 42: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 42

So, we’ve tried to apply these concepts from the external13 technical literature to simulation-based studies, classifying them into: survey (retrospect data), case study (current real data) and controlled experiments.

In addition to this classification scheme, we proposed a set of information to simulation-based studies characterization, according to table 1.

Table 1. Simulation-based studies characterization

Perspective Possible values

Number of models 1, 2, …, N

Number of datasets 1, 2, …, N

Data set - Project data [Historical or Current] - Artificial data [Example or Systematically Generated]

Input parameters - Determined in an ad-hoc way - Determined in an systematic way

- Constant - Variables

Model calibration - Calibrated - Non-calibrated

Study procedure - Comparison against other or modified models results - Variation of input parameters to observe the impact on response variable - Just execute simulation runs

We obtained some statistics in characterizing the studies found in our quasi-systematic review according to the terminology already presented. As shown in figure 14(A), Survey studies cover more than a half of the studies. It is important to remember that these surveys were not designed as a collection of expert’s opinion using forms as instruments, but surveying past (historical) project data through simulation runs using a simulation model as an instrument. It is more like “ask” the simulation model (built with retrospect data) for values of output variables given a configuration of input parameters.

Several authors call their studies an “experiment”. In fact, these studies are what the technical literature call “simulation experiments” and it is different from “controlled experiments”, as defined before and what is the meaning in figure 14(A) term “Experiment”. By “simulation experiment” we mean a “test or a series of tests in which meaningful changes are made to the input variables of a simulation model so that we may observe and identify the reasons for changes in the performance measures” [Maria, 1997], and this definition is closer of what we called “Survey”. In any case, it is difficult to identify hypothesis and experimental design in these reports, and also difficult to identify control and treatment groups in controlled experiments.

Survey studies are proportional to the procedures pointed in figure 14(B), where “Variation of input parameters to observe the impact on output variables” as the procedure adopted to survey the simulation model. The same interpretation can be applied to percentage of controlled experiments and the procedure of “Comparison other/modified models results”. These comparisons were, in most part, made in a particular experimental design using distinct or same datasets in which a treatment and a control group were established to do a fair comparison among involved models.

13

Not captured by this review.

Page 43: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 43

Figure 14. Characterization of simulation-based studies

Another interesting characteristic of these studies is related to how input parameters were determined (figure 14-C) for the study. They can be determined in four ways:

Systematic constant: there is a pre-defined procedure just to generate or choose these values, which will be the same for the whole simulation study; it can also include multiple runs;

Systematic variable: there is a pre-defined procedure just to generate or choose these values, which will be varied in different simulation instants of the simulation study; it can also include multiple runs. Sensitivity Analysis, like in [Houston et al, 2001] can be included in this category;

Ad-hoc constant: there is no pre-defined procedure to generate or choose these values, which will be the same for the whole simulation study; it can also include multiple runs;

Ad-hoc variables: there is no pre-defined procedure to generate or choose these values, which will be varied in different simulation instants of the simulation study; it can also include multiple runs. One common behavior of it is to set a new parameter value at a time t during the simulation run.

The model calibration is also an important characteristic in simulation studies [Ören, 1981]. In figure 14(D) it is possible to see that many studies do not report this kind of information. Garousi et al (2009) presented an experiment using two distinct calibration parameters set, and also discussed the calibration, in order to understand the impact of V&V activities on project performance. Many of the simulation studies with no calibrated model are based on artificial data.

The distribution of simulation studies over simulation approaches is presented in figure 15. The vast majority of simulation-based studies use System Dynamics models. Also, we have replications of simulation studies, as already mentioned before. Replications happen almost always by the same author. One main contribution for it is the fact that, currently,

Page 44: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 44

reported data is not enough to replicate a study. Some missing information in simulation-based study reports will be commented later in this section.

Figure 15. Distribution of Simulation Studies over Simulation Approaches

Analogous to simulation approaches, we present the studies distributed over Software Engineering domains in figure 16.

Figure 16. Distribution of Simulation Studies over SE Domains

Page 45: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 45

As occurs with System Dynamics approach, Software Project Management is the most studied domain in simulation studies selected. In the same way it has been explained for others perspectives (characteristics, v&v procedures, analysis instruments, and others), System Dynamics models for Software Project Management can be found in most selected papers in this quasi-systematic review. So, we may also conclude that these are the simulation-based studies that have been through a high experience and it is possible to learn with their maturity. Maybe, the will of explain the whole software development dynamics lead to this focus. However, study purpose and goals differ too much to explain it all.

In papers analyzed, purpose and goals of simulation models are not clearly defined, and the same happens to the goals of the performed studies. It is very common to find descriptions mentioning only the problem where the proposed simulation model is involved, but it is very hard to find specific or structured research questions, hypothesis or a GQM approach, for example. At a first impression, one can believe that models were conceived before problems arise. It must not be true, in fact. But, the way how simulation-based studies have been reported lead to this kind of conclusion.

Another issue found in simulation studies reports is related to the experimental design. In general, it is not reported at all. It’s possible, but hard to identify factors (many times, just parameters) and response variables for the studies where the output data is presented, at least, by charts. The arrangement is rarely found, i.e., to answer simples questions such as what are the treatments/levels for each experimental factor? What are the other model input parameters (context variables)? Do they remain constant? What were their initial values for each simulation run? Seldom, simple information like number of simulation runs are reported, and if were, no explained why such number was used.

These problems should be considered of main importance, because without addressing these issues, replicate and audit these studies (or even verify the results) is an unfeasible task, as well as the difficulty to compare studies results and to make a benchmark of models, since there is not a comparable baseline.

There are some exceptions among simulation-based studies as in [Houston, 2001] and [Wakeland et al, 2004], but the vast number of these studies rely on proof of concept-like studies, consisting in ad hoc experimental designs, not complying to any systematic experimental methodology, and often missing relevant information in their reports. Houston et al used DOE to measure the relative contribution of each factor to the variation in the response variables in order to characterize the behavior of system dynamic simulation models. Wakeland et al proposed the use of DOE (Design of Experiments) and BRSA (Broad Range Sensitivity Analysis) in order to understand the interactions and nonlinear effects at work in the model, i.e., the model logic and behavior, and in this way, leading to a better understanding of the underlying system/process.

Page 46: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 46

4 Conclusions

This section presents some concluding remarks taken from the performed quasi-systematic review. First, we discuss threats to this review validity. Later, we present the open questions emerged after analyzing the characterization results. Finally, we summarize the current state of the art and future perspectives.

4.1 Threats to validity

It is possible to identify threats and risks to this quasi-systematic review validity in several performed activities. We tried to reduce them as much as possible when they were identified. Here, we present a discussion about each threat to validity of our work.

Keywords. Terminology problems have been reported in many research areas involving computers science and engineering, including computer simulation and software engineering. So, to establish the set of keywords used to compose our search string we performed on a previously performed ad-hoc review. Also, we submit them to two experts in computer simulation applied to Software Engineering in order to minimize the absence of an unknown term. We do not use specific terms for each software engineering domains (such as testing, design, requirements, inspections, and others), instead of it, we used general terms to represent the Software Engineering area. It may be a threat to our study validity since it is possible that papers do not mention general terms of software engineering, rather than specific ones.

Sources Selection. Scopus, Ei Compendex, Web of Science (ISI Knowledge) encompass the main publication databases (including journals and conference papers) for computer simulation and software engineering research. Examples of such databases include ACM, IEEE, Elsevier, Springer and WILEY. Among these databases we can find papers from journals like SIMULATION, Simulation Modelling Practice and Theory, Journal of Systems and Software, Software Process Improvement and Practice (now incorporated in the Journal of Software Maintenance and Evolution: Research and Practice), IEEE Transactions, LNCS and conferences like ICSSP (and their former versions), ICSE, ICST, among many others. For characterization purposes, a sample of the most important technical literature seems to be enough, since it is not feasible to review all research papers published about this subject.

Inclusion and Exclusion Criteria. We tried to filter as much as possible papers not considering simulation applied to the Software Engineering field. Nine control papers (papers that should compose the answer to the research question) were pre-selected from the ad-hoc literature review and according to the authors’ experience. The coverage of search string related to our control papers was about 67%. From the relevant papers selected after the application of inclusion and exclusion criteria 14 papers were unavailable for download.

Personal Understanding and Bias. Although we know that doing quasi-systematic reviews may impose a lot of extra and manual work that are error prone [Dyba et al, 2007], three reviewers were involved in order to reduce the selection and information extraction (guided by the information extraction form) bias.

Classification. As long as we have no consensual taxonomy or classification schema we based it on the information presented by each paper. It may cause inaccurate classification. We tried to group terms (simulation approaches, characteristics, analysis and V&V procedures) pursuing a semantically similar definition or description.

Conclusions. Another limitation of this study is the publication selection bias, since publications rarely contain negative results and present their weakness. Considering all

Page 47: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 47

these threats and the caution taken in order to reduce them, we believe our quasi-systematic review was systematically performed and it brings some confidence to the results.

4.2 Open Questions

Some questions still remain after running this quasi-systematic review. They are listed as follows:

Which kind of method or procedure can reduce the gap from what is observed from

Software Engineering systems and processes and what has been modeled?

What could be considered a minimum set of V&V procedures that, when

successfully implemented, could bring confidence to a simulation model?

What are the requirements for replicating simulation studies?

4.3 State of the Art and Future Directions

We will drive this section based on results found in this review and also on the open questions presented in the last section (4.2).

Simulation approaches are likely to be mentioned and, in many cases, described. It is important to give an overview of the model underlying approach since it gives an interesting background of how the real system or process was abstracted and what is the execution mechanism used to drive simulations. Unfortunately, a few published simulation models use an approach not clearly defined (what we called “Not specified”), and it difficult to realize a simulation model developed without considering a standard abstraction and behavior.

Domains also appear to be clearly defined. And, simulation in software engineering tends to be biased by the System Dynamics approach and the Software Process and Project Management models.

It was not possible to capture what is the motivation for using a specific simulation approach. Maybe, such motivation should be the perspective modelers want to analyze the systems or processes or that some kinds of problems are related to a given simulation approach. We tried to relate simulation approaches to software engineering domains and characteristics found, but the only thing we could conclude is that models characteristics are just driven by simulation approaches. As long as we were not able to clearly capture and group problems and purposes of simulation models, it continues as a hypothesis.

Simulation studies comprehend the main concerning of our results. There is a lack of rigor in planning studies (mainly experimental design issues), in assure model validity before perform studies, and analysis procedures for simulation output data. All these issues are treated, most likely, in an ad-hoc fashion, except for few studies that present a systematic way of doing one or another activity, but never all of them together.

Following this reasoning about simulation studies, we believe that methodologies encompassing from planning to report of results of simulation-based studies, passing through model validity assurance and output analysis can be the next future direction and important problems to be solved in Software Engineering field. Such methodology should also consider peculiarities of Software Engineering such as lack of data and measurement issues.

Improvements taken from such a methodology can highlight the requirements for replicating simulation studies, once it is done in a systematic way, it should be repeatable.

Page 48: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 48

Another possible direction which is, in fact, on the way is about proposing concrete methods for developing simulation models in software engineering. Some specific approaches were proposed in literature, for instance, the IMMoS (Integrated Measurement, Modelling, and Simulation) methodology for development of System Dynamics models for Software Process Simulation [Pfahl and Ruhe, 2002].

Page 49: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 49

5 Referências

Abdel-Hamid, T. Understanding the "90% syndrome" in software project management: A simulation-based case study The Journal of Systems and Software, 1988, 8, 319-33.

Abdel-Hamid, T. The economics of software quality assurance: A simulation-based case study MIS Quarterly: Management Information Systems, 1988, 12, 395-410.

Abdel-Hamid, T. K. Dynamics of software project staffing: A system dynamics based simulation approach IEEE Transactions on Software Engineering, 1989, 15, 109 – 119.

Abdel-Hamid, T. K. Investigating the cost/schedule trade-off in software development IEEE Software, 1990, 7, 97-105.

Abdel-Hamid, T. A multiproject perspective of single-project dynamics The Journal of Systems and Software, 1993, 22, 151-165.

Abdel-Hamid, T. K. & Madnick, S. E. IMPACT OF SCHEDULE ESTIMATION ON SOFTWARE PROJECT BEHAVIOR. IEEE Software, 1986, 3, 70 – 75.

Abdel-hamid, Tarek; e Madnick, Stuart. Software Project Dynamics: An Integrated Approach. Facsimile Edition, Prentice-Hall. 1991.

Ahmed, R.; Hall, T.; Wernick, P.; Robinson, S. & Shah, M. Software process simulation modelling: A survey of practice. Journal of Simulation, 2008, 2, 91 – 102.

Al-Emran, A. & Pfahl, D. Operational planning, re-planning and risk analysis for software releases. Lecture Notes in Computer Science, 2007, 4589 LNCS, 315 – 329.

Al-Emran, Ahmed; Pfahl, Dietmar; Ruhe, Günther. A Method for Re-planning of Software Releases Using Discrete-event Simulation. Software Process Improve and Practice, v. 13, p 19–33. 2008.

Alexopoulos, “Statistical analysis of simulation output: State of the art”. Simulation Conference, 2007 Winter, 2007.

Alvarez, Guillermo A., C. F. Applying simulation to the design and performance evaluation of fault-tolerant systems Proceedings of the IEEE Symposium on Reliable Distributed Systems, IEEE Comp Soc, Los Alamitos, CA, United States, 1997, 35-42.

Ambrosio, B. G.; Braga, J. L. & Resende-Filho, M. A. Modeling and scenario simulation for decision support in management of requirements activities in software projects Journal of Software Maintenance and Evolution, 2011, 23, 35 – 50.

Araújo, M. A. P.; Travassos, G.H. Towards a Framework for Experimental Studies on Object-Oriented Software Decay. In: ACM/IEEE ISESE’04-International Symposium on Empirical Software Engineering, Redondo Beach, USA, 2004.

Arief, L. B.; Speirs, N. A. A UML Tool for an Automatic Generation of Simulation Programs. In Proceedings of WOSP 2000

Banks, J. Introduction to Simulation. In: WINTER SIMULATION CONFERENCE (WSC’99). Phoenix, AZ, USA, 1999.

Barros, Márcio O.; Werner, Claudia M. L.; Travassos, Guilherme H. Supporting risks in software project management. The Journal of Systems and Software, v. 70, p. 21–35. 2003.

Page 50: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 50

Biolchini, J., Mian, P.G., Natali, A.C., Travassos, G.H. (2005). “Systematic Review in Software Engineering: Relevance and Utility”. Technical Report. PESC - COPPE/UFRJ. Brazil. http://www.cos.ufrj.br/uploadfiles/es67905.pdf .

Birta, L. G. e Arbez, G. Modelling and Simulation: Exploring Dynamic System Behaviour. Springer. 2007.

Chen, Y.-X. & Liu, Q. Hierarchy-based team software process simulation model Wuhan University Journal of Natural Sciences, 2006, 11, 273 – 277.

Choi, K.; Jung, S.; Kim, H.; Bae, D.H. & Lee, D. UML-based modeling and simulation method for mission-critical real-time embedded system development. Proceedings of the IASTED International Conference on Software Engineering, as part of the 24th IASTED International Multi-Conference on APPLIED INFORMATICS, 2006, 160 – 165.

Drappa, A. & Ludewig, J. Quantitative modeling for the interactive simulation of software projects Journal of Systems and Software, 1999, 46, 113 – 122.

Drappa, A. & Ludewig, J. Simulation in software engineering training Proceedings - International Conference on Software Engineering, 2000, 199 – 208.

Tore Dybå, Torgeir Dingsøyr, Geir K. Hanssen. Applying Systematic Reviews to Diverse Study Types: An Experience Report. First International Symposium on Empirical Software Engineering and Measurement, 2007.

Ferreira, S.; Collofello, J.; Shunk, D. & Mackulak, G. Understanding the effects of requirements volatility in software engineering by using analytical modeling and software process simulation Journal of Systems and Software, 2009, 82, 1568 – 1577.

Forrester, Jay W. (1961). Industrial Dynamics. Pegasus Communications. ISBN 1883823366.

Garcia, R. E.; Oliveira, M. C. F.; Maldonado, J. C. . Genetic Algorithms to Support Software Engineering Experimentation. In: IV International Symposium on Empirical Software Engineering (ISESE), 2005, v. 1. p. 488-497.

Grillinger, P.; Brada, P.; Racek, S. Simulation approach to embedded system programming and testing Proceedings - 11th IEEE International Conference and Workshop on the Engineering of Computer-Based Systems, ECBS 2004, 2004, 248-254.

Höst, M., Regnell, B., Tingström, C. A framework for simulation of requirements engineering processes EUROMICRO 2008 - Proceedings of the 34th EUROMICRO Conference on Software Engineering and Advanced Applications, SEAA 2008, 2008, 183-190

Houston, D. X; Ferreira, S; Collofello, J. S.; Montgomery, D. C.; Mackulak, G. T.; Shunk, D. L. Behavioral characterization: Finding and using the influential factors in software process simulation models Journal of Systems and Software, 2001, 59, 259-270

Kang, K.; Lee, K.; Lee, J. & Kim, G. ASADAL/SIM: An incremental multi-level simulation and analysis tool for real-time software specifications SOFTWARE-PRACTICE & EXPERIENCE, JOHN WILEY & SONS LTD, 1998, 28, 445-462.

Law, Averill M. Statistical Analysis of Simulation Output Data: The Practical State Of The Art. Proceedings of the 2007 Winter Simulation Conference.

Page 51: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 51

Lee, B. & Miller, J. Multi-project management in Software Engineering using simulation modeling. SOFTWARE QUALITY JOURNAL, KLUWER ACADEMIC PUBL, 2004, 12, 59-82.

Lehman, M.M., 1980, “Programs, Life Cycle and the Laws of Software Evolution”, Proc. IEEE Special Issue on Software Engineering, vol. 68, no. 9, pp. 1060 -1076.

Lopes, V. P. Repositório de Conhecimento de um Ambiente de Apoio a Experimentação em Engenharia de Software. Dissertação de Mestrado. Programa de Pós-graduação em Engenharia de Sistemas e Computação, COPPE-UFRJ, 2010.

Luckham, D. C.; Kenney, J. J.; Augustin, L. M.; Vera, J.; Bryan, D.; Mann, W. Specification and Analysis of System Architecture Using Rapide. IEEE Transactions on Software Engineering. Volume 21 , Issue 4. Special issue on software architecture. Pages: 336 – 355. 1995.

Madachy, Raymond J. System dynamics modeling of an inspection-based process. In: 18TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ICSE’1996. Berlin, Germany: IEEE Computer Society, 1996. P. 376 - 386.

Madachy, R. Software Process Dynamics. Wiley-IEEE Press. 2008.

Madachy, R. & Khoshnevis, B. Dynamic simulation modeling of an inspection-based software lifecycle processes Simulation, 1997, 69, 35 - 47

Maria, A. Introduction to Modeling and Simulation. Proceedings of the 1997 Winter Simulation Conference.

Martin, Robert H.; Raffo, David. A Model of the Software Development Process Using Both Continuous and Discrete Models. Software Process: Improvement and Practice, v. 5, n. 2-3, p. 147-157, 2000.

Martin, R. & Raffo, D. Application of a hybrid process simulation model to a software development project Journal of Systems and Software, 2001, 59, 237 – 246.

Melis, M.; Turnu, I.; Cau, A. & Concas, G. Evaluating the impact of test-first programming and pair programming through software process simulation Software Process Improvement and Practice, 2006, 11, 345 – 360.

Müller, M.; Pfahl, D. Simulation Methods. Guide to Advanced Empirical Software Engineering. 2008, Section I, 117-152, DOI: 10.1007/978-1-84800-044-5_5

Nance, R. E.; Sargent, R. G. Perspectives on the Evolution of Simulation. OPERATIONS RESEARCH. Vol. 50, No. 1, January-February 2002, pp. 161-172. DOI: 10.1287/opre.50.1.161.177902002.

Navarro, E. O. & Hoek, A. V. D. Design and evaluation of an educational software process simulation environment and associated model Proceedings - 18th Conference on Software Engineering Education and Training, CSEE and T 2005, 2005, 25 – 34.

Ören, T. I. Uses of Simulation. Principles of Modeling and Simulation: A Multidisciplinary Approach / John A. Sokolowski, Catherine M. Banks. John Wiley & Sons, Inc. 2009.

Ormon, S.; Cassady, C. & Greenwood, A. A simulation-based reliability prediction model for conceptual design Proceedings of the Annual Reliability and Maintainability Symposium, 2001, 433 – 436.

Padberg, F. A software process scheduling simulator Proceedings - International Conference on Software Engineering, 2003, 816-817.

Pai, M. McCulloch, M. Gorman, J.D. et al. (2004) “Systematic Reviews and meta-analyses: An illustrated, step-by-step guide”, The National Medical Journal of India, vol. 17, n.2.

Page 52: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 52

Pfahl, D. Lebsanft, K. Using simulation to analyze the impact of software requirement volatility on project performance Information and Software Technology, 2000, 42, 1001 – 1008.

Pfahl, D.; Klemm, M. & Ruhe, G. A CBT module with integrated simulation component for software project management education and training Journal of Systems and Software, 2001, 59, 283 – 298.

D Pfahl, G Ruhe. IMMoS: a methodology for integrated measurement, modelling and simulationby. Software Process: Improvement and Practice (2002) Volume: 7, Issue: 3-4, Publisher: Wiley, Pages: 189-210.

Pfahl, D.; Laitenberger, O.; Dorsch, J. & Ruhe, G. An Externally Replicated Experiment for Evaluating the Learning Effectiveness of Using Simulations in Software Project Management Education Empirical Software Engineering, 2003, 8, 367 – 395.

Raffo, D.; Kaltio, T.; Partridge, D.; Phalp, K.; Ramil, J. F. (1999). “Empirical Studies Applied to Software Process Models”. In Empirical Software Engineering, volume 4, issue 4, pages 353-369.

Setamanit, S.O.; Wakeland, W. & Raffo, D. Using simulation to evaluate global software development task allocation strategies Software Process Improvement and Practice, 2007, 12, 491 – 503.

Setamanit, S.O. & Raffo, D. Identifying key success factors for globally distributed software development project using simulation: A case study Lecture Notes in Computer Science 2008, 5007 LNCS, 320 - 332

Stopford, B. & Counsell, S. A Framework for the Simulation of Structural Software Evolution ACM TRANSACTIONS ON MODELING AND COMPUTER SIMULATION, ASSOC COMPUTING MACHINERY, 2008, 18.

Thelin, T.; Petersson, H.; Runeson, P. & Wohlin, C. Applying sampling to improve software inspections Journal of Systems and Software, 2004, 73, 257 – 269.

Travassos, G. H.; Santos, P. M.; Mian, P. G.; Dias Neto, A. C.; Biolchini, J. (2008) "An Environment to Support Large Scale Experimentation in Software Engineering," Engineering of Complex Computer Systems, IEEE International Conference on, pp. 193-202, 13th IEEE International Conference on Engineering of Complex Computer Systems (iceccs 2008).

G. H. Travassos and M. O. Barros, “Contributions of In Virtuo and In Silico Experiments for the Future of Empirical Studies in Software Engineering,”Proc. 2nd Workshop in Workshop Series on Empirical Software Engineering, The Future of Empirical Studies in Software Engineering, Rome, WSESE03, Fraunhofer IRB Verlag, 2003.

Turnu, I.; Melis, M.; Cau, A.; Setzu, A.; Concas, G. & Mannaro, K. Modeling and simulation of open source development using an agile practice Journal of Systems Architecture, 2006, 52, 610 – 618.

Wakeland, W. W.; Martin, R. H. & Raffo, D. Using Design of Experiments, sensitivity analysis, and hybrid simulation to evaluate changes to a software development process: A case study Software Process Improvement and Practice, 2004, 9, 107 – 119.

C. Wohlin, P. Runeson, M. Höst, M. C. Ohlsson, B. Regnell and A. Wesslén, "Experimentation Software Engineering - An Introduction", Kluwer Academic Publishers, ISBN 0-7923-8682-5, 2000.

Page 53: Artigo Breno

Abordagens para Simulação em Engenharia de Software Revisão Sistemática

COPPE/PESC 53

Wu, M. & Yan, H. Simulation in software engineering with system dynamics: A case study Journal of Software, 2009, 4, 1127 – 1135.

Zelkowitz, M. V. Techniques for Empirical Validation. V. Basili et al. (Eds.): Empirical Software Engineering Issues, LNCS 4336, Springer-Verlag Berlin Heidelberg, pp. 4 – 9, 2007.

Zhang, H.; Kitchenham, B.; Pfahl, D. Reflections on 10 years of software process simulation modeling: A systematic review. Lecture Notes in Computer Science, 2008, 5007 LNCS, 345-356.