Experience with Software Process Simulation and Modeling

Walt Scacchi, ATRIUM Laboratory, University of Southern California, Los Angeles, CA 90089-1421 Wscacchi@rcf.usc.edu
(Presented at ProSim'98, Silver Falls, OR, 22-24 June 1998)
ABSTRACT

In this paper, I describe an approach taken and experiences encountered in developing and applying simulation and modeling technologies to software processes. Processes for both software development and use have been investigated. As such, the focus of this paper is aimed at addressing four aspects of software process simulation and modeling. First, I describe an approach and examples of software simulation and modeling as investigated with the Articulator environment developed at USC. Second, I describe how by focusing on process modeling, analysis, and simulation, we are led to expand the scope of work with software processes toward a more comprehensive software process life cycle engineering. Third, I describe some of the lessons learned from applying modeling and simulation concepts, techniques, and tools to software processes in a variety of organizational settings. Fourth, I provide a brief glimpse for how software modeling, simulation, and process life cycle engineering technologies can be applied to three relatively new areas: software process reengineering, virtual software acquisition, and software development over a wide-area information infrastructure.

 Keywords: software process, process modeling, process simulation, knowledge-based simulation, process life cycle

Introduction

Over the past 10 years, research efforts in the USC System Factory project and the USC ATRIUM Laboratory have investigated various kinds of modeling and simulation issues. These effort focus on understanding the organizational processes involved in software system development and use. Many of these efforts have addressed how to use simulation and modeling tools, techniques, and concepts to understand and forecast software process trajectories prior to their actual performance. This is done when analyzing both "as-is" processes (i.e., simulated models of the software processes observed in some organizational setting) and "to-be" processes (simulated models of new or redesigned software processes intended to operate in the same organizational setting). However, we have also used these same technologies to understand, validate, and refine models of software processes in light of empirically grounded observational data from actual process performances that we have collected.

This report highlights the approach, experiences and lessons learned through these efforts. Specifically, these highlights are grouped under the following headings, then elaborated in sections that follow.

Simulation and modeling software development processes

Models of software processes are particularly interesting when formalized as computational descriptions [Curtis, Kellner and Over 1992]. Accordingly, we can model, simulate, symbolically execute, navigationally traverse or interactively browse them.  In simple terms, this is equivalent to saying that simulation entails the symbolic performance of process tasks by their assigned agents using the tools, systems, and resources to produce the designated products. For example, in simulating an instance of a software project management process, the manager's agent would "execute" her management tasks. Tasks are modeled according to the precedence structure of sub-tasks or steps specified in the modeled process instance. Agents and tasks can then use or consume simulated time, budgeted funds, and other resources along the way. Since tasks and other resources can be modeled at arbitrary levels of precision and detail, then the simulation makes progress as long as task pre-conditions or post-conditions are satisfied at each step. For example, for a manager to be able to assign staff to the report production task, such staff must be available at that moment, else the simulated process stops, reports the problem, and then waits for new input or command from the simulation user.

In our work at USC, we have employed two kinds of simulation technologies in studies of software development processes, knowledge-based simulation (KBS) and discrete-event simulation (DES)  [Mi and Scacchi 1990, Scacchi 1997, Scacchi and Mi 1997]. We have built our own KBS tools, while we employed a commercially available DES package. Each is described in turn. We have also developed and used tools that allow users to browse, stepwise simulate (traverse) and execute software process models across distributed network environments, as described elsewhere [Noll and Scacchi 1997,1998, Scacchi and Mi 1997, Scacchi and Noll 1997].

Knowledge-based simulation of software processes

KBS is one kind of technology used to simulate software processes. Most approaches to modeling and simulating complex processes center around the execution of a state machine that traverses explicitly or implicitly modeled states and/or events in the simulated process instance. For example, the Entity-Process approach developed by Kellner [Humphrey and Kellner 1989] is one that uses the Statechartstm tool [Harel et al. 1990] to explicitly model and traverse the states declared in a software process.

Simulations also typically apply to instances of a modeled process, where the model is developed then run through the simulation after its parameter values are instantiated. In a KBS, software process states can either be explicitly declared or implicitly represented. Implicit representation means that process states will correspond to values that populate a snapshot of the underlying knowledge base of inter-linked objects, attributes, and relations used to record and manage the simulation. As such, the set of attainable next-states is non-deterministic and innumerable, but bounded by the context of a simulated process execution space [Mi and Scacchi 1990]. KBS can also symbolically evaluate a class of process models, or a process model without instance values. Thus, these features help to begin to differentiate KBS approaches to software process simulation.

KBS is useful to address different types of simulated process execution behavior. We have found four types of process behavior of interest. First, understanding software processes requiring fine granularity. Second, analyzing processes with multiple or mixed levels of process granularity. Third, analyzing patterns of interaction and workflow among software developers/agents [E.g., Bendifallah and Scacchi 1989]. Fourth, analyzing processes whose structure and control flow are dynamically changing [Bendifallah and Scacchi 1987, 1989, Mi and Scacchi 1991,1993]. Granularity here refers to the level of detail that we seek to simulate in order to understand gross or fine-level process dynamics or details. The following two figures may help to convey this. Alternatively, we use DES to simulate processes at a coarser level, but when our interest is addressed to understanding the overall performance envelope that results from the simulated executed of a process that is routinely instantiated 5-100+ times. This is described later.

 

Figure 1. Screen display of a KBS session running in the Articulator environment

Figure 1 displays a view of a workstation display running a KBS session in the Articulator environment developed at USC [Mi and Scacchi 1990]. The Articulator was implemented as a rule-based multi-agent problem solving system that models, analyzes, and simulates complex organizational processes. The top left frame lists the KBS production rules that have fired during the process simulation step. Many of these rules pertain to the operation of the KBS engine, rather than to a simulated software process. The details of these rules is not the focus here, but examples can be found elsewhere [Mi and Scacchi 1990]. The top right frame provides an annotated transcription of the software process events that have occurred as a result of the rule firings. The bottom right frame then provides a description of rules that conflict when concurrently fired, and the resulting conflict resolution results that occur.

Figure 2. Paraphrased transcription resulting from a KBS run on a fine-grain software process

Figure 2 provides another screen display this time focusing on the KBS annotation transcript that is partially occluded in Figure 1. Furthermore, addition post-processing has been performed by the Articulator to paraphrase the transcript into a more readable, though somewhat stilted form. Classic techniques for paraphrasing the internal semantic form into a sentential form are employed [Simmons and Slocum 1972]. As before, details on the software process being simulated and automatically documented as shown in Figure 2 appear elsewhere [Mi and Scacchi 1990].

Our use of KBS mechanisms also support features that are uncommon in popular commercial simulation packages. The provision of paraphrasing capabilities shown above is an example. Other examples include using symbolic execution functions to determine the path and flow of intermediate process state transitions in ways that can be made persistent, queried, dynamically analyzed, and reconfigured into multiple alternative scenarios. Similarly, the use of a multi-agent problem-solving representation allows use of a distributed environment to model and simulate the interactions and behavior of different agents, skill sets, and task assignments. These capabilities for knowledge-based software process simulation are described next.

Persistence storage of simulated events enables the ability to run simulation forward and backward to any specific event or update to the knowledge base. Persistence in the Articulator environment was implemented using object-oriented data management facilities provided in the underlying knowledge engineering environment [cf. Balzer and Narayanaswamy 1993, Mi and Scacchi 1990]. Queries to this knowledge base provide one mechanism to retrieve or deduce where an event or update of interest occurs. Dynamic analysis can monitor usage or consumption of resources to help identify possible bottlenecks in simulated process instances. Using these functions, it is possible to run a simulation to some point, backup to some previous context, and then create new instance values for the simulated process model. These changes then spawn a new simulated process instance thread. For example, this could allow adding more time to a schedule, more staff to a overloaded workflow, or to remove unspent money from a budget. Such a change would then branch off a new simulation trajectory that can subsequently be made persistent, and so forth. Furthermore, we can also employ the paraphrasing and report generation functions to produce narrative-like descriptions or summaries of what transpired during a given simulation run. Thus, knowledge-based simulation enables the creation and incremental evolution of a network of event trajectories. These in turn may be useful in evaluating or systemically forecasting the yield attributable to new, interactively designed process alternatives.

The Articulator environment was among the earliest to address issues of modeling and simulating organizational processes enacted by autonomous or managed problem-solving agents [Mi and Scacchi 1990]. Though agent technology was still at a conceptual stage when the Articulator was initially developed (1988-1990), we thought it would be interesting to develop a process simulation capability that could eventually be distributed and run across a local-area or wide-area network. Individually modeled agents acted as place holders for people working in real-world processes. Agents were capable of individual or collective problem-solving behavior, depending on what problem-solving skills (i.e., processes they "know" how to perform) they were given, and how they were configured to interact (typically to send and receive messages from one another). We found that developing and specifying software processes from an agent-centered perspective helped us to more thoroughly understand the processes under examination.

Soon after, we developed an approach to import and export agent-centered software process model and instance specifications to integrate and control new/legacy software engineering environments with open system interfaces [Mi and Scacchi 1992]. Thus, we could now prototype and later generate process-driven software development or use environments that could be tailored for different users or user roles [Mi and Scacchi 1992, Garg et al. 1994, Scacchi and Mi 1997]. Similarly, we could monitor, capture, replay, and simulate the history of a process instance that was enacted in a process-driven work environment [Scacchi and Mi 1997, Scacchi 1997]. Unfortunately, interest in these matters quickly began to overshadow our interest in just simulating software processes using rule-based problem-solving computational agents. Subsequently, our interest in developing an agent-based simulation environment waned as others interested and exclusively focused on agent technology, rather than software processes, moved into the foreground.

Discrete-event simulation of software processes

DES allows us to dynamically analyze different samples of parameter values in software process instances. This enables simulated processes to function like transportation networks whose volumetric flow, traffic density, and congestion bottlenecks can be assessed according to alternative (heuristic or statistical) arrival rates and service intervals. Using DES, process experts often find it easy to observe or discover process bottlenecks and optimization alternatives. Similarly, when repetitive, high frequency processes are being studied, then DES provides a basis for assessing and validating the replicability of the simulated process to actual experience. Likewise, DES can be used when data on events and process step completion times/costs can be empirically measured or captured, then entered as process instance values for simulation. Thus, DES seems well-suited for use when studying the behavioral dynamics of processes with a large number of recurring process instances, and those with relatively short process completion (or cycle) times.

Many commercially available DES packages now support animated visual displays of simulated process executions or results. We use the animated displays so that process experts can further validate as-is and to-be process simulations under different scenarios. These can be viewed as software development or business process "movies". In addition, the animated process instance simulations can be modified, re-executed, and viewed like other simulations. Although we cannot conveniently show such animations in printed form, snapshots captured from such a simulation may suggest what can be observed. In Figure 3, we have modeled an eight-person (or agent) activity for performing an instance of a high frequency end-user process supported by a software system (in this case, for an Accounts Payable process). The figure depicts the structure of the workflow, which agents currently perform what tasks, individual workload pie charts, and work units-in-progress quantities (e.g., the backlog of invoices cleared, problematic invoices, and checks released).

We have employed both KBS and DES to generate and analyze simulated process performance results. With KBS our attention focused on generating persistent records of the process execution context as it evolved through the course of a simulation. By capturing simulation results in this manner, they could be stored in a repository, queried, and even run backwards. Such capability enabled us to observe or measure the dynamics of interactions between the multiple agents (E.g., software development people working in different roles in a process) over some simulation duration or event stream. In contrast, with DES, our interest was often focused on developing answers to questions pertaining to descriptive statistical characterization of simulated process execution events. In this regard, we found that DES could be used to analyze and measure the distribution of time, effort, and cost (as in "activity-based costing" and "process-based costing").

 
                        Figure 3: Visual display from an animated multi-agent simulation

Figure 4 displays a snapshot of an accompanying pie chart depicting current workload, division of labor, and activity-based cost figures (lower right) for the simulated workflow volume shown in Figure 3.

 
Figure 4: Visual display of dynamic pie chart depicting current workload, division of labor, and aggregate costs

Thus, both KBS and DES play vital roles in helping to understand the behavioral dynamics of software processes that we find in real-world settings.

Simulation and modeling in process life cycle engineering

From the discussion in the preceding section, it becomes increasingly clear that modeling and simulating software processes is not an end onto itself. To the contrary, experience in software process modeling and simulation has led to a better understanding the range of activities that can be instigated and supported under the heading of process life cycle engineering [Heineman et al. 1994, Madhavji 1991, Scacchi and Mi, 1993, 1997]. To help make this clear, consider the following set of activities that emerged as a result of our experience with the Articulator environment, together with the additional system capabilities that have been integrated with it [Scacchi 1997]. While many of these process engineering activities have been addressed in this paper, the others are described in more detail in the cited works. Clearly, many other scholars in the software process research community have made important contributions in these areas. But one goal that we sought was to identify and address of process life cycle engineering activities as we encountered them [cf. Scacchi and Mi 1993, 1997]. Similarly our approach and environment for studying software processes was designed and implemented to support a tight integration of modeling, analysis, and simulation [Garg et al. 1994, Mi and Scacchi 1990].

Central to our approach to achieving this integration was the use of a knowledge-based process meta-model [Mi and Scacchi 1990, Mi and Scacchi 1996, Scacchi 1997]. Our research team could easily tailor this meta-model to various processes or process application domains. We believe this was possible because the foundations of the meta-model were grounded in empirical studies of software development and use processes [E.g., Bendifallah and Scacchi 1987,1989, Kling and Scacchi 1982, Mi and Scacchi 1990]. Likewise, the meta-model was based in theoretical conceptualizations of the resource-based ontology that we developed and refined [Garg and Scacchi 1989, Grant 1991, Mi and Scacchi 1990,1996].

Use of process meta-modeling techniques enabled our team to develop a number of associated tools that could be rapidly integrated with our established modeling, analysis, and simulation capabilities. Similarly, it enabled us to rapidly integrate externally developed or commercially developed tools, whose selected inputs and outputs could be interfaced with the modeling and simulation facilities available to us [Garg et al. 1994, Scacchi 1997, Scacchi and Mi 1997, Scacchi and Noll 1997]. Accordingly, this capability enabled us, among other things, to interoperate and automatically transform models of complex organizational processes that we had captured and formally modeled into input formats required by the KBS or DES tools we employed [Mi and Scacchi 1996, Scacchi and Mi 1997]. Thus, development and use of a process meta-model was a key component contributing to the success of our approach and effort.
 

Experience with software process simulation and modeling in industrial settings

We used our capability and facilities for software process simulation in a variety of industrial settings, described elsewhere [Scacchi 1997, Scacchi and Mi 1993,1997, Scacchi and Noll 1997]. Our experiences were shaped in a substantial way by the interactions and feedback we encountered from our organizational sponsors and their staff (typically people at work in the settings being studied).

We used the Articulator environment to model, analyze, or simulate upwards of one hundred or more organizational processes. In this regard, we have constructed software process models and instances for organizations within different industries and government agencies. We have focused, for example, on activities involving team-based software product design and review processes, as well as department and division-wide information system operations and user support processes that include tens to hundreds of participants. These models typically include dozens of classes of agents, tasks, resources, and products, but a small number of software support tools and systems, while the process instantiation may include 1-10+ instances of each class.

In one extreme situation, we were asked to model, analyze, and simulate a large-scale software development life cycle process that was to be used in building a next-generation telecommunications services environment. As a result of modeling and analysis, we discovered the process had 19 levels of (process) decomposition and management delegation. Our advice to the company was that they needed to seriously rethink what they hoped to accomplish with such an onerous and cumbersome life cycle process, rather than have us spend time and resources trying to simulate this process. The company elected not to do this, and our effort ended. However, we were not surprised to read in a trade newspaper some time later that the company had killed the software development effort guided by this process after spending more than $200M while failing to develop the target software system. Our lesson from this is that modeling, analysis, and simulation can help improving software processes, but they cannot overcome management decision-making authority.

Our experience to date suggests the following: Modeling existing processes can take from 1-3 person-days to 2-3 person-months of effort. Process analysis routines can run in real-time or acceptable near real time. Software development process simulations can take seconds to hours (even days!) depending on the complexity of the modeled process, its instance space, and the amount of non-deterministic process activities being modeled. Note however that simulation performance is limited to available processing power and processor memory. This suggests that better performance can be achieved with (clustered) high performance computing platforms.

Overall, we found that our use of KBS could be overwhelming to people unfamiliar with such advanced technology. Our ability to potentially model and simulate individual or small group interactions, then summarize the results of these interactions in a (stilted) narrative format was viewed with some degree of skepticism [Votta 1993]. This may have been due to our inability to sufficiently communicate what was being modeled and simulated (E.g., person-role-tool-task interactions with other person-role-etc.) versus what was not being modeled or simulated (how this person thinks or acts in response to interactions with another person).

The Articulator environment as a platform for software process modeling, analysis, and simulation was designed to address the research interests of a small group of enthusiastic software process researchers. It was not designed to be a tool intended to be provided to end-users in other organizational settings. The complexity of the user interface displayed in Figure 1 should underscore this. Furthermore, its implementation as a rule-based multi-agent problem-solving system with more than a 1000 production rules, most of which are actually small programs implemented in Common Lisp, should make clear that evolving it into a robust, user-friendly system would be impractical. Perhaps with recent advances in agent technology designed to operate over the Internet, some of the KBS concepts pioneered in the Articulator environment can be reinvented and re-implemented in a more robust and intuitive manner.

Alternatively, we found the use and output of DES results to be much more easily accepted by more people. The functional capabilities and technical features of the DES package we used (WITNESStm) were significantly less when compared to our KBS. However, the DES package did provide facilities for generating interactive graphic animations of simulated processes. These "movies" or "animated cartoons" of simulated processes were informing and sometimes entertaining. More importantly, we repeatedly found people could observe and understand the dynamics of an animated process quite readily and intuitively. It appears that this aspect is particularly relevant. Specifically, when people could quickly grasp what was being portrayed in a simulation movie they were watching, they could often recognize process pathologies (E.g., workflow bottlenecks) as well as suggest alternative process redesigns or improvement opportunities [Scacchi and Mi 1997, Scacchi and Noll 1997]. In turn, this feedback could then be used to further engage and "empower" these people to take control over the redesign of their as-is process, or the design of their to-be process. We took these to be a better indicator of "customer satisfaction" with our research approach and results in software process engineering. The apparent value to end-users of intuitive interfaces to software process modeling and simulation environments is therefore a valuable lesson learned.
 

Other emerging topics in software process simulation and modeling

There are three areas of inquiry that we have been begun to investigate in the context of software process simulation and modeling. The first seeks to raise an awareness for the value of applying modeling, simulation, and other process life cycle engineering activities to the problem of systematically optimizing the practice of software process reengineering. The second pertains to the research and development of applications of simulation and modeling technology to redesign the processes associated with the acquisition of large software systems. The third addresses how software process life cycle engineering can be supported over wide-area networks.

Supporting software process reengineering

Probably all of the readers of this paper have an awareness or familiarity with the Capability Maturity Model for assessing and improving software development processes. It seems fair to say that the CMM seeks to draw attention to improving the quality of existing software processes, albeit in an incremental manner that does not require nor assume a capability for software process modeling and simulation. However, it seems possible that for those organizations that elect to model, simulate, and other engineer their software processes across their life cycle, it may be possible to consider and evaluate the potential for radical process transformation or highly optimized software processes.

Initial experience with reengineering business processes that make extensive use network-based end-user software processes has been promising [Scacchi and Noll 1997]. Similarly, techniques have been developed which support the use of automated measures of process flow structure and execution performance as a basis for process redesign and optimization [Nissen 1994, Scacchi 1997]. The opportunity remains for bringing these capabilities together, along with a library of process optimization rules/heuristics, an integrated process simulation capability, and a mechanism for automating the optimization of to-be software process models. Automated software process reengineering has yet to be demonstrated, but in any event, such an innovative capability seems likely to depend on software process modeling and simulation technologies.

A new vision for software system acquisition

Software system acquisition is a particularly vexing problem for government agencies. Billions of dollars are being invested or targeted to the acquisition of multi-million SLOC software systems by various military and civilian (FAA, NASA) government agencies. For example, the U.S. Navy has begun a new long-term program to acquire and deploy a fleet of next-generation battleships collectively referred to as the SC-21 program [SC21]. The acquisition, development, and deployment of these multi-architecture ships is being supported through the use of modeling, simulation, and other technologies for "virtual prototyping" of the ships [Scacchi and Boehm 1998]. However, the technology for modeling and simulating the architecture(s) of the software systems for these ships is scarcely available, and could be said to exist in only in pieces and promises. Nonetheless, R&D in the area of how to apply modeling and simulation of software systems and associated development processes to support software system acquisition appears to be a promising, high consequence, area of study [cf. Boehm and Scacchi 1996, Scacchi and Boehm 1998, SPMN 1997].

Towards a wide-are information infrastructure to supporting software development and use processes

Last, a related area for further attention is the development of a shared or sharable national/global information infrastructure to support collaborative software process life cycle engineering. The WWW and Internet can serve as the foundation for such an infrastructure, but more is needed. For example, our USC research team has investigated how to model software processes so that they can be prototyped, interactively simulated or executed, validated, and redesigned by a group of people who are distributed across multiple dispersed sites connected by the Internet [Noll and Scacchi 1997,1998, Scacchi 1997]. While more can be said about this, the cited articles provide more detail than appropriate here.
 

Conclusions

Overall, our experience with these simulation capabilities can be summarized according to the kind of mechanisms employed. We found that knowledge-based simulation was of greatest value in supporting exploratory analysis of software process models. Knowledge-based simulation technology may be "rocket science" to most people. But it is useful when focused on understanding fine-grained or deep causal relationship low frequency or loosely structured software processes. This can help facilitate qualitative insights into the operation of software processes. In contrast, discrete-event simulation was of greatest value in validating and assessing "shallow" process model instances of high frequency, short cycle-time processes using coarse-grain and statistically representative data  samples. This helps facilitate quantitative insights into the operation of alternative software process instances. Thus, together we find that both knowledge-based and conventional discrete-event simulation capabilities are helpful when used to complement the strengths of one another.

The starting thesis of this paper was framed in terms of simulating and modeling software processes. Experience with the technologies gave rise to enhancements, extensions, and the integration of many additional systems. As a result, we found the software process modeling, analysis, and simulation are best understood and practiced as part of an overall engineering of the software process life cycle. Furthermore, it seems that the use of a process meta-model has demonstrated a strategy for how to successfully integrate new tools and techniques for software process life cycle engineering.

We have had a variety of experiences and learned much for the use of simulation and modeling tools, techniques, and concepts in understanding software development processes. Knowledge-based simulation capabilities now seem to have a limited future unless they can be reinvented using agent technologies. Discrete-event simulation packages, especially those that can produce animated visualizations of simulated process execution seem attractive. Either way, provision of robust and intuitive simulation capabilities seems essential if the goal is to get more people involved in software process modeling and simulation, or if the goal is to explore some of the emerging topics that were identified at the end of this paper.

Finally, we found these all of what we have learned, and much to the technical capabilities we initially developed for studying complex software development processes, can be readily applied to other organizational processes where software systems are used [Scacchi 1997, Scacchi and Mi 1997, Scacchi and Noll 1997]. This is also an important result that researchers investigating how simulation and modeling technology used for studying software processes may want to further consider and exploit.
 

Acknowledgments

The research results described in this paper benefited from the collaborative contributions of many people. In particular, Peiwei Mi and John Noll played major roles in developing and evolving the software technologies described here. Pankaj K. Garg and Mark Nissen also contributed to the research efforts described in the paper. Their contributions are greatly appreciated. In addition, many corporate and government agencies provided the contracts and grants that support the research described here. Their support is also appreciated.

References

B. Balzer and K. Narayanaswamy. Mechanisms for Generic Process Support. Proc. 1st. ACM SIGSOFT Symp. Foundations for Software Engineering. ACM Press, New York, NY, 7-10, 1993. Reprinted in P.K. Garg and M. Jazayeri (eds.), Process-Centered Software Engineering Environments, IEEE Computer Society, 349-360, 1996.

S. Bendifallah and W. Scacchi. Work Shifts and Structures: An Empirical Study of Software Specification Work, Proc. 11th. Intern. Conf. Software Engineering, IEEE Computer Society, Pittsburgh, PA, 260-270, May 1989.

S. Bendifallah and W. Scacchi. Understanding Software Maintenance Work. IEEE Trans. Software Engineering, 13(3):311-323, 1989. Reprinted in D. Longstreet (ed.), Tutorial on Software Maintenance and Computers, IEEE Computer Society, 1990.

B. Boehm and W. Scacchi. Simulation and Modeling for Software Acquisition (SAMSA), Final Report, Center for Software Engineering, University of Southern California, Los Angeles, CA, http://sunset.usc.edu/SAMSA/samcover.html, March 1996.

B. Curtis, M.I. Kellner, and J.W. Over. Process Modeling. Communications ACM, 35(9):75-90, 1992

P.K. Garg, P. Mi, T. Pham, W. Scacchi, and G. Thunquest. The SMART Approach to Software Process Engineering. Proc. 16th. Intern. Conf. Software Engineering. Sorrento, Italy, 341-350, 1994. Reprinted in P.K. Garg and M. Jazayeri (eds.), Process-Centered Software Engineering Environments,  IEEE Computer Society, 141-164, 1996.

P.K. Garg and W. Scacchi. ISHYS: Designing Intelligent Software Systems. IEEE Expert, 4(3):52-63, 1989.

R.M. Grant. The Resource-Based Theory of Competitive Advantage: Implications for Strategy Formulation. California Management Review, 33(3):114-135, 1991.

D. Harel et al. Statemate: A Working Environment for the Development of Complex Reactive Systems. IEEE Trans. Software Engineering. 16(4): 403-414, 1990.

G. Heineman, J.E. Botsford, G. Caldiera, G. Kaiser, M.I. Kellner, and N.H. Madhavji. Emerging Technologies that support a Software Process Life Cycle. IBM Systems J. 32(3):501-529, 1994.

W.S. Humphrey and M.I. Kellner. Software Process Modeling: Principles of Entity Process Models. Proc. 11th. Intern. Conf. Software Engineering, IEEE Computer Society, Pittsburgh, PA, 331-342, 1989.

R. Kling and W. Scacchi. The Web of Computing: Computer Technology as Social Organization. in M. Yovits (ed.), Advances in Computers, 21:3-90, 1982

J. Laffey. Dynamism in Electronic Performance Support Systems. Performance Improvement Quarterly, 8(1):31-46, 1995

F. Leymann and W. Altenhuber. Managing Business Process as an Information Resource. IBM Systems J., 326-348, 1994.

N.H. Madhavji. The Process Cycle. Software Engineering J. 6(5):234-242, 1991. Reprinted in Process-Centered Software Engineering Environments, P.K. Garg and M. Jazayeri (eds.), IEEE Computer Society, 50-58, 1996.

P. Mi, M-J. Lee and W. Scacchi. A Knowledge-based Software Process Library for Process-Driven Software Development. Proc. 7th. Annual Knowledge-Based Software Engineering Conference, IEEE Computer Society, Washington, DC, 122-131, 1992. http://www.usc.edu/dept/ATRIUM/Papers/Process_Asset_Library.ps

P. Mi and W. Scacchi. A Knowledge-Based Environment for Modeling and Simulating Software Engineering Processes. IEEE Trans. Knowledge and Data Engineering, 2(3):283-294, 1990. Reprinted in Nikkei Artificial Intelligence, 20(1):176-191, 1991 (in Japanese); P.K. Garg and M. Jazayeri (eds.), Process-Centered Software Engineering Environments,  IEEE Computer Society, 119-130, 1996. http://www.usc.edu/dept/ATRIUM/Papers/Articulator.ps

P. Mi and W. Scacchi. Modeling Articulation Work in Software Engineering Processes, Proc. 1st. Intern. Conf. Soft. Processes, IEEE Computer Society, Redondo Beach, CA, 188-201, October 1991.

P. Mi and W. Scacchi. Process Integration in CASE Environments. IEEE Software, 9(2):45-53, March 1992. Reprinted in Computer-Aided Software Engineering, (2nd Edition), E. Chikofski (ed.), IEEE Computer Society, 1993. http://www.usc.edu/dept/ATRIUM/Papers/CASE_Process_Integration.ps

P. Mi and W. Scacchi. Articulation: An Integrated Approach to the Diagnosis, Replanning, and Rescheduling of Software Process Failures. Proc. 8th. Knowledge-Based Software Engineering Conference, Chicago, IL, IEEE Computer Society, 77-85, 1993. http://www.usc.edu/dept/ATRIUM/Papers/Articulation.ps

P. Mi and W. Scacchi. A Meta-Model for Formulating Knowledge-Based Models of Software Development. Decision Support Systems, 17(3):313-330. 1996.  http://www.usc.edu/dept/ATRIUM/Papers/Process_Meta_Model.ps

M. Nissen. Valuing IT through Virtual Process Measurement. Proc. 15th. Intern. Conf. Information Systems, Vancouver, Canada, 309-323, 1994.

J. Noll and W. Scacchi. Supporting Distributed Configuration Management in Virtual Enterprises. in R. Conradi (ed.), Software Configuration Management, Lecture Notes in Computer Science, Vol. 1235, 142-160, May 1997.
http://www.usc.edu/dept/ATRIUM/Papers/DHT-SCM7.ps

J. Noll and W. Scacchi. Supporting Software Development in Virtual Enterprises. Journal of Digital Information, revised version to appear, 1998. http://www.usc.edu/dept/ATRIUM/Papers/DHT-VE97.html

W. Scacchi. The life cycle engineering of software process and capabilities: an experience report. Interactive WWW Presentation Materials. http://www.usc.edu/dept/ATRIUM/Process_Life_Cycle.html 1997.

W. Scacchi, Modeling, Simulating, and Enacting Complex Organizational Processes: A Life Cycle Approach. in M. Prietula, K. Carley, and L. Gasser (eds.), Simulating Organizations: Computational Models of Institutions and Groups, AAAI Press/MIT Press, Menlo Park, CA, to appear, 1998.

W. Scacchi and B. Boehm. Virtual System Acquisition: Approach and Transitions. Acquisition Review Quarterly, to appear, 1998. http://www.usc.edu/dept/ATRIUM/Papers/ARQ/VISTA.html

W. Scacchi and P. Mi. Modeling, Integrating, and Enacting Software Engineering Processes. Proc. 3rd. Irvine Software Symposium, University of California, Irvine, 27-38, CA 1993.

W. Scacchi and P. Mi. Process Life Cycle Engineering: A Knowledge-Based Approach and Environment. Intern. J. Intelligent Systems in Accounting, Finance, and Management, 6(1):83-107, 1997.
http://www.usc.edu/dept/ATRIUM/Papers/Process_Life_Cycle.html

W. Scacchi and J. Noll. Process-Driven Intranets: Life-Cycle Support for Process Reengineering. IEEE Internet Computing, 1(5):42-49, September-October 1997. http://www.usc.edu/dept/ATRIUM/Papers/Process_Intranets.html

(SC21) SC-21 Information System, http://sc21.crane.navy.mil, 1997.

R.F. Simmons and J. Slocum. Generating English Discourse from Semantic Networks. Communications ACM, 15:891-905, 1972.

(SPMN) Software Program Managers Network. The Condensed Guide to Software Acquisition Best Practices, October 1997. Available from SPMN at http://www.spmn.com.

L. Votta. Comparing One Formal to One Informal Process Description. Proc. 8th. Intern. Software Process Workshop, IEEE Computer Society, Dagstuhl, Germany, February 1993.