Go home now Header Background Image
Search
Submission Procedure
share: |
 
Follow us
 
 
 
 
Volume 6 / Issue 3 / Abstract

available in:   PDF (288 kB) PS (566 kB)
 
get:  
Similar Docs BibTeX   Write a comment
  
get:  
Links into Future
 
DOI:   10.3217/jucs-006-03-0289

Methodologies and Tools For
Continuous Improvement of Systems

William D. Schindel
(International Centers for Telecommunications Technology, Inc., and System Sciences, LLC
schindel@ictt.com)

Gloria M. Rogers
(Rose-Hulman Institute of Technology, Terre Haute, IN
gloria.rogers@rose-hulman.edu)

Abstract: Continuous improvement of hard technology (software, electronic, mechanical, chemical, biological, etc.) systems and institutional (mixed human and technology based) systems is examined from a system perspective, applying system engineering and assessment methodologies and tools. Class and containment hierarchies are used to simplify the modeling of complex systems and their dynamic processes, particularly system families with both shared standardized content and necessary diversity, resolving addressing an historical tension. The engineering concept of "embedded system" is formalized as modeled patterns of embedding management intelligence in both hard technology systems and human institutions. Embedded intelligence models describe intelligent performance, human learning, technical system life cycle improvement, and institutional improvement of all systems. The resulting models describe situationally aware, conscious systems, whether adaptive man­made systems or continuously improving institutions. Models include system requirements, design, verification, and change management. Assessment of system performance against goals determines priority for continuing system improvement. After treating human and hard technology systems on a unified basis, their significant differences are recognized through knowledge worker educational processes, personal reflection on performance, and use of electronic portfolios exhibiting best work. Tools supporting these methodologies are Intranet infrastructure providing computer support of the collaborative work of specifying institutional and technical system requirements, design, assessment, and improvement change management. This approach originates from integrating methodologies and tools of a collegiate educational institution and a commercial engineering enterprise, applied to educational and industrial client systems, environments, technologies, and markets. The resulting approach creates a unified framework for continuous improvement of systems.

Keywords: Accreditation; Assessment; Business Reengineering; Collaborative Work; Consciousness Models; Continuous Improvement; Education; Embedded System; Hierarchy Theory; Knowledge Engineering; Life Cycle; Management; Methodology; Operations Support System (OSS); Portfolio; Productivity; Reverse Engineering; Software Engineering; System Engineering; Tools; Use Case

Categories: A.1, H.1, I.2.4, K.3.1, K.6.4

1 Introduction

Ideally, evolution of human­made systems is a process of ongoing improvement based upon explicit plans. Successfully changing hard technology or institutional process systems requires greater shared explicit knowledge of those systems than

Page 289

operating them unchanged. Even sustaining current performance for an existing institutional process places these same demands on organizations whose internal population is growing or turning over membership. In competitive environments, the rate at which we successfully improve a system product or an organization can determine ongoing system survival or prosperity.

In this article, we summarize combined use of two methodologies and Intranet­based collaborative tools. These are applied to group efforts improving and sustaining systems. We focus on (1) the life cycle system engineering process for evolving families of configurable systems (whether hard technology or institutional processes) and (2) the assessment of progress against continuous improvement goals.

The combined use of these tools and methodologies originated from collaboration of the business and academic sectors. A commercial system engineering firm affiliated with a school of engineering, science, and mathematics collaborated with its academic partner to integrate their methodologies and tools for internal and external uses. The commercial firm provided SystematicaTM, both a methodology for system engineering families of intelligent configurable complex systems and an Intranet­based tool supporting that methodology across teams. The academic institution provided its Intranet­based tool, e­PortfolioTM, combined with assessment methodology for institutions and individuals performing in complex goal­oriented environments. The partners collaborated to adapt the assessment support technology into a commercial product.

IT systems used to describe and perform institutional processes are sometimes called Operations Support Systems (OSSs). They are the institutional analogue of Embedded Control Systems used to control machines, equipment, vehicles, and other hard technology systems. The Systematica methodology is a unified approach to engineering systems in general, with special provisions for OSSs and Embedded Control Systems. This methodology focuses on families of specializable systems instead of the traditional system engineering focus on single systems. Further, it unifies modeling of families of intelligent (not necessarily IT) systems, based upon models of conscious intelligence and abstract management of arbitrary systems.

Assessment of knowledge worker performance is more complex than assessment of manual worker performance. Assessment has sometimes used portfolios that collect and illustrate the more complex accomplishments of the knowledge worker, directly involving the individual in assessment. e­Portfolio advances this approach by creating an Internet based multi­media electronic portfolio environment, and coupling it with formal data structures used by expert reviewers to rate the advancement of individuals, groups, and organizations' processes against a taxonomy of locally modeled performance objectives. This approach applies to situations ranging from institutional academic accreditation and individual student performance to commercial professional individual and organizational process improvement and certification. E­Portfolio implements in a tangible tool form some of the most recent thinking in assessment methodologies, "closing the loop" in the continuous improvement process for human and institutional processes.

In this collaboration, the e­Portfolio tool has been used to support system engineering process improvement. Conversely, the system engineering methodology has been used to model environments for continuous improvement of performance. We believe that this synthesis and collaboration are all the more important because of

Page 290

the emerging dominance of knowledge­based work and the projected movement of the center of gravity of learning into life­long continuing education [Drucker 1999]. The long­term efficacy of these methodologies and tools will be assessed over years of time. Initial results in both commercial and academic institutional environments have been encouraging. Explicit understanding of hard technology systems and of individual and institutional processes have been enhanced, along with planning and assessment of progress.

2 The System Perspective

So much has been written about system life cycle development and organizational change that we might suspect these are fairly well understood. But, challenges experienced by organizations attempting single, not to mention ongoing, change in systems indicate otherwise.

We suggest that this gap is due in part to lack of a shared system perspective by the members of organizations developing new generations of hard technology systems or institutional processes. To have such a system perspective on evolving systems, we must first have one on systems­whether they are changing or not.

A system is a collection of components, linked by interaction relationships [Schindel 1996][Schindel 1997]:

Figure 1: The System Perspective

This describes a perspective on, not a property of, a system. It describes how we divide it into components and interaction relationships in a mental model.

While this simplistic perspective might appear obvious, its consequences are often more profound than expected when dealing with complex systems. Man­made systems are becoming more complex, driven by natural forces of competition and enabled by technologies allowing greater complexity. As this complexity grows, it becomes more difficult to

Describe Understand Communicate about
Predict Deploy and install Design and implement
Manage Operate Manufacture
Monitor Repair Diagnose
Configure Maintain Control
Evolve Account for Maintain security of

Page 291

these systems, at acceptable and predictable cost, time, and risk. Forecast growth in complexity threatens to accelerate these trends.

There is evidence that these challenges are not well understood, much less solved, for increasingly complex families of hard technology systems. Thanks to competitive pressure and enabling technologies, we are increasingly more likely and able to create complex systems than to completely understand them. A rising human activity is the reverse engineering of hard technology systems that were created only recently, attempting to recover understanding of the requirements these systems are expected to meet, and their designs. [Rugaber et al 1999]

These challenges are greater in systems that are organizations of people, as such systems are more complex and even less well defined than hard technology systems. A portion of the work described here is based upon progress in understanding families of complex systems of the "simpler" hard technology type, followed by adapting these techniques to the more complex made­of­man type. For readers interested only in hard technology systems, it is worth remembering that many organizations find their ability to engineer hard technology successfully is limited by their institutional processes of system engineering. So, even for the hard­core technologist, there is no escaping from the importance of understanding institutional processes.

Emerging understanding of principles of organization of process­oriented institutions is (at best) recent. No less an authority on organizational change than Michael Hammer, pioneer noted business re­engineering advocate, writes in 1999 of organizing around the key enterprise processes, the difficult trade­offs between process standardization and diversity, challenges of organizing for ongoing change, and the evolving role of management in such an organization [Hammer and Stanton 1999]. This suggests the nature of re­engineering is still being explored.

Modeling a human organization and acting on that model is the task of management, whether performed by a traditional line manager or by self­managing teams and self­directed individuals, coached by new­age managers.

Management's pioneering scholar Peter Drucker's Management By Objectives [Drucker 1954] shows that the concept of system requirements applies to institutional systems. The specification of requirements and design of both hard technology and institutional systems is an economic and market­driven activity. Establishing accountability for system performance is not possible without a method of accounting. The work summarized in this article provides a system of "bookkeeping" for families of configurable intelligent systems, based upon system engineering and economic principles.

Many historical efforts at improved organizational performance have included the aggressive investment in information technology. Information systems authority Paul Strassmann writes of the lack of correlation in the returns gained with these investments, implying a gap in understanding by those investing in, implementing, operating, or managing these systems. [Strassmann 1990], [Strassmann 1994] More recent explosive growth of Internet utilization illustrates Drucker's [Drucker 1999] thesis that we proliferate data more readily than we distill meaningful information. Combating the modern day "Tower of Babel", the embedded intelligence methodology described here deals with the underlying semantics or meaning of information about systems. [Schindel 1997]

Page 292

Drucker reminds us that before 1900, hardly anyone other than soldiers, clergy, and teachers worked for organized institutions, that four fifths of the U.S. population performed manual labor in farming, domestic service, and blue­collar work, and that only today has the largest single occupational group become that of "knowledge workers", another Drucker category [Drucker 1999]. The appearance of commercial enterprise management in the 1800's and its growth in the 1900's followed shifts in patterns of demand, productivity, communication, and ownership. Emergence of professional management shifted coordination of production and distribution from the market to explicit management processes, as chronicled by Alfred Chandler [Chandler 1977]. New patterns involving the emergence of knowledge work, creation of value, and management of institutional performance are again shifting the work and responsibility of management [Drucker 1999]. The structure of the economic system is itself projected by Drucker to shift, and he marks improving the productivity of knowledge workers as the greatest management challenge of the twenty­first century.

A further indication of the inconsistency in prevailing system views of human organizations is that we often demonize "Taylorism" as the historic archetype of wrong­headedness in seeking to standardize work of individuals and organizations, while simultaneously glorifying "best practices" as valued patterns we must install and vigorously emulate to optimize our performance. [Kanigel 1997], [O'Dell and Grayson 1998] Drucker, on the other hand, views Taylor as the single most influential American, the inventor of the single American philosophy that has most profoundly swept the world.

A unifying theme in the work of Taylor, Drucker, Strassmann, and system engineers is the importance of sound (economic, scientific, or engineering) models and measures of system performance. Today's environment still includes emotion­laden issues, non­scientific technical jargon, and gaps in understanding of both human­made technology systems and complex process­oriented institutions.

3 System Engineering ­ Methodology and Tools

In Section 3, we review the use of a System Engineering Methodology and Tools to improve complex system outcomes. These outcomes are essential to continuous improvement of hard technology systems and institutional processes in all types of institutions.

3.1 System Engineering Methodology

The commercial enterprise of one of the authors performs system engineering for industrial clients at three progressive levels:

  1. Level 1 System Engineering Practices: This is the subject of traditional system engineering since military systems of World War II. This foundation includes both well­recognized (e.g., SE­CMM [Bate et al 1995]) system engineering reference models and the addition of specific methodology, including formalisms and tools that aid in solutions. Classical topics include defining systems (requirements engineering, design engineering, reverse engineering of existing systems), validation and verification of those systems, and management of configuration, change, and operation of those systems. The emphasis is on single complex systems.
  2. Page 293

  3. Level 2 System Engineering Practices: These extend traditional (Level 1) practices to multiple systems with common (standardized) content but also diversity for individual applications, markets, or needs. The ideas of product lines, enterprise divisions, patterns, common content, economic leverage, and specialization of process, functionality, and configuration are part of this perspective. The emphasis is on global optimization of families of configurable complex systems that can be specialized for diversity without weakening common content.
  4. Level 3 System Engineering Practices: The subject here is extension of Level 1­2 practices to the problem of intelligent systems ­ systems with components explicitly embedded to improve the behavior of the overall system in dynamic environments. These are management (a.k.a. control or embedded intelligence) systems. The meaning of intelligence, management, and meaning of information are part of Level 3 concepts. The emphasis is on reusable patterns of embedded intelligence in complex systems.

3.1.1 Level 1 SE Methodology: System Engineering of Single Systems

The problems of system engineering have been widely studied for fifty years, stimulated by generation of complex human­made systems during World War II [Bate et al 1995]. This is not to say that system engineering is mastered in practice by most organizations. Continuous improvement of team­based system engineering processes in large organizations is a subject of great attention, and one of the applications of continuous improvement addressed by the methodologies of this article.

In Level 1 practices, system engineering methodology is structured for later extension in Levels 2 and 3. Supporting the SE­CMM reference model [Bate et al 1995], it particularly deals with the issues of interactions between systems:

  1. Who? ­ What is the boundary of the Subject System [Figure 2] ­ with what external systems does it interact, and through what interfaces?
  2. What? ­What are the outcomes of interactions that the Subject System is required to perform with the external systems of its environment?
  3. When? ­ What are the dynamic or temporal relationships ­ when should these interactions occur?
  4. How? ­ What is the internal design of Subject System to meet these requirements?
  5. Why? ­ What are the rationale justifying the requirements and the design?

[Figure 2] provides the context for these issues. Requirements statements apply at the external boundary of the Subject System, indicating what it must be or do as seen from its environment. Design statements apply to the system interior, indicating how internal components and relationships are arranged to support the requirements.

Page 294

Figure 2: Subject System Interacting with Systems in Its Environment

In spite of the fact that designs must be traceable to requirements they support, it is not the case that design steps always follow requirements steps in time, or even that they should. [Figure 3] illustrates the iterative "no beginning" nature of the Requirements­Design process.

Figure 3: The Requirements­Design Iteration

There are many historic examples in which "requirements" have been driven by "designs", in the sense that new technologies or design ideas have permitted the inclusion or recognition of "requirements" that would otherwise not have been practical to include in system goals. The methodology recognizes this relationship, while supporting the real need that all design aspects be traceable to the requirements that they were intended to support.

The required system­environment interaction functions in this methodology are formally modeled without reliance on natural language as the substrate of specification, using RNA Transaction ModelsTM. These functional requirements are furthermore placed into class hierarchies of types of functions and decomposed into containment hierarchies of sub­functions.

Page 295

Figure 4: System Interactions

Level 1 practices include establishment of formal interfaces. The systems may be mechanical, electronic, chemical, biological, software, human enterprises, etc. Interactions at interfaces may involve physical variables or symbolic information.

The "when" portion of the Level 1 methodology groups functions into environmental Situations, which are the system engineering analogue of software use cases. [Jacobson, 1992] These are treated formally as states, and modeled in finite state machines or continuous trajectories. The states are similarly placed into class and containment (sub state) hierarchies.

Tool support for Level 1 practices includes the requirements, design, validation, documentation, and change management process, including integration with other engineering tools. Traditional automated tracing of decomposition and rationale relationships is supplemented by trace of inheritance of patterns of requirements and designs. This establishes an environment for system engineering that can simultaneously support document driven and database driven team processes.

Some aspects of the Level 1 system engineering practices could be viewed as object oriented analysis of general (non software) systems. In fact, UML notation [Booch, Rumbaugh, and Jacobson 1999] is utilized, borrowing again from the software world.

3.1.2 Level 2 SE Methodology: System Engineering of Families of Configurable Systems

The Level 2 practices add to Level 1 practices, for engineering families of configurable systems with common content. These practices recognize the historical problems encountered by different practitioners:

  • Software designers: software component "re­use"
  • Mechanical engineers: reduced manufacturing parts counts and common parts
  • Organizational process designers: standardized processes and best practices

All of these practitioners are faced with natural pressures that erode standardization with variation, reducing economic leverage. These pressures are powerful, and not dismissed by force of standardization alone. Gaining and

Page 296

maintaining leverage in an environment of multiple product lines, organizational divisions, or operational systems requires understanding of these forces and methodology that allows specialization without sacrificing standardization.

The Level 2 Practices use class hierarchies that allow development and evolution of families of configurable systems, shown in [Figure 5]. Whether these systems are hard technology, institutional process, or both, they consist of multiple specialized systems organized into classes, product lines, or common categories. These higher categories are in turn arranged based on their class similarity in the same way, progressing eventually to the core technologies, processes, or competencies of the organization, at the top of [Figure 5]. Whereas this approach is often invoked informally, this methodology converts it to a quantitative model with system bookkeeping, resulting in a discipline for developing, auditing, and evolving families of systems.

This process does not force the amount of common content in systems, but allows its optimization. Based upon economic judgment, an optimal arrangement in one business process or product line may be a high degree of commonality, with only a limited variation allowed in specializing for local needs. In another process or product, a higher degree of local specialization may be optimal, with only limited common content. Level 2 Practices provide means of having this argument objectively, and keeping track of decisions and implementation as a quantitative discipline. This approach attacks a problem of standardization versus diversity, reported for institutional processes by Hammer [Hammer and Stanton 1999].

Scientific bookkeeping allows calculation of Return on VariationTM when we model variations on common themes, to understand the benefit and cost of specialization for individual markets, customers, organizational units, etc. It also allows the application of Gestalt RulesTM supporting the use of patterns in system engineering.

[Figure 6] illustrates the idea of class hierarchy for Engines as systems. [Figure 7] illustrates those same Engines in containment hierarchy. Class and containment are organizing relationships for systems, which are different from interaction relationships [Figure 4] for the same systems. [Figure 8] illustrates how all of these organizing and interaction relationships are conceptually combined by the Systematica methodology and supporting tools.

Use of patterns to simplify and understand complex systems has a long heritage:

System patterns become most sophisticated when they are patterns of behavior. These reduce complexity, through behavioral abstractions that recur across different systems. In this methodology, we classify dynamic behaviors into class and containment hierarchies with the same ease as classification of static objects. The idea of classifying dynamic processes into formal class hierarchies may seem different from classifying static patterns such as shapes, natural objects, people, etc. However, experience shows that we can describe both in the same way. In fact, study of interacting systems in physics and cognitive sciences shows that objects owe their

Page 297

origin to processes, and that classification of patterns of objects in hierarchies is in fact classification of patterns of processes. This is related to the ideas of polymorphism and classes of interfaces in object oriented software.

Figure 5: Families of Systems

Figure 6: Class Hierarchy

Figure 7: Containment Hierarchy

Page 298

Figure 8: Combined Class and Containment Hierarchy, and Interaction

Figure 9: Class Patterns of Dynamic Behavior

Tool support for Level 2 Practices includes the modeling of class hierarchies of systems, interfaces, interaction functions, and states, measurement of common content

Page 299

and variation, and audit of pattern conformance in support of a discipline­oriented approach.

3.1.3 Level 3 SE Methodology: System Engineering of Intelligent Systems

3.1.3.1 Management as Embedded Intelligence

A uniform approach is used for the modeling of embedded intelligence, whether automated agents (e.g., programs, controllers, regulators, operations systems) or expert human agents. In commercial, military, and institutional systems, this reflects today's reality. In the airplane cockpit, network control center, truck cab, fleet operations center, machine shop, facility control room, construction site, factory, or hazardous environment site, it is common to find intelligent sensing, analysis, and control shared between embedded human and automated agents. [Stiles and Glickstein 1991], [Glickstein 1984]

In some environments, human agents are dominant and limited automated assisting agents appear. In other environments, automated agents are dominant and human agents intervene only for exceptional tasks requiring high expertise. Many examples exist of fully automated agent populations. Other examples exist in which only human agents are visible. The trend over time is that in a given environment, human agents' previous tasks are replaced as technologists and process experts learn how to automate tasks. The century­old battleground of Taylorism's origin was the individual machinist position in a machine shop. Today, that individual position is often 100% automated, with no operator in sight ­ either no human at all, or a human very distant, in a slower response supervisory control loop. As more sophisticated automated agents take over former tasks of human agents, new, higher­skill demanding tasks are often developed and assigned to human agents.

In this methodology, we define "intelligence" of a system as the degree of its functional adaptability to operate effectively within different environmental states. This approach avoids issues of technology, and specifically avoids reference to computers, software, or people. The centrifugal ball governor on Watt's steam engine made that system more capable of delivering constant speed drive to varying external loads, therefore more intelligent in this perspective, without use of human or electronic technology.

Figure 10: Interpretation of Intelligence

Page 300

The "IQ Test" for Systems 1 and 2 of [Figure 10] is to provide varying environmental states, then decide which performs better using objective functions of performance.

We use a definition of management consistent for institutional human management and embedded technological control systems. We define "management" of a system as the modeling, monitoring, or control of the performance, configuration, fault, security, or accounting aspects of that system. This approach borrows and abstracts from narrower use of the System Management Functional Areas (SMFAs) originating in the ISO model of computer and network system management [ISO 1991].

Our definition of management is consistent with the definitions of human management of institutions found in Drucker [Drucker 1999]. It is related to the definition of management by Strassmann [Strassmann 1990]. Strassmann's definition is concerned with institutional level management (supported by use of institution level financial data to measure Return on ManagementTM). He excludes from his definition of management embedded systems contained in institutional production and service operations directly serving customers. Our definition treats embedded intelligence at all these levels as management (supporting our use of a universal model of management), but can distinguish institutional level management to break out a management component consistent with Strassmann's. Our approach and Strassmann's are consistent in viewing all management as information management.

Using the Level 2 pattern discipline, in Level 3 we find patterns of management system intelligence emerge from this approach. These originate from the management model of [Figure 11]. In this model:

  • A Managed System (MDS) is a system that provides services to a System of Users (SOU), and that we plan to manage;
  • A Management System (MTS) is a system that models, monitors, or controls the performance, configuration, fault, security, or accounting aspects of a Managed System, providing management services to the System of Users;
  • A System of Users is a system (whether human or technological) that consumes the services of the MDS and MTS;
  • A System of Access provides the means of interaction of MDS, MTS, and SOU.

Tool support for Level 3 includes support of patterns of intelligence, human agent and automated agent process assignments, standardized and specialized processes in general, and modeling and implementation of system engineering process standards.

Practiced across multiple technical and institutional environments, this has led to a family of reusable patterns of intelligence that apply well across all the intelligent system environments we have encountered.

Page 301

Figure 11: Embedded Intelligence Summary Model

3.1.3.2 Relation to Modeling Conscious Systems

Modeling of conscious systems in nature has moved in recent years from a speculative or philosophical activity [Dennett 1996] [Churchland and Sejnowski 1992] to a pursuit in the biological and cognitive sciences. [Edelman 1988] [Edelman 1987] [Edelman 1989] [Crick 1994] [Damasio 1999] The Level 3 system engineering practices referenced here borrow perspectives from those endeavors.

Without suggesting that consciousness, if defined and present, would be of the same nature in different kinds of systems, our interest is in systems exhibiting aspects of conscious behavior.

For example, robust institutions exhibit a shared explicit awareness of the goals, status, direction, environmental threats and opportunities, and spirit of the organization. These institutions are better prepared to engage in continuous or ongoing change. [Senge 1990] Likewise, a higher value is placed on technical products that adapt to differing environmental or application missions without being re­engineered. These ideas suggest that situationally aware systems are of interest, and the situational state based model of performance provided by the Level 1 practices are enhanced by the Level 3 model of intelligent management.

From the cognitive sciences, we have borrowed ideas such as attention modeling to expand the detailed model behind [Figure 11]. The patterns are consistent with Damasio's model of being aware of awareness in biological consciousness [Damasio 1999].

3.2 System Engineering Tools

Methodologies and tools never stand alone, but interact with each other. Frederick Taylor's holistic vision included not just the (rigid) machine operator procedures for which he became widely known. He simultaneously developed equally revolutionary new tools and machines, high speed machineable steel alloys, employee compensation methods, and even a financial accounting infrastructure.

Page 302

As shown in [Figure 12], all of these combine in an interacting system fabric in which each is dependent upon the other. They all must evolve together, and the result is only as good as the weakest link.

The resulting combined system is itself subject to system engineering. This approach has been used in the development of the Systematica methodology and tools summarized in this article. This results in a situation­based model of system engineering tasks, appropriate tools, assignment of tasks to system engineers, managers, and automated agents, and the specialization of the model to local process needs.

Figure 12: Methodologies, Tools, and Tasks Move Together

The methodology does not depend upon use of a specific tool, and can use a variety of commercial or customized tools. Systematica tools have been constructed as a part of the work summarized here. These tools actually constitute an infrastructure technology that automatically generates specialized tools fit to the local engineering process needs of individual enterprises. These tools are model­driven, meaning that instead of writing and modifying programs, the keepers of these tools adapt them to local needs by entering the models of the organization's desired processes and systems. This explicit modeling, instead of encoding in implicit program algorithms, also opens and enhances the process for continuous improvement.

The technology components of the Systematica tool infrastructure fall into several categories:

  1. Database: The tools use a relational database upon which an object model layer has been overlaid to support class hierarchy and other structure. The database contains both updateable process models governing the operation and use of the tool, and the base of information describing modeled systems, including specifications of requirements, designs, tests, rationale, etc. Update of the database is controlled by the modeled rules of the configuration change management process.
  2. Process governing rules, models, and patterns: The hierarchy of reusable patterns is separable from the tool, and describes the generic and specializable patterns that may be inherited.
  3. Interactive user access: User access is moderated by web based technology that can be embedded in an Intranet. User interfaces are generated by modeled rules,
  4. Page 303

    providing an adaptable interface technology not requiring programming for basic specialization. The interface technology includes embedding of technical system graphical models consistent with the methodology.

  5. Views and Systems of Access: Views include interactive web­based human views, automatically generated or input documents, and views interacting with other enterprise systems and tools. Systems of access include Internet browser technologies and system­to­system interfaces, mapped by XML models.
  6. Incorporation of industry standard components and technologies: The use of open interfaces and standards has made possible both inclusion of best­of­breed packages for graphics, word processing, report generation, and other functions, along with enabling others to integrate third party tools, components, and operations systems.

3.3 Applications of the System Engineering Methodology

Systematica methodology has been applied to both hard technology and institutional process system improvement. The following sections summarize domains of application experience with both types of systems.

3.3.1 Application to Hard Technology Systems

Over the last thirty years, the system engineering concepts described have been developed and applied in a number of settings involving real­world hard technology systems, including:

  1. Telecommunications Industry: Applications have been for families of software­defined embedded systems, used to manage rapidly proliferating networks providing voice, data, and video services offered by high­end telecommunications carriers. These settings included management systems for some of the largest scale networks in existence, managing the performance, configuration, fault, accounting and security aspects of diverse service­providing networks. Practice in this area has seen the movement to digital transmission and switching, software­controlled network elements, proliferation of network technologies and services, including most recently wide­band optical transmission and convergence of services in Internet, cable, and telephone environments, and the extension of the network to mobile devices. This period has seen ongoing international efforts to establish standards for the management of these systems. System engineering work has included specifying requirements and design architecture, implementing systems, developing approaches to automated mediation of different generations of management agents, and reverse engineering of existing systems to extract designs and requirements, for some of the world's largest telecom carriers and their major suppliers. The combination of deregulation and new technology has stimulated the entry of many new players and fragmented a formerly integrated network into an exploding diversity of businesses, systems, and services.
  2. Power Systems, Construction, Heavy Equipment, and Automotive, Industries: System engineering activity has been in an environment of large scale fuel­injected diesel engines, high end industrial electrical induction motors, electrical generators, hydraulic power systems and diverse arrays of implements and
  3. Page 304

    material processors, track and wheel based traction and steering systems, and machine applications for construction, excavation and underground boring, transportation, mining, agriculture, and power generation. These settings have included some of the largest scale construction, mining, and power systems on the globe. The period has seen the invasion of these previously mechanical systems by embedded electronic and software control system technologies, creating new levels of complexity of behavior, configuration, engineering, and service. Competition and governmental regulations of air quality and other factors have combined with technology to drive and enable new levels of complexity in these systems. Following patterns of other industries, these systems increasingly embed intelligence managing the performance, configuration, security, accounting, and faults of the systems. Purely mechanical system efforts have included concerns with greater modularity of mechanical assemblies and reduced parts counts with higher common content. These industries are major drivers of trends to embed mobile distributed, networked systems.

  4. Aerospace Industry: As the earliest generator of complex man­made system engineering processes, the aerospace industry has often set abstract patterns of system evolution which can be seen to have been followed years later by other industries, such as those listed here. Representing the ultimate in "competition", defense systems were once viewed with a more open checkbook perspective than today's concern with cost. This has resulted in movement from engineering "clean sheet" new systems to the engineering of families of systems configured for use in multiple service branches and environments. System engineering applications have included airborne weapon delivery and navigation systems and military communication network control systems.
  5. Medical Systems and Health Care Industry: This industry operates health care systems, develops pharmaceuticals and medical devices, and operates administrative networks concerned with everything from insurance to utilization and outcome data. Biological systems are ultimately the most complex, and while this field is newer to system engineering, the advent of genetically synthesized pharmaceuticals, embedded biomedical devices, and pattern recognition systems could drive it though the maturation patterns seen in other industries more rapidly. System engineering applications have included pharmaceutical manufacturing process modeling and AI applications.

3.3.2 Application To Institutional Systems: Engineering of Freedom and Constraint

The following sections first summarize domains of institutional process experience with the methodology, then elaborate on the challenges special to human processes.

3.3.2.1 Institutional Applications

During the last twenty years, the system engineering concepts described have been applied to a number of institutional system settings, including:

  1. Telecommunications Industry: Embedded telecom Operations Support Systems (OSSs) merge the workflow efforts of people and automated agents. These
  2. Page 305

    model, control, and monitor the core institutional processes of managing carrier networks. These have included customer service provisioning, from order entry through service cut over and billing, network engineering and equipment provisioning processes, customer trouble reporting bureaus, testing, and service dispatch processes, network equipment fault analysis and response processes, and network performance tracking, analysis, and response processes.

  3. Power Systems, Construction, Heavy Equipment, and Automotive, Industries: This has included system engineering modeling of the processes of equipment servicing, configuration management and provisioning of field systems, distribution of software releases to mobile systems, operation of trouble bureau operations for distributor information networks, and the modeling and operation of the system engineering process for enterprise­wide embedded control systems.
  4. Enterprise Information Systems: Across many industries, the system engineering cycle has been applied to common institutional process patterns found in enterprise computing and network trouble bureau, help desk, and IT equipment asset management processes.
  5. System Engineering Services: This activity has focused on the modeling and operation of the internal system engineering institutional processes used in support of multiple clients.
3.3.2.2 Special Considerations for Engineering Human Processes

The combined methodology addresses the need to be sensitive to the complexity of human processes in comparison to hard technology systems, and to recognize the challenges inherent in modeling, much less changing, work processes. The century­long stormy history of Taylor's Scientific Management offers lessons to which we have paid heed. [Kanigel 1997] Addressing these issues are key aspects of this approach, including:

  1. The use of class hierarchy of process models to allow for variation while enforcing standardization to the degree that it is modeled as intended. This continuum addresses revises the historically bipolar challenge perspective of choosing either standardization or diversity, reported by Hammer [Hammer and Stanton 1999].
  2. The use of self­reflective electronic portfolios to put more of the management of knowledge workers' improvement into the hands of knowledge workers themselves, as emphasized by Drucker [Drucker 1999].

Improvement involves learning, whether in an academic institution or during ongoing professional development. Continuous institutional improvement includes improvement to the process and content of learning. [Senge 1990] This section summarizes the model of professional development process improvement within a commercial enterprise, and [Section 4] discusses the educational institution case.

The application of embedded management systems in engines, terrestrial and space vehicles, airplanes, factory machines, pipelines, networks, and other man­made systems is often focused on improving the performance or behavior of these engineered systems. Hard­core numbers and objective criteria are used to measure this improvement, in miles per gallon, reduction of waste, reliability and availability, response time, reduced emission levels, costs, production per day, or other performance attributes.

Page 306

When the same ideas are applied to embed management systems into organizations of people, equally clear crisp criteria, traceable from original intentions through implementation, may appear lacking.

Before Frederick Taylor developed Scientific Management of (initially) machine shops, the shop workforce was made up of highly experienced expert workmen, each of whom had his own ideas about the best way to produce parts, apply tools, and interact with the organization. Taylor's approach of detailed codified production procedures was fought by a resistant workforce that felt invaded by rote directions and productivity pressure. Even while it was being widely adopted, Taylor's "one best way" approach was called into question by various detractors, and a century later "Taylorism" has become a term synonymous with overly rigid management philosophy. However, looking beyond the charges of rigidity and conformism to the underlying scientific basis of this work, Drucker considers Taylor to have made the greatest contribution of any American to a world wide exported influence on the development of production. [Drucker 1999]

Ironically, today's machine tools and shop operations have carried this model much farther than Taylor, encoding tool selection, tool speeds, machining trajectories, and sequences into fully automated machine tool processes with little or no human intervention. A new and elevated generation of tasks for humans in programming the production process and integrated design and production resulted. Was Taylor wrong on all counts, or were critics missing the underlying point of objective system modeling?

The landscape has changed since Taylor's Scientific Management became twentieth century Industrial Engineering. Today, knowledge workers (another Drucker concept) constitute 40% of the U.S. workforce, and are becoming the largest component of the workforces of each developed nation. The capital assets of the corresponding businesses have become these knowledge workers, not the machine tools of Taylor's time. [Drucker 1999]

When business America jumped into computerized mechanization of enterprise information processes, the promise of higher corporate productivity provided the rationale for investment of billions of dollars in computer hardware, software, and operations. But Strassmann's studies indicate that this return was poorly correlated with IT investments over many enterprises. The very concept of Return on ManagementTM taught by Strassmann illustrates that the idea of just what such a return might mean apparently escaped many business leaders or implementers somewhere between intention and implementation. [Strassmann 1990]

In [Section 4], we review the use of "portfolios" as a part of assessment of processes for improving the performance of human agents over time (learning). In preparation, we review the system modeling of the human portion of these systems in commercial organizations of knowledge workers. This model is used to support a common perspective of the complex human tasks to be performed, the processes used to improve that performance, and the assessment of those improvement programs to determine their effectiveness.

[Figure 13] summarizes the model in simplified form.

This model has been chosen to parallel the system models used for the human­made hard technology system components that work in parallel with the human agents. It describes aspects of performance learning and improvement that can apply

Page 307

within both educational institutions (where the task performers and learners may be students) and commercial enterprises or other institutions (where the task performers and learners may be staff members improving their performance over time).

In this model, the following definitions apply:

Task Performer: A human agent, intended to perform complex tasks to a given level of performance. This agent may be, for example, an engineer, scientist, writer, or other professional. This agent typically interacts with other agents in this performance.

Figure 13: Informal SE Model of Performance, Assessment, Improvement Environment

Page 308

Managed System: The system upon which the human agent's tasks are performed. For example, a Managed System may be a natural or man­made system that is being analyzed by an engineer or scientist, or designed or synthesized by an engineer, or composed or rendered by a writer or an artist, or operated by an operator. The Managed System provides most of the semantics of (domain knowledge about) the tasks to be performed by the Task Performer.

Performance Situation Type: A situation is a state of the environment of the Task Performer. The Performance Situation Type is the system engineering equivalent of a Use Case in the software world, modeled in advance of the performance of Tasks. The content of these situation models include a description of performance objectives and indicators of performance. These models are developed from the mission and vision of the organization in which the Task Performer operates, based on inputs from the organization stakeholders.

Performance Situation Instance: Actual occurrences of Performance Situations, in which tasks are performed by the Task Performer. These are recorded by e­Portfolio when the Task Performer considers them examples of best work.

Mentor, Coach, Teacher, Process Specifier: A human agent responsible to instruct Task Performers in the performance of the tasks, coach them in that performance, evaluate performance, or specify performance.

Coaching or Instructional Situation Type: A situation­based state (Use Case equivalent) in which instruction or coaching is to occur. Examples include classroom situations, laboratory situations, on­the­job performance situations, project situations, co­curricular activity situations, etc. These models include overall educational practices and strategies, along with expected learning outcomes.

Instructional Curriculum: Coaching or Instructional Situation Types are aggregated into collections that constitute a curriculum planned and optimized to accomplish an overall educational outcome.

Curriculum Map: A mapping of Instructional Situations to Performance Situations, indicating the plan of coverage of the Performance Situations by the Curriculum. This is valuable for understanding how Curriculum addresses the intended outcomes.

Curriculum Planner, Assessor: A human agent responsible for establishing, analyzing, and improving the Curriculum over time, including advance planning, retrospective assessment, and improvement planning.

Curriculum Change Situation Type: A situation­based state (Use Case equivalent) representing possible requested change to improve the Curriculum. These are pre­modeled as into Types representing possible future requests.

Curriculum Change Situation Instance: An actual occurrence of a Change Situation (Change Request), provided for resolution. The ongoing improvement process analyzes and prioritizes these instances for disposition as (possible) curricular change.

3.4 System Engineering Lessons Learned

Experience in using this methodology has taught lessons including:

Page 309

  1. A single methodology and set of abstract patterns has been seen to fit a wide range of industries, technologies, and markets for complex configurable families of systems.
  2. Although different industries and markets are at different stages of the process of maturation of these patterns, the fact that others have preceded can be a valuable guide to probable trends.
  3. This methodology and pattern set work equally well for hard technology systems and for human institutional processes, while recognizing their differences.
  4. The problem of maintaining leverage of families of configurable systems with common content is a widespread challenge to many organizations.
  5. Many organizations need to complete the installation of classical (Level 1) single system engineering practices as a part of putting system engineering of families into place.
  6. System engineering of families of systems requires a different organizational and ownership approach than system engineering of single systems.
  7. Continuous improvement of systems is an ongoing need, as organizations merge new acquisitions, introduce new competitive products, markets change, technologies open new possibilities, and the human component of the enterprise turns over or grows. Constant pressures attack the integrity of existing technical and organizational architectures. Constant effort is required to maintain and evolve these architectures in an improvement direction. This effort is better invested when the structure of organizations and processes recognize the underlying nature of the problem.
  8. Reverse engineering of existing systems is often needed as a key part of the system engineering process. This is a measure of earlier system engineering process outcomes.
  9. Shared models across teams facilitate the process, providing common terminology and system perspectives unifying project semantics.
  10. Availability of a family architecture can greatly accelerate the generation of successful new systems.
  11. While it is natural to seek the "best" architecture for a complex system, often just having an architecture (whether optimal or not) whose model is understood across the process team is a major improvement.
  12. IT infrastructure support suited to the methodology is essential for modeling, managing, assessing, and improving complex systems. This is true both because of the volume and complexity of the information involved in complex systems, and because of the need for objective bookkeeping in the scientific process.
  13. The history of Scientific Management offers a number of useful lessons for those who would standardize organizational processes, but knowledge worker process standardization is also different in some ways from manual worker process standardization.
  14. The portfolio approach is a valuable way to accumulate and reflect upon individual and programmatic history of use and improvement of the system engineering methodology.
  15. Technical system engineering staff can better see the benefit of process improvement programs (CMM, ISO, TQM, etc.) when they are cast in the form of objectively system engineered systems themselves. This can reduce resistance

    Page 310

    to change and encourage involvement of staff in process engineering of their own work.

4 Assessment of Educational Outcomes - Methodology and Tools

In the previous section, we discussed the use of system engineering methodology and tools to solve complex system problems. In this section, we will describe the use of assessment methodology and the application of information technology to assess educational outcomes. The use of quality methods to assess outcomes are is essential to the continuous improvement of institutional processes in all types of institutions - commercial, educational, military, governmental, and others."

4.1 Assessment Methodology

Institutions are coming under increased pressure to demonstrate to their stakeholders that they are producing quality products and/or services. This pressure comes from state and/or federal agencies, investors, customers and regulating bodies. For example, in education new accreditation standards developed by regional accrediting bodies now require institutions of higher education to conduct self­studies that demonstrate that the institution is achieving its academic objectives and has processes in place to ensure the continuous improvement of the educational institution. This requires institutions to identify their objectives in ways that are measurable and develop processes to support the continuous improvement efforts. Few institutions have a well developed set of specific, measurable student outcome goals that are tied to their institutional mission and guide the educational delivery system both in and out of the classroom. Because most educational curricula are packaged in a structure of required and elective courses, it is difficult to see how the content and strategies that are being employed in the classroom contribute to the expected general student learning outcomes (if, in fact, the outcomes have been defined). Generally, there is also no system of individual faculty responsibility to see that students are acquiring knowledge or skills outside the individual faculty members' course content area. At the same time, students are often uninformed about the specific learning outcomes they are expected to achieve during their collegiate experience but are fairly well versed in which combinations of courses are acceptable to get a college degree. In manufacturing, companies must meet externally developed quality standards in order to be competitive in many markets. This requires them to develop and demonstrate processes and procedures that ensure the customer that the products that are produced meet a minimal acceptable quality standard.

At the same time that institutions are being pressured by external forces to develop a system of accountability, there is also a need to manage the process of change that is the result of the rapid rate at which customer and client needs are evolving. Making decisions about the addition of new services or product lines to meet the needs of society and the specific needs of individual customers/clients, and finding ways to produce the desired outcome or product more effectively and efficiently are often hampered by the lack of a well defined system of continuous improvement to support the decision making process.

Page 311

To support organizational change and accountability and to create a true learning organization where all constituents are involved in meaningful and collaborative ways, it is essential that a continuous improvement (CI) system be developed that supports the effective and efficient delivery of the primary functions of the institution and the monitoring of institutional progress in achieving its desired outcomes. Models have been developed that describe the processes required to support continuous improvement of the educational system. One such model is summarized in [Figure 14]. This model is designed to guide the development of a system of continuous improvement of general outcomes.

Figure 14. Model for Continuous Improvement

The process begins with a vision and mission statement. A clear sense of what the institution mission is and its vision of the future is the foundation for developing its objectives and indicators of whether or not those objectives are being met. The development of the objectives and indicators requires the input of many constituents both inside and outside the institution. For example, in an educational institution, these constituents include faculty, staff, students, employers, recruiters, boards of advisors and trustees, and, in the case of public institutions, appropriate state agencies. The objectives and indicators and even the mission of the institution may

Page 312

change over time as the needs of the customer/client change and the technology and knowledge base grows. Once the objectives and indicators have been developed, the institution must make decisions about the strategies and practices that it will employ to achieve them. The outcomes that result from the implementation of the institutional strategies and practices are the focus of study to determine whether or not the intended objectives were achieved. This study takes the form of collecting evidence through the assessment process. The assessment process includes the development of a data collection method(s), collection of data and their analysis and the report of findings. Once the findings have been reported, it is necessary to evaluate the findings and, where appropriate, make recommendations for change based the evidence and examination of the strategies and practices that have been used to achieve the objectives. These findings and/or change recommendations are then fed back to the appropriate internal and external groups for implementation. This process is continuous and, although the processes in the model appear to be sequential and cumulative they are, in fact, iterative with multiple feedback loops. As one process in the CI model is employed, it informs previous or planned processes and improvements are made in the processes and or objectives before the cycle has been completed.

4.2 Assessment Tools

The tools that support the assessment process are many and vary with the institutional context and focus of the outcome being studied [Prus and Johnson 1994]. For example, in an educational institution, tools to measure student learning outcomes may include:

  • Standardized exams
  • Local developed exams
  • Oral exams
  • Competency­based methods
  • Simulation
  • Performance Appraisal
  • Self & third party reports
  • Surveys and Questionnaires
  • Interviews
  • Third party reports
  • Focus groups
  • Portfolios
  • Archival records
  • Behavioral observations

In a manufacturing facility, tools commonly used to measure processes and outcomes may include:[Brassard, 1989]

  • Fishbone diagrams
  • Gantt charts

Page 313

  • Force field analysis
  • Tree diagrams
  • Control charts
  • Histograms
  • Pareto charts
  • Run diagrams
  • Scatter diagrams
  • Nominal group processes

As commercial institutions adopt more quality improvement processes the distinction between the types of tools used to measure institutional processes, productivity and outcomes is becoming less distinct across different types of institutions. from those used in industry. In whatever environment measurement tools are being used, it is important to consider that there will always be more than one way to measure any objective and no single method is good for measuring a wide variety of different objectives. In evaluating the methods to be used, it has been found that there is a consistently inverse relationship between the quality of measurement methods and their expediency. Whatever assessment method is chosen and implemented, it is important to pilot test it to see if the method is appropriate for the objective being measured. Before developing "in­house" assessment tools, research should be conducted to see if there are appropriate tools that have already been developed and tested.

In addition to the traditional format of assessment tools, electronic forms of the tools are being developed. With the proliferation of computing technologies in all areas of society, electronic assessment tools are becoming very prevalent. The use of Web­based surveys, documentation software, and software­based analytical tools for both quantitative and qualitative assessment are commonplace in both commercial and educational institutions. In this section we discuss the development of an electronic portfolio system as a method for assessing institutional effectiveness related to student outcomes.

In education, a portfolio has been described as a "purposeful collection of student work that exhibits the student's efforts, progress, and achievements. The collection must include student participation in selecting contents, the criteria for selection, the criteria for judging merit, and evidence of student self­reflection." [Paulson et al 1991] While there is agreement on the definition of a portfolio, there is no one correct way to design a portfolio process. The design should be driven by a clear understanding of the desired outcome from using portfolios and the specific skills to be assessed. The desired outcome will determine the design and focus of the portfolio process. Portfolios are not an end in themselves and must be developed with a clear vision of the desired outcome. [Arter et al 1995]

There are multiple benefits and some disadvantages in using portfolios of student work as a means of assessing student outcomes. [Prus and Johnson 1994] Portfolios can:

  • Provide multiple samples of student work over time.
  • Give a broader, more in­depth look at student skills and knowledge.
  • Page 314

  • Allow raters to base assessment on more "authentic" student work efforts, progress, and achievements.
  • Provide a view of learning and development
  • Allow multiple components of a curriculum to be measured at the same time.
  • Provide a process of reviewing and grading portfolios that offers an excellent opportunity for faculty exchange and development, discussion of curriculum goals and objectives, review of grading criteria, and program feedback.
  • Provide results that are more likely to be meaningful at all levels (i.e., the individual student, program, or institution) and can be used for diagnostic/prescriptive purposes as well.
  • Increase the "power" of maximum performance measures over more artificial or restrictive "speed" measures on tests or in­class sample.
  • Increase student participation (e.g., selection, revision, evaluation) in the assessment process.

The use of portfolios also has some disadvantages that need to be considered when choosing an assessment method. They include:

  • Problems with storage and administration.
  • Cost in terms of evaluator time and effort.
  • Challenge of establishing reliable and valid rating criteria.
  • Concern of faculty that a hidden agenda of the process is to validate their grading or evaluate the effectiveness of their teaching.
  • Security concerns that may arise as to whether submitted samples represent the students' own work, or adhere to other measurement criteria.

The e­Portfolio was designed at Rose­Hulman Institute of Technology as a means of collecting rich multimedia portfolios of each studentsstudent's best work across the whole population of students. Modules were created to provide for the asynchronous assessment of student work by trained raters and the aggregate statistical reporting of results. The e­Portfolio system was designed to measure the effectiveness of the overall institutional educational process reflected by student learning outcomes.

The e­Portfolio has been developed to include the following modules:

  • A web­based curriculum mapping system linking faculty course delivery to specific desired student outcomes
  • A web­based student electronic portfolio system based on a "showcase" model where students select the work they believe best demonstrates their progress toward specific learning outcomes. (This module may be viewed at http://www.rose-hulman.edu/ira/reps)
  • An on­line advisors' module allowing faculty advisors to view their academic advisees' portfolios
  • An asynchronous portfolio rating system to be used by raters (primarily faculty) that has the capacity to maintain the calibration of inter­rater reliability, the evolution of rating rubrics, and feedback to students
  • A student and faculty electronic feedback process that facilitates student and faculty input into the improvement of the CI system
  • Page 315

  • A portfolio rating analysis and reporting system to automate the reporting process and update the aggregated rating results on student portfolios in real time
  • An administration module used to modify the system to meet local needs and applications (e.g., change learning objectives/criteria, adopt a longitudinal model instead of a showcase model of portfolio assessment)

4.3 Applications of the Assessment Methodology

In applying the continuous improvement model to institutional processes, Rose­Hulman has completed the first iteration of defining the student learning objectives. A team of faculty then met to evaluate a number of assessment methods to determine what primary method would be used to collect evidence of student learning outcomes. The criteria established to evaluate the different methods were: 1) the method should provide rich information, 2) the method had to be valid in that it focused on Rose­Hulman specific outcomes, 3) the method needed to be non­intrusive on faculty and students, and 4) the method should be relatively easy to administer. The result of the deliberations was a decision to use portfolios as the primary means of collecting evidence and evaluating student learning outcomes.

The was concern about the workload required to administer a portfolio process led to the decision to utilize information technology to facilitate the process. The use of an electronic portfolio system would not only provide efficiency in data storage and acquisition but also promote asynchronous assessment in that students could submit documents and review their portfolios and faculty could rate the portfolios at any time from anywhere.

The e­Portfolio CI system requirements were developed primarily by faculty and were designed to be sensitive to the heavy workloads confronting both faculty and students at Rose­Hulman. System designers from the Office of Institutional Research met with faculty and administrators to determine design requirements, preferences for options, and possible future development of the system. Each of these aspects of the portfolio was important to build into the design. Once the system requirements were defined, institutional resources were used to create the electronic infrastructure needed to support the design. The CI system was designed to minimize the amount of human intervention necessary for routine tasks to support and maintain the process. Special attention was given to provide both faculty and student development opportunities through the system design and implementation process. This was done primarily through the prototyping process and assessing both the software and the student/faculty experience. The e­Portfolio design provides for maximum flexibility to accommodate multiple portfolio uses and institutional contexts. The administration module enables adaptability to local requirements.

Secondary applications have emerged during the process of implementatingimplementing the primary system. These include supporting the reflection by individual students on their own progress, the accumulation of information in support of individual student resumes, and the awareness of student learning by faculty raters in areas outside their own discipline. In addition, the e­Portfolio system is being adapted to support course­level and department­level portfolios. The institution will soon be piloting the use of the e­Portfolio as a means to archive and review faculty promotion, tenure, and retention documentation.

Page 316

First year students are introduced to the learning objectives and the portfolio system during their first quarter on campus. Because the maintenance of portfolios is the responsibility of the students, faculty are not asked to select or collect materials that are to be placed in the system. Students are expected to choose from among the many artifacts that they produce during their collegiate experience, selecting those they feel best represent their progress toward achieving the desired student outcomes that are formally displayed as process standards in the system. In the process of submitting evidence, students are asked to include reflective statements in their portfolios explaining why they believe the chosen submission meets the stated criteria. The e­Portfolio system is web­based and enhanced by a powerful database. Students can easily submit any documents they can put in an electronic format (e.g., video and/or audio files, scanned documents, etc.).

Faculty are asked to indicate on a web­based, curriculum­mapping survey which of the nine student learning objectives they cover in the specific courses they teach. This reinforces the overall RHIT learning objectives to the faculty and provides a curriculum map for the institute that identifies where in the curricula the specific learning objectives are being taught, reinforced, and where students are getting formal feedback on their progress toward achieving the objectives.

Prior to the rating process, faculty are assigned to work in teams to assess specific learning objectives. Each team rates one of the learning objectives. The first step in the rating process is to establish inter­rater reliability to assure the reliability of the process. During the inter­rater reliability process each rater on a team is assigned the same students' electronic files to assess independently. During the independent rating each rater keeps a log of the rationale behind his/her assessment decisions for each performance criteria that define the learning objective. Logs are used to document the rating process and provide the basis for the development of meaningful rubrics for rating. These logs are electronic and a part of the e­Portfolio rating module. After each faculty member has made his/her ratings, the team meets together to compare and discuss their ratings and the rationale behind their decisions. When the team is comfortable with the fact that they are rating the material using the same criteria, the rubric is formalized for each of the performance criteria and used during the subsequent rating process. The rubrics can be changed at the mutual agreement among the raters.

Expected measurable learning outcomes are explicit, reinforced, and provide a common language and expectations for teaching and learning for both students and faculty. On most college campuses, this is no small feat. However, the CI system requires more than just the collection of data. It also requires a process to review the data as it relates to how the assessment results match the expected outcomes. The powerful database approach allows for the efficient reporting of results using multiple criteria (e.g., major, class, sex, high school size, etc.). Closing the loop in the CI system is accomplished by reviewing the assessment data (that is electronically analyzed and generated) and making recommendations to the Institute. At the same time that the Institute will be considering recommendations, individual departments will receive and review the results for the students in the department curricular programs.

Page 317

4.4 Assessment Lessons Learned

During the development and implementation of the CI system, several lessons were learned. The following represent lessons learned areas in the development and implementation of the general CI system when applied within an educational institution environment. We believe these lessons have counterparts in most other environments.

Institutional Activity ­ Involve Key Stakeholders: The development and implementation of a campus CI system is an institution­wide process. It is critical to involve key stakeholders throughout the process, but particularly in the development of objectives and in the feedback process. This increases buy­in and the likelihood that recommendations will receive broad acceptance. As in the case of the learning objectives, some processes require extensive involvement of faculty and students. Other institutional objectives may require heavy involvement from other institutional constituents. For example, in the case of institutional objectives that deal with resource allocations, those constituents that are responsible for establishing budget guidelines and priorities should be involved in the development of measurable objectives and establishing strategies for achieving those objectives. In all instances, key constituencies need to be identified and included in the CI process and effective and efficient methods of data collection and analysis must be implemented.

Allocate Resources: When information technologies are utilized in the CI processes, there is a cost. This may take the form of human and/or capital resources. It requires effort for in­house experts to develop software according to specifications, to perform testing, and to carry out ongoing improvement of the software. Servers and databases need to be purchased and maintained to support the software and growing needs for data storage and retrieval. However, the initial development process is a small fraction of the total cost of software, the larger costs being specification, testing, maintaining and evolving, documenting, etc. In addition to the development of the electronic infrastructure, there also needs to be training of students and faculty on the use of the system and coordination of rating sessions and feedback of evaluation results. The value of the electronic portfolio process needs to be clear and accepted so that students and faculty both recognize that the benefits of the process outweigh the costs. Support must begin with high levels of leadership, and a reward system that reinforces faculty participation in the process needs to be in place.

Ask Them, Ask Them, and Ask Them Again: The importance of the support of faculty cannot be understated. It is important that faculty are involved early in the process and participate in identifying learning objectives and developing performance indicators. To make the portfolio process work, it is necessary that faculty champion the system with their peers and that they encourage students to participate in the process. It is unlikely that the process can be successful, long term, without active faculty support.

Tell Them, Tell Them, and Tell Them Again: Faculty have many demands on their time and attention. It is important to reinforce the CI processes at every opportunity. Getting them involved in some part of the process providing some mechanism to reward their participation is the best way of keeping them engaged.

Dynamic Process: It is important to recognize that the CI process is a living process. That is, it is continually changing and must be nurtured. Institutional

Page 318

resources must be allocated at a level that ensures its maturation and optimizes its potential.

Ambiguous: In developing a CI process, it is important to recognized that using simple models of a complex system results in ambiguities. Many different models can be used, and there may not be a one best model. Nevertheless, in the spirit of Frederick Taylor we are seeking "best practices" with the idea that one practice may produce an outcome that is significantly superior to another.

Iterative and Integrated: The CI process is made up of multiple steps and processes. Each step needs to fit together to form a cohesive whole. In addition, the process is iterative in that findings from one step of the process serve to inform the other parts of the process. Multiple iterations may take place within the process cycle before a entire cycle has been completed. For example, during the portfolio rating process, faculty made several changes to the performance indicators to increase clarity of the learning objective. This happened before the results of the rating process were analyzed.

Start Early: It is important that institutions and programs allow enough time to develop a process in a way that is responsive to the demands for information and accountability. Engaging faculty and other stakeholders in meaningful ways, integrating information technology in the CI process, and evaluating and improving processes as they are implemented takes time. For the purposes of accreditation, it is desirable to have completed the continuous improvement cycle at least once before an accreditation visit. This is estimated to take at least two years for an inclusive process with initial demonstrated results. This is also true in commercial continuous improvement processes in connection with structures such as ISO­9000, TQM, and CMM where all advise that significant improvement programs should be expected to take years of time.

Decouple From Faculty Evaluation: Faculty participation is critical to the success of an institutional CI process. If faculty believe that they, personally, are going to be evaluated by this process they will be resistant to its use. Most institutions have faculty evaluation processes in place that provide individual feedback in areas that are valued by the institution. If not, they should be developed, but the institutional CI process should not be the vehicle.

Review Existing Technology: Much is currently taking place in the area of integrating technology into the assessment process in higher education. In some cases the technology is being developed "in­house" and in others by commercial enterprises. Investigating what others are doing in this area provided valuable information and insights into where the gaps were between what we wanted to do and what was available from other sources. It also helped to inform us on areas where we could improve our own view of how technology could be used.

Allocate Resources: Development of technology­based assessment processes is very resource intensive. If there is local expertise to manage and develop the technology, they will need to allocate time and funding for tools needed for the development effort. Depending on the tool being developed, this is a serious consideration when making the decision to integrate technology into the CI process.

Test the Technology: After the development of the alpha version of the student module of the e­Portfolio, a pilot project was conducted using upperclass students to test the module. This turned out to be a critical step prior to implementing the system

Page 319

with all the students. The students made many good suggestions about both the user interface and the implementation process. As a result of this experience, the faculty rating module was also tested and significant improvements and enhancements added. Whether or not the technology is developed internally or purchased from a vendor, experience indicates the value of pilot testing before ramping up to full implementation.

Narrow the Scope: Decisions about the use of technology and how to integrate it are very similar to the assessment process itself. Once there are multiple users and/or benefactors of the technology, there are many suggestions about enhancements and additional features that could be added. It is important to clearly define the purpose of the technology and stay focused on the task at hand. In the design of the system, it is helpful to make it as versatile as possible for a large number of applications; however, this should not be done at the expense of producing a timely and efficient local process.

Compatibility Issues: Issues of the compatibility of various database platforms and computing systems need to be considered throughout the design process. These also evolve constantly, and updates are required to maintain compatibility. When multiple raters are involved in rating student portfolios, the issue of being able to access student material can become a problem. For example, in the e­Portfolio system discussed here, students are encouraged to submit documents in .pdf format for ease of multiple user access via the web. However, it is not a requirement and most students, at this point, do not convert their files. It is important that compatibility issues are addressed early on in the process.

Importance of Support Services: Even though the technology may be locally developed by a non­IT department, it is important to coordinate the use of the system with local information technology professionals and institutional IT resource managers. It is unlikely that responsibility for the assessment process will reside in the computing center. Therefore, to have appropriate access to servers and integrated networks to support the e­portfolio process is critical. The smooth implementation of an integrated technology­enhanced assessment system will, in part, be dependent on the ability of those responsible for the assessment process to coordinate efforts with those responsible for computing resources.

Confusion of the Tool With the Process: It is important not to confuse the technology tools with the assessment process. No matter how well the technology is designed, it serves as a conduit for the assessment process; it is not the assessment process. The technology tool must support the assessment process in ways that provide efficient, accessible, and user­friendly interfaces. It does not substitute for well­developed, clearly stated, and measurable institutional objectives or evaluation processes that utilize the assessment results and are locally valid.

5 Conclusions and Future Plans

Sections 3 and 4 list individual lessons we have learned. More global conclusions and plans are as follows:

  1. As Drucker has also reported, the center of gravity of professional education is moving from collegiate years to continuing education. [Drucker 1999] Coupled with new enabling technologies, this will result in new models of educational
  2. Page 320

    delivery. Methods and tools such as those described here will be needed to assess the relative merits of competing approaches as they are piloted, deployed, and evolved.

  3. The scientific approach to improvement is difficult, but the best known: Proposing a model of a (hard technology or human institutional) system, gathering data on the behavior of that system, and testing the model with the data, are in the realm of science. Synthesizing systems and adjusting their configuration to achieve ends are in the realm of engineering. Frederick Taylor's "Scientific Management" emphasized the collection of objective data as the basis of the shop productivity methods he engineered, and Taylor went to great effort to conduct experiments collecting information about materials, tools, people, and processes over many years. In real world institutional systems, the laboratory may well involve people's lives and shareholders' institutions: As it was for Taylor's contemporaries, the process may not be ballet. Soft information may also masquerade as quantitative scientific fact. These are not criticisms of the scientific approach, but risks we take in attempting to place our arguments about improvement on an objective basis. No substitute for the scientific method has been seen to discover more about these complex systems.
  4. Longer term observation: In the time scale of continuous improvement of high complexity systems, the models and observations reported here are still recent. Longer term observation will be continued over years to determine continued improvement and further conclusions.

6 Appendix: About The Organizations

Two organizations collaborated in the work described here, and the authorship of this article.

Rose­Hulman Institute of Technology (RHIT) is a leading undergraduate collegiate institution, providing baccalaureate and masters level degree programs in engineering, science, and mathematics. RHIT was recently voted the number one school offering these programs at the B.S./M.S. level by representatives of its peer institutions nationally. RHIT has been a leader in the educational assessment movement, and has served as the host location and sponsor for the Best Assessment Processes Symposia held in 1997, 1998, and scheduled for April, 2000. Dr. Gloria Rogers, Vice President of Institutional Research and Assessment, is author of the assessment guidebook, "Stepping Ahead: An Assessment Plan Development Guide." The school is 125 years old, and offers degree programs in Applied Optics, Civil Engineering, Chemistry, Chemical Engineering, Computer Science, Economics, Electrical Engineering, Computer Engineering, Mathematics, Mechanical Engineering, and Physics. As a part of its interaction with business and industry, RHIT recently established the Center for an Innovation Economy (CIE), located on a 180 acre business campus near its academic campus in Terre Haute, Indiana. RHIT was an early pioneer in the development and application of electronic portfolios for use in educational program assessment, resulting in the e­Portfolio system.

International Centers for Telecommunication Technology, Inc. (ICTT) is a specialist in system engineering services for its complex systems clients in telecommunications, mobile equipment and power systems, and other markets. The

Page 321

company specializes in providing system engineering across multiple technologies (electronic, mechanical, software, chemical, and human enterprise organizations), for which the focal issue is the complexity of families of configurable systems. A commercial enterprise, ICTT is partly owned by Rose­Hulman Institute of Technology, with one of its offices located at Aleph Park, RHIT,s business campus. Through its research, publishing, and licensing affiliate, System Sciences, LLC, the company supplies SystematicaTM, a system engineering methodology and supporting tool set for use by organizations in which high complexity systems and organization are a central issue. ICTT is also the commercial distributor for the e­PortfolioTM assessment tools that originated within Rose­Hulman Institute of Technology. William D. Schindel is President of ICTT and System Sciences, LLC.

References

[Alexander et al 1977] Alexander, C., Ishidawa, S., Silverstein, M., Jacobson, M., Fiksdahl­King, I., and Angel, S.: A Pattern Language, Oxford University Press, New York, 1977.

[Arter et al 1995] Arter, Judith A.: "Portfolios for Assessment and Instruction," ERIC Digest, No. ED388890, 1995.

[Bate et al 1995] Bate, Roger, et al: A Systems Engineering Capability Maturity Model, Version 1.1, Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, November, 1995.

[Bonner 1988] Bonner, John Tyler: The Evolution of Complexity by Means of Natural Selection, Princeton University Press, 1988.

[Booch, Rumbaugh, and Jacobson, 1999] Booch, G., Rumbaugh, J., and Jacobson, I.: The Unified Modeling Language User Guide, Reading, Mass.: Addison­Wesley Publishers, 1999.

[Brassard 1989] Brassard, Michael: "The Memory Jogger Plus +TM." Goal/QPC, Methuen, Massachusetts, 1989.

[Chandler 1977] Chandler, Alfred D.: The Visible Hand: Managerial Revolution in American Business, Cambridge, MA: Harvard University Press, 1977.

[Churchland and Sejnowski, 1992] Churchland, Patricia, and Sejnowski, Terrence: The Computational Brain, Cambridge: MIT Press, 1992.

[Crick 1994] Crick, F.: The Astonishing Hypothesis, New York: Charles Scribner's Sons, 1994.

[Damasio 1999] Damasio, A.: The Feeling of What Happens: Body and Emotion in the Making of Consciousness, New York: Harcourt, Brace & Company, 1999.

[Dennett 1996] Dennett, Daniel C.: Kinds of Minds: Toward an Understanding of Consciousness, New York: Basic Books, 1996.

[Drucker 1954] Drucker, Peter F.: The Practice of Management, New York: Harper & Row, 1954.

[Drucker 1999] Drucker, Peter F.: Management Challenges for the 21st Century, New York: Harper Collins Publishers, 1999.

[Edelman 1987] Edelman, G.: Neural Darwinism: The Theory of Neuronal Group Selection, Basic Books, 1987.

[Edelman 1988] Edelman, G.: Topobiology: An Introduction to Molecular Embryology, Basic Books, 1988.

[Edelman 1989] Edelman, G.: The Remembered Present: A Biological Theory of Consciousness, New York: Basic Books, 1989.

Page 322

[Gamma et al 1995] Gamma, E., Helm, R., Johnson, R. , and Vlissides, J.: Design Patterns: Elements of Reusable Object­Oriented Software, Addison­Wesley, New York, NY, 1995.

[Glickstein 1984] Glickstein, I.: "The Rational Cockpit and Advanced Automation", Technical Directions, IBM Federal Systems Division, Volume 10, No. 4, 1984.

[Hammer and Stanton 1999] Michael Hammer and Steven Stanton: "How Process Enterprises Really Work", Harvard Business Review, November­December, 1999, pp. 108­118.

[Intelligent Enteprise 1999] Intelligent Enterprise magazine, Miller Freeman, Inc., 411 Borel Avenue, Suite, 100, San Mateo, CA 94402.

[ISO 1999] ISO/IEC IS 9595, CCITT Recommendation X.710, Information Technology ­ Open Systems Interconnection 6shy; Common Management Information Service Definition, (1991).

[Jacobson 1992] Jacobson, Ivar: Object­Oriented Software Engineering: A Use Case Driven Approach, Reading, Mass.: Addison­Wesley Publishers, 1992.

[Kanigel 1997] Kanigel, Robert: The One Best Way: Frederick Winslow Taylor and the Enigma of Efficiency, New York: Penguin Books, 1997.

[Knowledge Management 1999] Knowledge Management Magazine, 156 W. Fifty­Sixth Street, 3rd Floor, New York, NY, 10019, 212­333­7600.

[O'Dell and Grayson 1998] O'Dell, Carla, and Grayson, Jr., C. Jackson: If Only We Knew What We Know: The Transfer of Internal Knowledge and Best Practice, New York: The Free Press, 1998.

[Paulson et al 1991]: "What makes a portfolio a portfolio," Educational Leadership, 48(5), 60­63, 1991.

[Prus and Johnson 1994] Prus, J. and Johnson, R.: "Assessment and Testing Myths and Realities." New Directions for Community Colleges, No. 88, Winter, 1994.

[Rugaber et al 1999] Rugaber, Spencer, et al: Georgia Tech Reverse Engineering Group Web Site: http://www.cc.gatech.edu/reverse/

[Schindel 1996] Schindel, William D.: "System Engineering: An Overview of Complexity's Impact", SAE International, SAE Technical Paper 962177, October, 1996.

[Schindel 1997] Schindel, William D.: "The Tower of Babel: Language and Meaning in System Engineering", SAE International, SAE Technical Paper 973217, November, 1997.

[Senge 1990] Senge, P.: The Fifth Discipline: The Art & Practice of The Learning Organization, New York: Doubleday, 1990.

[Strassmann 1990] Strassmann, Paul A.: The Business Value of Computers, New Canaan, CT: Information Economics Press, 1990.

[Strassmann 1994] Strassmann, Paul A.: The Politics of Information Management, New Canaan, CT: Information Economics Press, 1994.

[Stiles and Glickstein 1991] Stiles, P., and Glickstein, I.: "Route Planning", Proceedings of the IEEE/AIAA 10th Digital Avionics Systems Conference, October, 1991, pp. 420­425.

[Thompson 1961] Thompson, D'Arcy: On Growth and Form, Cambridge University Press, 1961.

Systematica, Gestalt Rules, Return on Variation, and RNA Transaction Model are trademarks of System Sciences, LLC. e­Portfolio is a trademark of International Centers for Telecommunication Technology, Inc. Return on Management is a trademark of Strassmann, Inc.

Page 323