THE USE OF ADA FOR THE IMPLEMENTATION
OF AUTOMATED INFORMATION SYSTEMS
WITHIN THE DEPARTMENT OF DEFENSE
James C. Emery
Naval Postgraduate School
Monterey, CA 93943
28 December 1993
Table of Contents
I. Management Summary 1
II. SOFTWARE DEVELOPMENT FOR AIS AND EMBEDDED SYSTEMS 4
Choosing a Programming Language 5
Automated Information Systems Versus Embedded Systems 8
Automated Information Systems 8
Embedded Systems 9
Problems with the Conventional Software Development Process 12
An Emerging Software Development Paradigm 14
The Payoff from High Productivity 18
Control in the New Development Paradigm 20
III. Ada 21
The Ada Market 21
The Ada Mandate 23
IV. Recommendations 24
V. Appendices 25
1. Software Reuse and Higher-Level Languages 25
2. A Brief Background on Ada 28
3. Characteristics of Integrated CASE Products 29
VI. REFERENCES 31
VII. BIOGRAPHIES OF AUTHORS 33
I. MANAGEMENT SUMMARY
The Department of Defense (DoD) is the world's largest consumer of information system resources. Given the leading position of the United States in most areas of information technology, DoD has an avowed policy of exploiting this advantage to the maximum extent in its weapon systems, command & control systems, and supporting administrative systems. The development of computer software stands as one of the principal bottlenecks to achieving more effective information systems, and so the process for developing application software is a vital issue for DoD.
In 1990, Congress mandated the use of the programming language Ada for all DoD software development -- including automated information systems (AIS) -- "where cost-effective." This report examines the issues concerning the use of Ada for developing AIS, and recommends changes where deemed appropriate.
Ada is widely used for recently-developed embedded weapon systems within DoD; it is also used for some non-DoD applications having similar characteristics (notably, avionic and air traffic control systems). Ada has proven to be well suited for large real-time applications with exceptional reliability requirements. The use of Ada for AIS applications, however, remains quite limited within DoD and almost negligible in the commercial world of management information systems.
We conclude that this pattern of usage is a result of rational decision-making on the part of project managers and software developers. Ada is the language of choice for embedded systems -- the domain for which it was developed -- but generally does not provide the most cost-effective approach for developing automated information systems.
We do not view this as a startling or radical conclusion. There is no reason to suppose that a programming language designed for the embedded domain will necessarily be the best choice across the full spectrum of applications within DoD. A principal motivation of the creators of Ada more than a decade ago was the elimination of DoD's rampant proliferation of languages and dialects -- estimated at one time to exceed 300 in number. But the fact that 300 languages is bad does not imply that a single standard language is good. High-level languages and development tools designed specifically for the AIS domain generally offer substantial advantages over a single general-purpose procedural language such as Ada.
Ada language features and the structured software development methodology that it supports and enforces are appropriate for applications having a requirement for high reliability and stability -- both particularly important for embedded systems. Applications of this sort are expensive to develop and maintain, however, due to the extraordinary measures required to achieve near perfection. In contrast, the best AIS development environments, tailored to meet the particular requirements of the AIS domain, can increase productivity by a factor of ten or more compared to the conventional development process for embedded systems. Although an improvement of this magnitude is difficult to achieve, it is a reasonable long-term aspiration if DoD commits itself to a reinvention of its software development process for AIS.
A tenfold productivity gain does not merely reduce the cost of software development and maintenance; even greater benefits are likely to come from increased responsiveness in adapting software to the changing needs of DoD information systems. Without a development methodology that permits relatively inexpensive and quick changes to applications, information systems will continue to be plagued by their inflexibility in responding to organizational learning, changing environmental conditions, and advances in technology.
Advanced AIS development environments take advantage of such tools as integrated computer aided software engineering (I-CASE) products, powerful graphically-oriented application generators, and commercial off-the-shelf (COTS) software. These tools enable developers to minimize their use of extremely labor-intensive programming in a third-generation language (3GL) such as Ada, COBOL, or C/C++. There is a thriving commercial market for non-Ada development tools, and vendors continue to invest millions of dollars to expand their offerings.
Ada-based development environments designed primarily for embedded software generally cannot compete with the more specialized environments focused on the development of AIS applications. Would-be Ada developers of such applications face a relative paucity of Ada products aimed at this market. The software industry has not invested heavily in such products because the market for them is so small, and the market is so small because potential customers see little incentive to abandon the powerful development environments already provided for the non-Ada world.
There is little reason to expect that this cycle can be broken. Past experience in the software industry strongly suggests that users will stick with their existing environment unless they expect to gain a substantial advantage from shifting to a new one. Massive expenditures would have to be made in Ada products and staff training to overcome the existing lead of the more popular non-Ada environments. It appears unlikely that commercial software firms will invest their own funds in such an endeavor; it appears equally unlikely that DoD will be willing or able to provide funding comparable to the level of investment currently being made by the private sector in non-Ada tools for AIS applications. Under these circumstances, the gap between the Ada and non-Ada worlds can only widen.
Gaining a major improvement in software productivity does not come simply from introducing a better language or set of tools (as important as they are); it also requires substantial changes in such things as the organization and composition of development teams and mechanisms for interacting with users over the entire development cycle. In some important respects, the typical Ada software engineering approach is incompatible with an emerging development paradigm in the AIS world that emphasizes adaptation in preference to stability and control. Furthermore, undue attention to the relatively low-level issue of selecting a programming language diverts attention from larger issues that have a far greater impact on the effectiveness of information systems.
Most large organizations in the private sector recognize the need to change from their traditional, problem-ridden development and maintenance process for software. Leading companies have already aggressively adopted a new development paradigm, and are now reaping the benefits. Almost all major companies have at least begun to explore new methods of software development. There are no easy paths to success, but it is clear to almost every informed person that something must be done to extricate their organizations from the chronic software problems that engulf them. A fundamental part of any such strategy is a shift away from 3GL programming.
It is critical to note that not a single large enterprise in the private sector looks to Ada as a broad source of relief for its software problems. Although some limited use of Ada exists for special-purpose applications such as process control, no major private-sector organization contemplates adopting Ada as the language of choice for its administrative systems, in the manner mandated for DoD.
This is a message hard to ignore. Unless one can argue that DoD administrative systems have unique characteristics, or that DoD management possesses unique insights about AIS development -- not easy arguments to defend -- there seems little justification for DoD to force the use of Ada for large-scale automated information systems in the face of such universal rejection of this approach by the private sector. If the Ada mandate is vigorously enforced for AIS, DoD must realistically expect to stand largely on its own in a critical area of information technology. The checkered history of developing special-purpose technology for DoD does not give one much confidence that this is the right way to go.
In view of these considerations, we make the following recommendations:
II. SOFTWARE DEVELOPMENT FOR AIS AND EMBEDDED SYSTEMS
The effective development of automated information systems (AIS) is a critical success factor for the Department of Defense. The aggregate cost to develop and maintain these systems exceeds $5 billion per year. By almost everyone's reckoning, a considerable fraction of this cost could be saved -- or spent more productively -- through the use of a more efficient and effective software development process.
The potential benefits from improving the process dwarf the annual expenditures figure. The savings will not come primarily from reductions in software development costs -- indeed, these costs are likely to increase, given the growing demands placed on AIS -- but rather from improved information systems that yield benefits in such form as reduced inventory costs, fewer operating personnel, and improved decision making. DoD's dependency on AIS for all manner of administrative and logistics functions is already very high, and is certain to increase. It is difficult to see how the Department can meet its commitment to reduce its budget by billions of dollars over the next several years without making substantial improvements in its large administrative systems. In an era of tight resource constraints, improvements in the administrative tail translate directly into greater military tooth.
The bottleneck to dramatic improvements in administrative processes is very often the development and maintenance of computer software. Over the past three decades, computer hardware has shown improvements well in excess of a factor of a thousand. During this period, the rate of improvement in software development and maintenance has fallen far behind the rate of advance in hardware, continually widening the gap between the power of the hardware and our ability to use it effectively.
Software has become the most problem-ridden aspect of applying information technology. No vision or strategic plan for exploiting information technology will have its full impact until DoD develops improved processes for developing and maintaining software. That is true of the entire spectrum of application domains, from AIS to embedded weapon systems.
This report focuses on policy issues concerning automated information systems, with special attention given to the role of Ada in developing these systems. We have concluded that Ada generally does not provide the most cost-effective environment for AIS application. This conclusion is based on the following line of reasoning:
Choosing a Programming Language
The design and use of the programming language Ada has been a controversial issue ever since its specification in the early 1980s. Much of the heat generated by both the pro- and anti-Ada camps stems from a misplaced emphasis on a fairly detailed implementation issue. The choice of programming language is important, but it is no longer the critical issue it seemed when the Ada efforts were begun.
Far more important than choice of language is having a realistic strategic plan for exploiting information technology (IT). This includes decisions about the aggregate level of IT resources, organization of IT activities, target areas for IT applications, the extent of sharing and integration across services, the degree of centralization of information system development, mechanisms for allocating IT resources, and general enterprise architectural issues.
Second in importance to an enterprise-wide plan is the IT infrastructure. It is becoming increasingly recognized that an effective infrastructure is a key ingredient in exploiting information technology. The infrastructure consists of generic resources that facilitate the development of specific applications. These include communications networks that link computers and databases, mechanisms for sharing common data, hardware platforms, educational and training programs, user support services, and the software development environment. With an effective infrastructure in place, the development of individual applications should become a relatively straightforward process.
The choice of a programming language falls at a level distinctly below the strategic plan and IT infrastructure. A language is part of the definition of the software development environment -- but only a part. Arguably the most important attribute of a programming language is its compatibility with an advanced support environment. The environment includes such things as computer-assisted software engineering (CASE) tools; application development tools; visual programming aids; graphical user interface (GUI) tools; debugging tools; utility programs; various "middleware" products for interfacing new applications with the huge body of existing legacy systems; and various products for managing software testing, program releases, software distribution, and configuration control. Application developers denied the aid of a strong development environment find themselves at a serious disadvantage compared to those having a well supported environment.
An important criterion in judging the suitability of a language is the extent to which it includes features that make it easy for an application developer to define common tasks. Consider, for example, the calculation of the net present value (NPV) of a stream of future cash flows, which arises relatively frequently in financial and cost-benefit analyses. Languages aimed at such applications -- e.g., spreadsheets and specialized decision support languages -- almost always include an NPV function as a built-in "primitive" of the language. A single line of code, such as Value = NPV(Discount_Rate, Cash_Flows), can then be used to perform this calculation. In the absence of a built-in function -- or, alternatively, the availability of a "subroutine" or reusable component that performs the NPV calculation -- the application developer has to write a small program using the language's more elementary features, which always involves more work on the part of the programmer. The provision of powerful built-in functions of this sort is one of the keys to improving software quality and productivity.
Much of the gain in productivity from a built-in function such as NPV comes from its nonprocedural form that allows a programmer merely to define the desired end result (an NPV value) rather than specify a step-by-step procedure for computing the result -- i.e., to state what is desired instead of how to do it. A nonprocedural specification generally requires considerable fewer language statements than an equivalent procedural specification, and is therefore more productive from a programming standpoint. This requires, of course, that the translator for a nonprocedural language contain the appropriate "knowledge" to execute the correct computer instructions to achieve the stated end result (such as translating "NPV" into the series of steps required to compute net present value).
The choice of functions to include in a high-level language depends on the application domain for which the language is targeted. Languages designed for the AIS domain generally include features that facilitate the programming of such common tasks as editing input transactions, managing the database, formatting display screens, creating reports, performing relatively simple arithmetic calculations, and providing backup and recovery in case of a system failure.
A 3GL such as Ada has the virtue of providing a general capability for defining any specifiable task, but it incorporates almost no specific knowledge about AIS applications. For example, Ada does not know about calculating NPV or creating reports and screen formats; to the extent that an application calls for these functions, the knowledge has to be provided by the programmer (or a reusable component). The burden that this added task imposes on the programmer generally carries a stiff price in reduced productivity and greater opportunity for error.
Given the practical limitations on the complexity of a language -- for both developers of the language and those who must learn to use it -- no language can contain application-specific knowledge across a wide spectrum of application domains. An inevitable tradeoff thus exists between the generality of a language and the application-specific knowledge that it can incorporate. Two important consequences stem from this tradeoff:
As the field of software engineering matures, a larger share of the burden for software development shifts away from traditional third-generation programming languages to higher-level languages supported by an extensive set of complementary development tools. In a growing number of applications, use of a 3GL is avoided altogether. It is important to keep this trend in perspective when assessing the role of Ada in the development of automated information systems. Undue attention to the choice of programming language runs the risk of emphasizing the wrong problem.
Automated information systems versus embedded systems
The suitability of using a common language for both embedded systems and AIS depends on the extent to which the two domains have common functional characteristics. Although considerable overlap may exist, it is nevertheless true that the typical automated information system has a number of distinguishing characteristics that differentiates it from the typical embedded system. These differences should be taken into account in establishing the most cost-effect development methodology for each domain.
Automated Information Systems
Hardware is rarely much of an issue in the design and implementation of AIS applications. Such applications almost always operate on general-purpose hardware, and any capacity limitations can generally be relaxed by adding to capacity (except for budgetary constraints).
The human interface for an effective AIS must be adaptive and cater to the idiosyncratic needs of its users. These systems reflect the changing business processes underlying the organization's missions. AIS applications must accommodate a wide range of users, from data entry operators to senior decision makers. Each type of user has different -- often unique -- needs. A well-designed AIS generally requires graphical user interfaces (GUI) to make the application more user friendly and productive. As needs change in response to organizational learning and new external factors, the human interface needs to adapt correspondingly.
An AIS must often serve a geographically dispersed set of users. Contemporary design increasingly relies on a client-server architecture consisting of scattered processors linked through a communications network. Despite their physical distribution, automated information systems must satisfy growing demands for seamless cross-functional integration.
One of the principal hallmarks of an AIS is its heavy dependence on the database that supports it. A database for a large AIS typically consists of billions or even trillions of bytes of frequently-changing data distributed over a multi-level (and increasingly geographically distributed) storage hierarchy. The stored data deal with such matters as employees, inventory items, requisitions, vendors, and physical facilities. The AIS needs to manage rich associations among data items -- among inventory items, requisitions, and vendors, for example. An effective contemporary design generally calls for the sharing of data across applications to reduce redundancy and increase consistency. So important is the database that developers increasingly treat it as the central focus of the design (instead of the more traditional focus on individual applications). This approach relies heavily on the use of a generalized relational database management system.
AIS vary greatly in size. At the high end of the spectrum, an AIS may consist of several million lines of code. With the inexorable pressures to add functionality and integration to meet growing mission needs, there has been a continual upward creep in application size -- constrained mainly by the technical problems of developing applications of increased scale.
At the low end of the size spectrum, small applications have proliferated to meet the special needs of departments, workgroups, and individuals. User-friendly development tools, such as spreadsheets and database languages, have been applied extensively by end users to develop systems on their own. When appropriately controlled, end-user computing can provide an exceedingly cost-effective solution to local, modest-sized problems.
Automated information systems vary widely in their time criticality. Some AIS provide batch processing outputs for which response time is not a critical issue. Other applications may support a variety of such time-dependent functions as interactive queries, on-line updating of massive databases, and interactive decision support systems. Even interactive systems, however, generally require response times on the order of a few seconds, with excessive delays generally carrying only modest penalties.
Availability ("uptime") and accuracy requirements for an AIS tend to be less stringent than for embedded systems. Availability is hardly a matter of indifference to its users, but the reliability of modern hardware and systems software is high enough that an availability in excess of 99 percent can be expected with good management without having to take extraordinary measures (e.g., special fault-tolerant hardware or redundant databases). An AIS must meet high standards for accuracy in order to avoid such errors as a lost replenishment order or an incorrect payment to a vendor. This places a high premium on well-designed data entry procedures to deal with a high volume of manual (i.e., error-prone) data inputs, as well as careful software testing to determine that the system generates the correct outputs for all possible data inputs. Setting availability and accuracy specifications for an AIS generally requires an economic tradeoff between the cost of errors or downtime versus the cost of avoiding them; the design rarely must consider life- or mission-threatening risks.
Embedded programs differ significantly from AIS in many of their important characteristics. Embedded systems often operate on special-purpose hardware that offers relatively little capacity flexibility. Timing and capacity considerations typically force careful attention to hardware performance.
An embedded system tends to place less emphasis on its database than does an AIS. Data structures are generally specialized to the particular application rather than employing a generalized database management system.
As in the case of AIS, embedded systems are subject to continual pressures for increased functionality and therefore larger program size. The F-14, for example, requires about 32 thousand lines of code, while the advanced tactical fighter of today is estimated to require seven million lines of code.
An embedded program cannot tolerate the adaptability and idiosyncrasies required of an effective AIS. Embedded software interacts strongly with the hardware components of the system of which it is a part. In an avionics system, for example, programs may share memory and other computer resources, and deal with tight coupling with such components as the propulsion system, smart weapons, and electronic countermeasures. A change that significantly affects an interface with another component may entail a tremendous cost or performance penalty. Because of this, compulsive attention must be paid to getting a sound set of up-front requirements and then keeping the specifications as stable as possible.
Embedded systems typically function in a real-time world of high-speed data streams and dynamic physical control processes. Delayed control actions, or loss of data due to overloads or timing problems, may threaten life or mission.
A systems availability of 99 percent might be tolerable for a batch processing system, but would generally be disastrous for a mission-critical embedded system. As a result, highly disciplined techniques for software engineering and testing must be employed to insure an extremely high rate of availability.
The table on the next page summarizes the differences between automated information systems and embedded systems.
Comparison Between AIS and Embedded Systems
AIS Embedded Systems
General-purpose, generally imposing Special-purpose, often with stringent
little or no capacity constraints; capacity constraints; hardware
hardware efficiency seldom a major performance often a major design
design consideration. issue.
Variety of users having widely varying Limited variation in the interfaces for
(and changing) requirements; systems different users; systems effectiveness
efficiency depends heavily on the depends heavily on the cognitive
quality of the human interface. efficiency of the interface.
Interface with Other Components
Interface typically handled through Tight coupling with other components;
the sharing of data stored in a management of the interfaces a major common database. design issue.
Very large and dynamic, maintained Generally of relatively modest size, with
in a multilevel storage hierarchy; a simple specialized structure; generally
rich links among data elements; not the major focus of the design.
heavy design emphasis on the database;
heavy reliance on a generalized
database management system.
Increasingly moving toward client- Generally self-contained within a weapon
server architectures with geographically system, with relatively limited
distributed components linked through communication links to external
a network. components.
At high end, up to several million Generally in excess of 100K lines of
lines of code; at the low end, up code -- and growing.
to a few thousand lines of code, often
developed by end users.
Generally modest, involving economic Generally time critical, with life- and
tradeoffs between cost and response mission-threatening consequences.
Requirements generally achievable Often involves ultra-high reliability/
with normal good design practice. accuracy requirements that call for
extraordinary design measures.
Differences between automated information systems and embedded systems should not be viewed as merely a matter of degree. Large differences in degree matter greatly in the way systems should be designed and implemented. A methodology yielding acceptable downtime for an AIS may be totally inappropriate for a real-time control system. The reverse is also true: a methodology appropriate for embedded systems might carry an unjustified price tag and rigidities for an AIS. This is one of the important reasons why the choice of programming language and development tools for AIS should give special attention to the domain characteristics and organizational culture within which these applications are developed.
Problems with the Conventional Software Development Process
In assessing Ada's use for automated information systems, it is necessary to consider the development process under which Ada applications are typically developed. The characteristics of Ada -- not to mention DoD acquisition regulations and documentation standards -- tend to favor rather close adherence to the conventional life cycle development process. This process has not changed fundamentally in over two decades, notwithstanding important improvements in the theory and practice of software engineering. The classical approach can generally be characterized as follows:
There is a vicious circle at work here. Low productivity results in a very large team size, extended project duration, and lack of continuity among development staff members. These necessitate large overhead expenditures for control and documentation, which further reduce productivity. An important aspect of project control is the inhibition of all but the most critical changes in specifications, even at the risk of delivering a product that does not meet real needs.
The characteristics of the conventional development process contribute to a history of software problems in many organizations: implementation projects vastly exceed budgets and schedules, maintenance of old "legacy" systems soaks up most available resources, inflexible applications fail to adapt to changing needs, and development projects get abandoned after millions of dollars have been squandered on them. Large automated information systems within DoD have had their share of these problems.
Notwithstanding its dubious track record, the conventional development process still largely governs much of current practice within DoD. Most of the Department's efforts to improve software development focus on fine tuning and imposing greater discipline over the process, rather than fundamentally altering it. Improvement efforts are directed toward establishing well-defined standard procedures and metrics that allow a software organization to reduce variability and increase predictability of its development process. A "mature" organization, according to the model developed by the Software Engineering Institute (SEI) at Carnegie-Mellon University, is one that can reliably deliver a product approximately on time and on budget, and in close agreement with specifications. Few organizations have been judged to have reached a high level of maturity according to the SEI criteria.
A mature software engineering process may suffer from the vices of its virtues. The delivered product may match the original specifications but may not satisfy real user needs. Strict adherence to the process may allow developers to meet project plans, but not necessary at the lowest cost or with the shortest development cycle. Process discipline is achieved only with great difficulty and at considerable expense in establishing and operating the necessary control mechanisms. Even when applied well, the conventional process has some serious limitations; when applied badly (which is not uncommon), it can produce expensive failures.
An Emerging Software Development Paradigm
The widespread difficulties experienced in software development suggest that the problems stem from intrinsic flaws in the conventional implementation process that are not likely to be eliminated by mere fine tuning. There are encouraging signs that an improved development paradigm is emerging that differs from the old one in a number of significant ways.
The new process can improve productivity by a factor of ten or more. To be sure, productivity gains of this magnitude are difficult to achieve and require getting a lot of things right. Nevertheless, forward-thinking organizations are increasingly adopting elements of the new paradigm and thereby gaining substantial payoffs.
It is difficult to characterize the new paradigm, because considerable variation exists in the way it is applied by different organizations. Nevertheless, these variations often share some important common themes:
The table on the next page summarizes the differences between the new and old development paradigms.
Although some of the emerging changes in development practices apply to the embedded and C3I system domains, they are particularly suited for AIS applications, for which adaptability is a crucial need. Some of the features of the new paradigm are perfectly compatible with an Ada development environment (the use of reusable components and object-oriented technology, for example), while in other cases the Ada environment suffers from some distinct disadvantages (lack of a rich and cost-effective set of development tools, for example, and an inability to make quick and inexpensive changes to a large prototype program).
Comparison Between the Conventional and New Development Paradigms
Conventional Methodology The New Paradigm
The implementation is broken down into The various stages of the implementation
independent stages, with relative sharp blend into each other in a relatively
boundaries between them. seamless fashion.
A third-generation procedural language Various methods -- such as I-CASE tools
is used to define the application in or COTS -- are used to eliminate or
line-by-line fashion. reduce line-by-line procedural coding.
Due to the low productivity of 3GLs, High productivity permits substantial
a large team is required to develop applications to be developed by a small team
a large AIS application. (thus adding further to productivity); the
relatively rapid delivery of an application
often allows very large applications to be
implemented in separate chunks.
Each separate stage of the development Team continuity is provided throughout
is handled by a separate team having the implementation process, reducing the
specialized skills. problem of inter-stage communication
and improving incentives for success.
Requirements are defined up front, and Continual adaptation takes place during
thereafter held as stable as possible the development process to take advantage
to reduce disruption. of organizational learning and changes
Large development projects have a A relatively short development cycle
long delivery cycle, during which better assures that external requirements
external requirements may change will not change significantly before an
significantly; users tend to "gold plate" application is delivered; any missing
their requirements to avoid overlooking functions can be added to next cycle.
possible long-term needs.
Any changes in the system -- even minor End-user tools permit non-specialist cosmetic changes in the interface -- users to modify the interface of an
must be performed by the technical staff application (without tampering with its a through a formal task definition. inner core).
Maintenance consumes most of the A productive, adaptive development
available technical resources, inhibiting methodology is continued throughout the
new development projects. maintenance phase.
The Payoff from High Productivity
The emerging software development paradigm provides a multi-faceted approach to improving software development and maintenance. This recognizes that the productivity for a given project depends on a number of factors:
Improving any one of these factors, without changing the others, will almost certainly result in disappointment. In particular, a focus solely on development tools or programming languages will have little effect, and could conceivably hurt productivity by the quest for the magic silver bullet that diverts attention from all of the other productivity factors. Productivity improvements from the use of integrated CASE tools, for example, have often been disappointing because developers were unable or unwilling to accompany the introduction of the tool with all of the other changes needed to sustain a high-productivity environment.
From the standpoint of productivity, the ideal project deals with a high-profile, well-understood, stand-alone application designed from scratch by a small team of highly qualified personnel who receive first-rate support by management and users at all levels in the organization. Under such ideal conditions, productivity can increase by much more than tenfold. Within a large organization, however, most applications must be developed under considerably less than ideal conditions. Nevertheless, the most productive large firms are able consistently to sustain productivity rates nearly four times the average figure for commercial information systems and ten times the average figure for DoD.
There is no lack of evidence to support the proposition that major improvements in software productivity can be achieved under the right conditions; the important issue from DoD's perspective, however, is whether it is possible to make significant improvements in productivity under the special conditions that prevail within DoD. Even though the ideal development conditions cannot be realized for most projects, DoD still has considerable discretion in how it manages software development. Substantial room for improvement exists through improved training of IT managers and staff members, proper organization and management of development teams, provision of a high-productivity development environment, reduced unproductive documentation and red tape, and the selection of high-payoff applications that offer good prospects for success. DoD would forfeit a great opportunity if it failed to take advantage of the flexibility that it has to improve development productivity.
A discussion of software productivity within DoD would be incomplete without mention of documentation requirements. DoD pays an enormous price for documentation -- up to three times the cost of coding. Many experienced project managers regard 2167A and 7395A documentation standards as excessive, often resulting in the ritualistic creation of a flood of documentation costing far more than its worth. In the current DoD efforts to revise documentation standards, it will be critical to question each requirement's contribution to improved software quality, project control, contract management, and software maintenance. In the spirit of functional process improvement, a documentation requirement should be eliminated if it costs more than its likely value in terms of more effective delivered software.
These policy issues connected with software development deserve high-level attention from DoD management. Achieving high productivity is not just a matter of lowering the cost of software development; more importantly, it would permit DoD to make a major qualitative shift in the way administrative systems are planned and deployed.
The annual cost of maintaining a large legacy system generally runs about 10 to 20 percent of its initial development cost. With a tenfold increase in development productivity, developing an entirely new system costs no more than the cost of keeping an old one for an additional year. As a practical matter it is not possible to scrap all legacy systems, but there will at least be less incentive to keep them well past their useful lives. This is true even if the productivity gain falls well below the tenfold target.
Business reengineering and functional process improvement (FPI) have gotten a great deal of attention recently within both the public and private sectors. The hallmark of such FPI is a complete re-thinking about how the organization wants to conduct its business. Existing legacy systems impose serious constraints on making needed changes. A productive software development environment makes it feasible to develop new systems to support changed business processes. DoD has committed to making substantial reductions in its operating costs, which will be very difficult to achieve without having an effective means to develop the AIS applications to support FPI.
Perhaps most important of all, increased software productivity would permit DoD to adapt to changing needs and take advantage of organizational learning. DoD faces great uncertainty over the rest of the decade and into the next century. As automated information systems become more pervasive throughout all activities within the Department, it will become increasingly essential to avoid the rigidities of existing legacy systems. The reduced application development cycles made possible by a high-productivity environment makes it much more possible to deliver to users the functional capabilities they need as their world changes.
Even modest improvements in DoD's overall software development environment will not come quickly or easily. Organizations within DoD will differ widely in their willingness and ability to adopt elements of the new development paradigm. It would clearly be a mistake to mandate the universal adoption of a substantially changed and immature paradigm; but it would be equally a mistake to set up constraints that restrict such adoption for those organizations eager and able to move to a more productive environment.
Control in the New Development Paradigm
The emerging software development paradigm tends to cause considerable concern to those who value standardization, control, and the avoidance of ambiguity. There is no doubt that the more adaptive approach inherent in the emerging paradigm introduces new or different problems of control. The absence of firm up-front specifications for an evolutionary process requires developers to set out on a path without knowing exactly where they will wind up. Some developers and managers find such explicit ambiguity difficult to accept.
Evolutionary development also raises problems under DoD acquisition practices. The conventional life cycle process with supposedly well-defined requirements seldom actually ends up meeting the initial specifications, but the illusion of aiming at a well-defined target is relatively acceptable under conventional contracting practices. Although existing acquisition laws and regulations do not appear to preclude evolutionary software development, common practices would certainly have to be modified.
The continuing chaotic introduction of new development languages and tools adds to concerns about lack of discipline and control. This chaos is likely to continue over the foreseeable future. There are dozens of companies vying to provide software development products. It is almost impossible to predict with any precision who the winners and losers will be.
It should be noted that the Ada environment is not immune from this chaos. Although the language itself has been kept rigidly stable over the past ten years (pending the release of Ada 9X), Ada development aids have changed continuously over this period. An increasing fraction of a developer's time is likely to be spent using development tools that are not part of Ada's formal language specification, and these tools stand little chance of being standardized or stabilized over the next several years. Paradoxically, then, the sooner developers rely more heavily on these tools, the less standardization and stability they can enjoy. Thus, two of the principal goals that motivated the creation of Ada -- the portability of people and a reduction of learning time -- are threatened by technological change.
Leading-edge companies in the private sector are not unnecessarily inhibited by the ambiguity and uncertainty inherent in today's hectic development environment. They are not willing to wait until the clear winners emerge, because the penalty of waiting is likely to exceed by far the benefit of eliminating the risk of a bad choice.
Well-managed organizations will attempt to make informed choices among competing approaches to software development. By so doing, they can acquire experience with the new paradigm, gradually moving up the learning curve to gain substantial benefits from improved quality and productivity. If a selected product becomes obsolete or superseded by a better one -- which is always a possibility with today's rate of change in technology -- then the cost of conversion to an improved approach is likely to be quite tolerable, and less than the penalty incurred if the organization had stuck with the old development methodology. In the meantime, the organization will have acquired skills and benefits that will persist. It is a fallacy to consider the status quo as risk-free; indeed, it may be the riskiest alternative of all.
A shift to a more flexible development paradigm in no way eliminates the need for good management. Development projects should still be well planned and controlled. Mechanisms should be built to make continual process improvements through the collection and analysis of software metrics dealing with such matters as defect removal, costs, the extent of component reuse, frequency and severity of failures, response time, and user feedback. Standards must be defined and enforced for the documentation necessary to support the development process, continuing maintenance, and user operations. The number of supported products should be severely limited by subjecting each candidate to a careful assessment of the value it adds to organization's portfolio of tools and the organization's ability to support it. Project risks should be contained through such means as the selection of projects that offer the best conditions for success, use of low-budget pilot projects to gain experience with new technology, frequent design and program reviews, "time boxes" to limit the amount of time allowed to complete an iterative design modification, careful control of changes in a prototype specification to guard against an upward creep in requirements of doubtful value, phased conversions, and periodic checkpoints to assess whether a project should be continued.
The Ada Market
Many software engineers believe that Ada is the best programming language for its target domain. Ada provides features, and enforces or encourages disciplined design practices, well suited to the development of embedded systems and systems having similar characteristics. The evidence for this lies in the language's widespread acceptance in such applications, even outside the DoD mandate. Ada has also been accepted for ultra-reliable non-defense applications, as demonstrated by its almost universal use for advanced air traffic control systems currently under development.
Ada does not enjoy similar success in the AIS domain. This is true even within DoD, where many AIS developers manage to evade the Ada mandate by one means or another. In the non-mandated arena, the use of Ada for AIS development is quite limited; such use that does occur is chiefly confined to large control systems that share many of the reliability and time-critical characteristics of embedded weapon systems. In conferences, exhibitions, and publications aimed at the conventional commercial market, one searches in vain for almost any shred of interest in Ada.
The skimpy use of Ada in the commercial world sends a signal that DoD should not ignore. Markets, when free to operate, generally do a good job of allocating resources and identifying attractive alternatives. A non-market command approach to such judgments has a decidedly blemished record in both government and private firms. The market in this case is sending the unequivocal message that Ada generally does not provide the most cost-effective approach to AIS development.
The shift away from procedural language coding toward higher-level development approaches puts Ada at a distinct disadvantage. Commercially available products and services supporting Ada are relatively few and expensive, and oriented toward embedded systems. This lack of interest causes Ada to be rejected by the vast majority of AIS developers. Compared to the more popular COBOL environment, Ada and its related products have a small share of the total language market.
The small Ada market for AIS applications inhibits software vendors from investing their own funds in Ada products. The high cost of product development spread over a relatively few users results in high unit costs. If an Ada product does not already exist in the market, a customer demanding it must often foot most of the development bill.
The sparse use of Ada for AIS applications has a number of other undesirable consequences. By almost any indicator of popularity, Ada falls behind alternative languages. For example, C and C++ lead Ada by a large factor in such indices as the number of computer science graduates that know the language, employment openings, and books on the subject.
In contrast to the Ada market, the more popular markets offer a plethora of development tools, generally at a small fraction of the cost of the corresponding Ada products (if they exists at all). This is especially true of the IBM-compatible PC market, where increasingly heavy-duty development tools, available for a few hundred dollars, compete with Ada-development tools priced at ten thousand dollars and more. Many of these same advantages extend to the powerful UNIX workstations market (and its offspring, the rapidly growing client-server market), and even to mainframes. Although important progress has been made in increasing the variety and reducing the cost of Ada products, the fact remains that Ada developers are placed at a significant disadvantage relative to those who employ the more popular development environments. With the growing practice of constructing applications from a variety of separate commercial products and linking them through de facto industry interface standards, Ada is put at a further disadvantage.
Nothing on the horizon suggests that this situation will change; indeed, there are reasons to fear that Ada's disadvantage is likely to widen. Users are likely to stick with their existing environment unless they expect a substantial advantage from shifting to a new one. The huge expenditures that would have to be made to bring Ada at par with competing environment do not appear to be forthcoming from either industry or DoD sources.
There is, of course, a chicken-or-egg phenomenon at work here. Ada's lack of popularity in the commercial world results in a low level of investment in the Ada environment, which in turn leads to a further reduction in Ada's attractiveness. Whether this phenomenon is rational or desirable is moot; the fact that a given technology is largely rejected by the market produces real effects that potential adopters have to take into account. The world is replete with instances of supposedly superior technologies loosing out to their competitors.
The Ada Mandate
The existing Congressional mandate requires the use of Ada for all DoD software development, "where cost-effective." Exemption from this general mandate requires special approval, with the burden of proof placed on anyone seeking to deviate from the use of Ada.
In the AIS domain, the most striking result of the mandate has been its lack of effect. Despite the uncompromising formal support for Ada at the highest DoD levels, lower-level project managers have often exerted their ingenuity to avoid the use of Ada. The behavior of project managers evidently reflects their judgment that Ada is not the most cost-effective language for most AIS applications.
One might argue that project managers, if unconstrained, might choose a language inferior to Ada, or make their decisions from a narrow, short-term perspective rather than from the broader perspective seen by higher levels of management. Even if one concedes that Ada may not be the best choice for a given AIS application, one could still argue that there is a long-term benefit in requiring all AIS developers to use Ada because it builds the Ada market and thus lowers entry barriers across the full spectrum of application domains.
This centralized viewpoint conflicts with contemporary organizational trends. Most successful private firms -- as well as the current administration -- are striving to move decisions nearer to the point where detailed knowledge resides; this is, in fact, one of the central tenets of functional process improvement and the reinvention of government. The trend toward greater decentralization is based on the assumption that the best decisions are likely to come when informed and motivated managers on the scene of action are given considerable discretion as to how they accomplish their mission. It is entirely consistent with this point of view to give project managers greater discretion in choosing a programming language, within a general set of standards and guidelines.
Almost everyone with knowledge of the current software development scene agrees that eventually a fundamentally new paradigm must replace the existing one. Serious disagreements may arise concerning the timing of the change and nature of the new paradigm, but few would question its inevitability. Formidable barriers to change exist, and so the widespread adoption of a new paradigm will undoubtedly take many years. Nevertheless, well-managed organizations increasingly recognize that they must move aggressively to extricate themselves from the software quagmire that limits their ability to exploit information technology. DoD software policies should encourage a cautious but deliberate path in this direction, rather than impose unnecessary constraints to inhibit it.
Based on the above discussion, we make the following recommendations:
1. Considerable discretion should be granted to AIS project managers in judging whether the use of Ada is cost-effective in a particular case. An AIS developer should be required to justify a detailed development plan, including the choice of programming languages and development environment. A coherent plan that proposes a high-productivity approach without programming in Ada -- such as the use of COTS, integrated CASE tools, small development teams, and an adaptive design methodology -- should generally be accepted as cost-effective, thus satisfying the Ada mandate.
2. DoD should establish a process for reviewing software development languages and tools, selecting a limited set of approved products, and building services to support the selected products. Clearly, the process of selecting an approved product must fully satisfy the requirements for competitive procurements (similar to the process used to select a DoD integrated CASE product). A developer should have to justify the use of any product not on the approved list.
3. DoD should assess current policies and standards with the objective of eliminating those that tend to force AIS developers to use unproductive methodologies. Particular attention should be given to contracting mechanisms for acquiring software through an adaptive development process, in-house versus outsourcing practices, reducing the current documentation burden, increasing the skill level of DoD software developers, and building more effective partnership arrangements between developers and users. New guidelines and exemplary management practices should be prepared for improving the efficiency and effectiveness of AIS software development.
4. DoD should conduct well-controlled pilot studies to develop improved methods of software development. Critical to such studies is the collection of comprehensive metrics to provide objective data on which sound software development policies can be based. (The pilot studies connected with the I-CASE procurement are a definite step in the right direction.)
Appendix 1: Software Reuse and Higher-Level Languages
A strong case can be made that DoD, like all other large organizations, must extricate itself from the conventional development process that requires laborious, line-by-line coding. Ada supporters generally agree with this proposition, and look to reusable components as an important contributor to this end. Studies of software development productivity generally conclude that reusability potentially offers one of the most fruitful avenues of improvement.
Software is an almost unique economic good: it costs a great deal to create but almost nothing to reproduce. This creates a powerful incentive to employ previously-written code instead of new custom code. In the AIS domain, systems generally have a number of similar generic functions, creating numerous potential opportunities for reuse.
A fundamental requirement for successful reuse is a domain analysis that identifies generic functions commonly required to develop applications within the target domain. In the AIS domain, for example, the common functions would include those for specifying screen and report formats, handling input transactions, managing the database, and performing relatively simple logical transformations of input data into output data. The identified functions can then be written (in Ada) as reusable components. In effect, these components become "primitives" in Ada. An Ada application developer can then assemble the set of components to be used in a given application, and write an Ada program that "calls" each component at the appropriate point in the program.
Barriers to Software Reuse
Despite the attractiveness of software reuse, the barriers to achieving it are formidable. In practice, reuse falls far short of its potential, especially across organizational boundaries. Any realistic appraisal of efforts to increase Ada's productivity through component reuse must take these difficulties into account:
Software Reuse in Practice
1. Informal reuse of code fragments. The most common form of reuse is the informal reuse of portions of a program. This almost invariably involves reuse of code that is already familiar to the developer -- typically belonging to a program one has helped write or maintain. For example, a programmer might create a new report by copying another report and making minor data and format changes. Informal reuse reduces the programming and testing burden somewhat, but does not lower other life-cycle costs.
2. Component reuse. A larger fraction of a project's lifecycle costs can be reduced if entire components can be reused, ideally with no modification.
Successful component libraries have been made available as commercial products targeted to a narrow and well-defined domain. The developer of the components accepts responsibility for their correctness and maintenance (as long as they are treated as "black boxes" and not changed in any way). The components provide the programmer and analyst with an extended set of high-order primitives, expanding the range of functions that can be defined without having to resort to multiple lower-level primitives.
The criteria for the success of a component library are 1) a narrow and well-understood domain, 2) a broad enough market to justify developing and supporting the library, and 3) a set of components that are generally useful and complex enough to make it worthwhile to avoid re-creating their functionality. Successes with this approach include FORTRAN-based mathematics libraries and C++ -based GUI utilities.
Reuse can also occur through the creation of repositories consisting of contributed components that permit the sharing of software resources among a population of potential users. The early hope was that if enough programmers put copies of their work into a generally-accessible repository, the likelihood of new developers finding the raw materials they needed would increase as the repository grew in size The portability of Ada raised hopes of being able to create DoD-wide repositories, with little or no concern about language or hardware incompatibilities. In practice, the barriers to reuse have limited the success of this approach. Experience has shown that not much component reuse happens fortuitously, no matter how large the repository.
3. High-level reuse. The design of an application should take about 40 percent of the total implementation effort, whereas coding may take only about 20 percent. One can therefore achieve higher leverage from the reuse of design components than from lines of code. This requires the developer of a new application to accept a number of previously-made design decisions. In order for this to work, careful domain analysis must be performed to create reusable high-level components that meet an acceptable fraction of the needs of future developers. Designing for reuse is a laudable aim, but it requires considerable management effort to change the old habits of designers and programmers accustomed to developing new applications largely from scratch without constraints on their "creativity."
4. Packages and COTS. In a sufficiently well-understood and widely-used application domain (e.g., inventory control, payroll, accounts receivable), it has become feasible to produce application packages that meet the needs of multiple organizations. Users have the option of making their business procedures compatible with the features of a packages, or spending far greater sums to modify the package or produce an entirely new custom application.
5. Software factories. A number of software vendors, notably in Japan, have achieved extremely high levels of reuse within specialized application domains (e.g., factory control systems). The target applications tend to be insufficiently standardized for any single package to be widely applicable, but sufficiently standardized that most of a custom system can be produced from a carefully designed menu of previously developed components.
The Prospects for Ada Reuse
Ada has technical advantages in supporting component-level software reuse, by virtue of its portability and language features (e.g., packages, generics, and -- soon -- object-orientation). Component-level reuse has considerable potential for reducing the cost of custom software, but most of this potential is wasted if software is simply made available for fortuitous reuse. Real success requires design for reuse, organizational commitment to a program of reuse, and non-trivial funding.
Other approaches to reducing or eliminating line-by-line coding are conceptually similar to component reuse. The designers of a higher-level language face the same problem of domain analysis as those seeking to identify reusable components or objects. If a required task is not included as a primitive in a higher-level language, it can generally be defined by combining lower-level primitives (at some additional effort).
Designers of a higher-level language have some important advantages over those trying to assemble a comprehensive set of reusable components. The language designers can provide a consistent user interface and seamless integration with the built-in primitives. They have control over the tradeoff between including a large number of powerful primitives (thus adding to the cost and complexity of the language) versus forcing the user to create equivalent functionality using a combination of lower-level primitives to define composite tasks (thus making the language simpler to develop and learn, but requiring greater programming effort to apply).
A large variety of special-purpose languages has been developed to meet the needs of targeted application domains. Decision support systems (DSS) have been an especially fertile area for such development. The set of special-purpose DSS languages includes spreadsheet products, financial modelling languages, statistical analysis languages, simulation systems, and end-user database/analysis/reporting languages. One of the important attractions of these languages is their "user-friendliness" that permits users to learn the language relatively easily and then develop their own applications. The result can be an exceedingly productive software development environment.
This same principle of using a special-purpose language to develop applications within a given domain applies to AIS as well as DSS. For the AIS domain, the language must focus on such common tasks as managing a large database, designing the human interface, creating queries and reports, managing the stream of incoming transactions, and providing security. Fourth-generation languages, application generators, and integrated CASE products are increasingly able to provide a powerful set of functions for creating applications without having to resort to line-by-line 3GL programming.
Appendix 2: a brief background on ada
Ada 83 was designed and implemented in the early 1980s with two primary goals: 1) eliminate the rampant proliferation of programming languages within DoD, and 2) make available a procedural language that incorporates the best set of procedural language features.
At that time, DoD suffered from severe language proliferation. Over three hundred computer languages and dialects were in use, with costly consequences. The multiplicity of languages meant that software and personnel were not portable. Two sites with identical software needs might not be able to use the same software if their hardware supported different languages. Programmers could not easily be moved from one application to another if the two were developed in different languages. The maintenance burden was particularly heavy in cases where the language was obsolete or tied to obsolete hardware.
The solution chosen was to standardize on a single computer language for the development of large-scale real-time systems. Great effort was spent to identify the best structured development methodologies that came out of lessons learned in the 1970s. An analysis of existing languages identified no generally acceptable candidate that provided all of the constructs needed to support sound software engineering. As a result, DoD sponsored the development of a new language, Ada, which embodied the best available practices for the target application domain.
Although Ada is a general-purpose language, the representative characteristics of the embedded systems domain -- large, time-critical programs having extremely stringent reliability requirements -- clearly dominated design decisions. Ada 9X, soon to be released after a ten-year revision cycle, continues this same design emphasis.
APPENDIX 3: CHARACTERISTICS OF INTEGRATED CASE PRODUCTS
One of the difficulties in discussing emerging software development tools is that no consistent understanding exists as to what functions are included; in fact, we do not even have a generally accepted term for them. The following features are among those that characterize this class of product (not all of which are available in any given product):
- Create screen formats
- Define automatic error checks for input data
- Generate reports or responses to user queries
- Design "user-friendly" graphical interfaces
- Keep track of transactions in their various stages of processing
- Maintain a journal of all events within the system
- Recover from a system failure
- Other existing (legacy) applications
Despite the power of the new higher-level development tools, they still have some serious shortcomings, particularly for DoD use.
Vendors of I-CASE products continually introduce enhanced features that increase their products' effectiveness and reduce some of their current technical limitations. There will still remain, however, some formidable barriers to the widespread adoption of these tools within DoD, as pointed out in a recent GAO report. Notwithstanding the difficulties, I-CASE tools offer such substantial potential benefits that they should not be dismissed as too immature for consideration as this time. The recent DoD acquisition of an I-CASE product is certainly an important step in giving visibility and support to the concept of high-level development tools. It provides an opportunity to gain early experience with an exceedingly important area of information technology.
Abdel-Hamid, Tarek, and Stuart E. Madnick, Software Project Management, Prentice-Hall, 1991.
Banker, Rajiv D., Robert J. Kauffman, and Dani Zweig, "Repository Evaluation of Software Reuse," IEEE Transactions on Software Engineering, 19, 4 (April 1993), pp. 379-389.
Banker, Rajiv D., Robert J. Kauffman, "Reuse and Productivity in Integrated Computer-Aided Software Engineering," MIS Quarterly, 15, 3 (September 1991), pp. 375-401.
Bannick, Kathy A., "Breakdown of Software Expenditures in the Department of Defense, United States, and in the World," Masters Thesis, Naval Postgraduate School, 1991.
Boehm, Barry W., "Software Engineering," IEEE Transactions on Computers, vol. C-25 (December 1976).
Boehm, Barry W., Software Engineering Economics, Prentice-Hall, 1981.
Boehm, Barry W., "A Spiral Model of Software Development and Enhancement," Software Engineering Notes, Vol 11, 4 (August 1986).
Boehm, Barry W., "Improving Software Productivity, IEEE Computer, 20, 9 (September 1987), pp. 43-57.
Bui, Tung, Cheryl D. Blake, and James C. Emery, "Prototyping with Application Generators: Lessons Learned from the Naval Aviation Logistics Command Management Information System Case," Naval Postgraduate School, October 1992, 56 pp.
Bui, Tung, James C. Emery, Gerald Harmes, Myung Suh, and Tina VanHook, "A Clearinghouse for Software Reuse: Lessons Learned from the RAPID/DSRS Initiatives," Naval Postgraduate School, October 1992, 39 pp.
Bui, Tung X., James C. Emery, and Cheryl D. Blake, Prototyping with Application Generators: Lessons Learned from the Naval Aviation Logistics Command Management Information System Case, Naval Postgraduate School, October 1992, 56 pp.
Cusamano, M, Japan's Software Factories: A Challenge to U.S. Management, Oxford University Press, 1991.
Defense Science Board Task Force Report, FY 1994-99 Future Years Defense Plan (Odeen Report), May 1993.
Department of Defense, "Fiscal Years 1992 and 1993 Report on Information Technology Resources, Exhibit 43A, 1992.
Department of Defense, "Technical Architecture Framework for Information Management," Volume 4, Standards-Based Architecture Planning Guide, Version 2.0, 25 October 1993.
Emery, James C. and Martin J. McCaffrey, ADA and Management Information Systems: Policy Issues Concerning Programming Language Options for the Department of Defense, Naval Postgraduate School, Monterey, CA, June 1991, 102 pp.
Emery, James C., Management Information Systems -- The Critical Strategic Resource, Oxford University Press, 1987.
Emery, James C., The Strategic Implications of a Productive Software Development Process, Proceedings of the Workshop on Information Technologies and Systems, M.I.T., 14-15 December 1991, 15 pp.
Farley, Dennis and Scott, Tony, "King COBOL -- Deposed at Last," System Builder, June/July 1990, pp 45-47.
Federal Computer Week, "Rational, Verdix Merge, Aspire to Be Ada Power," November 1, 1993, p. 28.
GAO, House Report 101-382, "DoD Automated Information Systems Experience Runaway Costs and Years of Schedule Delays While Providing Little Capability." 1992.
GAO/IMTEC, "Automated Information Systems -- Schedule Delays and Cost Overruns Plague DoD Systems," 1989.
GAO/IMTEC, "Defense's I-CASE Implementation," June 1993.
Hanna, Mary, "Can CASE Bridge to Object World," Software Magazine, July 1993, pp. 41-5.
ITAA, Enterprise Integration in the Department of Defense, July 1993, 42 pp.
Jones, Capers. Applied Software Measurement, McGraw-Hill, 1991.
Kemerer, Chris F., and Eric Brynjolfsson, "Network Externalities in Microcomputer Software: an Econometric Analysis of the Spreadsheet Market." CISC Working Paper, M.I.T., November 1993.
Levitan, Karen B., John Salasin, Thomas P. Frazier, Bruce N. Angier, "Final Report on the Status of Software Obsolescence in the DoD," Institute of Defense Analysis, Paper P-2136, August 1988.
PRISM, "Research Summary," Refractions -- The New World of Systems Development, Index Systems, December 1988.
Schonberg, Edmond, "Contrasts: Ada 9X and C++," Cross Talk, September 1992, pp. 12-16.
Weill, Peter, "The Role and Value of IT Infrastructure: Some Empirical Observations," in Banker, Kauffman, and Mahmood (eds.), Strategic Information Technology Management, Idea Group Publishing, 1993, pp. 547-72.
VII. BIOGRAPHIES OF AUTHORS
Dr. James C. Emery was recently appointed as Professor in the Administrative Sciences Department at the Naval Postgraduate School. Prior to this, he had over 25 years' experience as a faculty member at M.I.T. and the University of Pennsylvania's Wharton School, as well as ten years' full-time experience in industry, government, and the non-profit sector. While on leave from the M.I.T. faculty in 1961-62, he worked as Staff Analyst (GS16) in the office of the Assistant Secretary of Defense (Comptroller). He was a faculty member at Penn from 1965 to 1993; he served two years of this period as the University's Director of Computing Activities, and three years as the Chairman of the Decision Sciences Department. In 1974 he joined EDUCOM, a membership organization of colleges and universities dealing with networking and other computer-related issues in higher education; he was president for three years before returning to Wharton in 1980. He is a Founding Member and past president of The Society for Information Management, and served a three-year term as Senior Editor of the MIS Quarterly from 1989 to 1991. His research interests include application software development methodologies, information systems planning, information systems economics, and organizational coordination. He has a B.S. in chemistry from the University of Arkansas and an S.M. and Ph.D. from the Sloan School of Management at M.I.T.; he was a Fulbright Scholar at the University of London in 1954-55.
Dr. Dani Zweig is a Research Assistant Professor at the Naval Postgraduate School. Prior to his appointment at NPS he served as a consultant for Peat Marwick. His research interests include software reuse, the cost implications of software complexity, and an analysis of DoD's software inventory, its rate of obsolescence, and expected replacement costs. He has a M.Sc. in computer science from the University of Toronto and a Ph.D. in industrial administration from Carnegie-Mellon University.