Technidigm-2000

On-the-Level

Common Sense, Technically Speaking


Chopping down the cherry tree...


Chapter 2

INTEGRATING INTEGRITY

Section 2.3:

Advanced Integrity: Always Being On-the-Level


We are likely to associate qualities such as honor and integrity with a youthful George Washington admitting that he cut down the cherry tree. The other "Founding Fathers," especially those who signed the U. S. Declaration of Independence and Constitution, are presumed by most of us to have been on-the-level and, thus, to have had integrity. They risked their fortunes and lives for relatively abstract concepts. They are also viewed as being well educated and experienced leaders. They are models of integrity even today, perhaps more honored than they deserve in some respects, but still highly esteemed.

While these character and ability presumptions regarding the Founding Fathers may be arguable, democratic governments are founded upon and dependent on the collective good intentions of the governed, intentions that are supposed to be reflected in elected officials. At the end of the 1990s, it is evident to many people that these perceived standards for political and government leaders are no longer as high as they were when the Founding Fathers established the government of the United States.
 
 

What Is Integrity in a Technological Era?

One reason that integrity is lower in priority in the 1990s is that we have lost track of what it is, who has it, how it is used, and what it means to a technological society and democracy. Other than the basic traits of a few very distinguished religious leaders, there is no general understanding of integrity even in a non-technological framework. We are not confident in our ability to determine who has basic integrity, so many of us make an assumption one way or the other, until proof emerges. Thus, the question of basic integrity relative to any issue or situation can elicit a spectrum of responses. Moreover, in a technically oriented society, integrity should not be limited to our basic personal traits. This means that we can have layers of opposing responses piled on top of our basic responses as to the nature of integrity and who has it.

Integrity refers to notions of completeness as well as firmness relative to being prepared to deal well with a question or issue. Since people seldom can perfect their abilities and instantly deal perfectly with every question or issue, we often choose our leaders relative to each other. When we fail to understand the layers of experience and ability related to integrity in a technological society, it is easy to mis-select leaders. The evidence of this mis-selection may not emerge for years, and even then we may not recognize why we failed to select properly. We then may too easily justify our failure to select properly by claiming that we were simply voting against the other candidate, avoiding the issue of positive selection of leaders.

In the nuclear energy field, our popular notions of personal integrity are fundamental but not sufficient for proper operation of a nuclear power plant. This is because our popular notions of integrity are limited to good intentions and fail to consider the increasingly important technology-related contributions to the integrity environment. It takes a lot more than a "nice guy" who has been to college to run a nuclear plant in conformance with society's needs. The same can be said of most industries.
 
 

The Importance of Knowledge, Experience, and Industry-Specific Cultures: The Incomplete Manager

It is increasingly difficult to have not only basic human honor against which to measure integrity, but also sufficient knowledge and experience such that integrity may be applied to the complete set of functions needed. Yes, there are additional and important integrity pieces that involve knowledge, experience, and even an appropriate industrial or interpersonal culture. A good nuclear safety culture is fundamental for nuclear power plants. Yet, there are some nuclear plant managers and nuclear oversight organization managers who have never worked in that culture. They may have integrity in terms of personal honor and good intentions, but they fall short in understanding issues of technical knowledge, experience, and the unique culture required for success.

When a manager having an incomplete portfolio of integrity is too high up in the organization, it can become very difficult for others to keep the organization on track and to ensure success. These incomplete managers can have a highly developed sense of their own personal integrity, but they seldom provide evidence of any technical integrity because they do not have the knowledge and experience to form the basis for technical integrity. They are simply managers who have peripheral technical knowledge and experience.

Incomplete manager spawn whistle blowers. Most whistle blowers not only have the basic personal integrity needed in their industry, they also have the technical and cultural integrity that most directly supports that industry's objectives. It is up to senior managers and leaders to establish and maintain a climate of integrity throughout the organization in which all members are comfortable raising questions and getting issues resolved responsibly. When someone is forced to play the role of whistle blower, it should immediately be evident that incomplete managers exist in the organization. This should be enough to cause an intensive reflection on technological common sense within the organization.

Sometimes leadership is so powerful that whistle blowers are intimidated into silence. Many commercial nuclear plants and even nuclear weapon plants were supposed to be "saved" by retired Navy nuclear admirals, but most fell on their faces because they did not have the technical insights needed for technical integrity. They relied too much on leadership by intimidation, mimicking the highly esteemed Admiral Hyman Rickover, but bringing less than a full measure of technical ability and, indeed, less than a full understanding of the model, no-fault nuclear safety culture instituted within the Navy nuclear propulsion program.

Thus, the absence of whistle blowers in an organization does not imply an adequate culture or adequate levels of technological integrity. You may still have incomplete managers running the organization. It only takes one of them in the organizational chain of command to set up conditions that could lead to disaster. This is why the organization's culture is so important. Most nuclear plant managers would have a difficult time passing a nuclear safety culture test, but almost none of its critics could even recognize the importance of the individual questions.
 
 

Perfect Integrity

In the following discussion, integrity is defined in even broader terms, and those broad terms help us to understand the narrower concept of personal integrity. Developing an understanding of integrity in its broadest sense is one area in which it is very useful to borrow directly from technology. Integrity in technology is less ambiguous than it is in human relations. If computer chip has a single flaw, even one that shows up only under extreme circumstances, the chip's integrity is largely destroyed.

If we held people to the same standards as we hold computer chips, how many would meet the standard? We hope that more people would meet the standard than is indicated by the popular media, but there is a reason that we can expect things to be more perfect than people. The integrity of things is largely established when they are made. People have to establish their integrity continuously!

Pretend for a moment that you do not understand this abstract thing we call integrity. Such ignorance occurs all the time in engineering circles. The designers and operators of the failed nuclear power reactor at Three Mile Island (TMI Unit 2) believed that they had done their best, right up to and including the melting of their nuclear fuel. The TMI plant was designed with effective containment buildings, so while a series of unexpected problems led to the meltdown of one of the plant's reactor cores, most of the radioactivity was contained. The TMI plant design had a kind of integrity that we seldom consider. It was not perfect, but it was adequate relative to the 1986 event at Chernobyl.

TMI was the best we could do given the information available at the time of design and construction. The feedback (on of the 12 Technidigm-2000 elements) from TMI's core meltdown was used to improve reactor plant design safety and operating procedures. Feedback allows technology to approach perfection - - design integrity. Feedback can help managers and their organizations to approach perfection and to become more complete.
 
 

Compensating for Incomplete Integrity

At TMI, design integrity success compensated for failures in plant operational and maintenance integrity, even though the technical records and rationale behind the plant's "design basis" had already been discarded or lost. The nuclear industry refers to these compensating qualities as defense-in-depth. Similarly, the design success of the U.S. Constitution has often compensated for our failure to operate the resulting government as intended. We have been slow to understand and maintain the "design intent" of the Constitution. Technidigm-2000 provides a vehicle for turning this situation around.

The designers and operators of the failed nuclear power reactor at Chernobyl also believed they were doing the right thing. Yet, the Chernobyl plant was designed without effective containment buildings, leaving little room for human error. The Chernobyl design integrity was faulty. The defense-in-depth was not as strong as at TMI. Then a Chernobyl supervisor, in the absence of an adequate nuclear safety culture as well, deliberately overrode a safety interlock while testing one of the Chernobyl reactors. The reactor core melted, and large amounts of fission-product radioactivity were released. Interestingly, the integrity of the Soviet reactor plant design conformed with the integrity of the Soviet governmental system, much like the integrity of the American reactor design and government conformed with each other.
 
 

Integrity for a Lifetime

The radioactivity released from the core fission products was largely contained at Three Mile Island, while the radioactivity released from Chernobyl contaminated most of the world. The difference between the two results was the amount of integrity associated with the technical design (relatively fixed) and operation (people) of each of the plants. Integrity of the original design, the integrity of construction, operation, testing, and maintenance all play a part in the definition of engineering integrity. The failure to maintain any of these factors impacting the integrity of a nuclear plant at any point in its life cycle can open the door to failure. Likewise, constitutional government requires careful design, testing, operation, and maintenance. We are its operators and maintainers. Our job is continuous.

The fact that the Chernobyl accident and its release of great mounts of radioactivity to the atmosphere were was kept secret for a few days reflects on Russian governmental integrity. It was not until scientists in other countries detected the radioactive cloud and stated the obvious that the source of the radioactivity was acknowledged. The Russians were forced to admit that they had released a large quantity of radioactivity into the atmosphere. When you are forced to be honest and forthright, your integrity is questionable. Such questions remain for a lifetime, including the lifetime of a country.

Although it falls within the most common understanding of integrity, forthrightness is only one aspect of integrity. The problem at TMI was acknowledged immediately. Nevertheless, the argument over the severity of the TMI radioactivity release continues as a result of the polarized interests involved -- pro-nuke and anti-nuke. Polarization leads to extremism, which almost always takes us off-the-level.
 
 

Problematic Integrity Has Root Causes

In engineering projects, engineers look for "root causes" of problems. When they find something that is less than what was needed, they have also found a weakness in the overall project integrity. The problem might be with the original design principles used many years ago, or it might be something that happened today for the first time. Nevertheless, the fact that we did not anticipate a new problem is as much a failure of a program's integrity as our failure to plan for what we knew would happen. We must allow margins for planning error and lay out programs and solution systems that are conservative enough to so hat we can maintain control of our enterprises. The root cause of many problems is that we failed to plan ahead. Planning ahead is part of integrity.

Time (one of the 12 Technidigm-2000 elements) must be considered in most decisions. This is a difficult task because no one can fully anticipate what might happen in the future. Yet, this is what we expect of nuclear plant designers. They meet our expectations by providing design safety margins. Their responsibilities for creating a design of high integrity are broad and circumspect.

Likewise, societal institutions require a broad and circumspect definition and implementation of integrity. One difference between TMI and Chernobyl was a society that is able to demand integrity of its leaders and one that is not. In a democracy, the people expect more from government oversight organizations such as the U.S. Nuclear Regulatory Commission (NRC), and they generally get something for these expectations. Indeed, the NRC is almost as interested in trends that could lead to problems as they are in problems that have already occurred. As in most situations, it is better to prevent a problem than to have to deal with it.
 
 

Planning Through Trending: Relativistic Integrity

The trends in society and in politics are also important in heading off major problems. The problem is that trends are not as easily recognized when the standards of performance are allowed to deteriorate. Negative trends become acceptable when the standards of performance slide. At some point, the problems get so severe that the lower standards are no longer acceptable, and the trend may cause reversals before a major problem.

In a constitutional democracy, the people can cause such a reversal if they recognize the problem in time and take action. This can become more difficult when technology accelerates social changes. Technidigm-2000 helps us to recognize problems over time by recognizing time as a key element in dealing with issues. Nevertheless, the trending or planning issue is only one aspect of our modern, technology-based, societal integrity.
 
 

Integrity Requires Facing Facts

Even in a democracy success depends on understanding the problem and facing the facts of the situation. Following the TMI accident, the NRC required all of the U.  S. nuclear plants to reconstitute and validate their design integrity documentation. Many nuclear utility companies simply no longer knew how their nuclear plants had been designed. They had thrown out or lost their basic design information, so they could not maintain the plant in conformance with the designers' intentions. The NRC required them to face the facts of the situation. It was a very expensive requirement, but it reestablished the design integrity of the U. S. nuclear power industry. Nuclear plant operational and maintenance issues will continue and also must be addressed with continued high levels of integrity.

Importantly, nuclear plant integrity was reestablished and improved by identifying and elevating previously neglected but fundamental safety principles. Engineering principles have evolved over thousands of years, but the integrity of their implementation is now more difficult and expensive due to increases in project size and complexity, not to mention the significant increase in the hazards that result from failure. Not surprisingly, the identification and continuing support of principles are closely related to integrity, whether that integrity is in design of things or in the character of individuals.

What we learn from all of this is the purpose of integrity. The purpose of integrity is to implement and maintain principles even when it is difficult to do so. Unless we understand the principles being pursued, the effects of time, and the importance of feedback within margins of safety, our notion of integrity can wander significantly. This is true in any situation where integrity is an issue. Principles and integrity are inseparable in good technical projects, and they are inseparable in good people and good social projects.

 

Understanding Personal Integrity

If you think the need for personal integrity is obvious, no one will disagree with you. The hard part is determining whether personal integrity is present. The managers who decided to launch the Challenger space shuttle the day it exploded were full of integrity, but they did not understand how to apply their integrity in a technologically complex situation. They had to learn the hard way, at the expense of the lives of others.

When the Challenger exploded, we all learned that being a manager does not bestow a manager with more technical wisdom than that of the cognizant engineer. It was a single engineer near the bottom of the organization who understood the specific limitations of the equipment, in this case the temperature limits of a critical seal or o-ring. There was little time and no effective means of expressing those limitations to the decision makers.

Technidigm-2000 provides tools with which such complex and time-critical situations can be addressed more effectively in the future. The decision to launch the Challenger was probably made on-the-level, but at level one (a manager's opinion) rather than level four (a fully informed and experienced decision maker). To the extent that the manager's decision to launch was made due to inappropriate considerations, the decision was off-the-level. Assuming that we are on-the-level, levels constitute four more of the 12 Technidigm-2000 elements.

Many people believe that the engineering design integrity of the complex space shuttle launch system became a secondary consideration to the politics of the moment. Reversed priorities occur all the time, but they seldom lead to such a dramatic result as an explosion and loss of life. Much more often the politics of the moment merely suppress the thin threads of integrity such that adverse trends in relative to established standards develop. Major adverse consequences that can be deferred are likely to be deferred. When long term trends are involved, accountability for those adverse consequences becomes obscured by the passage of time and by changes in context. Thus, a good solution system to an issue must consider a range of integrity-related concepts for the label of good to be true.

If we are to achieve good solutions, we first need good personal integrity in those who develop and control those solutions. With personal integrity in place, principles can be applied in the proper solution system context over the appropriate period of time. When these elements are properly coordinated, adverse consequences are more easily avoided.


Continue in This Chapter: