We are likely to associate qualities such as honor and integrity with a youthful George Washington admitting that he cut down the cherry tree. The other "Founding Fathers," especially those who signed the U. S. Declaration of Independence and Constitution, are presumed by most of us to have been on-the-level and, thus, to have had integrity. They risked their fortunes and lives for relatively abstract concepts. They are also viewed as being well educated and experienced leaders. They are models of integrity even today, perhaps more honored than they deserve in some respects, but still highly esteemed.
While these character and ability presumptions regarding the Founding
Fathers may be arguable, democratic governments are founded upon and dependent
on the collective good intentions of the governed, intentions that are
supposed to be reflected in elected officials. At the end of the 1990s,
it is evident to many people that these perceived standards for political
and government leaders are no longer as high as they were when the Founding
Fathers established the government of the United States.
Integrity refers to notions of completeness as well as firmness relative to being prepared to deal well with a question or issue. Since people seldom can perfect their abilities and instantly deal perfectly with every question or issue, we often choose our leaders relative to each other. When we fail to understand the layers of experience and ability related to integrity in a technological society, it is easy to mis-select leaders. The evidence of this mis-selection may not emerge for years, and even then we may not recognize why we failed to select properly. We then may too easily justify our failure to select properly by claiming that we were simply voting against the other candidate, avoiding the issue of positive selection of leaders.
In the nuclear energy field, our popular notions of personal integrity
are fundamental but not sufficient for proper operation of a nuclear power
plant. This is because our popular notions of integrity are limited to
good intentions and fail to consider the increasingly important technology-related
contributions to the integrity environment. It takes a lot more than a
"nice guy" who has been to college to run a nuclear plant in conformance
with society's needs. The same can be said of most industries.
When a manager having an incomplete portfolio of integrity is too high up in the organization, it can become very difficult for others to keep the organization on track and to ensure success. These incomplete managers can have a highly developed sense of their own personal integrity, but they seldom provide evidence of any technical integrity because they do not have the knowledge and experience to form the basis for technical integrity. They are simply managers who have peripheral technical knowledge and experience.
Incomplete manager spawn whistle blowers. Most whistle blowers not only have the basic personal integrity needed in their industry, they also have the technical and cultural integrity that most directly supports that industry's objectives. It is up to senior managers and leaders to establish and maintain a climate of integrity throughout the organization in which all members are comfortable raising questions and getting issues resolved responsibly. When someone is forced to play the role of whistle blower, it should immediately be evident that incomplete managers exist in the organization. This should be enough to cause an intensive reflection on technological common sense within the organization.
Sometimes leadership is so powerful that whistle blowers are intimidated into silence. Many commercial nuclear plants and even nuclear weapon plants were supposed to be "saved" by retired Navy nuclear admirals, but most fell on their faces because they did not have the technical insights needed for technical integrity. They relied too much on leadership by intimidation, mimicking the highly esteemed Admiral Hyman Rickover, but bringing less than a full measure of technical ability and, indeed, less than a full understanding of the model, no-fault nuclear safety culture instituted within the Navy nuclear propulsion program.
Thus, the absence of whistle blowers in an organization does not
imply an adequate culture or adequate levels of technological integrity.
You may still have incomplete managers running the organization. It only
takes one of them in the organizational chain of command to set up conditions
that could lead to disaster. This is why the organization's culture is
so important. Most nuclear plant managers would have a difficult time passing
a nuclear safety culture test,
but almost none of its critics could even recognize the importance of the
If we held people to the same standards as we hold computer chips, how many would meet the standard? We hope that more people would meet the standard than is indicated by the popular media, but there is a reason that we can expect things to be more perfect than people. The integrity of things is largely established when they are made. People have to establish their integrity continuously!
Pretend for a moment that you do not understand this abstract thing we call integrity. Such ignorance occurs all the time in engineering circles. The designers and operators of the failed nuclear power reactor at Three Mile Island (TMI Unit 2) believed that they had done their best, right up to and including the melting of their nuclear fuel. The TMI plant was designed with effective containment buildings, so while a series of unexpected problems led to the meltdown of one of the plant's reactor cores, most of the radioactivity was contained. The TMI plant design had a kind of integrity that we seldom consider. It was not perfect, but it was adequate relative to the 1986 event at Chernobyl.
TMI was the best we could do given the information available at the
time of design and construction. The feedback (on of the 12 Technidigm-2000
elements) from TMI's core meltdown was used to improve reactor plant design
safety and operating procedures. Feedback allows technology to approach
perfection - - design integrity. Feedback can help managers and their organizations
to approach perfection and to become more complete.
The designers and operators of the failed nuclear power reactor at
Chernobyl also believed they were doing the right thing. Yet, the Chernobyl
plant was designed without effective containment buildings, leaving little
room for human error. The Chernobyl design integrity was faulty. The defense-in-depth
was not as strong as at TMI. Then a Chernobyl supervisor, in the absence
of an adequate nuclear safety culture as well, deliberately overrode a
safety interlock while testing one of the Chernobyl reactors. The reactor
core melted, and large amounts of fission-product radioactivity were released.
Interestingly, the integrity of the Soviet reactor plant design conformed
with the integrity of the Soviet governmental system, much like the integrity
of the American reactor design and government conformed with each other.
The fact that the Chernobyl accident and its release of great mounts of radioactivity to the atmosphere were was kept secret for a few days reflects on Russian governmental integrity. It was not until scientists in other countries detected the radioactive cloud and stated the obvious that the source of the radioactivity was acknowledged. The Russians were forced to admit that they had released a large quantity of radioactivity into the atmosphere. When you are forced to be honest and forthright, your integrity is questionable. Such questions remain for a lifetime, including the lifetime of a country.
Although it falls within the most common understanding of integrity,
forthrightness is only one aspect of integrity. The problem at TMI was
acknowledged immediately. Nevertheless, the argument over the severity
of the TMI radioactivity release continues as a result of the polarized
interests involved -- pro-nuke and anti-nuke. Polarization leads to extremism,
which almost always takes us off-the-level.
Time (one of the 12 Technidigm-2000 elements) must be considered in most decisions. This is a difficult task because no one can fully anticipate what might happen in the future. Yet, this is what we expect of nuclear plant designers. They meet our expectations by providing design safety margins. Their responsibilities for creating a design of high integrity are broad and circumspect.
Likewise, societal institutions require a broad and circumspect definition
and implementation of integrity. One difference between TMI and Chernobyl
was a society that is able to demand integrity of its leaders and
one that is not. In a democracy, the people expect more from government
oversight organizations such as the U.S. Nuclear Regulatory Commission
(NRC), and they generally get something for these expectations. Indeed,
the NRC is almost as interested in trends that could lead to problems
as they are in problems that have already occurred. As in most situations,
it is better to prevent a problem than to have to deal with it.
In a constitutional democracy, the people can cause such a reversal
if they recognize the problem in time and take action. This can become
more difficult when technology accelerates social changes. Technidigm-2000
helps us to recognize problems over time by recognizing time as a key element
in dealing with issues. Nevertheless, the trending or planning issue is
only one aspect of our modern, technology-based, societal integrity.
Importantly, nuclear plant integrity was reestablished and improved by identifying and elevating previously neglected but fundamental safety principles. Engineering principles have evolved over thousands of years, but the integrity of their implementation is now more difficult and expensive due to increases in project size and complexity, not to mention the significant increase in the hazards that result from failure. Not surprisingly, the identification and continuing support of principles are closely related to integrity, whether that integrity is in design of things or in the character of individuals.
What we learn from all of this is the purpose of integrity. The purpose of integrity is to implement and maintain principles even when it is difficult to do so. Unless we understand the principles being pursued, the effects of time, and the importance of feedback within margins of safety, our notion of integrity can wander significantly. This is true in any situation where integrity is an issue. Principles and integrity are inseparable in good technical projects, and they are inseparable in good people and good social projects.
When the Challenger exploded, we all learned that being a manager does not bestow a manager with more technical wisdom than that of the cognizant engineer. It was a single engineer near the bottom of the organization who understood the specific limitations of the equipment, in this case the temperature limits of a critical seal or o-ring. There was little time and no effective means of expressing those limitations to the decision makers.
Technidigm-2000 provides tools with which such complex and time-critical situations can be addressed more effectively in the future. The decision to launch the Challenger was probably made on-the-level, but at level one (a manager's opinion) rather than level four (a fully informed and experienced decision maker). To the extent that the manager's decision to launch was made due to inappropriate considerations, the decision was off-the-level. Assuming that we are on-the-level, levels constitute four more of the 12 Technidigm-2000 elements.
Many people believe that the engineering design integrity of the complex space shuttle launch system became a secondary consideration to the politics of the moment. Reversed priorities occur all the time, but they seldom lead to such a dramatic result as an explosion and loss of life. Much more often the politics of the moment merely suppress the thin threads of integrity such that adverse trends in relative to established standards develop. Major adverse consequences that can be deferred are likely to be deferred. When long term trends are involved, accountability for those adverse consequences becomes obscured by the passage of time and by changes in context. Thus, a good solution system to an issue must consider a range of integrity-related concepts for the label of good to be true.
If we are to achieve good solutions, we first need good personal integrity in those who develop and control those solutions. With personal integrity in place, principles can be applied in the proper solution system context over the appropriate period of time. When these elements are properly coordinated, adverse consequences are more easily avoided.
Return to Technidigm.org Home Page