Short Series about Science: #4 Scientific Responsibility

I am sorry to say that there is too much point to the wisecrack that life is extinct on other planets because their scientists were more advanced than ours.
John F. Kennedy, speech, 11 December 1959

Science is value neutral — as probably best said by Richard Dawkins:

Scientific and technological progress themselves are value-neutral. They are just very good at doing what they do. If you want to do selfish, greedy, intolerant and violent things, scientific technology will provide you with by far the most efficient way of doing so. But if you want to do good, to solve the world’s problems, to progress in the best value-laden sense, once again, there is no better means to those ends than the scientific way.
Richard Dawkins

And this applies to any science. While the A-bomb is the poster child of the negative side of science in general (actually, not necessarily, but that’s another debate), you can easily show negative uses of other areas of science. Take psychology — you can use systematic desensitization to help someone overcome his/her phobia, or you can use it to make someone a torturer (the inhibition ‘normal’ people have when it comes to torturing someone is — in a way — also only a phobia of hurting other people that can be treated the same way as any other phobia). Likewise you can use theories of work motivation to motivate people — or to cheaply downsize a department. That’s the beauty and terror of a good theory — it is applicable for a lot of different purposes.

Which means that personal responsibility is something every scientist has — no matter the discipline this person is working in. Personal responsibility applies to using the best methods to answer research questions, integrity of the data and publications (no plagiarism, no falsification or fabrication). The three whistle-blowers who made the fraud of their supervisor public is a good example of personal responsibility and integrity.

It also applies to making sure others can understand what the findings mean — and the limits of these findings. This demands perspective taking and clear communication (what does the person know, what does the person not know, what does this person not know that s/he does not know, what can the person understand?).

There are many examples where communication went wrong, e.g., the miscommunication regarding the reactions of O-Rings to low temperatures in the Challenger disaster. It not only did cost NASA a shuttle and a mission, there were seven astronauts in that shuttle who died that day:

Whether the astronauts remained conscious long after the breakup is unknown, and largely depends on whether the detached crew cabin maintained pressure integrity. If it did not, the time of useful consciousness at that altitude is just a few seconds; the PEAPs supplied only unpressurized air, and hence would not have helped the crew to retain consciousness. The cabin hit the ocean surface at roughly 207 mph (333 km/h), with an estimated deceleration at impact of well over 200 g, far beyond the structural limits of the crew compartment or crew survivability levels.
http://history.nasa.gov/kerwin.html

I highly recommend having a look at Edward R. Tufte’s “Visual and Statistical Thinking: Displays of Evidence for Making Decisions” to get an impression of the degree of miscommunication involved.

No wonder that Feynman said during the hearing of the accident:

“For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”
Richard Feynman

Sure, this is an extreme example, but the truth remains: As a scientist one has a responsibility — with consequences going far beyond a single paper.

 

Short Series about Science

3 Trackbacks / Pingbacks

  1. Workshop: Scientific Work – Overview | ORGANIZING CREATIVITY
  2. Short Series about Science: #2 Theories | ORGANIZING CREATIVITY
  3. Short Series about Science: #1 Key Strengths | ORGANIZING CREATIVITY

Comments are closed.