"Yes, yes, quite so, but you fellows are evading the philosophical point! Would you, or would you not, impose your beliefs on others, if you believed that the good results of doing so would outweigh the bad? "

The problem with philosophy is that invariably its practitioners begin thinking its results are equivalent to knowledge gained through science. We innately believe that what we value has some significance external to us - that it's fixed, and understandable, and that our formulaic principles must always stay in tact. This is because we like to know. We don't like to think or believe or suspect. We like to KNOW. And when we don't know, we make stuff up.

People want to have ready-made "knowledge" and ready-made ethics that could hypothetically be encoded in an expert system. "What should I do in this situation?" Then the expert system asks you a whole bunch of questions and spits out a defensible answer.

But I don't believe human ethics is that simple. We might say, "Always this" or "Always that," but we should realize - consciously - that those are heuristics.