Reproduction is a Moral Good

Death is the biological fulfillment of the promise of equality; equality with the non-biological, physical world. Materialism beyond individualism leads to the equality of dirt and humans; consistent treatment between nonhuman and human worlds. The end logic of equality is the overcoming of all life boundaries, all distinctions, and all separations until Singularity or death. -–Heisman, Mitchell

Return to the unconscious flux where differences are no longer perceived.
A death wish masked as a metaphysical ideal.


Emancipation from multiplicity, or not, we are only liberated from the awareness of divergence not from its presence. The hope for a return, after death, to an imagined singularity expresses a deep and unacknowledged hatred for oneself and all that made it possible.

Can you say more. A statement like “whatever else the good is, it cannot be X” both leaves ‘the good’ undefined and narrows down the set of things that it could be. I see my argument as making that sort of claim.

I have made many arguments for the positive version (i.e. arguments like “X is good”) this thread, and my claim here is that the negative version (i.e. arguments like “Y is incompatible with good”) is better justified by those arguments. This is my fullest exposition of the positive version; here’s a tldr of the negative version:

  1. “The good” can only be coherently defined by reference to its function
  2. As an evolved trait, its function was instrumental towards the survival and reproduction of our ancestors
  3. Antinatalism, the position that we should not reproduce, cannot be compatible with any coherent definition of “the good”.

Again, note that this claim constrains good and does not define good.


Restated this way I see a counterargument to your contention that functional morality makes morality tantamount to increasing evolutionary fitness, in the form of a counterargument to my claim that reproduction is a moral good:

Morality’s function was not to increase survival and reproduction, it was merely instrumental towards that end. Previously I’d described its function in terms of cooperation and group cohesion; why shouldn’t we describe ‘the good’ solely in those terms, and leave other evolved traits to deal with reproduction? While it may be that evolutionary fitness is maximized when the sum of instincts motivating an individual points in the direction of reproduction, that doesn’t entail that every instinct must point in that direction. If reproduction hurts cooperation or group cohesion, morality should motivate the individual away from reproduction – that is its function.

I’m not sure I’m convinced by that, but I think it’s a strong counterargument to my claim in this thread. In any case, I think it shows how functional morality is distinct from evolutionary fitness.

A post was split to a new topic: Moved from “Reproduction is a Moral Good”

Sasquatch got your tongue, Satyr? Click the down arrow:

No reply. Click down or up arrow.

Final answer.

I am not sure how to answer this. Is George Clooney a “silicx model”?

Nothing you’ve posted merits a response.

Carleas:

Your thread says moral instinct springs from reproductive fitness.

Then you say in silica AI without moral instinct can “get it right” comparable to those with moral instinct.

Does “get it right” mean in silica AI are reproducing?

Because you only respond when you disagree? Thanks for the affirmation, buddy.

Instincts are morally neutral. A “gets it right” moral choice/decision orders the instincts according to self=other. Instincts are our default programming. Unconditioned (¿so how’d it get there?) response. DNA has error correction.

Ah, I see what you mean. In that post, by “get it right” I only mean that they could label behaviors the way that the human moral instinct labels them, and therefore they could convincingly ‘pretend’ to be moral even though they don’t have a moral instinct in the way I think you mean.

I don’t think functional morality has a straightforward answer to whether artificial intelligences can be moral agents or moral patients. For moral patiency, I think this goes to the counter-argument to my position that offered in reply to @PZR: if morality is about the group and only the group, then an artificial intelligence can be a moral patient to the extent that they can be a part of the group; if instead morality’s grounding in evolutionary fitness constrains it, then maybe what happens to artificial intelligences would always be amoral and at most instrumentally good or bad.

For agency, it seems to test the way “morality” is defined under functional morality, but it’s colorable that an AI trained to do the things humans call morality is behaving morally in the same sense that humans are. Said differently, if morality is-what-it-does, and an AI has a module that does what human morality does and it calls it morality, it is morality.