I’ve referenced the post because the argument I make there has plugged into later parts of the conversation, and pointing back to it is efficient.
To you response, I asked for the math. It’s insufficient to say that everyone’s value is infinite. Any fraction of infinity is also infinity, so should we spend exactly as much to prevent a 10% chance of death as we spend to prevent certain death? I’m open to seeing an explicit formulation of your position that avoids absurd outcomes, but just throwing in more infinities doesn’t seem to do it.
I doubt that you are correct, and I don’t think it’s a coherent position. But this seems like another way of expressing our disagreement here.
Ah, so I did! My intent was to present the idea that it’s possible, and I guess I used the direct question because I recognize that the prospect is uncomfortable. But it does undermine the view-from-nowhere line I’ve been claiming for myself.
Still, I’d like to keep the conversation in the view from nowhere. I don’t think any engagement in this question, whether to accept or reject it, reflects on the morality of those engaging. Even if we can show my position to be definitely correct (which I think we agree has not been shown), I’m not sure that it follows that we must have explicit dollar values assigned to everything we care about.
In any case, I don’t think making this about us is useful to exploring the ideas. I don’t think you’re evil for disagreeing with me, and I hope you don’t think I’m evil for making this case. If anything, I think people who engage in moral debate are behaving morally by necessity, because what could be more moral than spending time figuring out what one should do?
But surely there was a difference in value between (t_1) (sometime before you met her) and (t_2) sometime after you met her? Do you treat all rapes equally, or do you place particular value on your wife not being raped?
Or maybe you mean that there is some abstract concept ‘my wife’ and that the value you place on things that happen to whoever stands in that place is constant, even if people move in and out of it. But then, we can do that with other things too: I value ‘my new car’ a certain amount, and as it ages it is no longer ‘my new car’, it becomes ‘my old car’ which I value less.
I also, again, would caution self-skepticism. There is a rich history of e.g. parents selling their children into slavery during lean times. I have a strong belief that I would die before doing that, but I wonder if those parents also felt that way before they were faced with the choice of all their children starving or selling one to feed the others.
I am begging the question a bit here, because I’m trying to show that those values are just like dollar values, and then just assuming they behave like dollar values to show it. But assume for a moment that ‘selling your children’ has a very very high but finite dollar value. Would you notice if it changed by a few dollars based on your mood? Probably not. So introspection here could be a poor way to gauge how dollar-like those values are. Ultimately, though, we can’t introspect each other directly, so I don’t know if this line will be fruitful in resolving our disagreement.
I don’t think that so, at least not where “dangerous work” can be quantified as “work where there is an X% chance you will die”. If we know the different odds of dying doing some work A and some other work B, and we also know the price premium you place on work A, then we know what additional pay you require for some additional risk of death.
Here, it’s a thought experiment, but it’s also a common real life scenario. Many people really do make that exact choice. And for others, the choice is implicit in many other choices. When you choose to buy a cheaper but less-safe care, you make that choice implicitly. The value of various peoples lives are necessarily implicit in choices about health insurance, life insurance, safety precautions, occupations, and hobbies. They don’t need to be contained or pure in order for the value of lives to be implicitly included in them.
Neither would I! But there’s no inconsistency in saying that people do in fact value murdering someone else at $X, and also that we should make it illegal for people to accept money for murder.
I think I understand what you mean here (similar to what I said above about fluctuations of a few dollars on top of some very very large finite amount). But there is another sense in which $100m and $1b are perfectly meaningful amounts, right? We can’t intuitively grasp the difference, because our brains aren’t built to, but that’s part of why we have math: to make sense of numbers we can’t grasp intuitively.
I have not and am not advocating any such thing.
See, I don’t see anything rhetorical in this question. You can literally buy canned oxygen. You can pay someone to fill your scuba tanks.
Air isn’t usually excludable, which makes buying selling your average breath of air impossible, but that doesn’t make the dollar value non-existent.
This is another part of my question in this thread. One is what compensation one needs for the violation of ones own morals; another is what compensation one needs for the violation of someone else’s morals. Seems like a lot less, right? If I see someone who keeps halal about to unknowingly eat ham, how much do I need to be paid not to say anything? It seems like roughly zero, although I would take on non-zero costs to avoid serving someone halal, even without their knowledge.
I think we need to be clear about what we mean by “putting a price on things”. I think the price is there, whether or not we think about it. My understanding is that what tends to devalue things is making the price explicit, and that makes sense. For coalition building, a strong signal that someone is irrationally committed to the coalition above all else is quite valuable. If someone does the hard introspection and determines that their commitment to the coalition actually stops at $10m, the value to the coalition decreases. That $10m limit was there anyway, but making it common knowledge changes the social dynamics.