I’ve solved this one elsewhere on the forum. But the solution lies extremely problematically in a significant strength of the current economic model: voluntary trade.
When voluntary trade is thought of, one tends to think of their part alone - and forget that trades are two-way agreements. We all want our own part to be voluntary, but what manages the expectations of the bounds of what one is prepared to consent to pay? The party who has less to lose.
So long as we have voluntary trade, the party with less to lose will make the price, and the party with more to lose will take the price.
This reaches a natural equilibrium in our current economic model, that varies - extremely usefully - in accordance with how many people are willing to settle for the offered price, and with the agreed compensation to provide that which is intended to be sold.
The chain of transactions, priced by the above principle, goes all the way back to the initial extraction of the primary ingredients from the earth - and the earth charges nothing.
And so is revealed the artitrary nature of our current economy, but with huge strengths in responding to supply and demand, and in allowing choice to take or leave what is provided - with incentive to provide what is likely to be taken rather than left.
Altering this model requires not only that voluntary trade is compromised.
It also encounters problems with determining the “needs” of a needs based economy.
At least some people are going to have to sacrifice consent in trade, and we are still at a point where we cannot truthfully determine what each individual’s real needs really are - and beyond the technological and pragmatic difficulties in discerning what people’s real needs are, do people want their needs to be known? Is there not a value in privacy, and an intrusion in forcing it out? Further, is there not psychological value in forbidden things?
This is the strength of the emerging “AI computer algorithm” model. Nobody knows quite how AI comes to learn what it does - human beings for their part only program the initial structure of how to start doing so, and then the AI takes over at a rate with which humans cannot keep up. The whole process of learning what people want/need can be learned it seems, but humans cannot keep up with it and this is potentially a good thing. The issues here are in the ability to track the wants/needs of specific parties, which would be in breach of privacy, and in who programs the AI. Under the voluntary trade model, there is further issue in who “owns” the process, and what they use the results for. As I pointed out above, the earth charges nothing, so there is no fundamental basis for anyone to own anything. But as I pointed out above, there is value in pretending that there is a basis for anyone to own anything.
But aside from these circumstantial issues, the breach of privacy may be opaque enough to not pose a threat - however perhaps not so to the issue of who programs the AI. Obviously it is no easy task to program such advanced things, so very few are capable of doing so, but the policing of these few is not self-regulating… except perhaps without voluntary trade. Without voluntary trade, the inequalities of each trade that produce pricing may be eliminated, and without pricing, those who program the AI and those who would seek to abuse those who program the AI have no monetary leverage.
So to answer the question “how would a needs based economy affect… western efforts at civilisation?”, those who benefit from having less to lose from turning down a trade i.e. the powerful/the rich, would lose their relative value - as has already been pointed out in this thread. We would all lose out from the ideal of voluntary trade, with AI technology we are half way there, but the moderation of voluntary trade is required to complete the journey. And who moderates voluntary trade?