Rupert Sheldrake at TEDx

I recently heard about some ‘controversy’ regarding TED not being pleased with a presentation from a TEDx event. A guy gave a talk at TEDxWhitechapel, a man by the name of Rupert Sheldrake. The talk was posted on the TEDxWhitechapel youtube, and then the official TED correspondents requested it be taken down from their channel, later to be hosted on TED’s blog with a disclaimer above the video expressing that skepticism of the video is warranted.

There’s an ongoing debate about whether TED is justified in their actions, or whether their actions constitute dogmatic censorship.

In his video, from what I understand, he makes the claim that science is dogmatic itself, and that his list of 10 dogmas needs to be confronted face-on by the scientific community and everyone else. Here are the 10 dogmas:

  1. Everything is essentially mechanical
  2. All matter is unconscious
  3. The total amount of energy and matter is always the same.
  4. The laws of nature are fixed.
  5. Nature is purposeless
  6. All biological inheritence is material.
  7. Minds are inside heads and are nothing but the activity of brains.
    8 ) Memory is stored in material traces of the brain.
  8. Unexplained phenomena such as telepathy are illusory.
  9. Mechanistic medicine is the only one that really works.

From what I know of Sheldrake he’s a pretty fringe thinker, and some of his stuff is way out there. But he’s not fundamentally against the scientific method, and I think his accusation of dogmas is reasonable enough to at least merit rebuttal. Nagel’s just been roasted for raising similar objections to materialism. Neither of them are fundamentally weak or religiously-biased thinkers. But they call for a philosophical justification of science (as the scientific method can’t justify itself), and a lot of people aren’t up to that. Or perhaps they feel that there are points they’d rather not concede, or that any appraisal of a reductionist view that isn’t entirely supportive will lead the world into a superstitious theocratic dictatorship.

TED started off well, but as far as I can tell it’s a media circus now. The money and the networking has taken over from the original aim. That’s not me being a hipster about it, it’s based on a lot of things I’ve read from people involved in it, presenters, and so on. Meh, sic transit gloria mundi and all that. It’s weird that they’d invite someone like him and then shut him down afterwards.

Well, TED didn’t invite him, he was invited by an independent TED-related group from Whitechapel. TEDx is TED-affiliated local groups that host TED-like conferences with some guidance from TED and some other stuff from TED. But ultimately the hosts of TEDx are responsible for inviting their own speakers, and TED can only give broad guidelines about who to invite.

[quote=“Only_Humean”
From what I know of Sheldrake he’s a pretty fringe thinker, and some of his stuff is way out there. But he’s not fundamentally against the scientific method, and I think his accusation of dogmas is reasonable enough to at least merit rebuttal.[/quote]
He a scientist and he uses the scientific method in his research. So he’s not even superficially against the scientific method.

I believe you could be a materialist and agree with Sheldrake on most of his (implicit) positions in that list of dogmas. Dogmas held by much of the scientific community, but not all of it.

What I found interesting (in a bad way) was that, when Sheldrake was investigating the speed of light, various people had responded that it’s constant by definition.
Now, I’m not a great scientist or anything, but…that sounds absolutely retarded to me, as well as patently false.

I mean, the speed of light may be constant, I’m not saying that that’s false.
But I’m just saying that that’s not true by definition

Let’s say we were measuring the speed of light relative to, idk, the length of something specific, maybe some titanium rod. We’ll call the length of the titanium rod 1 Teter. And let’s say we’re in a world in which we know that there is a such thing as ‘light’, we know that it obviously travels at some finite speed, but we don’t know what that speed is. So, one day we’re able to see exactly how far light goes in 1 second, and we find it goes 1 million teters exactly. And then the next day we measure how far the light goes in 1 second, and it goes 1.2 million teters. And the next day, it goes .8 million teters in 1 second.

The results described above may be false, may not actually describe what’s happening in the world, but they’re not false because of some definition. Definitions are flimsy little things. The definition of the speed of light, to me, is only ‘The distance that a photon travels divided by the time it took it to travel that distance’. The value of the speed of light is different and separate, imo, from the definition of the speed of light. The question of whether the speed is constant is also a different and separate question. The term ‘the speed of light’ need not be defined any differently from the definition I gave above. We may discover a value that matches that definition, and we may discover that that value is constant, but the value and the question of its constancy are not, need not, and should not be part of the definition.

Imo.

There has, in fact, been some research that supports the idea that the speed of light has changed over time.

newscientist.com/article/dn6 … ently.html

I also think that RS questions the idea that contants and laws must be eternal and specifically calls for scientific research to test these assumptions. (but has not stated that light speed has changed) There has been in recent years other research, by scientists other than RS, that supports his questioning the assumptions around this. All very much within the tradition of science.

Moreno, thanks for that link.

But what I’m getting at is different from that. It’s not even a question, for me, of whether the speed changes or not. The big mistake is to say that it doesn’t change by definition. I think that’s a huge mistake, that a lot of philosophers and clearly a lot of scientists make as well. It’s a way of conceptualizing definitions as being more than they are.

There is an exact and unalterable reason why EM waves propagate at the rate they do. But that speed is still environment dependent. And it isn’t a case of being true by definition.

On the pother hand, usually when someone is talking about the speed of light being constant, they are talking about relativity, and saying that the speed of light is “constant for all observers”. That is a different issue. In order to form modern relativity, such a consistency was made “true by definition”. That is a major part of religious (dogmatic) Scientism, not valid Science and is very similar to the Catholic Church declaring the the Earth is the center of the solar system “by definition”.

Although I agree with the notion that real Science should be separated from the dogmatic Scientism, I think that list of dogmas to question seems a little silly and perhaps even misdirecting with intent. There are more concerning and valid things to question. But I haven’t watched that episode yet.

He is indeed; I meant that he’s not in the group of people fundamentally against science, not that he’s less-than-completely against it. He’s a FRS, IIRC; which is not necessarily to say his theories are all deserving of equal merit.

Perhaps in an unusual philosophical take on it, but not in the modern scientific reductionist sense of materialism, I don’t think. Although 1 and 7 are open for debate on the grounds of terminology, stated in that form.

It depends on context. If you’re making an argument using the relativistic paradigm, and not say a Newtonian one, then the speed of light (in a vacuum) is by definition constant. It’s simply not accurate to say that the speed of light is the distance it travels divided by the time it takes, as distance and time (or rather, spacetime) can be distorted by mass. That constant velocity is the base “given” from which we can derive distance or time, in that sense. We measure spacetime curvature by the velocity of light, not light’s velocity by the dimensions of spacetime. That would be like checking a ruler is roughly 12" long by holding it against a carrot of roughly 12" :stuck_out_tongue:

Of course, what this means is that (e.g) mass is also by definition different in Newtonian and relativistic mechanics. You could express mass in terms of the amount of spacetime distortion caused. It’s not that it affects things differently, it’s mathematically a different thing. So if he was using relativistic assumptions to make his test set-ups, it could be valid to point out that he’s doing something nonsensical, working against the theory he’s using to prove his case.

I guess I don’t know enough about the subject to really get it. ‘True by definition’ arguments generally don’t sit well with me, but maybe if I studied relativity I’d get it. I’ll treat it as an open question for now, and not take a hard-line stance against the ‘true by definition’ point until I understand what it means.

But I did think your analogy was interesting, as that’s kind of what I thought the ‘true by definition’ argument meant. If we 1 foot as “the length of this carrot,” then “1 foot” is no longer constant – it varies as the length of the carrot varies. So if we define the meter, likewise, as some proportion of the speed of light, and if the speed of light varies relative to some other standard that’s not defined based on it, then the meter would also vary to the same degree. So, the speed of light may be ‘by definition’ constant relative to the meter, but then in a world where the speed of light varies relative to something else, the meter itself isn’t constant. Much like a carrot.

IF the speed of light is to be defined as constant for all observers, then there is no option but for distance to vary between all observers. That is what the Lorenz equations are all about, “time dilation and distance dilation”.

But in reality, it is a incoherent ontology, as was brought out in the Stopped Clock Paradox.

Speed of light used to be measured by comparing to a standard distance and time. Now it’s been turned backwards and we take the speed of light as the standard and derive distances from it. I think that is RS’s point … our standard speed is changing so our measurements are wrong. But it’s been written in the book of dogma so it can’t be challenged.

Einstein wrote a book on Relativity, there are a couple of dense bits but it’s pretty approachable and quite compact.

The only reason it can be spoken of as true by definition is because it was mathematically derived, and the predictions it makes turn out well (at appropriate scales). And metres are indeed not constant, in the theory. Kindasorta.

While I’m recommending books, Kuhn’s “Structure of Scientific Revolutions” is a good read that’s passingly related to the subject :slight_smile:

The postulate that the speed of light is constant for all observers was not mathematically derived. Einstein specifically stated, “IF what you have told me [concerning the speed of light] is true, THEN… relativity…simultaneity…issues…”, that he admitted he could never get to fully work out.

You cannot arbitrary choose for something to be definitional true in your ontology if you want it to exactly fit reality, the same as with the Earth being the center of the solar system. You can define it that way, but after enough measurements are made, it becomes a useless ontology.

It’s not a question of relativity. :-"

The “constancy of the speed of light being relative to the observer” certainty is.
That is usually what they mean when they are talking about the constancy of the speed of light.
Everyone knows that the speed of light “in a vacuum” is defined as a particular speed.
The only problem with that is defining a vacuum, since a truly total vacuum cannot exist.

The “absolute speed of light in an otherwise total vacuum”, is most definitely a fixed speed that determines all other speeds, sizes, and pretty much everything in the universe. That one can be referenced by definition without ever running into ontological problems.

There is a reason that light travels at that one speed.

It got measured. Years later it was measured again and they got a different value. Again years later, they got another value.

Measured values are not explained by experimental error.

However, the scientific community has decided that there is one true value and that it is not changing naturally.

If they got different calculations for the speed of light in a total vacuum (I would be very interested in how they did that), they could still merely declare a set value. They could make it “1.3 mph” and then literally all other measurements be restructured around that.

It is unquestionable that EM propagation has one very exact speed that determines all other measurements. So there isn’t a problem with them doing that as long as they don’t try to change too many common measurements too quickly into new standards.

This is his claim:

sheldrake.org/experiments/constants/

This does seem to get into the relative sloppiness of modern physics compared to RM. There are several things going on concerning their definitions that are disturbing.

  1. They have defined a second in terms of oscillations, which inherently involves the speed of light and length.
  2. They have, since 1983, defined the speed of light in terms of seconds and length
  3. The above sets length to make up for any actual variations, which makes length a variable.
  4. They have defined the permittivity of space in terms of charge force and length.
  5. They have defined force in terms of mass, length, and seconds (acceleration).
  6. They have defined mass in terms of a chosen sample force of weight.

The permittivity of space will vary with the region of space which directly affects the actual speed of light. But it would take me a while before I could figure out if they have violated coherency or actually hidden true measurement in all of that defining. I suspect they have. Lately they seem to love making circular definitions.

In RM:AO, the maximum propagation of affect (the speed of light in an absolute vacuum) is exactly 1 toe/tic, without any other dependencies. If I could know the actual permittivity of this region of space (how much the speed of affect is slowed in this non-absolute vacuum region) and the defined structure of Cesium-133, I could translate that into meters per second depending on which definitions they stick to.