Home » Cosmology, Media, News » Evolution, climate change, and the multiverse sold together now in one convenient package

Evolution, climate change, and the multiverse sold together now in one convenient package

It all works. Everything is true and everybody is right, except the people who say everything can’t be true and everyone can’t be right..

In a news roundup at Not Even Wrong, Peter Woit mentions “Controversy in Science: When Scientists Disagree, What’s the Journalist to Do?”, featuring Paula Apsell, WGBH/NOVA & KITP Science Journalist in Residence (here) , saying,

A couple weeks ago the KITP hosted a talk by Nova’s Paula Apsell,their Journalist in Residence, entitled Controversy in Science. She covered the topics of Evolution, Climate Change, and the Multiverse. Go to about 43 minutes into the program for the segment on the multiverse, which dealt with Brian Greene’s hour-long program on the subject. David Gross objected strenuously to the program and how it was made, criticizing it for not distinguishing solid science from speculation, being manipulative and not seriously presenting the arguments of opponents. Gross explained that he had been interviewed for four hours for the program, but what went on the air was virtually all Brian’s point of view, with only a short bit from him which he felt didn’t represent his arguments. Joe Polchinski however thought it was just fine…

What is a journalist to do?

Well, more relevant here is what is a journalist not to do: It’s okay to be admittedly partisan. What’s not okay is claiming to be non-partisan while acting as an agent for one side. That’s why the legacy media are tanking faster than a horseshoe in the swimming pool.

See also: How to confront Darwinism when spun through mainstream media

  • Delicious
  • Facebook
  • Reddit
  • StumbleUpon
  • Twitter
  • RSS Feed

4 Responses to Evolution, climate change, and the multiverse sold together now in one convenient package

  1. Hence the relevance of the IOSE. (Though this does not take on the climate change issues; my best advice here is to realise that computer simulations and temperature proxies are one step removed from actual observation, also that even something like a valid and credibly correct average value of atmospheric temp is quite hard to do. My best policy advice is to not undertake costly actions only on simulation results or the like.)

  2. KF,

    Speaking of your IOSE . . . I was looking it over again and I was wondering . . . your definition of S:

    S = 1 or 0 according as the observed configuration, E, is on objective analysis specific to a narrow and independently describable zone of interest, T:

    I shall admit to not being the sharpest knife in the drawer by any means but could you spell this out for a particular example?

  3. Jerad,

    You have again chosen to raise something tangential and off-topic for this thread, a continuation of side-tracking tactics warned against in the previous thread.

    In addition, the tangential comment is loaded.

    For, you write as though the meaning of the dummy variable S and how its value is set is not explicitly explained where it is raised in the IOSE. (You do not give the relevant link.)

    {nb: In brief, for onlookers, S DEFAULTS to 0, the value that in effect forces the determination, chance. S only comes up in a highly contingent situation, so the first default relative to the design explanatory filter, is already eliminated for the expression to be on the table:

    Chi_500 = I*S – 500, bits beyond the solar system threshold.

    At no point above do you acknowledge that there are nine bullet points immediately following the deduction of the expression, relevant to the interpretation of S and its significance. S is about specificity, and only when there is objective warrant relevant to the case, will it move from the default, 0, to 1, denoting specificity. Where also, if something is specific but not complex enough — here, on the gamut of the solar system’s 10^57 atoms and 10^17 s — the value of Chi_500 will be in the zone: plausibly reachable by chance, per a needle in the haystack threshold that is discussed in the context preceding, on why 500 bits is a good threshold. Notice in particular the difference between a case where 501 coins are in no particular order and their being arranged per the ASCII code for a message in English. (Observe the specification in the latter case.)}

    Given what has already been happening, this looks like a rhetorical tactic of distraction, rather than a responsible on-topic comment for this thread.

    Please do better than this.

    On the topic of this thread, there is a serious problem of manipulative popular presentations of scientific topics including especially on origins. Notice, what News draws attention to in the OP by clipping:

    A couple weeks ago the KITP hosted a talk by Nova’s Paula Apsell,their Journalist in Residence, entitled Controversy in Science. She covered the topics of Evolution, Climate Change, and the Multiverse. Go to about 43 minutes into the program for the segment on the multiverse, which dealt with Brian Greene’s hour-long program on the subject. David Gross objected strenuously to the program and how it was made, criticizing it for not distinguishing solid science from speculation, being manipulative and not seriously presenting the arguments of opponents. Gross explained that he had been interviewed for four hours for the program, but what went on the air was virtually all Brian’s point of view, with only a short bit from him which he felt didn’t represent his arguments. Joe Polchinski however thought it was just fine…

    In short, having sacrificed half a workday plus overhead time to support the interview, he was strawmannised and reduced to a caricatured snippet. (Multiverse speculations are not even properly science, as there is an absence of observational data. They are little more than philosophy done in a lab coat. In such a context, it is VITAL, that serious alternatives sit to the table of comparative difficulties so that the onlooker can see the balance on the merits for himself. Otherwise the Newtonian rule of sensible scientific reasoning that empirical data should control conclusions not speculations, is stood on its head.)

    That sort of thing is why an independent, balancing initiative that covers a relevant range of such topics is relevant.

    KF

  4. KF,

    I’ve read your material and I just can’t make sense of things like:

    S DEFAULTS to 0, the value that in effect forces the determination, chance. S only comes up in a highly contingent situation, so the first default relative to the design explanatory filter, is already eliminated for the expression to be on the table

    Which is why I’m asking for some examples. As I said, I’m not the brightest person so I think it’s fair to ask for some clarification since you put weight on this material. Sometimes, it almost sounds like you’re assuming your conclusion:

    S goes to 1 when we have objective grounds — to be explained case by case — to assign that value.

    Which kind of defeats the purpose of having a metric, doesn’t it?

    And sometimes what you say just doesn’t make sense to me:

    A string at random is a list with one member, but if we pick it as a password, it is now a zone with one member.

    Where also, if something is specific but not complex enough — here, on the gamut of the solar system’s 10^57 atoms and 10^17 s — the value of Chi_500 will be in the zone: plausibly reachable by chance, per a needle in the haystack threshold that is discussed in the context preceding, on why 500 bits is a good threshold.

    Which is why I’d like to see a worked out example where you show how all the values are determined and the final conclusion is arrived at. An applied case as it were. that would really help me to understand what you’re getting at.

    In short, having sacrificed half a workday plus overhead time to support the interview, he was strawmannised and reduced to a caricatured snippet.

    It happens to the just and the injust. What can you do eh?

Leave a Reply