Wyman wrote:Let me take another crack at this. There are two, at least, components to my argument that physicists do more than calculate.
I agree. There is no point in a physicist's education at which they stop being human beings.
Wyman wrote:First, there's the more obvious psychological point that coming up with new conjectures, interpretations and proofs requires more than mere calculation.
Thomas Kuhn is the usual reference point here, he argued that there is 'normal' science, which is how most professionals spend their career and involves working with the concepts and mathematical tools of the current paradigm. A large part of the job is calculating, but, said Kuhn, there is also revolutionary science, which becomes necessary as inconsistent data accumulates to a point where the consensus about the paradigm breaks down and scientists start looking for a new model. Kuhn has been criticised for being too conservative, he appears to believe that scientists all agree with a current paradigm, which is more or less the same belief that WanderingLands has, except they have twisted it slightly so that the scientific 'elite', for some reason are defending a paradigm they know to be untrue. I see that Greylorn Ell has posted a similar thesis. The truth is a bit more complicated; if you want results, you use the mathematical tools that work best, you can believe anything you like about what 'reality' is like, but if the sums don't add up, it is useless.
For example, the Copernican revolution changed our view of the world as much as Darwin or Einstein and it very nearly happened 1500 years earlier. Aristarchus of Samos proposed the same heliocentric model as Copernicus. We only really know about it, because Archimedes, who in anyone's book is a proper scientist mentioned it in a book called 'The Sand Reckoner'. It demonstrates a way, basically exponation, to express numbers so large they have no practical applications. For instance, how many grains of sand it would take to fill the universe. He explains to King Gelon of Syracuse, to whom the book is dedicated, that most people are familiar with the geocentric model described by Aristotle, but there is another model, described in a book by Aristarchus of Samos that depicts the cosmos as considerably larger. Aristarchus had worked out that the sun is a lot larger than the Earth and argued that it is improbable that the larger body should orbit the smaller. We know this is true, and clearly Archimedes was sympathetic, but the Aristotelian model had been developed mathematically by Eudoxus and Calippus, two brilliant mathematicians, and the predictive power of their models wasn't matched by Aristarchus. With a few tweaks the Aristotelian model became the Ptolomaic one, which is perfectly adequate for naked eye observations. The metaphysics, the philosophical model on which the mathematical model is based can be complete bollocks, if the maths tells you what you want to know, use it; shut up and calculate.
The point about scientists being human is that they will have all sorts of crackpot ideas about what actually makes the universe tick. I don't know if, for instance, Ed Witten genuinely believes that fundamental particles really are strings or whether they can simply (Ha!) be described that way; it's a lot of effort for something you don't believe to be the case, but hey, some people just like really difficult maths.
Wyman wrote:It requires imagination and association, among other things - that goes to my illustration of Feynman's 'Babylonian' method of mathematics.
Which is much more like Feyerabends Methodological Anarchy. Yes; people use the tools at hand and sometimes they invent new ones. Some people are labourers, some are craftsmen and some are inventors.
Wyman wrote:Think of doing a proof in geometry, for instance. One does not just calculate, as one would perform long division or multiplication. One 'racks' one's brain for similar 'avenues' of thought from similar problems, similarities of aspects of the models - it quite fits Plato's 'knowledge as recollection' conjecture - much of it is associative in nature.
This is a reference to the slave in Plato's Meno
http://classics.mit.edu/Plato/meno.html . The main problem I have with this is that there are no examples, that I am aware of, of people suddenly just 'remembering' things; they generally have to have it demonstrated, which corrupts the evidence and makes the conclusion unsound.
Wyman wrote:The second component I'd propose is something like Kant's neumena or Wittgenstein's 'that of which we cannot speak...'
Kant realised that there are a priori concepts, notably space and time, that we invent to structure and order our experience: separate things happen sequentially, we imagine this happens 'in' something.
I had to read Ayer's Language, Truth and Logic as preparation for my first degree; it nearly put me off. Inspired by Wittgenstein, the Logical Positivists formulated the verification principle, basically that talk of anything for which there is no empirical evidence is meaningless. But as people on the forum have noted, science makes hypotheses for which there is currently no empirical evidence, and then try to introduce technology that provides the evidence.
Wyman wrote:Arguing whether gravity is fundamental to Einstein's physics or physics in general is not arguing over whether gravity is 'really' fundamental, but whether gravity is a fundamental (elemental, undefined, simple) term in the system involved - that is, in the collection of rules (axioms, theorems, hypotheses) and the corresponding model or interpretation of those rules.
There is a force, there are different explanations. Whatever your preferred explanation, it makes no difference to the strength of the force.
Wyman wrote:It is the model that is placed up against reality for comparison - agreement or disagreement with observation. But the model may only 'touch' reality at certain very limited points for that comparison. So the model is not coextensive with the reality it is created to describe.
It doesn't have to be for the purposes of science, it only has to agree with the phenomena.
Wyman wrote:Thus, to say that gravity, or anything else is fundamental is like saying that points, lines and planes are fundamental to plane geometry. Yes, they are undefined terms, elemental to the system of axioms, with a one to one correspondence with their model (dots and lines on a piece of paper being one such model).
The model describes (along with elemental relations such as incidence, congruence, etc.), say, most of what you could draw with a pencil on a piece of paper (in a sense, which 'sense' is a huge subject in itself). But it does not make sense, quite, to say that dots and dashes -pencil marks - on paper (rather than points and lines in theory) are fundamental elements of a piece of paper.
Well, since you mention Plato, it is more plausible (but not very realistic) to argue for a realm in which mathematical 'truths' exist, rather than our 'memories'. The axioms of geometry are independent of any actual paper or pencils, whereas the 'truth' of gravity is contingent on there being a universe. Gravity is not the way it is for reasons of logic, it is for some 'physical' reason. We don't know what that reason is, but that hasn't stopped us sending probes to planets millions of miles away.
In one sense, Kuhn was right, a lot of professional science is a dreary slog. Which is why it is better to be a philosopher.