Point: You have failed to answer the question of whether you think you can get to ontology without epistemology.
No matter. I will answer it for you by the end of this post.
uwot wrote: ↑Thu Mar 21, 2019 7:51 am
Well, twice I have suggested that 'Yer might wanna check that'. If you really need a link to a wikipedia page on the Planck scale, here ya go (....)
OK and then? I understand the contents and the implications of the Wiki page - it affirms my position.
and physicists no longer have any scientific model whatsoever to suggest how the physical universe behaves
Without an epistemic model of reality, I would like you to explain to me how you could arrive at ANY "ontology", interpret anything or say anything about reality beyond phenomenology.
So you really need to be a bit more specific in pointing out the thing that you want me to "check".
uwot wrote: ↑Thu Mar 21, 2019 7:51 am
Logik wrote: ↑Wed Mar 20, 2019 8:19 amIf you make that claim then you have necessarily measured something. That which you are measuring is called "evidence" a.k.a information.
IF you have somehow found a way/mechanism to obtain evidence for your epistemic claim that A is more probable than B then you have, in practice, falsified the "limit" Planck imposes on you.
So what is your evidence for this?
1. Probability Theory: The logic of science (by E.T. Jaunes)
https://www.amazon.com/Probability-Theo ... 0521592712
2. Principle of Maximum entropy follows from #1
https://en.wikipedia.org/wiki/Principle ... um_entropy
The principle was first expounded by E. T. Jaynes in two papers in 1957[1][2] where he emphasized a natural correspondence between statistical mechanics and information theory. In particular, Jaynes offered a new and very general rationale why the Gibbsian method of statistical mechanics works. He argued that the entropy of statistical mechanics and the information entropy of information theory are basically the same thing. Consequently, statistical mechanics should be seen just as a particular application of a general tool of logical inference and information theory.
3.Principle of Maximum entropy (agnosticism!) applied in practice:
https://en.wikipedia.org/wiki/Bayesian_inference
Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data
Back-referencing myself in context (my quoted post below describes the process of Bayesian Inference - e.g how scientists think):
Logik wrote: ↑Wed Mar 20, 2019 8:19 am
You have two hypotheses:
A: There is an ontology beyond Planck scale
B: There is no ontology beyond Planck scale
Probability(A) = Probability(B) = 0.5
A/B = 50/50
A:B = 1:1
MeasureDecibel(A, B) = 0 dB (
https://en.wikipedia.org/wiki/Decibel )
"There are ontological events beyond Planck scale" is an epistemic claim that A is more probable than B.
The take away of the above is that plausible reasoning is quantifiable (e.g MEASURABLE) on a Decibel scale.
Plausibility is a function of MEASURED information. Information is evidence and evidence is information.
This is how empiricism works.
4.
https://en.wikipedia.org/wiki/Information_theory
Information theory studies the quantification, storage, and communication of information. It was originally proposed by Claude E. Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication".
5. The actual paper for #4 above:
http://math.harvard.edu/~ctm/home/text/ ... ntropy.pdf
6. Digital Physics is a new paradigm:
https://en.wikipedia.org/wiki/Digital_physics
7. Physical Information is a unification of ontology AND epistemology - it produces a new language:
https://en.wikipedia.org/wiki/Physical_information
8. Measurement/epistemic/ontological limits:
https://en.wikipedia.org/wiki/Nyquist%E ... ng_theorem
If a function x(t) contains no frequencies higher than B hertz, it is completely determined by giving its ordinates at a series of points spaced 1/2B seconds apart.
A sufficient sample-rate is therefore anything larger than 2B samples per second. Equivalently, for a given sample rate F perfect reconstruction is guaranteed possible for a bandlimit B< F/2
IF "The Universe" is a wave function then Planck length imposes a limit on wave-
length, and thus a limit on frequency: 1 / ℓ P hertz.
IF The Universe is a wave function with frequency of 1/ℓ P hertz, then in order to describe The Universe perfectly you need to sample it at 2/ℓ P e.g your measuring instrument needs to have a wave-length of HALF a planck-length. Ooops!
And so, at the deepest depths of your own mind- you are left with uncertainty. You are left with this guy rolling around in your head:
https://en.wikipedia.org/wiki/Qubit whose precise value you can't quite determine.
Worse yet - what you can't determine is whether information is epistemic or ontological. The conundrum being is that in order to DECIDE whether information is epistemic or ontological you need ..... evidence to disambiguate the two cases.
Do you see it yet? If not - ask questions.