Heh, of course, that would be too easy.Scott Mayers wrote: ↑Sun Feb 24, 2019 6:28 pm Okay, it still might be helpful to understand what you've done in the program because you could have simply wrote,
print("A is B: true")
print("A is not B: true")
print("A is A: true")
print("A is not A:true")
What I did was for any object of type "Human" I re-defined the "=" operator to return "False".
It's useful to think of operators as mathematical functions so when you see A = B think f(A, B).
And since it's a function.... i am in control of defining its behaviour.
In the case of Type systems f(A,B) we have what's called parametric polymorphism. The parameters to the function influence the behaviour of the function in real time.
Yeah, no fault of your own there. High-order logics are abstractions upon abstractions upon abstractions, but somewhere deep down in the belly of the beast it is all boolean logic.Scott Mayers wrote: ↑Sun Feb 24, 2019 6:28 pm At least given I lack knowing the language, you may as well have just printed the above. I can't interpret it as is without knowing some basic functioning definitions of Python, like "_init_", "pass", or rules they define in its 'logic'. [I can create my own 'objects' this way using structures but can't be sure if that has other factors added I'm not aware of.]
Python is a Hindley-Milner type system (or thereabout, not exactly - but whatever there are religions there on whether object-oriented or functional paradigms are better). https://en.wikipedia.org/wiki/Hindley%E ... ype_system
The key property that makes all these shenanigans possible is the parametric polymorphism.
Oh yeah! There are hundreds if not thousands of ways to fool yourself with these beasts.Scott Mayers wrote: ↑Sun Feb 24, 2019 6:28 pm Now, I am familiar with timing problems on the level of the components and is something I argued a few years ago in an intense debate on "The Monty Hall Problem". Because the means of higher-order languages to 'trick' even the programmer to think they haven't cheated, there resides a 'cheat' because some processes take twice as long than others on the level of the gates in the chips when minimized to an operation even though people think all operations take the same amount of time. As such the outcome inappropriately comes out as what they expect without realizing that the design to "test" the theorem is fixed to the expected result.
Yep. In theory there is no difference between theory and practice, but in practice there is, and so in inventing boolean logic we are actually creating an illusion that we have paused time. It's kinda hard to maintain that illusion at large scale.Scott Mayers wrote: ↑Sun Feb 24, 2019 6:28 pm Is this something you are referring to? For those initial logic laws though, those comparisons are 'without time' when on paper.
Without turning this into a nit-picking session that's pretty much it. The concept of atomicity (or transactionality) in computer system is the ASSUMPTION that 17 different operations take place at the exact same time. And NONE of them failed, or produced unexpected results.Scott Mayers wrote: ↑Sun Feb 24, 2019 6:28 pm That was a reason for specifically calling implication a "conditional", for instance, to avoid the interpretation as a 'cause' just as the 'assignment operator' for computers is "=". Similar for "and": It doesn't mean "and then" as we use in regular language as it begs time between the events/variables. You might want to test something logical about time itself and so don't want the logical material of the variable use time to prove something about itself, which creates a problem.
Some times that's possible where the work is parallelizable, but if it is not you have to serialize events in which case it doesn't take time X, it takes 17*X.
Those problems that you have identified at the component level turn into very interesting sets of problems at the software level.
In order to maintain the illusion of consistency we have to use a LOT of duct tape. And most importantly - we have to pretend that we have control over time. Which is why we have the concept of a system clock. But of course that is just duct tape.
When you start working with distributed systems then you have to start worrying about clock synchronicity and there is just no getting away from geographic latency (that pesky speed of light!) so maintaining the illusion of consistency in distributed systems is an interesting problem-space...