## Logical struggles regarding the vagueness and partial knowledge

Some of my recent struggles concern logic. Therein, embedded in the logic that we know, we deal with binary assessment of a logical value based on the environment of a claim surrounding the sentence in question. What does that mean? We could say that a certain sentence is right given certain knowledge about the universe that surrounds it. However, gaining more knowledge about the universe could lead to making the same sentence false. In this context we could mention set theory axioms that faced numerous contradictions in the past. An example to that would be, for instance, enumeration of all integers (countable). The enumeration takes place in an environment when we can afford to enumerate all those and the structure of the universe is not in contradiction with that capability. If we think of the very enumeration as of enumeration in naive set theory, i.e. the enumeration of a set of integers, it might be the case that not all the integers are in that set etc.

Dr. Tao referrs to “infinitiary” and “more infinitiary” sets as well as their corresponding computational requirements, which clearly indicates the problem behind the word “naive” that should be used for all the theories that we know. Under very specific conditions certain theory might be true, but at the same time it might be the case that the larger picture would allow us find a leak within the very theorem. Yet another thing is that the level of formality used to build the model used for the proof of the theorem might involve low level problems that we fail to see. For instance, in a standard axiom system (ZFC) in set theory, we currently don’t know of any paradoxes. In the same time, it might be possible that there exist ones. Enumeration of those is also a function of tree traversal and space partitioning (problem space) and therefore has own corresponding computational requirements.

Based on the aforementioned, it is often the case that objects that we create cannot really exist and just exist due to vaugeness. Detection of the very vagueness requires in-depth verification of each object used in the proof. Each object is constructed using a language that is very likely to leak more when we get to know more about the universe. And it is the set, ie. the construction of “multiple” primitives carried in a bag that exists next to the primitives (numbers) and is widely used in the language of mathematics, where we have found so many problems. The same applies to understanding the structure of numbers.

All that combined with Tarski’s undefinability theorem and impredicavity allows more space for rethinking how numbers as primitives and sets as aggregates of primitive should describe the universe as of now, i.e. the one we know so little about, so that we don’t have to re-write too much in case of fundamental contradictions in our understanding of the universe. Interesting approach to thinking of sets has been presented by dr. Tao who put forward Banach-Tarski paradox and Cantor’s theorem in a “finitary” manner, using the notion of oracle. It might be the case that we should think about the mathematical tool for finding contradictions within the mathematics that we currently use, i.e. an analyser of a formal language used for constructing mathematical objects and proofs.

Question list:

1. Is there a way to describe the movement of planets assuming that there are k>=2 planets in k-example using the GTR?

2. From Mr Lipton’s blog:

“GLL: How can the same system be complete and incomplete? How can making something stronger make it incomplete?

Gödel: Ach—your words in English are too short. We have longer words so we think more before finishing them. Less confusion.”

The question is: could we redefine all the words and use only very accurate wording for whatever we want to state?

3. What are good examples of proofs that take advantage of induction and in the same time use other than integer number ordering (n->n+1)? What about p(n)->p(n+1)?

4. Can we be using an induction using a Turing machine  built out of logical statements and then use induction with an ordering defined by how deep we are in the three? (using a tree traversal rather than known sequential traversal)?

5. In case of a complex game where it is difficult to define the goal and there is a certain number of constraints that keep us alive in the game (e.g. life)- should we always look for implied odds rather than odds for the certain decision?

6. How is it now possible for a single person to conduct real astronomical research which would connect mathematical modelling with large amount of data?

Second part of this post as I am a little bit tired of logic now.

Whenever I think of the theorems regarding the inequalities (Muirhead, Jensen, WPM, Holder, Rearrangement, Chebyshev, Schur, Maclaurin, majorization, Bernoulli, etc.), I am thinking about three different things:

– how a certain function acts given a specific “extensive” argument,

– what are the types of interesting “extensive” arguments (this “extensive” argument could be either $A = x_1+x_2+...x_n$ or $B = w_1 x_1 + w_2 x_ + .. +w_n x_n$), then we could think if we can generalize the relation between $f(A^B)$ and $f(B^A)$.

– ways to settle the generalizations about the function $f(C) = {{f(A)}\over{f(B)}}$ (or similar) using the (human) analytic capabilities, automated (computation) analytic capabilities or heuristics.

What comes to mind is that for creating such theorems we are using:

– analytic (non-combinatoric) analysis of $f(C)$,  from there we can proceed in the area of more automated analytic approach using the problem solvers, here also we would use some sort of theory creating tool that would describe a feature of the function and then investigate it for $f(C)$,

– permutations, our brain will only succeed in finding the easiest to spot connections, whereas what we need is a tool that will permute and test,

When it comes to the analytic approach, we do have manifold notions including smoothing (unsmoothing), convex (concave),, extrema, constrained extrema, derivative test, hessian test etc. When it comes to permutations, we have majorization, symmetric sums, etc. It might also be the case that it is possible to reduce the dimensionality of the problem, i.e. decrease the amount of variables used in the problem.