## inhaling weaknesses of low parameter space proofs used for beautiful problems

Will be updated a couple of times

– Euler’s conjecture ($a^4+b^4+c^4=d^4$) proved false, $313(x^3+y^3)=z^3$ proved false, but smallest counterexample with more than 1000! digits- learning: either use the experimental approach to disprove conjectures) or discover new models for proofs, and to discover new ways of proofs, re-discover learning in cold start conditions, an example of Goldbach’s conjecture holding up to $10^{16}$, but we still cannot know whether it is true; observe that for many use cases we could use the current result, but  mathematics will still need the proofs to develop new ways of building knowledge,

– axioms are used for modelling in cold start conditions and, as such, are just preliminary observations, and therefore shall be re-modelled rel. fast, logic allows building up on explicit feedback, but also claims built on top of current models shall be further investigated, in iteration, first element of learning could  exploit multiplication with e.g. a re-modelled zeta due to the dist of its zeros, and then we could project  the outcome hyperplane into one of clustering method that would allow efficient handling of sparsity

– proving P implies Q is done with either “assume P, show that Q follows” or (contrapositive) “assume not Q, show that not P follows”, how to improve this process to make it more effective?, observe that proving P iff Q is done with either proving (P implies  Q and Q implies P), or  with a chain of iff clauses, and such a chain could also be used for implications, the key is to know how to  decompose a cold start (difficult) problem so that the chains cover the search space effectively, choosing such a chain can easily get stuck in a local minimum,

– to prove a problem, we sometimes decompose it into cases (when solving those cases seems easier than directly finding  a generic solution), such attempts can be used in a Gauss-style, i.e. using mathematical reverse engineering, i.e. we  may be able to find an interesting result, disclose an observation, conjecture a claim, detect potential constants or
exploit learnings from the structure of the solution, to further build knowledge and seek an analytic solution,

– given that easy problems will be solved with machines, with logical chains, and more complex solutions can be to some extent enumerated but their solution is still needed to learn more about knowledge building, we shall exploit more effective ways of decomposing ifs and iffs, and a good start for doing that is to understand for which problems we use the “multiple cases” methods- for those problems with a “easy to spot” informative feature (for problems that seem uninformative and irregular we can search the parameter space using more sophisticated techniques, with their most common problem – getting stuck in local  minima due to assumption that we will find solutions “in similar places”), for truly beautiful problems this may be far from the truth

– proof by contradiction, i.e. assumption of the contrapositive, and proving it cannot be the case, given the parameter space of  problems we tackle, could be used in a more direct approach, i.e. when re-building a new model based on iteratively changing axioms (we
believe that axioms are not obvious), we re-evaluate our statements, and if we find a contradiction, we further investigate part of the  model to disclose the informative feature leading to an error, and we eliminate it from the model,