Thursday, July 27, 2023

Tail Risk Hedging using Options



Moreover, I do technically think that Options being Path Dependent!(both first and higher order paths), any Tail Hedging Strategies may have some risk hedging issues for unfavorable Tail paths. And there could be some risk if some favorable path tail events won't occur over the years exposing to excessive cost risk. But yes they are quite helpful anyway !

Wednesday, July 26, 2023

Science Behind Spirituality

Science behind Spirituality

1) Science & Spirituality as the Set.
In one way Modern Science is trying to look at the world in Causality of Space-Time. It considers human as the Subset of the Universe but Spirituality typically emphasizes on the view that the external Universe exists as the Subset of Human mind and lies within it . Infact these two interpretations are interlinked and the two sides of the same coin that needs to be conceptually understood. Infact there is a famous paradox in Logic after Bertrand Russell known as Russell’s paradox which infact limits the scope of formal logic and even axiomatic foundation of mathematics and paves the way forward for metamathematics.
This paradox states that A is the subset of B iff A is not the subset of B. It’s also related to Liar Paradox in Logic where if X tells Truth ,he is a Liar and if he lies, He is True.
In context of Russell’s Paradox, The Science and Spirituality are like sets A(I) & B(Universe). Observing A as the subset of B is Spirituality and Observing B as the Subset of A is Science.. In Metaphysical Geometry ,it’s just the two sides of the same coin and Geometrically interlinked.
2) Determinism Vs Uncertainty: Scientific Role of God
If we go deeper and deeper into Science, we find that Science finds so much fundamental uncertainty especially in the microscopic quantum world. The laws of Science at that scale is more of Random nature than deterministic. This is fundamental characteristics of Scientific Nature we live in. There is little clue to Modern Science so far that the laws seem to be quite probabilistic in nature and what drives the world at microscopic quantum world... Albert Einstein through out his life couldn’t believe that the scientific world is not Deterministic and God is playing dice with us. But his belief has been refuted over the time the way Quantum world science has developed and admits the role of fundamental uncertainty in the Universe.
The big question is : Is God the real source of that Uncertainty inherent which is beyond human understanding ? Is the Uncertainty the pipeline through which God dictates the Scientific World ? It has been the pathway for the existence of God.
3) Importance of Human Brain & Consciousness
In Modern Science, contrast to Albert Einstein, Niel Bohrs believed in the Principle of Complementarity which states that It’s impossible to understand the scientific functioning of the Universe completely until we have the idea how the equipment(our own brains) perceives the Universe. The Science ultimately gets limited by this understanding about our own mysterious human brains through which the external world is encountered. Ultimately Modern Science is trying to understand Human Brain and the Consciousness as the way to understand the Universe.
4) Interconnectedness of Universe & Scientific Role of God : The greatest mysteries in Science so far is “Locality at a distance” found by John Bell in 1964 named as Bell’s Theorem which has puzzled scientists over the decades that two electron particles no matter how far way with each other are well informed about each other position and this is experimentally verified. This locality inherent in Nature leads to the theory of Hidden Variable Theory believed by Albert Einstein that some mysterious force exists beyond human understanding as of now which governs this interconnectedness in the Universe.
5) Paradox of Scientific Consciousness & Comnection with the Foundation of Spirituality
The most important challenge for Science which limits its growth so far is the Lack of Understanding of Consciousness. What is Consciousness, how we sense the existence of Universe. What is “I” ?
Recently Science has experimentally found there is no precise place in human brain which senses “I”(Self). Paradox of Self Consciousness...
Rhere are technically two sets “I” &”Universe”.
 “I” exist as the Subset of the set “Universe” exists and the set “Universe” exists inside “I” when experienced this. This logical and mathematical paradox like the form of Russell’s paradox can only be resolved Technically iff “I” = “Universe”. This is what the ultimate core of all Spiritual principles where One should not view “Self” as different entity than the Universe/Nature rather “Self” itself is Universe as the answer to the famous spiritual question “Who am I ?” It’s all delusion of Consciousness of being separated entity. In Spirituality this is what is said “ God is in everything. God is in every being. Infact God is not an entity but that state of consciousness where all these spatial differences of physical Separation as the entity vanishes and converges to One Wholeness Entity. All are One !
6)

Monday, July 24, 2023

Godel Incompleteness Theorems in Financial Markets

Godel Incompleteness: There will always be some risk say Self risk that can’t be Hedged ..You bring in some new Derivatives, there will always be some risk that can’t be hedged and it’s proven by Godel Cantor Diagnolization

Sunday, July 23, 2023

Godel's Incompleteness & Relavance for Law & Constitution

Disclaimer : This article is  strictly  for educational and research purpose only. It considers all the parties equally without any bias and favor in any form to remain free from any controversy of any sort. The author ahs due respect for all the parties equally ! Thank you. 


Is Our Constitution dynamically in sync with the Laws of Nature over the time ? The Foundational Problem of Self-referential in Constitution of Democracy!
(Random Unorganized Article ) ?
Godel Incompleteness Theorems are one of the most important discoveries in the history of Mathematical Logic, which blew up the Hilbert’s utopian dream of formalizing the entire formalistic mathematical axiomatic arithmetical system.
This issue is so fundamental philosophically and logically that I deeply believe that it is seemingly important for other areas too including Laws, Language, social life etc.
In this brief article( as I’m a bit lazy to write often!! my drawback !!), But I would like to do for the benefit of our system at large through the possible implications for the Law & Constitution of India in particular. Though it could be applied to any democracy around the world.
Before I start, I should state that Godel Incompleteness Theorems Proof which states that Any formal set of axiomatic arithmetic system would either be incomplete or inconsistent !! If we go deeper into the Proof of these theorems( Say through Cantor Diagonalization Theorems), it rests upon the Self-referential statements.
Self -referential statements are of extreme importance even in other fields. Here we shall look into its relevance for the Law & Constitution.
Any constitution is mathematically the set of different articles and clauses which comprises of various words and phrases explaining the rules . Now the same Constitution also contains the article of amendments having the power to amend the Constitution time to time as and when required dynamically.
Like US Constitution has article V for the amendment clause. Indian Constitution has Article 368 in part XX for that matter .
Talking specifically about the Indian Context, let’s say Parliament is given the power to amend the constitution as and when required based on the simple majority of say for example two-third. Now, if this is so, Parliament with the sufficient majority to a particular party can amend the constitution in its own favour! Let’s imagine it amends the amendment article 368 in Indian Constitution only for example . In that case, it could change the Constitution in downward way to favour itself and even try to change the Democratic systems to Dictatorship type legally. I am taking about the actual possibility because this has actually happened when Hilter turned the Germany into dictatorship legally by sending it’s constitution in 1933 !
And not that even in India, in 1975 Emergency our democracy was moving towards similar form of dictatorship possibly in 42nd amendments to curtail the basic freedom of democracy, even the power of the judiciary and making prime minister’s role above the judiciary in downward directions. These all possibilities are legally possible in the democratic systems.
This self-amendment is logical paradox in itself..called paradox of self-amendment and Contradiction in itself !
This problem of dis-entrenchment of self -amendment section will always arise and fundamentally Unsolvable in any democratic system. This is foundational constraints in the legal system of Constitution.
Now the Paradox of Self -Amendment is that who will emend the Amendment Clause itself i.e. Article 368 in Indian Constitution or Article V in US Constitution which deal with Self- amendments ?
Now there are two possibilities:
1)If there is provision for such Self -Amendment , Parliament can change the Amendment Clause itself through the required majority and could possibly turn it legally to dictatorship from a democracy!! It has happened historically !
A): Here I would also like to categorically mention that it could be possible as the way democratic election laws are structured in India, one single party can attain the majority. The Law doesn’t put Constraints on how many seats any party can win in any particular election. Also, in India, even if a party has say for example around 30% of total votes , it might win with the majority and form the Govt. So, 30% vote winning part rules the 70% Non-Voters as well !! Is this really a True Democracy in Principle or it needs Fundamental Basic Structure amendments in itself ?
Let’s take this as an example of Basic amendment required in future .)
Coming back to the earlier discussion
Now let’s look at the case of immutability - the Constitution states that Basic Fundamental Structure can’t be amended by the Parliamentary majority.
Say If there is no provision for such basic amendments by the Parliament in any case , the entire Constitution would be outdated over the time. Hence, this is the serious point of concern and there must be ways to amend the Basic ones in Future,if required to stand the test of time ! This is Immutability !
We must understand that scientifically we are living in the world of Complexity & Chaos that’s how the Universe/Nature works.
So, the fundamental question is Who will change the Basic amendment itself as required over that time as the society changes according to the laws of Nature which is supreme?
So, either we are in the state of paradox or immutability.
This would always remain the conflict and infact and been the case between Supreme Court & Parliament in 42nd ammendment.
So, this Self-referential issue is like the Unsolvable & Undecidability Problem in the constitution which relates to the Unsolvability & Undecidability problems in Mathematical Logic !
This issue fundamentally says that certain problems can’t be solved or decided!
In a nutshell then how to frame and design the Constitution to deal with the Self- Referential Statements to resolve the foundational issue of paradox and immutability in any democratic system for that matter ?
Society’s structure and laws change with time as the Human Behaviours change over the time along with Nature’s system ..hence Constitution must also change to be in sync with them. over the time or else they would be proven defunct !
Let’s imagine old generation laws of our ancestors and how relevant are they for the current generation ?
Few Practical Future Possibilities :
Let’s imagine over the time too much democracy is misused by few caste and religion and it needs to be changed to maintain a balance ..this is like the basic structure of democracy (Let’s understand the sense the context and not get too technical word by word here)
So, if democratic laws are misused over the time say a particular community increased population enough or say Parliament controlled by indirect influence Court through the selection of judges , then it might need to amend the Basics of constitution in future but that’s not allowed to be done as there can’t be change in the basic structure of the democracy !! So, if the basic laws of democracy itself creates problem through the Loopholes and are misused over the time by any group and it is so immutable that nothing can be done to amend it then in that case it would not become chaotic ! Isn’t it ?
It’s Paradox
The definition of “Basic” itself is not absolute but relatively dynamic over the time needs change as per the society and Nature.
If legal rules that authorize change can be used to change themselves, then we have paradox and contradiction; but if they cannot be used to change themselves (and if there is no higher rule that could authorize their change), then we have immutable rules. Paradox and immutability should create an uncomfortable dilemma for jurists and citizens in legal systems. It appears that we must give up either a central element of legal rationality or a central element of democratic theory.
It’s actual possibility that People could be misguided to vote misuse of democratic principles by parties and a single party becomes attains say almost large majority and opposition on the verge of extinction somehow.
Then the situation would be like :
Praying for Democratically elected People to save Democracy ! Singularity Point in Democracy.
The Structure of Democracy needs to changes over the time .
Let’s note that in the present structure even a party with total vote of 30% can make the central govt.with majority while 70% still opposed. And let’s assume that structure needs to be changed as it has fundamental drawback...in that case, how to change this basic structure to make democracy stronger to pass the test of time ?
There is high possibility in future we would need to make some Amendments in the basic structure of the democracy to protect democracy itself to pass the test of time and social structural changes..
We have to scientifically acknowledge that the World we are in keeps on changing over the time as per the laws of Nature. Chaos, Order these physical aspects can be well evident in Social and Political arena as well. The Constitution built 70 years ago based on the existing Scenarios then has to be modified as per the Social and Political scenarios change to make the democratic system robust and anti-fragile. Otherwise there would be systemic Tail Risk that can be imagined in the future. Our learned honourable politicians worry more about 5 to 10 years short-sightedly in their careers. But the point I am talking here pertains to 50 years to 100 years that could have relevance that could have relevance even in short term..who knows ! We have to build a dynamic system that could have proper provisions to make the Constitution dynamically in sync with the Law of Nature over the time.
Laws and Constitutions would always have fundamental limitations. This can be derived based on logical system incomplete and inconsistency principles from mathematical logic. There would always exist loopholes and limitations as long as the Laws and Constitutions are written in the words. We can practically see that how many political parties and people from castes and religion are tend to misuse them to gain the majority anyhow by hook or crook.
Some community might focus on increasing populations as to win the majority principles to be in power over the time. Infact this voting and majority system at the foundation of our democracy seems to be flawed.
Let’s take an example : In our own personal family will democracy be beneficial for long term future. If so, children being large in number would over rule the parents and it might not be beneficial for the family in the long run. There has to be intervention from experts and experienced. Sometimes, it might be the need of the hour that like competitive exams, voters’ weight should be decided based on their scores in the exam that could test their knowledge and other relevant basic issues to make sure they are wise enough to vote the right candidate. Mathematically/Statistically, the weightage should not be equal for all the voters rather on their scores. The reason I am saying is these equal weightage system in our democracy could be misused within the ambit of law legally by different parties and infact done as per the various allegations.
Lets take another example : If say one majority party becomes too large that all the opposition merge with the larger and smaller would be extinct. In that case, though legal, it would be existential for the democracy itself !
Or say our great MLAs,MPs win elections as competitors before the public voters and later they unite...May be that could be revised to protect the democracy ? But who will amend these flawed rules ?
Will Political representatives MPs& MLAs pass laws against their own comfort and career security in the long term interests of the country . This is also Self – Referential Issue.
We see that Party with around 30% vote overall in the country form the government Technically with majority but still 70% have voted against. This is the drawback of current multiparty democratic system . May be this needs to be amended in future to make the democratic system robust over the time .
I have mentioned few examples to show how Law and Constitution can be legally lead to danger against the existence of democracy itseelf! It’s like Self -referential statements where in future we may have to save Democracy from Democratically elected Party itself..
So, the point is how to amend the basic structure of democracy in good sense positively to make the democracy stronger over the time rather than static otherwise the existing loopholes in the basic structure could run the dangerous risk of getting misused legally !
Will our own honorable politicians from all the parties,also our honourable judges(it could apply to everyone frame the Constitution/Laws against their own comforts ? Self -Referential issue !
I would also like to add : that no matter how much majority one party brings and works, over the time public sentiment and expectations would change and they would be thrown out of power..This is according to the law of Nature which seem to be out of many politician’s understanding. Hence they must be polite and humble and do justice with the opponents while in power or else be ready for the Newton’s third law in time !!
And in summary , any democratic Constitution needs to solve their Self-referential problem to arise from the conflict of Paradox/Contradiction and Immutability over the time and remain Prudent by being in sync with the Laws of Nature or else it would be chaos and existential over the time.

Friday, July 21, 2023

Paradox of Fraud !!

If you want to commit fraud, always pay your taxes...

So, if you don't want to commit fraud, then don't pay your taxes ? But not paying tax is itself a fraud !!

Thursday, July 20, 2023

Tail Risk Hedging :

Say Tail Event in Financial Market Trading & Investment can have N paths ..There will always be some Tail event paths (higher order dynamics included like Speed, Acceleration as well) which will not be favorable for Options(being Time and Path dependent )even if cheap ,hence there is risk of huge Shoulde risk !

No matter which Options strategy is followed , there will definitely exist some Tail path where that won’t work being Path dependent. If not so, the option strategy will be path independent.. Contradiction!!


There will always be some risk whether shoulder/body/Tail or some other forms to generate enough return or else the return won't be there enough ! There is no free lunch in the market !! If there exists some gauranteed ways to Hedge all the risks and generate enough return, it's unnatural for market !!


Detail in my paper...

So, one will have to come up with the change in the strategy dynamically depending upon the path..here is the role of Timing, Idiosyncratic or some other form of risks..

Monday, July 17, 2023

Quantum Entanglement,Spooky at distance, Illusion of Classical Motion , Zeno's Paradox , Consciousness Geometry of Observer's Quantum Brain and the Universe

Quantum Entanglement, Spooky at distance is something because of Super Geometry between the Observer's Brain and the Universe leading to Interconnectedness which doesn't require any classical world type motion of information flow like Speed of Light. Here so the Geometrical role of Observer's Consciousness.

Two Classcially Separated Entities can be attached in Quantum world ! 

I deeply believe for long (which I thought of writing here ) is that Observer's Brain through which an Observer experiments with the External Universe is such that External  Classical  Universe exists within the Brain in Quantum aspect and vice versa.

It's like Russell's type Paradox where  the set A is internal to Set B if and only if Set A is external to the set B 

The traditional inherent  psychological day obday assumption by the scientist  that the Classical Universe is Primary and Human Brains are tiny subset of that or Quantum world could be the fundamental reason of getting surprised at Quantum Entanglement.

Quantum Entanglement is nothing but the outcome of this Super Geometry that the External  Universe exists internally to the Observer's Brain even if it appears external in day to day experiences. This fundamentally causes aht two classically separated entangled entities are well connected in Quantum world. Infact the whole Classical  Universe is interconnected in the hindsight of Quantum world . Fundamentally Classical world is also a Quantum world only. 

This peculiar deep Super Geometry between Human (Scientist) Observer's Brain (experiments) & the Universe where both lies within each other is the ultimate cause of Universal Interconnectedness at deep and also Quantum Entanglement..Locality at distance etc...It's deeply the outcome of Consciousness that Human Observer itself is the Universe not different from each other at deeper level !!!

"I" = "Universe" Both the sets are equal to resolve the Russell type paradox which lies inside what . This is also the core Science  of Spirituality which considers All are the same , interconnected, God everywhere and . There is no distinct "I". The Classical Separation in Space Time is illusion !! 

So, these two Classcially Separated Entities could be well informed about each other due to their alignment in Quantum world .and hence there is no superficial flow of information of larger than the Speed of Light is required as demonstrated by the Einstein's Special Theory of Relativity. 

I would also here say  that Einstein's Speed of Light maximum  is valid for classical world of motion ..infact at deeper level the characteristics of motion is itself illusion as well demonstrated by Zeno's paradox. There Eisntein's maximum Speed of Light statement would not make sense probably!! 



After all I do imagine and relaistically find the the Classcial type motion is fundamentally Illusion as in Zeno's Paradox. It's just static Superposition of various quantum states that lets the observer feel virtually that there is Classcial motion going on. This Classical World 3 D Space -Time all these are virtual realities coming out of some special features within Quantum waves. So, Classcial World is just a virtual aspect caused by Quantum waves fundamentally

Friday, July 7, 2023

Scientific Perspective: Black Swan : Causality Adjusted Randomized Factor Investing : Duality of Determinism & Randomness in Nature, Life &Financial Markets.

 

Scientific Perspective: Black Swan : Causality Adjusted Randomized Factor Investing : Duality of Determinism & Randomness in Nature,Life & Financial Markets.

Vol 1.

 

By

Pankaj Mani

(manipankaj9@gmail.com)

 

May 2023

 

 

 

 

 

 

 

 

 

 

 

 

1.Introduction :  In this Randomly written yet Deterministic thought paper, I’ll discuss about the Scientific Perspectives of Financial Investments especially Factor Investing. First will talk about Philosophy & Development in Science and its connection with  Financial Markets. How Investment/Factor Investing can be made scientific Will discuss about some fundamental misconceptions of popular statistical tools e.g. Regression Analysis, Backtesting etc.  Also the Role of Time Dimension in Statistics.  Will  discuss about the Scientific Origin of Black Swan, Convexity and finally a New Causal approach of Factor based Investment.

Will also discuss that ultimate intellectual maturity is the acknowledgement of duality/two extremes in Nature in the form of Randomness(Uncertainty) & Determinism(Predictability) like Wave Particle Duality. Rather than being competitor to opposite being complementary to each other. It attempts to unify the two extremes of Black Swan Randomness and Causality in the dual form.

 

Note : I’m not going to make it much technical in mathematics, for to be understood by all. Also, there exist Fundamental issues with the conventional statistical and mathematical tools, which is blindly applied in finance without understanding the concept. So, my focus is on concepts behind the mathematical/statistical tools, which are often misused and misunderstood in finance.  Also, there are some fundamental issues in the existing classical world mathematical/statistical tools applied  especially in finance driven by quantum human behavior . Hence applying the same faulty mathematical tools doesn’t make sense.  I mean that traditional mathematical tool used so far are not well capable of describing  these natural aspects. In future they will possibly have to be developed like Riemannian Geometry was done for Einstein’s General Theory of Relativity.

 

Also, there is a Self-referential problem like in the famous Godel Incompleteness Results(in Mathematical Logic) in this paper that disclosing the things beyond a limit could influence the observers’ minds who could be themselves the part of markets. So, in this way, Causal dynamics of Quantum market could be influenced. Hence, there would remain a limit  on how much to disclose.

 

P.S. Kindly ignore  typos in grammars, spelling,repitition etc as even they are random to varied extent for me relatively!

 

P.S. I know some  would agree,some wouldn’t on certain points, so need not worry. Scientifically we live in many worlds in Quantum perspectives, So, there couldn’t be absolute existence of the perspective fit for all.

 

 

 

Philosophy of Causality in Science

 

 

 

2.Background of Science :

 

Science has evolved from Newton’s  Classical Laws to Quantum Mechanics over the time. Einstein’s Theory of Relativity to Schrodinger’s wave Equation to Heisenberg Uncertainty Principle to Principle of Least Action to Feynman Path Integral Approach  have been developed in the study of science over the time. Science is still trying to unify the two extremes of Classical & Quantum world laws. One one hand, things appear Deterministic in day to day Classical world, things appear highly Uncertain/Random/Probabilistic in Quantum world. Science is trying to understand the Quantum world of Uncertainty and Unifying it to the Classical world.

Albert Einstein throughout his life couldn’t digest that God is playing dice with us. He believed in Deterministic world inherently which has been refuted over the time. On contrary, Niel’s Bohr’s  Principle of Complementarity emphasizes the role of Observer(Human Brain) through which the External world is experienced. Even Stephen Hawking talked in context of theory of Everything in Science about the same regarding the role of human brain in the ultimate theory of universe.. This is infact true that Science can’t find the ultimate theory of Everything without knowing about Human Brains, Consciousness etc. There is also Paradox of Consciousness and I wish I could take this here to show how Science needs radical approach and try to unify the ultimate reality of the Universe- Brain mutually. How a peculiar super geometry could unify the Copenhagen/Quantum and Classical Interpretation of the universe where the external world exists independent or dependent on the observer. But that’s extremely deep and require outstanding imagination but as this is beyond the scope of this paper, I would leave it here for now.

This is beyond the scope of this paper here. Hence, I would like to confine here.

Feynman’s Path Integral approach has been the most powerful scientific result in Science which originated from Paul Dirac’s Principle of Least Action. I would state that even Classical world theories like Newton’s Laws of Motion etc originate from the Quantum Energy Laws.

Those who have studied Physics know that one can derive Newton’s Laws of Motion from Principal of Least Action which is also supposed to be the core of Quantum world Laws. It states that any system tries to trace the path of Least Action (Action is basically a function of Kinetic & Potential energy),loosely speaking, least energy in least time.

So, Feynman’s Path Integral approach which also the principle behind Feynman ;Kac approach for solving Differential Equation originated when Richard Feynman tries to show that Schrodinger Wave Equation in Quantum Physics can be derived from Feynman’s Path Integral approach.

Feynman’s Path Integral tries to sum all the possibilities of trajectory to find out the resultant path and this came up when he tried to explain wave particle duality in context of the famous  Double Slit Interference experiment..

Though this approach by Feynman needs further improvement and clarifications which I have devised especially in Human Quantum context  but that’s beyond the scope of this paper. Hence, I would keep myself confirmed to what already is well known rather than proposing my own improvement in physics over that in this paper.

Feynman’s path integrals formula is so far the most powerful theory in physics to describe the laws of Nature. Starting from the Paul Dirac theory that a body traces the path of least action(roughly speaking least energy and time ), Richard Feynman derived Path Integral approach to discover the path of an object. It basically extends Principle of Least Action to Quantum world from Classical world.

Below are the equations of Feynman’s Path Integral theory to Principle of Least Action which tries to minimize the Energy Function of a system from which are  linked to the Laws of Motions

Quantum Decision Theory Series: #4 Path Integral Formulation and Finance –  NICKELED AND DIMED

 

 

How can Feynman's path integral formulation of quantum mechanics be  explained in layman terms? - Quora

 

 

Principle of Least action :

Least action visualized

 is Kinetic Energy while .

 

Still Science has lot to deeply look into  like Locality-at-distance by John Bell(Bell’s theorem ,Quantum Entanglement, Superposition, Interconnectedness,  EPR Paradox (this year Nobel was awarded on the same  Bell’s theorem related work in Physics ) for some wonderful experiments.

I could humbly try to take Science to much deeper level to explain my own findings here but that I would deal with separately  the mystery of  Locality-at-distance which Modern Science has not probably imagined so far and also Modern Science needs to push it’s boundary beyond and obviously the Crucial role of Human Consciousness comes into play.

 

But controlling myself here,  what is relevant in Finance context that Market  and hence the dual game play of Uncertainty & Predictability would keep on going endlessly until investors. Market is essentially the sum of Quantum Human Behaviours of Buyers and Sellers in the Market and Stakeholders etc. Even Feynman’s Path Integral approach/ Paul Dirac’s Principle of Least action follows for Human Quantum Behavior in Markets but yes slightly different variable form rather constant as in  non-living world physics

 

Hence, while Market/Finance  has to be studied Scientifically in Causal way, one has to acknowledge that Science won’t do magic  by being able to Predict the Fundamental Uncertainty inherent Everywhere including Human Behavior. But there is a good news to deal with Uncertainty which I would cover in different section here on Randomness. Beyond that as traditionally, statistical tools e..g Backtesting etc.are applied in finance are mostly  backward static  looking but by looking it into science like dynamical motion of stock in Quantum Human Behavior Space-Time, trajectories would be forward looking like hard science Physics applied to Human Context.It would help understand and manage Black Swan type events scientifically and better be prepared to manage portfolio for people at large which affects common people through pension, Sovereign Funds , Mutual funds, hedge funds etc.

 

 

Background of Mathematics in Finance & Human Quantum Behavior : Classical Mathematics Vs Quantum Mathematics

 

 

 

 

 

 

 

 

 

 

3.Randomness (Unpredictability, Uncertainty)Vs Determinism(Predictability): Duality in Nature.

For Long there has been serious conflict going on in Science/Philosophy/Social/Financial domains whether the world is Random or Deterministic. Science is still far from solving this. But let me state that there exists Fundamental duality in the Universe in terms of Randomness & Determinism just like Light behaves as Wave or Particle , which is fundamental duality inherent in the Universe. Similarly , there exists Fundamental duality of Uncertainty & Determinism in Nature and that’s well reflected everywhere including Financial Markets. Market essentially also has this duality. And this exists Relatively to the Observer. Same thing can be Random or Deterministic to different Observer relatively. Information availability is also one of the causes.

Infact Randomness & Determinism are NOT absolute phenomena . They can both co-exist to the same observer from two different perspectives and also relatively for different observers. It just depends on perspectives and also level of information. Paradox of Randomness can be resolved by understanding this Fundamental duality in Nature, hence markets,life that Both co-exist together perspective wise..

It is generally treated  that  there is Absolute Randomness existing, that perspective needs to be Fundamentally changed ! It’s Relative not just for different Observer but also for the same Observer!!

One and to understand this Duality inherent in Nature and everywhere in Life, markets. And I must say it is fundamentally linked to Wave-Particle Duality in Nature ! That’s deep and beyond the scope of this paper.

At the same time,I must categorically mention that Causation & Random are not two opposites rather even Randomness has the Cause. But knowing that something is Causal doesn’t mean it’s completely deterministic. It all depends on the level of information. Like we know the Causal Law of Motion of a Car but that doesn’t mean one can completely predict an accident !! Randomness would also be deterministic for itself. Reference frame matters.

So, Randomness is NOT Randomly Random but “Deterministically” Random. ! Things are Locally Random but Globally Deterministic in Nature, Universe and even Markets. One can relate this to Quantum Physics where a particle behave randomly at individual level but highly deterministic at group level.  

Hence, contrary to the understanding that Randomness is opposite or different from Causality and Causality is overestimated by Humans, I would rather say  Randomness does have Causality and even Randomness has inherent Determinism that need to be discovered but that doesn’t mean they would be completely predictable. It’s like Quantum world as explained earlier, Fundamental duality exists. Things could be random and predictable relatively. As in Quantum world, a wave could behave Random/ Uncertain at some level but Predictable as well at some other level.

 

So, Randomness has to be studied deeply as the part  of Causality but we have to acknowledge it’s existence and uncertainty inherent in Nature Relatively and Things can’t be completely Deterministic or Completely Random. Randomness doesn’t mean anything can happen, it would always be driven by Causal forces ,but that we may not completely know.

Hence, Randomness is Deterministically Random not Truly Random in Nature hence in Financial Markets as well.

As usually understood , Randomness is not absolute phenomenon, same thing can be random and deterministic from different perspectives and availability of information relatively for the same observer and different observers too. It’s like Duality.

Randomness does converge to its  Equilibrium State ! This is governed by the Law of Energy and Nature.

So, things – One we need to understand the Causality /Determinism behind the Randomness. Randomness does follow some Deterministic aspects but that doesn’t mean it would be completely predictable. That’s the key to success in real world  It’s linked to attaining more and more Convexity. Will be dealt in the later section.

 

 

 

 

4)RANDOMNESS PART2

 

Understanding Randomness in Real World

 

We often talk about Randomness in Nature,Markets, Life etc. but seems like we have to understand them deeply. Before we explore Randomness in finance, let’s first take ourselves to Science.

As it is often said : Life is unpredictable, and random things happen to us all the time. But yes they are also predictable in some perspectives..You might say the universe itself is random. Yet somehow, large numbers of random events can generate large-scale patterns that science can predict accurately. Heat diffusion and Brownian motion are just two examples.

Recently, randomness has even made the news: Apparently there’s hidden order in random surfaces, and we may be close to seeing a quantum computer generate ultimate randomness. This latter quest for perfect randomness is important because randomness brings unpredictability, and all non-quantum attempts to achieve it have the hidden flaw of being generated by algorithmic methods which can, theoretically, be deciphered. .

 

Is nature inherently random? According to some interpretations of quantum mechanics, it is, explaining why we can’t precisely predict the motions of single particles. In the famous double-slit experiment (which, as Richard Feynman declared, “has in it the heart of quantum mechanics”), we cannot predict where exactly an individual photon passing through two slits will land on the photo-sensitive wall on the other side. But we can make extremely precise predictions of the distribution of multiple particles, suggesting that nature may be deterministic after all. Indeed, we can predict to many decimal places what the distribution of billions of photons shot at the double slit will look like.

 

This dichotomy between unpredictable individual behavior and precise group behavior is not unique to quantum mechanics. There are many novel and strange aspects of quantum physics — particle-wave duality, quantum entanglement and the uncertainty principle, for instance — but probabilistic equations that give precise predictions of ensemble behavior are not among them. We see this phenomenon wherever very large numbers of like elements interact, such as in thermodynamics, where we can predict collective measures like heat and pressure with precision, though we may be completely ignorant about the paths taken by individual molecules.

 

There is a debate whether randomness or determinism lies at the heart of quantum mechanics, which I characterized as team B (Niels Bohr) versus team E (Albert Einstein). Team B sees the unpredictability of particle behavior as evidence that at the fundamental level of the universe, determinism is replaced by intrinsic, objective randomness. Team E contends that this randomness is merely a sign of our ignorance of a deeper level of deterministic causation.

.

This is a philosophically open question if Randomness truly exists in Nature or just it’s just lack of Ignorance !

 

The great mathematician and one of the founders of algorithms and computer science, John Von Neumann quoted that any algebraic system producing truly  random numbers in mathematics is just ignorance for they are the output of deterministic algorithm.

Let me put my scientific point of view because that’s where the origin of Fooled by Randomness, Black  Swan etc.originate in Finance and Market.

Taleb’s fooled by Randomness essentially seem to believe in the Niel Bohr school of thoughts of pure randomness when he talks about the Role of Randomness in Life & Markets. Although he is right in many aspects that Randomness does exist in Nature ,Life, Markets, it’s not Randomly Random rather Deterministically Random.

I would categorically state that Randomness is Deterministically Random not Randomly Random. What I mean here is that Even Randomness has Causality and they follow some Deterministic Laws of Nature.. Things appear Random Locally but they are highly Deterministic Globally in Nature. One can even visualize this in Mathematics. Riemann Hypothesis is termed as the most important problem in Mathematics which is linked to the Prime Number Distribution. The existence of  Prime Number  appears Random Locally  like Riemann Zeta function’s Non-Trivial Zeros but they are highly Deterministic that Non-Trivial Zeros linekd to Prime Number lie on Critical Line. This has been experimentally verified as well which I think it’s true.

Riemann Hypothesis Pictures

New insight into proving math's million-dollar problem: the Riemann hypothesis

 

Similarly one can see at many places in Mathematics as well. One has to fundamentally understand that Nature is work in progress. Randomness is the part of evolution of Nature locally  like wave but they are highly Deterministic Globally like Particle..

Summation of Randomness Leads t Determinism . That’s how Nature Fundamentally evolves. That’s also the reason why Monte Carlo Simulations type algorithms might work . Also, RCT, Algo Wheel type Randomized approaches..

 

There exist Global Causality of Local Randomness ! Randomness does follow the Law of Nature & Energy Deterministically in the Universe hence Human Behavior, Markets…

That doesn’t mean things would be completely predictable. It’s like we know the Laws of motion of a Car but can’t exactly predict an accident !

The dynamics of Randomness comes from the Origin of Principle of Least Action  by Paul Dirac to Feynman Path Integral…Things follow this path to reach the most equilibrium stage…And Randomness essentially follows that Law ! That’s why even Portfolio Construction should be based on Randomized approaches like Algo wheel where Factors can be Randomized and the Best Path of Least Action could be selected……This is what essentially leads to Convexity approach.

Hence , Black Swan ,Randomness does exist but essentially they would follow the Law of Causality where they would revert to their equilibrium state. The key to Risk Management is to survive the whole cycle before the convergence !!

 

 

So, first let me clarify that it’s not just Random.Randomness has two types- Randomly Random and Deterministically Random. Randomness  in Nature is not Randomly Random but they are Deterministically Random.

Deterministically Randomness has well defined Causal Mechanism but that doesn’t mean we can completely predict them !

It means they have duality . They are basically Random but even inside Randomness things have certain causes and are governed by the causal laws ! Random is Relative term depending upon the level of information with the Observer. So, we may not know what exactly would happen but that doesn’t mean anything can happen anytime.

What ever would happen is governed by the causal laws but yes even then we can’t predict them completely due to Incomplete information availability.

 

Hence we often come across Randomness & Determinism. But in Real World we have to define the term as “ Deterministically Random” & “ Randomly Random. Even Randomness follows the Laws of Nature and has proper Causality (But it doesn’t mean completely deterministic !)but yes there would likely always exist incompleteness of information leading to Randomness relatively for a human observer at any moment in time. But,I humbly believe there needs to understand in depth the Causality(Deterministic aspects) behind Randomness and its Laws to understand it more scientifically.  It would help more in achieving the higher Convexity.”

 

“We have to technically understand that Randomness in Nature  hence Market & Human Behavior is NOT “Randomly” Random rather  they are “Deterministically” Random in Nature and Markets !  Means Randomness doesn’t mean anything could happen randomly. Whatever happens (we might not predict completely due to lack of complete information available at any time )as the fundamental principles inherent in Nature inherited to Markets  but it has certain causal determinism Fundamentally!” – Randomness has Hidden Synchronicity !” --Pankaj Mani

 

Things have inherent Randomness locally but Globally they are bound to be Quite Deterministic ! But Determinism doesn’t mean we would be able to predict completely ! There would always be incomplete information of forces and their Measurements to be able to make complete Prediction in the Causal System. So despite knowing the Causal, we might not Predict completely  and there would be Randomness. In a nutshell, Even Randomness has deep Causal Mechanism  in Nature and hence markets as well.  even Causal Mechanism has Randomness vice versa  .It’s deep paradox in itself scientifically ! But that’s how Nature works and hence the markets as the subset of that.Randomness exists because the Nature is itself a Work- in-Progress. It’s not Complete in its own Construction ! Nature itself might not know where Nature could be as it’s itself under Construction. But that it does have Causality and would depend on how various forces work out. Further, Relativity of Randomness & Determinism  - What could be Random for Observer X could  be Deterministic for Observer B. It’s Relative ! Hence proponents of  opposite schools of thought have to understand this “Relativity” rather claiming to prove them “absolutely”.

Nature, Human Behavior & Markets are not Randomly Random but Deterministically Random !! Randomness does have hidden Causal Mechanism but we won’t be able to completely predict the reason being we don’t know different forces and their Measurements like Newton’s Laws of Motion . We know the Cause but we don’t know the complete information about the  measurement of different forces in the system at any moment in time leading to UnPredictability. Size, we have to understand the Scientific reason behind Unpredictability & Randomness.They don’t exist absolutely rather relatively.

 

 

 

 

 

 

 

 

 

 

 

5)The Philosophy behind Prediction : Trying  to make the world Deterministic and Banishing Fundamental Uncertainty from Nature, Life & Markets ??

 

Almost all the financial models try to predict the market based on various statistical and mathematical tools. Even in Life, we want to predict and also in various other social and other domains. Everyone talks about Prediction more or less. But have we ever thought deeply that what we inherently do when we try to predict ?  So, what does Prediction mean scientifically? Prediction means the the observer is trying to make the world deterministic by knowing the state of the world at Time t   at present (Time =0). Even Nature doesn’t know what it could be at Time T =t because it’s itself a work-in- progress driven by Causal forces. And the State of the world   at Time T=0 and the Time T=T both could be  different and even Nature doesn’t know what could be. So, by trying to predict, the observer might influence the system by influencing itself. These all are Scientific in context of Quantum Reality in Physics.

But is Universe Really Deterministic Completely ? Is it not against the Quantum Law of Nature ? Let’s imagine the world is completely Deterministic or Random ! Will the world run ? No, it will cease to run eventually. The only way the Universe, Markets, Nature,Life could Function is the existence of Duality & Opposites.

So, by Trying to Predict more and more, we are trying to make the system more and more Deterministic which it’s Fundamentally not. Infact there is paradoxical underlying truth .

So, having origin in the Quantum world science, an observer should not focus much on Predictions because by doing that more beyond a point, his own predictions could Influence his actions and that would likely change the Prediction only. In scientific terms, by prediction means trying to make the world Deterministic, things would become more uncertain. And the more one focuses on managing Uncertainty and less on Prediction beyond a point, the more Deterministically the result would occur. So, this Paradoxical situation originates due to Quantum behavior. Even Market is a Quantum wave coming Human Behavior etc..So is life.

Hence, to get better result, one must admit and respect the Duality in Nature, and not try to make the system moreand more deterministic by trying to predicting more and more endlessly.

So, the focus on process and present Uncertainty over the time would automatically lead to great future even without knowing the future in advance  deterministically. On the contrary by focussing more than enough on future prediction, the present is often causally affected which eventually influences the prediciton/future. This is no non-scientific thing rather very much scientific one coming from Quantum world laws of Uncertainty, Observer’s influence on the outcome etc...

 

In other words, one has to understand technically there would be lot of causal forces between now and the time for which it is predicted..and that could change the Prediction Causally. It’s like observer is trying to predict the motion of a Car without knowing the causal Forces thoroughly as  done in say Newton’s Laws of motion. This being a dynamical causal system like a physical system, there has to be understood the causal forces being applied. So, how can one predict scientifically beyond a limit  without knowing the causal forces and their strengths etc. It would be just a guesswork then.

I must repeat that Quant models must focus to manage this inherent Uncertainty in Nature which is quite Scientific and yes it inherits from Quantum Uncertainty to Human, Markets & Life etc.

Although it may be hard to imagine for some, but it’s scientifically like Quantum effect in Markets and Life as well !! The more an observer tries to predict, the more quantum uncertainty it would get. The less he tries to predict, more certainty would the things become. By predicting more and more beyond a point, the observer tends to change its own predictions using quantum effect. As in Quantum physics, Observer’s role is critical and could influence the outcome. Recall Heisenberg Uncertainty, to Schrodinger’s Cat to Quantum Entanglement in Quantum Physics.So, Observers have to understand the critical point and maintain the balance between Predictability & Unpredictability ( Uncertainty) inherent  everywhere !

In simple way, to predict like Newton’s Laws of Motion in Physics, one needs to have the information about all the causal forces. But does one have this, No ?

Infact there is a Self-referential issue as in the famous Godel Incompleteness Theorems.

In Science or in Markets, the fundamental issue is that Observer tries to Predict the system,who is itself the part of the system. The System is not independent of him/her. So, scientifically, Predicting the Market/Nature/Universe/Life is Causally Predicting the Self ! Now that’s Contradictory! Can we predict ourselves ? How far ? Market is essentially the summation of Human Behavioral Forces. If an observer can’t completely Predict itself, how come it can predict the Resultant Sum of All the Humans completely? Infact this is the fundamental challenge of Science as well where it’s the concern that without understanding the Human Observer Consciousness etc, will there ever be the Ultimate Theory of Everything for  the Universe?? No .

Caution : I don’t mean own should not predict at all or the world is is not at all predictable. Because, if Everything becomes Unpredictable/ Random, that would also cease the system to run ! The point is observer needs to appreciate and admire the existence of both the sides and act optimally.

 

So, the key is not to try to predict more and more beyond a point rather act in a balanced way appreciating the principles and Laws of Nature in the form of Uncertainty.

This opposite forces or duality create the potential to run the system. We often see opposite school of thoughts people. slamming and criticising others to prove their own point.

But the highest level of intellectual maturity and consciousness and knowledge is the acceptance of this duality and existence of opposites at the same time. Yes, this Opposite is the source of energy potential that induces the current flow otherwise everything will stop. This duality is inherently the part of the Nature scientifically. The Opposite Schools of Thoughts who end up fighting and proving their side to be true and others false must acknowledge this inherent duality in Nature, Life & Markets!! It’s Many World  as in Physics ! Relative & Complementary to each other.

 

The Beauty of Nature/Life/Markets/Universe is Duality and that’s essential to create Potential energy to function . This has to be understood well conceptually and not easy to digest psychologically for all.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

6)Connection between Science, Human Behavior & Financial Markets.

Markets are driven by Human Behavior as well which essentially follow the scientific  Laws of Nature. Human Brain is run by Neurons, Electric Signals etc. Human mind is basically an Quantum  equipment Market is essentially like a Quantum wave as the resultant of Quantum Human Brain waves of set of Traders/Investors in the Market. Infact Human itself behaves like a Quantum wave scientifically. At deeper level, one can scientifically realize that even Human behaves like a  Quantum wave !

Physics has some fundamental difference with Human. Unlike Physical Non-Living Objects, Human Behavior has some different Behavior but they are indeed linked to each other. But basically Human Quantum behavior also follows the Laws of Energy and Nature as non-living Physical  objects do in Nature. Technically, Human Quantum Behavioral is more advanced has  variable Components unlike the Constant one for Non-Living Physical Objects, but they follow the same laws of Nature & Energy fundamentally.

As in Physics, Non-Living  bodies follow Newton’s Laws of motion of forces, Quantum Laws, Principle of Least Action , Feynman Path Integral approach. Similarly,  Living Human object also follows these laws but in variable way unlike constant for the former.

Human Brain is also run by Neurons, Electric Signals, Energy etc. Let me state that like Physics, Human Quantum world also has Inertia,mass,weights, momentum, force etc. Broadly speaking just like Classical world, Quantum world also has the equivalent concepts. Indeed, the Newton’s Laws of Motions, Laws of Energy,Principle of Least Action (But Least has different meaning in Human Quantum world) are all valid in Human world. And as a result, even Markets has Momentum, Mass, Energy, Gravity, Acceleration etc.What we call Stocks also have mass, acceleration etc. .  They also follow the Laws of Science/Energy etc. Essentially one must understand that Market is a Quantum Energy system.

It has been traditionally told that Humans are different from normal physics. Indeed that’s true but even Humans also follow the Laws of Nature and hence the Markets. So, I must state that Quantum Human Behavior also follow Principle of Least Action, Laws of Forces & Energy  etc. as in Physics but yes at deeper level than Non-Living Physics which remains usually constant !!

And , for that reason markets have to be studied from that scientific perspective of energy, mass, momentum, acceleration etc. Infact I would explain later  the scientific background of concepts and how far economic rationales are correct scientifically especially for factor investimg and all. For example, I’ll explain what is meant scientifically for economic terms like Large,Mid & Small Caps in terms of Quantum weights. How that affect their causal dynamics in context of factors. Scientific factors Not Associational.

 

 

 

 

 

 

 

 

 

 

 

7)Century old Problem: The Deeper Connection between Financial Brownian Motion( Louis Bachalier) & Physical Brownian Motion( Albert Einstein):

Boltzamann equations and Langevin Equation : Newton’s Laws

 

Technically Markets run by Human Particle Behavioral Forces. Human Behavioral Forces run by Neuron and particles in the Human Brains. Since, these brain particles  are themselves quantum type particles, they also follow the Laws of Nature like Newton’s Laws,Quantum Laws etc.

Human Particle Motions driven by those quantum microscopic brain particles like neurons are the same as the microscopic particles in physical world. The cells of the brain include neurons and supportive glial cells. There are more than 86 billion neurons in the brain, and a more or less equal number of other cells. Brain activity is made possible by the interconnections of neurons and their release of neurotransmitters in response to nerve impulses. Neurons connect to form neural pathways, neural circuits, and elaborate network systems. The whole circuitry is driven by the process of neurotransmission. Human Brain System is basically  an Organic Electric System run by Neural signal network. Even Consciousness is also state of matter as postulated by few Scientists!

Both are natural particles essentially following the Laws of Nature. Hence there is no point of difference. The Human Crowd Behavior is scientifically the same as molecular dynamics in physics. Hence Financial Brownian motion and Physical Brownian motion are fundamentally the same scientifically but not exactly the same. As I explained earlier that Human Quantum Behavior is also run by the same Causal Laws in Variable way compared to Constant one as in Physical Non-Living Objects. But Fundamentally Human Quantum Behavior is also caused by the Quantum Forces and they also follow the Path of Least Action but Least here is variable .   Recently some Scientists in Japan have shown some similarities between Financial & Physical Brownian motion by studying the  behavior of HFT Trade Orders. It’s no surprise because I’ve already explained and make the statement that   Human Behavior also follow the Laws of Nature like Physical Bodies as both originate from the Particles/Waves only. Human Particles in Brain are also like Physical Bodies particles following the Laws of Nature but yes one is Non-Living and another is Living particles but they both follow the Laws of Nature in their respective ways. Hence, it is proven that Human Traders Behaviors(Financial Brownian Motion) in Market would /is  essentially similar to Physical Brownian Motion ! There is no surprise . As more and more studies will be done experimentally  by studying the Human traders behaviors,  they would come to this conclusion over the time that  they also follow the similar Laws  of Nature !

Just like Classical motion of Newton’s laws , Quantum motion is also governed by Newton’s laws… Newton’s Law is Universal originating from the Laws of Energy which is valid everywhere. Infact Newton’s Laws are derived from Laws of Energy, Principle of Least actions etc..which is universal every in the Universe

 

“Like Classical world Physical  laws(which essentially originate in Quantum world, just like mass energy, weight, forces,  inertia, momentum,entropy etc. all exist in Quantum world as well leading to Quantum motion of which Human Behavioral forces are a type… But test in Quantum Human Behavioral living world, forces behave  more like variables unlike constants in the physical non-living objects.  Hence, the Humans are essentially  an energy system and this is the scientific reason, they have similarities with non-living mechanisms.

 

I hereby humbly make the statement that just like Physical Brownian motion, Financial Brownian motion would share lot of similarity with slight variabilities. It’s indeed science and one can confirm these through the experiments in real markets. Recently it has been experimentally shown at preliminary stage but as these things are explored more in detail, they would find experimental proof of  my statements that Financial Brownian motion originating in Human particles Behavior is the scientifically similar  as Physical Brownian motion.

 

In a nutshell, Living Objects  laws are similar as Non-Living ones but with extra variability rather than constant making the former more complex in this context! Hence, the market should be essentially  studied as an energy system.”

 

 

I’ve already found so far in my own real-world experimental studies  !

 

Hence Imagine of Market as an Energy System driven by small Human Particles( also form of Energy) !

 

 

 

 

 

 

 

 

 

8). Science behind Black Swan & How to deal with it in Real World ?

As we have seen earlier, the Scientific Universe /Nature has Randomness & Predictability as inherent duality like Wave-Particle Duality. Despite the fact everything is governed by the  Causal forces, there exist many Causal forces governing the dynamics and beyond a observer level of information relatively. So, linked to Godel Incompleteness Results in Mathematics,  In any system there will always exist the zone of Unpredictability as well as Predictability and they exist together. In any day to day system, there will always lie some information beyond the system. This external information beyond the system is the actual cause of Unpredictability zone.

 

There will practically always be some information not available to an observer at any time which lie external to the system !

Black Swan emerges out of this zone  of Uncertainty & Unpredictability which is inherent scientifically in Universe, Nature, Life and  hence in Financial Markets. This would exist in relative sense not Absolute sense depending on the Level of  information available at a time. Hence, Technically Black Swan is nothing but exists in the domain of “Unpredictability, Uncertainty”. Hence , technically the statement “ Predicting the Black Swan is Contradictory as it means Predicting the Unpredictable”. “Predicting the Uncertainty”. I again emphasize it depends on the Level of information available to an observer.

So, what needs to  be done in Real world to Black Swan type events. First of all, one has to understand that it scientifically originates in the Unpredictability/Uncertainty zone.So, Predicting the Black Swan means trying to Predict Unpredictability! Contradictory! If it were Predictable, it won’t have been termed as Black Swan technically.

So, one has to understand the science of Uncertainty/Randomness . As explained in the Randomness section, even Randomness has deterministic hidden pattern and things do converge to their  stable  equilibrium state over the time . This is according to the Laws of Nature/Energy.

I must mention that Black Swan is technically nothing but Large Sudden  Energy Shock/Force where Humans Observers panic and lead to jump or crashes. So, fundamentally it’s Energy dynamics as in Physics!! Hence, one has to technically understand the Laws of Energy in Nature  to handle this sudden jerk !!

 

So, Rather than wasting time to predict Black Swan events, one need to focus on managing risk to be able to survive/ benefit whenever  they come. This can be done by attaining  Convexity ! This is because post Black Swan ,it would converge to the equilibrium state. Those who blow up during Black Swan events are not risk prudent for example  they  are exposed to risk due to leverage or similar fragility etc. The key is to survive during Black Swan. Hence one needs to be Convex enough to survive . This is the part of strategy how we could be more and more Convex.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

9).Background of Factor Investing

 

Factor Investing has traditionally been nothing but the application of Linear Regression Analysis tools on the Historical Return Data. It’s typically associational rather Causal where Linear Regression is conducted to calculate some superficial not-necessarily reliable relationships like beta and so-called Alpha. Especially Error term is often ignored. Factors such as Value, Momentum, Size etc are quite prominent based on some economic opinions which are not necessarily scientific.

One of the most buzzing factors  has been   Value, which has not performed over the years recently causing a global fuss if really Value matters or it has died.. So, how Value is calculated.

So, traditional factor Investing believes in some Economic Rationale or not even that,  relying on LS Linear Regression Analysis Statistical tools  to deduce, Alpha Beta etc. But there are lot of crucial issues. The most prominent is that while performing the regression analysis, it assumes that different factors are like independent variables without mutual relationship. So, in other terms, Value, Momentum etc. are assumed as Independent Factors as the Independent dimensions without trying to verify if they are really independent or not ! This is itself a blunder !! Different so-called factors like Value, Momentum, Size etc are not necessarily Independent but highly causally dependent on each other. We will see this later in scientific terminologies rather than superficial economic rationale.

Now, one can imagine that based on the backward looking Statistical tools like LS  Regression analysis(which is technically like inbuilt  sample testing  backward direction of time !!), one claims to calculate flawed concept of so called Alpha, Beta etc and that too often ignoring the most vital error components which is the real source of information especially about black swan..

So, what I mean to say that mathematical and statistical tools are not always wrong  but their blind application to real world scenarios is indeed blunder ,. One can imagine how serious this risk is where Billions & Trillions of Dollars are invested based on such misconceptions..

 

We will look into more details scientifically before that let’s first  look at  the foundation infrastructure of Linear Regression and Conventional Statistical Tools.

 

 

 

 

 

 

10).Application of Statistical Tools in Finance : Misunderstandings : Role of Time in Mathematics/Statistics in Real World.

Statistics Needs to be Scientific: They are always Backward in Time: Causal Science can make them look forward in Time.

Fundamental issues in Conventional Mathematical Tools in Finance : Classical to Quantum Mathematics !!

 

Statistics is heavily applied in the world of finance. Almost all the financial models so far uses statistics whether for risk management or prediction etc. But the fundamental issue with all the conventional statistical models is they all are backward looking in the direction of time. This ignorance of understanding the role of Time dimension has possibly  made the Financial models a scam in itself any be. Mathematics has always been studied independently of Time assuming absolute for all. This Platonist view is the cause of misunderstanding  and misapplication of Statistical tools  in Finance which suffers from Hindsight Bias.

Financial Statistical Models are always built in Past data in the direction of Time. At Time T=0, a modeller does curve fitting for Data over T< 0.

But the Modeller doesn’t understand ignorantly or by inertia that T> 0 & T<0 are not the same in the world of Statistics/Finance.

By looking Backward in the direction of Time, Modeller rules  out the Randomness component which is present while looking in future way back then. Future Direction of Time has many possible paths randomly but Backward Direction of time has just One Deterministic path which actually occurred and on which Data fitting is done. This foundational blunder of ignoring the Randomness in Backward Direction of Time is the core of all the issues in Financial Modeling based on Statistical Tools. The entire estimation of Risk, Prediction etc based on such Statistical Tools ignore the vital role of Randomness which a trader actually undergoes while taking decisions in real world. Unfortunately, Out of many possible unobserved paths in future, the statistical modellers only takes the observed path in the backward direction of time.

The fundamental issue with Statistics is that it is  always done in the backward direction of time  based on historical data. That’s where the role of Causality & Science comes in. Causality tries to look forward using scientific mechanism . Let’s imagine in physics, we are using statistical analysis of past trajectory to trace out the future Trajectory ? Does it sound awkward ? So, then how we do that in Finance ? Finance has to be studied like Scientific Physics and that Principle in forward direction of time .  This ingrained psychology of statistical analysis in finance should be replaced by scientific Causal anlaysi of future where we could study Randomenss, Determinism, Human Behavior etc..And yes, some investment and risk strategies are causal based on scientific principles. We will talk about it later.

Hence Psychologically engrained  Backward Time Looking Statistical Analysis must be discarded and Forward Looking Scientific Causal Analysis like we do in Physics etc must be adopted. This would also be the key to Scientific Risk Taking & Management and dealing Scientifically with Black Swan type events in Real World.

The fundamental issue with the financial models is that these statistics tools,methods are all backward looking in time. They are not forward..it’s a paradox/Contradiction in itself that Backward Looking analysis tools are applied for Future analysis in the dimension of time.

 

The tools like Backtesting etc are the subset of that unscientific understanding of Role of Arrow of  Time in Statistics in Real World.

 

That’s the reason why Finance has to be developed in forward direction of time as we do in Physics. Do we Backtest in Physics to predict the Trajectory of a vehicle /Car or we study the equations of of causa forces. This fundamental psychological change has to be brought in the world of finance where the trajectory of a stock price is studied scientifically by analysing Causal forces, rather Backtesting.

Even for those who simulate Randomly in future like MC Simulations etc must understand that There could always be more Scenarios in Real World than one can simulate using Computers or otherwise. This is fundamentally related to Godel Incompleteness Theorems.

 

The point is Finance has to be studied in scientific way in forward direction of time like we study physics. That’s the way, we can have better understanding of risk-return in real world finance..For that ,we will have to understand the Causal concept of Uncertainty/ Randomness in Life/Nature/Markets. This is is because by understanding this concept, one can take Scientific decisions to build the portfolio or otherwise.. Otherwise all those Statistical based tools like Sharpe Ratio, Drawdown, Correlation etc. Out of historical data or observed data is just the tiny subset of all the possibilities and hence hugely misleading for understanding risk.

Like in Feynman Path Integral approach of Quantum Mechanics /Paul Dirac Principle of Least Action, Explore all the paths not just the observed one.

 

Now let’s  look at the Fundamental Compatibility of Mathematical/Statistical Tools in Finance.

 

 

Traditionally the tools that are often applied is Probability Theory, Statistical Expectations Operators, Euclidean Calculus(Stochastic) , Linear Algebra etc. Let’s look at their origins. These tools were  originated by contemporary mathematicians when there was classical world developments in physics by Newton, Leibniz , Gauss etc.. Over the time Physics travelled from Classical Newtonian world to Eisntein’s Relativity theories to Quantum Mechanics etc..but the mathematical tools that originated to support physics or otherwise remained trapped in the Platonistic philosophy of view which would remain constant and independent in Classical forms. But as long  as they are applied to Classical worlds, they are fine but when they are forcibly applied to Quantum world say Quantum Human Behavioral aspects like Markets, Finance,Life etc. These fundamental issues start raising up that how compatible those classical world Euclidean space mathematical/statistical tools are with the Quantum worlds particularly Human behavior!

Let’s take an example : First with Statistical Operations like Expectation E[] which are used to calculate moments, and the Probability Theorems.

Are in Real World Quantum Human Behavioral Space-Time Expectation Operator Formula is the same as for the Motion of Non-Living Natural Celestial Objects. To remind that LS based Regression was first formulated  to apply for celestial mechanics.   In Real Life, does this traditional probability theory or Statistical Operators hold true ? Do we calculate Expectation in our day  to day life by summing the product of  probability and their outcomes. No ! If we look into the Logic behind this, it comes from the understanding of Logics &  Physics of those Legendary mathematicians. That time Quantum Mechanics, Relativity,Principle of Least Action etc were not known.  

So, to make the mathematical/ statistical tools compatible enough for the Quantum Human Behavior these Probability theory, Euclidean Calculus etc. Not likely true. There and to be developed Quantum Mathematics, Quantum Operators, Quantum Expectations etc as it works in in Human Quantum world. Say for example to explain the difference, in classical world, if observer conducts an experiment 10 times, it would get the same outcome, But in Quantum world/Quantum Human Behavior, one can’t expect the same Deterministic Outcome of Expected Results in 10 Quantum Trials. One can realte it to Markets which is also a Quantum System   Fundamentally. There would be Uncertainty in the Expectation / Expected Result. So, the  foundationsl point is Quantum Human Mathematical Tools have to be developed like Riemannian Geometry was developed for Einstein’s General Theory Of Relativity. We see mostly financial people using the classical conventional tools to prove and disprove n number of  results in finance and real world without understanding the fundamental discrepancy and incompatibility somehow. Blindly applying those statistical/mathematical operators, tools etc. And blindly applying to make investment decisions. 

So, the fundamental requirement is to develop new mathematical & statistical tools operators for quantum human behavior and hence financial markets . Thsi is because classical Euclidean type mathematical/statistical tools might not be fundamentally compatible with quantum human behavior. Hence, it doesn’t make true sense to model financial markets, risk etc. In real world.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

11).Regression Analysis, & Paradoxes

                                   Y =   

       

 

Let’s assume the Linear Regression Equation

                                          

 

 

       

 

Where Y is the Dependent Variable , X is the Independent variable and α is the Error Term of the Linear Regression.

                            X =   +*Y +

 

So, we can see that Association ignores the Direction of Time here. . It could be both way But Causation has direction like Cause and Effect at different Times subsequently.. ! To determine Causality, we will have to block all other paths/factors and test the direct effect ..for example…Stocks in different economic scenarios…or other company specific factors to test the causality …

For example: The Sun rises in the morning and the bird sings in the morning .They are obviously correlate but to test if Bird singing causes the Sun to Rise  We can test if the bird sings when the Sun doesn’t rise on cloudy days or if the Bird doesn’t sing, does the Sun rise or not… We would find that this is not the case. That  means it’s association ! Like two cars running on road would be associational not causal and they could suddenly change the direction after sometime if roads diverge suddenly…So, misunderstanding those two cars relationship is causal could mislead suddenly! Similar is like Factor Investing. If  certain factors work doesn’t mean they are causal..they could be misleading and could be by chance unless Causality is established !

If this is causality the absence of one would definitely affect the occurrence of the other event. If A causes B then if A doesn’t occur, B should be affected every time. That need to be tested.

But we do have to accept that there could be many more unknown causal factors that we might not know which could be in the form of randomness errors…!! So, that’s why Randomized type algorithms (e.g. RCT, Algo wheel etc.) type approaches would be required at later stage to deal with them !!

 

 

 

Point 1) In the first one variable Linear Regression Model (LRM), what is the most important term is the Error term(residual ). This represents the Randomness ,Uncertainty component . The error term is the TRUE Origin of the Black Swan events !

 

What is to be noted that in the above Regression formula, X1,X2 etc. are independent variables usually ignored in Factor based Investment  Models.

For example, while doing Linear Regression analysis in Factors, why is it assumed that different factors like Value (HML), Momentum, Size (SMB) etc are independent ? Are they really independent ? No , they are not necessarily Fundamentally Independent !  so how far this Regression Analysis application to calculate Alpha is accurate ? The entire calculation of Alpha itself violates the fundamental assumptions/requirements of Linear Regression  

 

1)Reverse regression Y on X and X on Y are different

2)Relativity of Beta when many independent variables X 1 ,X2 etc are included. Beta changes …

 

Statistics  is NOT wrong but Statistician needs to understand the statistical terms well in real world applications.  This error term which is often ignored is the most vital component in the real-world financial applications. This is where the Mystery lies !!

 

 

 

The General Structure of Machine Learning Models Like the Regression and other ones is the following

 

 

It’s all about finding the suitable functions . Here also once can see that though  could be many but the error term still exists which is the source of Black Swan Tail Risk.

 

The key issue is while most approaches on finding f() based on historical data fitting, what should be the focus is on managing future error terms (randomness uncertain components). Little effort is made to focus on Randomness (Error), most effort is focussed on Modeling Deterministic aspects and assuming Error/Random term would be expectedly zero or like that. This is what the core blunder is in real world applications.

That’s where the real world model development is needed…Managing the Unknowns and Random Components in the form of Error is equally rather more important from risk /black swan type events..

 

Can you predict your own life using Regression Analysis ? How far ? Market is essentially the resultant of all Human Behaviors ! Need not forget.

In context of Black Swan ,it is said that Real World is more Random than usually understood Indeed that’s right ! But then Randomness is also not Truly Random..It  follows Deterministic Laws of Nature. It’s basically paradoxical in the form of Duality in Nature, Life and Markets.

One more fundamentally important point is that Stock Prices don’t move on Euclidean Space of Paper in reality . This is virtual trajectory. In Real World, the Stock Price moves in different Non-Euclidean /Riemannian/Some other Quantum Economic   Space-Time which captures Human Quantum Minds etc.. Then we are trying to superficially/virtually draw the trajectory on the two dimensional paper . So, while doing Linear Regression Analysis , it should be independent of the intrinsic characteristics of the underlying space. Hence, whether Regression Analysis is done on Euclidean  or different Geometric Space (as in Machine Learning etc.) the true physical relationship must be invariant. Hence, these parameters like Beta etc depending on the slope of line in the Euclidean Space-Time could be fundamentally misleading !! Because in different Space-Time, Beta (calculating using Euclidean Metric Least Square Distance )could be possibly different but it has to be independent of the underlying Space. This is extremely fundamental issue while doing conventional Regression analysis . In a nutshell, this Beta relationship couldn’t be in the Real Economic Space Time but the characteristics of the space of the paper on which we have superficially assumed to draw the trajectory of the stocks !

 

 

 

 

 


12)REGRESSION PART2

 

A)Causation is in the dimension of time which LS Linear Regression doesn’t take into account….Time dimension…it treats them statically in timeless dimension only space

We have to understand in real-world perspective that Causation is established in the Space-Time dimension not just Space. As in the Physical world,  cause occurs at say for example T=0 and effect occurs at sometime in future T= t. If we ignore the dimension of Time and take only Spatial locations into consideration, it becomes just association not causation. For Causation we have to test the Cause and Effect in Space-Time not just Space. Unfortunately, LS Regression is being conducted particularly in Factor Based Investing and otherwise in Finance ignoring the Time Dimension. This omission of time makes the entire relationship like associational of patterns that could be just co-incidence but not scientifically causal.

By running the following LS Regression

 

We often inherently assume that X  and Y both  are able to transmit information causally “simultaneously” at time T=t  at more than the speed of light, but how this is possible ? If they are really causal, then first X should occur at Time T=t and this should cause effect to Y at some time in future T = t+n where n >0.  But  in the conventional LS regression it is inherently assumed that X and Y both are cause and event at the same time which is contradiction in real world scientifically in Nature.

It otherwise proves that X and Y relationship is Associational based on Superficial Patterns either caused by Coincidence(Like two independent Cars moving on the road in the same direction misleading an external observer to be causal to each other) or they are caused by some hidden causal mechanism of some other variable known as Confounder. The above LS based regression can take place at the same time “Simultaneously” iff there is hidden Confounder(if it’s not coincidence!) as information can’t travel at more than the Speed of the Light ! (Of course it’s not Locality-at-distance logically!)

 

So, for Causational Proof, the Equation has to be in the form of do calculus. This is causal intervention  that if X is causally set  to value x at time T=0 for example how the effect on  Y behaves in  future value of Time =t  and that time t can have different values on case to case basis.

I have been advocating for long that Time Dimension is often ignored in the Statistical and Mathematical world(Timeless Platonic world) unlike Physical world(Space-Time). The role of time appears non-sensical often but at deeper level it makes huge difference! Many of the Paradoxes in Mathematics originate also due to this omission of Time as well fundamentally. Say for example, Theory of Relativity considers “Simultaneity” as Relative while Conventional Set Theory in Mathematics rests on the Principle of Simultaneity as an Absolute Phenomenon. This is beyond the scope of this paper. But I mentioned to show how this foundational misunderstanding percolates down to statistics and Association in LS based Regression is often misunderstood as Causation !

B) Let’s go back to the history if LS Regression method. It was first formulated by Mathematicians like Guass, Legendre in 1722 to around 1800 for the estimation of  Trajectories of the Celestial Physical Bodies like our Earth in the Euclidean Space.

That’s  mathematically good estimation as that physical space-time behaves like Euclidean Space-Time. But the foundational issue arises when that Tool from Physical Space Bodies is being applied to the Financial & Economic Space-Time. The financial and economic space-time driven by quantum human minds is not essentially  an Euclidean Space-Time rather they have Non-Euclidean/Riemannian or some other Metric Space-time in Real World. Statisticians import the observations from that Space to Euclidean Space of Computer Screen or Paper to draw and apply LS Regression ! But this raises fundamental question that  the Euclidean Space of Paper and Metric calculation doesn’t bias the Estimation of Financial & Economic Variables from a different Non-Euclidean space-time ? I am stating this because the way Least Square method is developed, is it Not inherently  dependent on the Euclidean metric relationship ? What if these data are drawn on some other Non-Euclidean Space ? I mean this Relationship should be independent of the Underlying Geometry of Space where these are drawn. The Real -World Economic & Financial Space-Time is not Euclidean !! This a point of exploration as even in ML methods, often Euclidean metric tools are applied. But it could be Non-Euclidean and other Riemannian Metric Spaces as well to better discover the relationship.

 

But anyway let’s confine to the Euclidean one that is generally taken to derive the Least Square based Regression Method for a while ….

Least Square based Regression Tool is due to the algebraic structure of Least Square Formula where it minimizes the Sum of Squared Differences(we can say them Euclidean Errors!). If we change this objective of Minimizing the Squares to something different, the entire Calculation of Beta would be different and the value of Beta would change ! Beta is actually dependent on that Euclidean method. But why we minimize the Sum of Errors is itself under question ! How far is this process effective to take into account the Outlier in Real World ? Let’s say we don’t minimize the Sum rather we minimize some other functions that could be more suited to Outliers for example from Tail Risk Perspective. There could be different values of Beta on the same set of Data the way we define the minimization function of errors. It’s not Absolute ! It’s Relative ! The important concern is that in Factors Investing and Finance & Risk Management LS Beta is blindly applied for Allocation, Risk estimation that has proven to be disastrous in Real World  in case of extraordinary(outlier) scenarios.

 

Further, one more important aspect of LS based Regression is the Exogenous Condition that

   E[error] = 0

And if this exogenous condition of error is not satisfied, the whole estimation of Beta is unreliable and biased ! This is the fundamental reason why in Real World Finance, Beta is often biased and variable in Financial Time Series data.

Exogenous condition means the independence of Error terms E[error/X). If Exogenous conditions are not satisfied, that means the Error itself is the function of some hidden relationship . It could be that Error is  dependent on Y or X or some other hidden variables which govern the dynamics of the error and which in turn make Beta unbiased and unreliable as both are mathematically related ! hence the internal dynamics of Error has to be established otherwise it would affect the other aspects of the LS Regression. Even there can be different dynamics of Error say if Error itself is some other variables regression

If such is the case  value would be highly misestimated ! This actually happens in the Real World Finance & Economics Time Series data !

 

It’s like in algebraic equation

In this case the   would be highly misleading ,biased and incorrect mathematically.!

Infact such conceptual mistakes highly mislead in the real world finance and economics where beta(technically slope) becomes too variable dependent on the data time frame etc…and exposed to Black Swan Tail Risk affecting Trillions of Dollars of Investment globally affecting common people and Investors’ lives.

Error basically means Random Component ..all the deterministic components of Y dependent on Independent variables like X  have to be removed from the Error term.

So, In Financial /Economic world, unless Error satisfies the Exogenous condition , the LS estimate value of  will not be unbiased.

If the expected value of the Error term above is not 0(the violation of Exogenous condition ), it means that there is still some hidden deterministic variables relationship to be discovered inside error terms. The error term like constant  as in the equation is not a constant but a function of some hidden variables or even .

The correct meaning of  is that uncertain Portion of . In Real World example the error term represents that Unpredictable component when we try to explain Y in terms of X causally.

It's like we intervene by the cause    at say Time T= 0 and measure its effect on at Time T= t. Infact in true causal sense these two occur at different points in time not at the same time(simultaneously) because information traveling takes some time. Then we try to understand what uncertain random component of  Y is not explained by the deterministic components of the equation. We call this as the error  causally. But this omission of Time dimension makes the entire thing superficial associational  relationship and just an artifact of Euclidean metric space !! The way error is defined as Y – E[Y/X], it shows that the error term is not independently defined.

 

This is like B1 Spurious case as explained by Prof. Marcos Prado in his interesting paper.

“This is most likely valid in the field for which LS Regression was formulated in Celestial calculation but not  often in the Financial and Economic Space.  The Real World Data hardly show the Exogenous condition Satisfied.”

Econometricians can’t define error deriving from algebraic equation like Y- beta*X rather error terms should independently satisfy exogenous conditions E[error ] = 0

Error should be independent of X

Exogenous conditions means Regression assumes deterministic set up.

Expectations of error or randomness to be 0

Correct meaning is if we do[X] at time t= 0 then at  time t >0,   Y should equal to X*beta.

Error means Unexplained part in Causal relationship of Y due to X

Most Machine Learning Tools focuses on the deterministic part but there is need to focus on the Randomness(error) part and its underlying mechanism and how to manage those unexplained, uncertain components.

Present regression at same time means it assumes the role of hidden confounder.

What is important to observe that in LS regression, the order of variable is also very important.

The two regression lines below are not derived from each other with respective coefficients.

 

That means in general LS regression method

                              

                                   

This reveals the internal dynamics of LS Regression that the Minimizing the Sum of Squared Errors method is not symmetric in this sense. The Order matters and at the same time the values of the Slope and the Error terms depend on the error and are not inter-linked. This implies that in real world financial and economic applications, before applying this associational LS regression, one has to make sure about the Order otherwise the derived slopes and randomness error would change leading to different estimations of risk affecting allocations etc. This also shows that Error itself is relative and order of variable dependent on LS based system . But in the Real World, the Random Components should be linked to each other if the Variables are the same just order changes.

Coming back to the Commentary : the entire system is getting misled superficially by the Associational relationship based on LS type Regression affecting Trillions of Dollars Globally.

The way sum of squared error minimization has been defined in LS system, it’s dependent on the Euclidean type system.

Econometricians assume they are doing Causal but the mathematical tools used are associational.

If LS system is changed, Beta will also be changed..

Question why LS system ?

Rather than minimizing the sum of the squares, there could be other better approaches as well from tail risk perspectives. (This is the other detailed research topic in itself !)

There is no fixed Beta . The value of Beta  would depend upon how the  Error terms are dealt with like LS equation. This is because Beta is derived by minimizing the Sum of Squared Error Terms.

 

Lets say there are different points

 We are trying to find the regression equation

                          where

Here the values of  are those which minimizes the function

 It shows here that at time T=0  the error(randomness) terms are figured out, then at time T= t  >0 they are minimized mathematically to derive beta (deterministic)term ! This is also like In-sample derivation of Beta term(Deterministic Term) from the mathematical minimization of  function of Random Terms!

Please imagine the role of Time dimension here

 

 

Lets say the Regression equation

                         

                                      

Here it shows that Randomness Term is the Function of Deterministic Term and that’s why we take the derivatives to find out the minimum.

That mean at Time  T =0, we try to estimate the Error first and then calculate Suitable Beta at Time T= t >0 , Please note the role of Time Direction as well. This is because for Causation, role of Time Dimension is the fundamental requirement in Real World. By the way,  This is like In-Sample Estimation where Beta is searched by deriving from the  Error terms.

Ideally what happens in the Real World… We should pre-estimate Beta(Deterministic Expected Term) at T=0 and then measure Error Terms “INDEPENDENTLY” at T = t > 0 and study the Exogenous Condition E[ error/X] =0 if that is satisfied or not ! In the existing previous approach, “INDEPENDENCE” is completely compromised as Beta is derived from the Error itself !!

This is in principle foundationally incorrect leading to Biased estimation of  Beta in In-Sample way.

So, the entire traditional Beta Estimation in LS approach is like In-Sample. It doesn’t tell about Out-Sample Beta in Real World in Forward Direction of Time! Especially for the Financial Data which is often unstable, this could lead to huge misestimation in real world out sample result.

As we have seen earlier, the LS approach  was formulated for Celestial Physical Bodies which is Classical World Stable Data unlike Financial Data driven by Quantum Human(Trader & Investor) Minds

 

 

 

 

14. Exogenous Condition :

LS Sum of the error minimum when Expected (error)= 0 . LS approach relies on the Expected (Average ) figure where Average is hugely misleading Statistical Tool in Financial world with fat-tailed data.

The issue is even Exogenous Condition is not sufficient in Real World Finance because even if Expected i.e. Average is 0 satisfied but there is large negative movement and then subsequently large positive movement but the system could get exposed to Tail Risk Bankruptcy in the large Negative Error term and can’t even wait for the next Positive Upside Error Movement. So, from that Perspective of Tail Risk, even Exogenous Condition is not sufficient. It’s another detailed discussion of its own how these conditions should be re-framed !

 

 

 

 

 

 

 

 

 

 

15). Beta is not Absolute but Relative.

Beta depends upon how the error is dealt with and also with respect to what other variables are .

The Exogeneous condition of error is extremely important.

So, probably one test is Take Beta to that level until Exogenous condition on Error Randomness is satisfied independently !!Then only Beta would be unbiased. Otherwise, it means that the Beta is biased as Error Term itself is some function rather independent ! As long as the error doesn’t satisfy exogenous conditions independently, it shows there are some more factors(contributing to different betas … deterministic components) to be discovered….

Moreover Beta value depends on how Error Term formula is tweaked like in LS form, There can be other forms as well. Beta would be different in various cases. It’s all dependent on the method followed.

Beta should be such that it should maintain a balance between tail outliers and normal depending upon the requirements.

If Error has intrinsic hidden further pattern ..it would make the existing beta biased and different types of structure within error would affect the dynamics of  beta and further may be causality or not.

21. Hidden Dynamics of Error Terms :

Now Further, There are different  possible scenarios….Error terms have some hidden Deterministic components which affects X or Y or X affects the error term or even Y affects the error term …in all different cases the validity of Beta would be different

 

Say for example

 

 But if Error term is  biased and Not Random say for example

And further

 

.

In that case, the entire estimation of  would be highly biased as this is derived from  which itself is the function of different variables implicitly rather than independently.

 

Biasness of Beta

It’s like three variables  algebraic equation in x and y and z..you are assuming z as constant and make quadratic in x and y…it could be technically wrong !! Similarly until error term is exogenous and independent…..it would lead to wrong beta estimation !! That’s why in Regression.. independence is of huge importance.

If Error is function of some hidden variables then Beta won’t be correct…how do  you make error is independent ?

Simply doing regression without independence of error is technically wrong  and could lead to biasness and misestimation

Like in Quadratic or Functional algebraic equations if you treat a variable as constant ..it would give  wrong solution

Y = m X + c ….here c needs to be strictly constant ….can’t be a variable related to X and Y or else m would be incorrect…

One can’t treat function as a constant and do functional algebra ..it is technically wrong

So when doing such statistical regressions, make sure error terms is not a function of some related variables or else beta would be biased and unstable over the time…and as beta would depend on error

That’s the reason Beta  becomes unstable. It can be tested in real world  where we see that the measured value of beta keeps on changing rather than fixed constant ! One can experimentally verify this on many financial time series data and check how beta has its own inherent dynamics. Beta keeps on changing dynamically over the time dimension  or otherwise relatively if new variables are included!

Error would change over the time if it’s function ..not exogenous conditions satisfied…

If Error is itself a function not satisfying exogeneous conditions ..it could be the source of Black swan

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

16)Error Terms & Simpson’s Paradox : Simpson paradox reveals a lot about the Reality !

 

A) It reveals that the DETERMINISTIC parameters like Alpha, Beta themselves have Randomness over the time !

B) Further, the existence of these parameters and error terms are RELATIVE  !! THE LINE OF BEST-FIT itself behaves geometrically RANDOM over the time ! What was the Direction of Error Terms eventually becomes the LINE OF Best Fit and what was Line of Best-Fit becomes the Error ! This reveals the Fundamental aspect of Relativity and Randomness over the time for an observer.

It reveals the Fundamental Duality of Randomness & Determinism existing in the Universe,Life  and hence Markets.

The Universe and hence its subset ,the Markets have both the components: Fundamental & Deterministic. One can’t differentiate between the two. They exist Relatively and Simultaneously. Deterministic(Line of Best Fit) becomes Random(Error term) and Randomness(Error terms) becomes Deterministic ( Line of Best Fit)

One has to be very careful while making decisions based on the Regression analysis particularly ignoring the Error terms ! Blindly following the Fixed  Line of Best Fit to make investment decisions could be highly misleading and disastrous in real world !!

So, this points towards extremely fundamental points while doing the Linear Regression analysis : That parameters like Beta, Alpha etc. could  keep on changing depending on the data set and there are no fixed values.

First, Two things tracing the same trajectory  might not be correlated in true sense. Say for example two cars moving on the road would appear to be correlated to an external observer from sky. But that’s not so, may be later the road diverges after a long time and both the Cars could change the directions and befool the external observer from the sky who was assuming them to be correlated by observing their past trajectory for a long time !!. Similarly in markets !!

The Point is to establish the Scientific Causation first and then one can rely scientifically Regression to some extent as long as error is managed well.

For that causation, one has to block non-causal paths by observing the Y  & X while blocking other variables or checking how Y behave  if X is there and NOT there. If X causes Y then if X stops, Y must be affected or stopped !

###

 

We have to understand that Regression is fine  it could help in prediction at times but given the large unexpected error terms at times could be highly misleading at times and disastrous. So, what one should do optimally is to expect along the Regression Line of Best Fit but still be prepared to manage Random Error term which could be huge in the world full of Complexity

 

 

 

 

 

17). LS Regression, Beta, Error, Simpson’s Paradox, P-Hacking, Reproducibility Crisis – Part 3

 

I think Black Swan Theory itself could be exposed to Black Swan without some Causality –Self -Referential Problem !! Causality is to be mixed up to diversify the Black Swan concepts

In this Physical Universe where there is ongoing duality between Predictability and Randomness and consequently inherited to Markets. Human Traders/Investors are like Energy Particles driving the markets. There has to be a balance between Causality and Randomness approach.

LS ..Square of Error is like Variance minimization…It could be Skew minimization or Kurtosis minimization or some other approaches ..for Tail risk. There can be different more robust approaches that can be explored in detail but beyond this paper as of now.

 

 

                                      

 

In LS method having two equations above …Randomness term would be quite different for the order of X and Y because the method to minimize  Sum of Squares and Sum of Reciprocal of Squares would be different !

 

The reason why the coefficients in the above two equations are not interrelated is because Randomness(Error) term varies for the two equations affecting the Beta of the two equations.

If the randomness (error) term is really constant like  these two have been reversible into  where the coefficients could have been inter-related but that’s not the case. The fundamental reason is the way LS approach is done by Square of Error Minimizations and also the Error Term is not Constant rather some hidden function of variables, they are not inter-related in terms of their coefficients !

If really  the LS regression is in the form y = mx + c where c is constant then the order of y and x would  not change m and c i.e. beta and error terms.

But in LS case error is not constant but itself a function. That makes it irreversible and beta and error both change on order reversal of Y on X and X on Y. It is because the LS equation is not Deterministic .It has randomness components as well and error determines beta by minimizing the square of error terms. So both are functionally related ..in LS frame work Randomness (error) derives Deterministic(beta) components!

This indicates very fundamental issue while applying in the real world. One has to be careful why applying LS regression as to which if X has to be Regressed on Y or Y on X as the Beta and Error terms would be different and not inter-convertible respectively !

 

“ In the LS system , Deterministic Term(Beta) is derived from Random Term(Error Term) mathematically !”

This also shows that Error is not independent of Beta and Beta is not independent of error.

 

Deterministic term beta is the function of Randomness (error) term  in LS which is found by the Sum of Square Minimization mathematically. This inherently means that Deterministic Term Beta is also derived from Randomness(Error term) !  And as in Real World  especially Financial Data, the LS -Error terms keep changing being the hidden function of variables, the consequently Beta(Deterministic) term also keeps on changing affected by Error. I have explained this mathematically in the previous section.

 

 

The foundation issue is Direction of Time …In LS regression analysis.based on historical data backward time direction, we already know error(residuals as the difference calculated between y- beta x) ..making error as the dependent function on beta using the formula for residual..so error is no longer truly  independent.

 

But in Real World  Forward time we don’t know  the error first…we  estimate y based on x and then uncertain component of y on x would be termed as error.  so in physical causal terms ..we don’t know error in advance and then its expected value should be 0 ( exogenous condition) independent of X and y..

 

Need to minutely understand the role of arrow of time and independence of error from the deterministic term. It requires the higher level of imagination by learned expert traditional brains how the time dimension is ingrained in the mathematical and statistical developments that is often ignored and causes fundamental conflicts.

So in real-world forward direction , these expectations (beta) and error(Random) occur at different order…first deterministic term expected  and then independent error term…

But here in backward time , LS …we know error (random component)already and then we derive beta(deterministic) part from that using sum of square minimization.

True unbiased beta would be when  independently calculated error term in forward direction of time satisfies the exogenous conditions.

Not that we define the system backward in such a way using LS that exogenous conditions is forcefully satisfied and modify beta accordingly

 This is the crucial difference between causal structure and associational structure. Like in Physical World, Causation is in the dimension of Time and when we ignore Time dimension, it becomes Associational !

This is like we randomly assign

  as in RCT and then what Causal component of y is not explained by x ..that is error and should satisfy the exogenous condition in the  forward direction of time

This Randomness in forward  direction of time is wiped out while doing LS regression in backward direction of time

So, in true sense Beta is not Slope in forward direction…Beta is made slope in backward analysis where error is already known…but in correct  causal structure Beta is related to

          

                     

 

In LS calculation Error(Random) term is made  function of Beta(deterministic) term…

Or Beta (deterministic) term is derived from the Error(random) term minimization…

 

Infact this is generic issue with Statistical models, they are often  in backward direction of time and static , there has to be dynamic and forward looking as the market is like an physical energy system, of course driven by human behavior which is also an energy system !!

 

 

 

 

 

 

17. Paradox in LS- Regression Methodology :

 

It assumes  that  Beta( Deterministic term) originates from Error (Randomness term) ! Hence Deterministic term is also  Random if Error is Random…or else if it claims Beta to be really deterministic then Error term ( Random) is also  Deterministic !

 

It’s paradox in itself  !

This paradox is indicating towards the fundamental loopholes in the LS method of Regression how Beta is derived mathematically ignoring the time dimension in the biased way and if error terms are truly independent.

In Real World Forward Direction, Causation world Randomness and Determinism are independently existing and not derived from each other Deterministically   !

 

In Causal real world  forward direction of time first expectations (D component is fixed ) then error(random component is calculated and then it must satisfy exogenous conditions independently if LS regression is correct and unbiased.

In Associational backward time .. LS error term is figured out first and that using minimization of the sum function, the expected term beta (assumed earlier) is discovered by trial in backward direction of time and then changed to !

This traditional LS Regression is fundamentally like in-sample testing.

Infact correct beta should be figured out by out-sample testing and checking which method should be taken compared to LS approach and  how error terms satisfy the exogenous condition.

 

Both the approaches are technically and fundamentally different.

The Role of  Time dimension is Extremely Important leading to this difference!!

In backward LS regression, you derive beta(D term) based on Error(Random term)! ..In forward real Causal world, you fix Beta(expected deterministic term) and then derive the error (random) term independently..

 

I am repeating this time and again to bring to the conscious state of readers’ mind what they have been doing over last 50 years around.

To use regressions for prediction or to infer causal relationships, respectively, a researcher must carefully justify why existing relationships have predictive power for a new context or why a relationship between two variables has a causal interpretation. The latter is especially important when researchers hope to estimate causal relationships using observational data

The result of this In-Sample Testing is clearly visible when testing on the financial data how the measured values of Beta keeps on changing over the time unstably in real-world influenced by the error terms in the forward directions. Beta also behaves Randomly over the time ,which is supposed to be deterministic !

Beta change over the time and data shows that the Things are more Random actually not as much Deterministic as LS regression is expected to be.

How can Physical things in 3  dimension be explained clearly in 2 dimension…..

How can Time dimension in Real Physical Causal world be explained in 2 dimensional Timeless Space ( Mathematical) ?  Big Incompatibility…visible in LS Regression … Time (Causal) becomes Timeless(Associational) because of this reason !

That’s why Beta keeps on changing over the time, showing randomness not Deterministic aspect…Ref (Simpson’s paradox)

This means LS Regression inherently  assumes Randomness and Determinism are derived from each other algebraically using fixed deterministic rules…using LS method . ?? .but in Real Causal World  is that the case..? No ! These things are extremely fundamental not be overlooked for this is the reason in real world linear regression misleads dangerously at times.. This misconception affects the investment allocations and all in the portfolio management and miscalculating the risk factor(beta) could lead to disaster !

In Real  Causal World Randomness and Deterministic components come from different origin of  information !

Key Point :

That’s why if one uses LS Regression type in Real Causal World,  one should simultaneously apply  the RCT (Algo Wheel type ) method to handle Random (error term) along with Causal Deterministic terms. This model error and biasness in beta estimation using LS Regression has to be hedged by Strategizing the Randomness Error Term.

Without Proper Strategy to Manage Randomness Error Term, it’s highly dangerous to trust Beta which is often ignored in day to  day life financial investment decisions.

Skill Vs Luck Or Managing Luck also a Skill ?

It’s often said Skill Vs Luck to measure Alpha but I state that Managing Luck (by Strategizing Randomness) is itself an exceptional Skill !!

That’s where the role of Randomized tools   tools like RCT,AlgoWheel etc.

So, either assume Randomness and Deterministic are linked deterministically using mathematical formula or then LS regression is technically incorrect !! It violates  and rules out the mechanism of Randomness and Deterministic only

This means LS Regression shows  the world  has both   Random or Deterministic components.

But in Real World Random component has its own  intrinsic  determinism also but that’s not derived from the  external deterministic components like in LS Regression.

 

 

 

 

 

 

 

 

 

 

18)Simpson’s paradox :

 

Error and Beta in LS are Relative…Errors become Beta and Beta becomes Error direction wise in different reference frames. This also points that LS regression tries to determine the two dimensional system of points by one dimensional Beta measure while ignoring the direction of Error which is complementary to Beta. Or need to introduce two dimensional beta in place of one dimensional beta.

Infact this shows that LS regression approach is not suitable for deriving absolute relationship. There is fundamental instability in Beta derived deterministically using mathematical formula from Errors.

This also questions the validity of such regression in the real world. In another perspective, most fundamentally Simpson’s paradox shows the fundamental duality between Randomness & Deterministic terms interchangeably.

To figure out the relationship between the variables.

 

The measurement of beta in LS-Regression is Data dependent. & Time dependent and unstable especially in financial data.

There is no fixed absolute relationship and the Beta (deterministic term) behave more randomly influenced by the error terms.

In LS method…Beta derived from Error so in real world…need to manage error as well along with beta. It’s often taken  beta and not the error terms,

But the origin is error only in this flawed LS approach in real world

Simpson’s Paradox reveals the importance for Causal relationship where one should reject non-causal betas and accept causal betas like extraneous values in algebra.

It’s extremely careful thing to apply in social decision making as well where we get misled….by beta type relationships

Rather than making Beta one dimensional…may be you can make it two dimensional. Along x and y coordinate axes….

*Two dimensional concept of Beta and Resultant rather than One dimensional*

There would be dynamic equation for Beta and Error components as well

Two dimensional pictures of LS Regression. Simpson’s paradox

One direction analysis doesn’t fit for two dimensional movement of data ?

Beta and Error are Relative..Time and Data frame dependent…not absolute

So, it should be like Beta of the stock over last one month is 0.6….Beta of the stock over last 6 months is 1.2…like this…

 

 

 

 

 

19. Beta as the Time-Dependent Figure.

 

Beta   should be talked in the dimension of time like

                                   denoting the value for the time t,t+n for the financial time series data.

Like Beta of Stock over 1 month, 6 months, 1 year ,10 years like that. This is because the values of Beta behave more randomly than expected influenced by the dynamics of the errors in LS regression especially for the financial time series data. Beta keeps on changing over the time unstably over the period of time. Hence simply representing Beta as the constant term independent of time dimension is highly misleading in real-world causal financial world !

This traditional definition and representation of say Beta of a fund is 1.2  for example is fundamentally misleading and incorrect in the real world the way LS Regression is derived.

 

The Fundamental Role of Time Dimension is Extremely Important which is ignored Traditionally in the Investment Industry.

Since, the Physical World is Causal in the dimension of Time, Statistical & Mathematical tools used in the finance have to be essentially incorporate the dimension of time. LS Regression parameters ignoring the time dimension is ruling  out the Causational and making it Associational by ignoring the Time Dimension. And, it should be mandatorily be followed the way financial investment and risk industry  has been traditionally.

 

Beta not Static as usually treated…. Simpson’s paradox --- Ignoring Time dimension  of  data frame…even beta has time dependent function….not constant as in LS….not to be treated as constant static as usually done….dynamic

So, as it’s two dimensional thing, one has to study both the dimensions…(error and beta ) ..error as well..

Need to study dynamics over the time…not just static

The most fundamental reason is Ignoring Time dimension in mathematics and statistics in Real World Physical applications where Time is crucial for an event to occur.

Psychology and Social application of Simpson’s paradox

Error is also a direction….

Errors and Beta are like X and Y axis… Should be Independent

Dynamics of Statistics over Time diemnsion can’t get captured by the Static  Capture of Statistics*

Beta & Error are functions of time ..ignoring that is leading to many paradoxes and conflicts

Simpson Paradox also shows the fundamental issues with LS regression method

Taking investment and risk decisions in real world based on static view could be highly misleading.

 

 

 

 

 

 

 

20). Simpson’s Correlation Paradox

Say for example : X & Y show positive correlation over  1 year statically ..but they are infact negative on daily basis  dynamics

Or they show negative correlation on 1 year but they are positive on daily basis dynamics

Simpson’s Contradictory statement --- Role of Causal Time Dimension

& Correlations  can take many different values over different time and data

Real-World Judgement can have huge variation

No value of Beta without Error terms

Simply taking out Beta(Deterministic component) (direction) hugely misleading* *Hence handling (Random Components)Error is crucial and complementary*

It’s two dimensional thing so taking one and ignoring the  other could be dangerous*

 P-Value Hacking

 Exogenous condition: E(). = 0 but in real world if Errors are too large positive and Negative then also E( ) = 0 but that doesn’t mean it would work in the Real World as any Large Error can bankrupt the system even if average is 0*

 

Least Square  Method is basically  Minimum Variance Principle but Variance may not be the true statistical representation always especially in real world finance.

Hence just Exogenous Condition wouldn’t be helpful in the Real World!!

 

 Even if E( ) = 0 but it could bankrupt in the Real World..what matters is not the Expected Value rather the dynamics of error term. Later on we will also see that this definition of Expectation Operator could  itself be fundamentally flawed in real world.

 Beta is meaningless unless Error is managed well !!

 

Deriving Beta from Error (Residual) and then Calculating R^2 by involving them ..it's totally Insample thing ..even high R^2 can’t be relied as Beta is derived from Residual only…

P-value Hacking : Backtesting Randomness Best Cases selection based on Random Parameters testing….

In context of Reproducibility  Crisis by Prof Prado That's not Independence ....It should be like you take random trials and then see how do you perform...not that try 1000 trials and select the best....Best varies over that time....Best at Time T =0 might not  be equal to Best at Time T = 1

It's like in sample result to select the best and leave the rest.

 

Will explain later in this article on Convexity that how each Random Trial should have some cost involved and how it’s related to Reproducibility Crisis !!

 

 

 

 

 

 

 

 

21).Equity Risk Premium Puzzle  : It’s  indeed the bug in Factor Investing statistical structure based on the Linear  Regression Analysis which we have discussed earlier…Capturing Human Behavior dynamics based on Linear  Regression analysis deterministically while ignoring the randomness component is deeply flawed !  This puzzle arises because we misunderstand the concept and foundation of Regression Analysis.. Any regression  based model to determine this risk-premia will have fundamental concern… This is simply related technically to the fundamental issues of randomness  and structure with such Regressions on which CAPM and Factor based Models are based… the foundational issue is in the statistical tools itself on which such model puzzles have been developed !!

So, the ultimate solution to one of the most important historical puzzles in finance is to look into the flawed conceptual  misunderstanding of the statistical foundation which has given birth to the so-called puzzle. The origin of the Puzzle is the loopholes and misunderstanding in the statistical infrastructure which has given birth to such superficial puzzle. Once that is understood, the puzzle would cease to exist automatically ! So, this puzzle infact hints to revise the foundation !

I must again categorically mention : When we get the extraneous solution of any mathematical equations, it’s not that mathematics is incorrect rather the assumptions while applying the mathematics are incorrect. Similarly, here we need to revisit the foundation of the flawed assumptions of factor investing and the statistical infrastructure like Regression. Coming To Factor Investing Analysis : As the Byproduct of Regression Analysis

 

Equity Risk Premium Puzzle :

 

The main  reason behind this Equity Premium puzzle is fundamental structure of the Regression based CAPM model, Misestimation of Risk Calculation and Ignoring the Time Dimension.

Often Risk is measured in terms of Beta based on Historical data in backward direction of time using the same flawed estimation of LS regression techniques. Even Traditional Tools like Max Drawdown etc doesn’t capture the true Risk in the Real World. The fundamental reason is Time dimension is ignored. If we look in forward time direction, there could be many possible scenarios that an investor is willing to take while investing in Equity but not all such possibilities actually occur. Say for example an investor in the time of Uncertainty is willing to take risk of even market falling -50% say due to some war or some viral outbreak etc.. But somehow, market didn’t fall that much and only -25%. So, when regression  is done on historical data or even max drawdown is calculated on historical data in backward direction of time,  this possible forward scenario of -50%(which didn’t occur) was not included ,which an investor actually took in forward time uncertainty.

Existing Conventional Tools based on Regression Analysis and Max Drawdown, Sharpe Ratio etc. often don’t  take into account the possible future real risk which didn’t occur but was a possibility in forward direction of time ! infact that was the real-time risk an investor took in forward time direction , but was omitted in backward analysis of time ! This could hugely underestimate and overestimate the risks.

So, the fundamental reason of Equity Risk Premium is the

·      Misestimation of Risk in Real World by Ignoring Time Dimension and Traditional Backward Looking Tools e.g. Beta, Share Ratio, Max Drawdown etc.

·      Possible Randomness in Future gets omission in Backward Analysis!

·      Fundamental Issues with LS -Regression based approach (the backbone of Traditional Factor Investing)

·      Also, the omission of Fat-tailed Outlier data which could drastically change the  estimated risk parameters like Beta etc..

Hence, unless the Risk is Truly measured in Forward Direction of Time( Not on Historical Tools), the true estimation of Equity Risk Premium would be inaccurate.

You are measuring the estimate based on back ward looking data….in forward real world risk is way more than backward looking due to Uncertainty…

 

*Because equity risk premiums require the use of historical returns, they aren’t an exact science and, therefore, aren’t completely accurate.*

 

The estimates also vary largely on the time frame and methods of calculation…removal of outliers fat tailed negative data

Relying on CAPM type regression model to do  the historical analysis for equity  risk premium based on Beta etc.not reliable enough

Risk should be truly seen in the forward direction of time not backward analysis of Historical data using regression models

In real world we see how much risk we take for equity investing over the Risk-Free rate etc.

Measurement of Risk in the form of Beta using Regression on historical data is not correct in RealWorld!!

Real world Risk in forward direction of time is way more than the backward analysis of risk using regression tools

Lot of randomness that appear in forward direction of time in real-world doesn’t get captured in the backward analysis

For example : risky scenarios  that could have taken place while taking investment decisions  in fhe forward direction of time ..but say they didn’t actually happen …will not get captured and lead to less risk calculation on historical data  regression analysis…

.: Misestimation of  actual risk in forward direction

·      Risk is hugely misunderstood on backward Anlaysis and in forward directions..many disastrous possibilities had the chance which actually didn’t occur and hence don’t get captured truly in the risk measurement.

An equity risk premium is based on the idea of the risk-reward tradeoff. It is a forward-looking figure and, as such, the premium is theoretical. But there’s no real way to tell just how much an investor will make since no one can actually say how well equities or the equity market will perform in the future. Instead, an equity risk premium is an estimation as a backward-looking metric. It observes the stock market and government bond performance over a defined period of time and uses that historical performance to the potential for future returns. The estimates vary wildly depending on the time frame and method of calculation.

Future Beta & Historical beta

An equity risk premium is an excess return earned by an investor when they invest in the stock market over a risk-free rate.

This return compensates investors for taking on the higher risk of equity investing.

Determining an equity risk premium is theoretical because there’s no way to tell how well equities or the equity market will perform in the future.

Calculating an equity risk premium requires using historical rates of return.

CAPM model itself needs fundamental change based on historical regression

How exactly to calculate this premium is disputed

How exactly to calculate this premium is disputed

Equity Risk Premium = Ra – Rf = βa (Rm – Rf)

But how do you calculate future beta in real world   ? Not based on historical analysis

By the way this is associational not causational …

Expected equity risk premium likely depending on the hidden causal factors not just on this associational beta based on Regression where the role of confounders also come in.

Future return Equity Risk Premium would actually depend on the Causal Mechanism & factors  & Randomness not this associational type models like Factor based models and CAPM etc.

Future Expected Fair  Valuation drive the return causally scientifically Not this CAPM type associational superficial statistical theory

One needs to further scientifically  understand why an investor took systematic risk means he/she should be compensated with return why ? What is the science behind this  ? This would be explored later.

Missing Causal Confounder variable actually affects the equity risk premium. That is hidden but could make the entire risk calculation biased !

 

 

 

 

 

 

22).Newton’s Laws of Motion in Markets & Quantum Law for Markets

Newton’s Laws of Motion for Markets : Principle of Least Action by Paul Dirac : Feynman’s Path Intergal approach in Quantum & Classical World

Laws of Motion of Stock Prices in Quantum World:

 

Net Resultant Valuation(Energy) generates Forces(Demand * Supply i.e. Buy and Sell Orders i.e. OrderFlow which finally leads to the Motion i.e. Change in Momentum

 

 

 

 

Newton’s Laws of Motion in Microscopic world :

Consider a system of N classical particles. The particles a confined to a particular region of space by a "container'' of volume V. The particles have a finite kinetic energy and are therefore in constant motion, driven by the forces they exert on each other (and any external forces which may be present). At a given instant in time t, the Cartesian positions of the particles are r1(t),⋯,rN(t). The time evolution of the positions of the particles is then given by Newton's second law of motion:

mir¨i=Fi(r1,⋯,rN)

where F1,⋯,FN are the forces on each of the N particles due to all the other particles in the system. The notation r¨i= d^2 (r i/dt^2) i.e. second derivative.

N Newton's equations of motion constitute a set of 3N3 coupled second order differential equations. In order to solve these, it is necessary to specify a set of appropriate initial conditions on the coordinates and their first time derivatives, {r1(0),⋯,rN(0),r˙1(0),⋯,r˙N(0)} Then, the solution of Newton's equations gives the complete set of coordinates and velocities for all time t.

For an Investor , based on the valuation, the causal force of demand is created for a stock by the trader/investor. The Valuation is like Total Potential (in Physics) that generates the force in the stock.

Valuation is relative depending upon the investors’ perspectives. Valuation(Potential) generates the market forces. Yes it’s as per the  Physical Laws of the Nature.

So, there would be different valuations for different investors/traders (who are like human brain quantum particles!) causing different forces.

 Let   denotes that equivalent force caused by the respective potential (Valuation)

 

 

So, the causal process : market is the system of different investors

Each investor  has the relative valuation . Every Non-Zero Valuation would apply  either upward or downward forces. 

 

Causal Process due to Forces in the Markets.

Different Valuation-generated forces   attracts  and causes different investors/traders to put Buy/Sell Orders  .

(i.e. demand and supply forces) .This is scientifically  like the gravitational force on energy or law of attraction! Higher Value tends to attracts small values towards itself..

The Resultant Order Flow Imbalance is caused by the Sum of all these Buying and Selling Orders(Caused due to the Value generatedDemand-Supply Forces).

 is like Electric Flux i.e. Number of Order Lines passing through due to Valuation Caused Field.

(Note : I refer here only to the genuine orders intended to be executed not the false orders ,for that there will be more causal analysis)

 

 

The Resultant Order Flow( due to Valuation Generated Forces) causes the Momentum change for the next time duration [Newton’s Laws of Motion]                              

                                    

 

 

 

 

 

Now As the Stock Price gains Momentum due to  forces, Price changes using F = m *a (Newton’s Laws of Motion type mechanism), The Price changes subsequently.

Price at time t+ would be caused by Momentum generated at time t  and then updated Valuation for different investors would change relatively due to change in Price and Momentum in the dimension of time. Price and Momentum being two of many factors affecting the Real- World Valuation at anytime. ( This has been explained in detail)

 

                               

 

 

Explanation :

Valuation at any time  is the function of many multiples and factors and their dynamics over the time  for every investor. There is a resultant of all the Valuation. Open-ended process.

 We have explained earlier

 

So, to be specific related to Momentum and Value Relationship

a) Valuation at time t is also dependent on Momentum at time t-1 i.e. previous momentum. The mechanism for Valuation is mentioned below..

a)Valuation at time t drives the momentum (velocity) at time t for t+1 through the force it generates.

                                   

a)As the Stock price goes into the motion(momentum), it adds values for different investors leading to change in the Order Flow Imbalance.   is caused by more and more forces in the markets caused by the Valuation changing  over the time.

Number of Field Lines crossing is akin to Number of Trading Orders of Traders/Investors caused due to Valuation changes as the Potential.

Which depends upon the Current Strength caused by the Potential/Force/Valuation changing over the time.

Order Flow is essentially and scientifically  the Market Quantum Force. One needs to study the Order Flow (Force) to be able to understand the motion of the resultant Stock Price. Like we study Newton’s Laws of Motion to figure out the trajectory of a body. One can apply ML to study Order Flow Dynamics to study the Quantum Forces driving the Market.

So, Based on the Laws of Quantum Motion affecting The Human Traders’ Behavior and hence the Stock Price can be studied in the ORDER FLOW dynamics how Forces evolve. This is where ML could be useful. I state and claim based on my experiments that it’s similar to Energy system of Particles. This is linked with  Century Old Problem and essentially  Financial Brownian Motion would be similar to Geometrical Brownian motion but in different sense ! This is linked to what I’ve explained in other parts that Human Brain follows the Laws of  Quantum Motion and that’s the advanced version of Classical World Laws of Motion. So,microscopic Newtonian dynamics, the Boltzmann and Langevin equations are derived for the mesoscopic and macroscopic dynamics,

Hence I state and claim that one can experimentally verify that Financial Brownian Motion (Quantum Living  version) is  more advanced form of Physical  Brownian Motion (Quantum Non-Living Version) although their foundations being the Energy system only  fundamentally but in different forms. This fundamentally originates from the fact that Human Brain Quantum Behavior is the advanced version of Classical Laws of Motion where the former is variable while the later is constant. But yes all these Classical Laws of Motions are valid and originate from the Quantum world only. That too, Living Beings Quantum Laws have more variable components than Non-Living Quantum Laws but all of them are highly Causal. This is beyond the scope of this paper.

But for this paper, that can be well studied and verified in the behavior of Order Flow dynamics  ! It’s fundamentally the energy dynamics.

More the Valuations for different investors, it would attract more OI assuming there is availability of fund with the investors.

 

To note : Momentum is just one of the factors among many. What matters is the resultant of all like in physical forces. These are similar physical  forces attracting investors through their quantum force on  their brains(neurons).

 

Step2  :At any moment in time The V(upward/buyorders )tries to push the price up and V(downward/sell orders) tries to push the price downward.

This is tussle between the two driving the market ,eventually converging to the equilibrium state.

Step 3. The Momentum drives the prices further up due to the net upward force. Gradually based on the combined effects, the price moves.  The distance travelled by the stock price can be calculated on the change in momentum caused by the forces.

Step 4: As the price moves up and the momentum increases, it affects the Valuation dynamically. Valuation (upward ) and Valuation(downward) keeps on changing time to time.Upward force and downward force keeps on changing. .Gradually as the price goes up significantly , the Upward Force weakens gradually due to weaker upward Valuation and downward force becomes strong due to stronger downward valuation. Upward That leads to price coming down again. This cycle goes on to  reach the state of equilibrium and fair value over the time.

 

 

                        

[Newton’s Laws of Motion] This is universal law in Nature not just for Physics of Non-Living Objects !  But yes in different forms because nature of motions differ in classical and Quantum worlds ! But this is also applicable in quantum brain world ..

 

 

 

 

 

Here Mass of Stock can be taken as Market Capitalization which typically means Small Cap Stock will have less mass than Mid Cap than Large Cap. So, why Small caps could be more volatile than large cap in general is also because of their lower mass and hence requirements of forces. But this would depend on the resultant forces on ase to case basis. But yes it’s indeed caused by  scientific equations  of forces !

Like in Classical world, Quantum world also has gravity ! Larger stocks are having more mass and hence gravity than smaller caps in general comparatively. Yes it's like Physics . So, there is more amount of force required to move its price to similar distance (i.e. same return ) compared to small cap (low mass) stocks in general given all other  things are constant. Hence they also show lower risk as more amount of force  would be required to pull their prices down. And there this forces come from demand supply forces which are actually quantum forces in the brains of buyers and sellers ( traders) or computer codes…

 

“Factors have to be discovered scientifically how they contribute to the Quantum Laws of Motion of Stock Prices  not associational regression  based approaches traditionally followed.   One has to discover how a factor is affecting the acceleration/motion of the stock through Force and Mass ..For that one has to study the Law of Quantum Motion of a Stock Price originating from Human (Investors) Forces of Demand Supply & Order Flow. So, one has to study how Momentum , Size, Quality,   etc. are contributing to the Scientific Laws of Equation .

Not that they should be compensated for their risk due to a factor and hence the return.

This Economic Rationale that an investor should be compensated just because he/she has taken some specific type of risks has to be scientifically explained and determined how they are contributing to the Laws of Motion of a Stock Price, otherwise it’s just a superficial and non-scientific. Infact explained later that Economic /Financial Space- Time itself exists scientifically having their Laws of Motion as we have in Physics!!

 

 

 

 

 

 

 

 

 

 

 

 

24)The Issue with Conventional Value Method

 

Connection with Regression Analysis Price to Book Value wrt.  ROE (Aswath Damodaran).

This Regression Analysis is again applied with misleading results about Undervalued and Overvalued. Also, something becomes undervalued using one multiple while overvalued using another multiple. KoIt also reveals the flaw and incompleteness of single multiple approach to Valuation. If one uses(despite flaws),one should possibly  take into account different multiples to figure out the same.

 

Traditionally we have seen factors e.g. Value, Size, Quality, Momentum etc. based investing being very popular and at the same it has also shown the dismal performance over the last few years.

Two important aspects :

a)Value as the term has been defined incorrectly in this context that Low price to Book Value mean Undervalued and High mean Overvalued or similar other analysis. Infact this decision on Low or High is based on the LS -based Regression Line  as well .

Given the fundamental issue with the Regression Line, the basic definition of Low and High could be mis-estimated.

Moreover, even if one takes LS based Regression approach, Valuing something based on just one specific ratio could be misleading as different Ratios reflect different components of the Valuation(Relative Valuation Approach).

Low Price to Book Value Vs ROE Regression(Reference Damodaran).Misuse of LSS based Regression to figure out Low to High Book Value. The determination of Low and High based on the Line of the Best-fit is itself faulty ! Again same typical Regression issue. Line of Best-fit itself could itself misleading in Real-World applications. So, what one figures out Low could be High if Line of Best Fit is changed !

 

There could be indeed better methodologies to incorporate different ratios dimensions of Valuation but that’s not the discussion of this paper here.

a)    Value in Real World is not the same as such Academic Theoretical Definition  as conventionally calculated in the Finance & Investment Literatures. The Definition of Value needs to be Redefined and Recalculated even in the Literature. This is the fundamental reason why Value has started underperforming over the time recently. So, we shall discuss on this in detail below.

 

 

This Theoretical Definition of Value is itself flawed and incomplete when we look at how Value works out in Real-World Markets for Investors.

How Investors figure out value in a stock traditionally is itself flawed based on conventional approaches.

There will always be some patterns in this world enough to mislead us and that could be due to chance.

 

 

 

 

 

 

 

 

 

25).  VALUE IN REAL SCIENTIFIC WORLD :REDEFINITION 

So, How this Valuation formulates in Real World. There is the difference between Academic Old Definition of Value and Real-World Scientific Value. Infact Academic definition of Value should be redefined.

 

Here is the brief formulation: Now taking the idea of Relative Valuations Multiple here…Valuation of a stock for an observer(investor) could be the different (Relative Valuation). Every Relative Multiple shows different component of Valuation. Selecting any particular multiple might be biased at times in real world finance.

The key drawback is that a firm can be overvalued and undervalued at the same time using different relative multiples. This reveals the key drawback of the process! How something is concluded to be Overvalued or Undervalued traditionally using any particular multiple e.g. Price to Book Value etc.  in Valued based Investment is both conceptually and statistically flawed !

Infact , Value depends on the optimal combination of different multiples with right weightage and randomness component. Not just that rather higher order dynamics of how those Value ratios, their trajectories etc. The different ratios have their own dynamics. Higher Order Analysis not just First Order Analysis also affect the Value.

That’s why I categorically repeat that the way Value has been traditionally and academically defined as Low Price to Book Value etc. is both foundationally as well as statistically flawed  in Real World.

Value is also relative on observer to observer. It’s NOT absolute !

 

 

)

 

This is just a basic mathematical simple framework to show a framework but actually this is proprietary to decide the Value and would depend investor to investor relatively.

Further, it’s not just first order thing but also how these multiples behave over the time also determine the Value over the time.

 

So, What Value depends upon the combined approach for an investor. Conventional Academic Value itself needs Redefinition in  the Real World. Momentum, Quality etc. are the components of the True Consolidated Value not different entities from them in Real World. All these aspects/components determine the Value not just Cheapness on just low Price to Book Value multiple. It’s an Open -ended rather than a closed ended formula in every scenario.

Value  is scientifically determined by the Causal forces affecting human (investors’) Relative perceptions in real life and Cheapness of Price is just one of the many causal factors.

a)   Price to Book Value Multiple.

b)  Earning Multiples

c)   Debt Multiples

d)  Cash Flow Multiple

e)  Macro Economic Multiple

f)    Profitability & Growth Multiple

g)   Risk Multiple

h)  Cost Multiple

i)     Utility Multiple

j)     Momentum Multiple

k)   Sentimental Multiple

l)     Randomness

m)                   Others factors

This can be written as the statistical framework which remains proprietary here!

Further, I repeat time and again - it’s not just the first order rather how they have behaved dynamically over the time i. e. higher order would give better insight.

So, Value in Real World depends upon many Causal Multiples which actually affects the Business of a firm and Human(Investor behavior) scientifically  in a causal way as well as Randomness.

It’s not just Book to value Multiple. Infact, this traditional academic definition of Value is incomplete and needs to be redefined. Also, by this traditional definition, if something is cheap doesn’t mean all the investors would buy it. Cheapness is just one of the factors to buy. The IMPORTANT Point is We need to Understand How Value should be defined and how it works actually in the Real World.

One thing to Note that Momentum is also one of the Cause of Value . The reason being Momentum reflects the strength of Demand Supply forces being caused through Order Flow Imbalance. This force of demand supply also determines the value of something! Hence, contrary to traditional static absolute understanding of Value as different to Momentum, we have to understand it from the dynamic relative perspective that MomIentum is itself one of the Causes of Value and Vice versa !  Infact, it’s two ways : Value causes Momentum and Momentum causes Value. It’s essentially like Laws of Nature/Physics where Value is like Total Energy( including Potential) and Momentum is like Kinetic form.

So, all the other causal factors including momentum and randomness resultantly decide the Value.

Infact, Value is the function of many different causal factors including Momentum as well as unknown randomness !  This has its origin in the Human behaviors driving the Investors’ behavior at large coming from the science and laws of Nature & Human Behavior.

 

By the way, Scientifically using  Laws of Stability Equilibrium in Nature , there also exists Proprietary approach to calculate the fair Valuation Relatively using Geometrical Methods! We shall possibly discuss this later as much we can(due to proprietary nature) after talking about how Markets are linked to the Laws of Nature and Physics !

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

26).Causal Equation of Market

Let’s understand this in simple Newton’s Laws of motion like relationship in detail later but before that I write the Causal mechanism in the form of  Ornstein Uhlenbeck(OU) mean reversion Process where Stock price tends to revert to their true mean value over the time which is the state of equilibrium.

 This is Pure Scientific.

 

So in this equation, Net Effective Valuation at time t-

 cause the momentum at time t,   to move.

 

 

Valuation at time t  is caused by the historical Momentum     as well.

                                         

 

 

But unlike interest rate dynamics here the mean reversion level keeps on changing over the time for a stock or market depending upon various valuation constituents.

                           

 

 is the velocity at time t causing the momentum

 is the Expected Stable Equilibrium State True Value Equivalent Calculated at time t. This will be calculated based as discussed above and results from the scientific dynamics of the forces.

What is extremely important to understand is that Even the Causal Equations have Error & Randomness Components which is the origin of Black Swan type events/ risks!

 

 

 

 

      

The typical parameters , and , together with the initial condition 0, completely characterize the dynamics, and can be quickly characterized as follows, assuming  to be non-negative:

·        b: "long term mean level". All future trajectories of  will evolve around a mean level b in the long run;

·        a: "speed of reversion".  characterizes the velocity at which such trajectories will regroup around  in time;

·        σ: "instantaneous volatility", measures instant by instant the amplitude of randomness entering the system. Higher  implies more randomness

The following derived quantity is also of interest,

·        2/(2) σ^2 / 2a: "long term variance". All future trajectories of  will regroup around the long term mean with such variance after a long time.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

26).Scientific Background of Convexity & Randomized Approach in finance : Quantum Connection

 

Convexity :

More Gain than Pain from a Random Event. The performance curves outward, hence looks "convex". Anywhere where such asymmetry prevails, we can call it convex, otherwise we are in a concave position. The implication is that you are harmed much less by an error (or a variation) than you can benefit from it, you would welcome uncertainty in the long run.

 

 

 

 

The Picture taken from Internet

 

Let’s look at the Science once again and then how Convexity, Randomized Tools like RCT, Algo wheel etc. are Scientifically placed for Finance, Life etc..

As we know Feynman’s Path Integral approach has been the most important and powerful  scientific result in Science which originated from Paul Dirac’s Principle of Least Action. So, as we know that Market is basically a  Quantum wave driven by the sum of many Human Brains, it would follow the Laws of Quantum world. Market would have uncertainty,randomness which could make it Unpredictable in some perspectives while they would follow some deterministic approach in other perspectives. These two aspects would co-exist simultaneously. We will have to discover that Determinism inside Randomenss/Uncertainty at some scale. So, rather than directly trying to predict the Uncertainty, one has to look at Uncertainty from indirect perspectives.  One has to find out the hidden Causal Mechanism inside Randomness. These all are being done while acknowledging the existence of Uncertainty at some scale. So, we may not directly predict Uncertainty at local scale or in individual perspective but we could definitely do that in global and group context. This is directly how Quantum world works. One might not predict the Behavior  of a Quantum entity like electron individually due to Uncertainty but definitely as a system. That’s where the role of Randomized based approach comes in. The most Predictive approach tries to predict at individual level but that’s not how the Quantum world works. They don’t acknowledge Uncertainty inherent in Nature. So, Scientifically one should find out global Certainty in Uncertainty/Randomness rather than trying to predict the Local Uncertainty! That’s what I had stated earlier that Randomness is NOT Randomly Random rather Deterministically Random .Randomenss also has cause. The discovery of that Determinism inside Randomness is the key to success in markets or real world or life.

Hence, One should conduct Random Trials in Causal way and discover the Determinism inside. It’s like Path of Least Action as in Feynman’s Path Integral or Paul Dirac’s Principle of Least Action . That Path inside Randomness is the key. And random trials should be based on some causal aspects not Randomly Random Trials always because for every Random Trial there would be some cost involved. One has to minimize that  cost too. So, by following the Causal adjusted Random Trials, own can maximize Convexity. This is also linked to the law of energy. If the cost is not minimized, due to herd behavior of energy in markets, it could worsen leading to huge Tail risk.

 

This is also related to Reproducibility Crisis in someway where one tries so many computer simulations and random trials to select the best ignoring that  best  is not always the best ,best could  also be random !!

.

 

So, it’s not basically  Knowledge Vs Convexity rather Knowledge+ Convexity . Through Causal knowledge one can increase the Convexity by minimizing the cost of random trials. And this is also linked to Reproducibility Crisis where just taking random trials and selecting the best blindly could be biased as done in Backtesting etc.. ! This is because even the Best could behave randomly over the time.

 

That’s the clue to success : Discover the Determinism inside Randomness and follow the Path for Least Action . That’s based on Scientific Quantum Laws.

 

 

 

 

 

 

 

 

 

 

27). Causal -Adjusted Randomized Factor Approach

 

Now having understood the scientific concepts now deduce the approach for Causal Factor Investing. We see that Market participants keep on finding new and new Deterministic factors and try to predict it using LS Regression type tools. They try to predict individual factors and try to find out the Ultimate complete set of Deterministic factors to predict the markets. That holy grail is utopian and scientifically could not possibly exist due to Quantum Nature of Market. The effort to make the Real World Deterministic which is fundamentally Quantum and having Uncertainty is against the Laws of Nature scientifically. Hence , that utopian dream of discovering the ultimate set of Deterministic Factors is against the Laws of Nature. It’s analogous to the ground breaking Godel Incompleteness Results in Markets where the effort to make market Deterministic by Complete set of Factors is like making Arithmetic a Complete Axiomatic system...

In other words, discovering the ultimate Holy Grail is like trying to assume the world as Deterministic, which is scientifically not possible and against the  Law of Quantum world.

 

So, the key is to follow Randomized Factor Approach. In this approach like Feynman’s Path Integral & Paul Dirac Principle of Least Action, we try to apply the same in Human Quantum World. This is where we try to discover the hidden Determinism inside Random Causal Factors. This way we explore Certainty inside inherent Uncertainty..That’s the key and analogous to the Law of Quantum world (Physics being just a subset of that ). RCT, Algo wheel are based on that principles but there are many other tools using which we try to study the Determinism inside Random Factors. So, like an electron, we may not predict exactly about every single factor deterministically rather obviously in collective sense. It’s like Quantum world  Interference, where we may not predict the trajectory of each electron, but definitely as a whole we can find some determinism that they would like around.

Hence, unlike traditional approach of selecting factors on one by one basis, the scientific approach should be to Randomize the Factors based on  some Causal understanding and create a Convex System.

For that, we can follow a rough algorithm as below

 

a)   Random Selection of  the Set of stocks based on the Causal based Deterministic and  Random Factors according to their Real-World Valuations as discussed above  .

b)  Allocation weights to them have to be dynamically arranged starting from some random component based on the historic performance, expected future scenarios in a well  diversified manner, putting   various constraints on the weight,while also looking at Tail Risk perspective idiosyncratically.

c)   The key is to discover the determinism out of Randomized approach over the time  follow the path of least action as a group on the principle the Nature works

 

d)  Allocate random weights to them at start to these selected Causal stocks 

e)  Some Predictive views based on Causality Factor Analysis could add up to weightages. Say for example if the  has better risk-return profile expectation in future than , then even though randomly assigned weight .

f)    Put Constraints on the Weights to make it well diversified for example

     risk management point of view.

 

g)   Based on the discovered determinism inside their random trajectories over the time, keep on adjusting weights dynamically to each of them over the time.(for example)

There can be more sophisticated equations to update the weightage.

h)  The weights keep on adjusting dynamically based on determinism inside their randomness in real world.

g) Effectively, the Portfolio becomes relatively Convex to handle Black Swan Risk Effectively and revert to their equilibrium true state over the time….

 

The exact equations and functions to update weight would be changing in real world scenarios depending on the scenarios and one’s own risk-return objectives. Here it is just for demonstration perspective. The key aspect of Randomized Factor Approach is to Randomize the Factors selection in causal way and then discover the Determinism hidden inside over the time. This is unlike traditional approach where selected traditional associational factors are bet in deterministic way. But to acknowledge that the Holy grail of Completely Deterministic Factors won’t ever be found due to Quantum behavior of Markets. The key is to discover Causal Determinism inside Uncertainty/Randomness !

RCT/Algo wheel are some basic tools based on this approaches. The more advanced proprietary scientific tools based on this approach could be applied. It’s also based on the Convexity Principle to build the Portfolio dynamically over the time.

 

So, here True Risk is Black Swan Tail Risk, Normal Volatility should not be much taken as Risk rather they are the essential mechanism of the Markets ! As long as one is Convex and Prudent to Black Swan/Tail Risks, Intermediary Volatility should not be of much concern as according to the Laws of Nature and hence Markets, it would revert to the true Equilibrium Value state over the time.

Beyond that it would it would influence the causal behavior hence the that would remain proprietary and private

As explained in the beginning of the paper that there is Self-referential issue in the paper where publicly disclosing everything could influence the causal mechanism in someway …Even the information in this paper could be discounted by traders/investors in markets directly or indirectly.

 

 

 

 

28)RISK

Understanding Risk Mechanism in Real World:

Issues with Conventional Risk Management Tools e.g. Max Drawdown, Sharpe Ratio etc….. How Risk is often miscalculated !

 

It's often found that managers use conventional tools like Max Drawdown, Sharpe Ratio, using Backtesting etc. but let’s ask do they really measure the true risk in Real World ?  In the fat-tailed financial world , true aspects of risk are in the tail side which come unpredictably.

 

The fundamental problem with all these traditional tools is that they are often calculated on historical data in back ward direction of time. As explained above in various other contexts, direction of time is crucial in real world applications. These tools like Max Drawdown,Sharpe Ratio allnare measured based on the data which have actually occurred in past. Say for example a stock had max drawdown  of  25% .

 

How this drawdown is calculated ? At time  T = t, looking at the previous actual data (one that has actually happened!) one calculates the Max Drawdown. Or someone backtests a strategy at Time T= t  based on the actual trading data for T <t .

 

But let’s  imagine that these actual data reflect the true risk in forward direction of time in the real world ? The key difference is that  the trader / investor/observer is at T=0 and looking forward to T=t in forward direction of time live. In that case, there are multiple possible risky paths stock prices could take as per the reader’s  estimates which could have been actualized ,realized with full of uncertainty but that didn’t actually happen ! This was also explained in context of historical equity risk premium puzzle. Infact Risk is often  calculated wrongly in real world leading to such puzzle being one of the prime reasons along with fundamental issues with statistical regression framework and fat tailed data.

Actual Data Path is just the Subset of Many possible Risky Paths that could have been travelled but didn’t  actually ! 

Hence in forward direction of time, the actual risk that  traders took and the stock prices could have been were far more possible risky paths than than the actual path travelled. But  in historical analysis in backward direction of time like Backtesting or max drawdown or Sharpe ratio calculation etc. those real world live possible risks taken over many possible paths in forward direction of time at T= 0 for  the trader were not considered that could have been ! Hence , the  analysis in the backward direction of time ,   could grossly  misunderstands possible risks !

These fundamental issues arise because of the omission of Time Direction ! In forward direction of time there could have been many possible paths for the traders)as per this expectations or unexpectedly) which could have different possible max drawdown ,Sharpe, Skew ,Tail Risks etc. But

That actually didn’t occur and hence missed out in the backward direction of time.

Hence, these Conventional tools like Max Drawdown occurred or Sharpe ratio, of even Tail measures on Historical trading data in backward direction of time might not truly reflect the risk by looking Backward direction of Time on actual data ! True measure of risks would be in the forward direction of time over many possible paths which could have been but didn’t happen and those might not get reflected in actual trading data !

 

At the same time, if one does Scenario analysis in forward direction of time , there would always remain more possible scenarios  than the human minds or computer algorithms could imagine/simulate ! This follows philosophically and practically from Godel's Incompleteness Results indirectly that there would always remain more unimaginable scenarios than simulated in forward direction of time !   Hence the key is how to measure or be prudent to those Uncertain scenarios which one can’t imagine at that point in time In forward direction of time.

 

So, the best way of measuring Risk Prudently is to figure out how the Strategy could  have performed(in forward direction of time way back then) or perform in forward  direction of time assuming things could  behaving randomly ! If the strategy is prudent enough to sail through Randomness inherent in Markets/Nature, then one can say that the Strategy is Risk Prudent  in Real Forward Direction of Time .And, if one can sail through Randomness,one can be prudent to Black Swan risks as well for the originate out of Randomness, Incompleteness!

 

Hence, to understand risk truly , one should

measure in forward direction of time not backward analysis like Backtesting, max drawdown ,Sharpe ratio on Historical data ! Also, not just future scenario analysis but the underlying principle should be as much convex to take advantage of Randomness due to of relative lack of information at any moment in time for an observer (investor).

But yes that has to be Causal adjusted as Causality would minimize the cost of achieving Convexity for Randomness management. In a nutshell, there has to be balance between the two.

 

As explained earlier, it’s scientific that things revert to their equilibrium stage value over the time. Hence, one needs not worry about intermediary on prediction of Black Swan( which is in principle Unpredictable) . But what is extremely important to understand that we must be able to bear the downside phase during Black Swan events. And that comes from Prudent Risk Management. Often the institutions blow up because they can’t handle downside and before the market reverts to stable equilibrium level, they become bankrupt.

 This is mostly because of fragile strategies like Leverage etc.. It’s quite Scientific.

Nature does have many ups and downs ,catastrophes, but what matters Is to able to survive downside effectively to garner upside when it reverts scientifically as per the laws of Nature.

 

 

Uncertainty Management is the Key :

 

Most of the financial models are developed to predict the Deterministic aspects. But very few focus on Strategizing  Randomenss(Uncertainty ). As explained many times earlier that Uncertainty also being the inherent scientific characteristics everywhere in Life, markets ,Nature, the key to success in finance or otherwise is about handing  Uncertainty and benefiting from it along with Prediction work as required. How to do the best under Uncertainty. That’s the key. That should also be  the prime focus of Quant approaches or otherwise in general in life, finance and nature.

 

 

 

 

 

 

 

 

 

 

29)Commentary on Markowitz Portfolio Theory from Scientific Perspective Laws of Motion & Energy.

Legendary Nobel Laureate  Markowitz built  a portfolio construction theory in which investors should be compensated with higher returns for bearing higher risk. The Markowitz framework measured risk as the portfolio standard deviation, its measure of dispersion, or total risk. The Sharpe (1964), Lintner (1965), and Mossin (1966) development of the Capital Asset Pricing Model (CAPM) held that investors are compensated for bearing not only total risk, but also rather market risk, or systematic risk, as measured by a stock’s beta. Investors are not compensated for bearing stock-specific risk, which can be diversified away in a portfolio context. A stock’s beta is the slope of the stock’s return regressed against the market’s return. Modern capital theory has evolved from one beta, representing market risk, to multi-factor risk models (MFMs) with 4 or more betas. Investment managers seeking the maximum return for a given level of risk create portfolios using many sets of models, based both on historical and expectation data.

(https://link.springer.com/chapter/10.1007/978-0-387-77439-8_2)

 

We need to revisit foundationally how Markowitz Theory of Risk-Return or any such Economic Rationale Theory historically is in sync with the Scientific rationale or not otherwise, that would be just non-scientific”.

 

There are two crucial points to look into the Markowitz Framework from the Lens of Scientific Principles talked about.

1)Investor should be compensated for market risk not idiosyncratic risk of a stock

2) The Risk is measured by LS Regression based Beta which is fundamentally backward looking approach as explained earlier

 

Scientifically, if an investor takes specific risk in a stock which has the valuation (potential), it could generate return over the time relative to the other stocks or even market. Any risk taken on something which has Valuation (Potential Energy) could generate return .

Scientifically, Return is actually the Distance Travelled due to the Force on the Stock. Risk is the Possible Uncertainty in the forward direction of time.

 

So, the return generated by a stock would depend on the Causal force which pushes to travel distance in the financial space-time.  It could be due to both the components i.e. systemic forces in the system and also idiosyncratic forces. But one has to note that tall the idiosyncratic risks won’t lead to the possibly of return but yes some of them can .

The Conventional economic based rationale that an investor should be. Compensated based on the Systematic risk has to be looked at scientifically and it may not be true always. The Economic Rationale originated based on backward looking CAPM method where expected return depends on the beta risk. CAPM is itself a contradiction in itself . On one hand it’s technically backward looking regression approach to calculate beta  and on the other hand it talks about expected return in future.

But when then entire mechanism is seen in Causal Scientific forward direction of time, this logic and theory may not world always. Ultimately it would depend on the dynamics of  forces whether a risk would generate return or not.

For example : A stock price will be moved by the systematic forces in market like say Economic prospective of the country . Also, it would depend on the favorable idiosyncratic forces like say the quality of management or say renowned CEO appointment . So, even some idiosyncratic risks could cause the positive forces in the trajectory of stock price in its Space-time.

One has to understand that which risks to take and which not , to be scientific. Which one could be  positive and which one negatively or not.  

Just because something has more  systematic risk as per economic rationale , doesn’t mean it is bound to generate more return. It could be expected but not necessarily from scientific point of view.

Risk has to be analyzed in forward direction of time and how that could contribute to the Causal forces generating return through its trajectory in financial space-time.

The point I am making  that conventional CAPM based economic rationale and scientific causal based analysis both might be different if not always. One has to technically understand the concepts. By the way CAPM is an Static Equilibrium based model but Market in Real world is not always in Equilibrium and remain dynamic driven by Causal forces in the financial/economic space-time.

 

 

 

 

 

 

 

 

 

 

 

30)Benchmark and the Concept of Market.

Often in the Financial Investment world, we see Benchmark. Benchmark is applied almost everywhere to  express the performance(risk- return) of  a portfolio. But how robust the Benchmark itself is.

In my discussion with one of the legendary Nobel Laureates who agreed with my view and also be had his own thoughts as well that Benchmark is itself exposed to changing risk. Hence ,rather than focusing on Benchmark to gauge one’s performance relatively, one must focus on compounding return absolutely while managing tail risk.

The typical  problem with Benchmark is that Benchmark itself has set of stocks assigned some weightage ,but given the dynamic scenario in the market due to economic scenarios,  their internal structure of  conventional correlation, risk- return profiles keep on changing. This exposes the Benchmark itself to changing risk Dynamics in the economy. Moreover, Benchmark having certain weightage dynamics could itself be exposed to some tail risk. And if Benchmark is itself exposed to say tail risk due to its own Idiosyncratic structure, it wont be the right reference point. So, the point is – the reference point must be robust enough scientifically so that we could apply it to gauge others’ performance. But if the Benchmark itself is exposed to some risks which a portfolio may not have, then the Benchmark won’t be the right tool to gauge the portfolio.

Infact Benchmark should be able to manage its own intrinsic  changing risk in the economic scenarios. One should also remember that Benchmark is also just one of many possible Portfolios. We can see that internal composition of Benchmark itself keeps on changing over the time. So, the Benchmark itself is exposed to changing risk in various economic scenarios To become a robust reference point, it has to be robust itself first to various risk.

So, the main objective should be to compound the Portfolio on Absolute basis while managing the tail risk. Ultimately this is what matters, whether  the portfolio is Compounding and how it manages its tail risk.

It's like a reference frame in physics.  For choosing one as the reference basis, it has to be robust and reliable rather getting exposed to risk itself. Conventional Benchmarks in the system are themselves exposed to changing risk in the economy. Hence relying on that doesn’t make. Ultimately what matters is absolute compounding over the time while managing tail risk.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

32)Tail Risk Hedging & Law of Energy/ Nature : Law of Conservation of Risk

If one looks at Mother Nature, one would find that Nature itself goes through ups and downs, downs like natural catastrophes etc.and then it again reverts  back to normal state of Equilibrium over the time as the dynamic process. So, is Human Life.  Infact like the Law of Conservation of Energy, there is also Law of Conservation of Risk linked with each other. Downside is important to go upside. It’s like Quantum wave. If it doesn’t go downside, It won’t store potential energy to push it back to kinetic for and go upside. That’s infact the Natural Process. We can see that in life as well. Hence, one has to see the essential downside. It’s natural process of  of energy dynamics.

Tail is the essential part of Law of Energy.  If we try to cut the Tail beyond a point, we  will possibly not get the energy to move upward. Let’s technically imagine if there can be natural physical  wave with no trough and only crest ? If so, it would be done   artificially by some external intervention by the observer which would in some form incorporate the  role of timing . That’s where it gets exposed to some form of timing risk. And if there is really  no trough risk then it runs the risk of shoulder risk which can go endlessly…which itself could be a disastrous horizontal tail risk !!

Hence, in the natural view, one and to understand the essential downside risk . I don’t mean to say all the downside  risks are essential but some part of downside risk is essential and must be taken to naturally travel upside. Or else, there would come up some other hidden risks.

In finance, often tail hedging is talked about, but the path of really Compounding wealth will have to go through the phase of  essential components of Tail. Rather than hedging and cutting out the Tail entirely try to be Convex and manage the Tail risk. Get exposed to essential component of Tail and be prudent enough to survive the Tail.

If one really tries to hedge the Tail Risk completely and also reap the more upside it’s against the Laws of Nature and Energy. So, somewhere it’s exposed to some other hidden risks may be shoulder risk  !! There are some essential risks that must be taken to reap the upside or  otherwise it’s hiding the risk somewhere that could blow up or won’t generate enough Compounding return. If some system is indeed Natural, it’s technically not possible or else that system is not Natural be it artificial finance system or anything else.

It's often quoted by some players in the industry that they have hedged all the risks and generating good return.It’s against the Laws of Nature and somewhere hiding the risks. Law of Conservation of Risk.

 

Even Mother Nature goes through the Downside like Tail events in the form of Catastrophes, Earthquake etc..but yes they revert to normalcy in very prudent manner and go quite up. So, Mother nature’s doesn’t cut all its downside risk as it would hamper its upward trajectory  through law of Energy dynamics.

So, Tail Risk Hedging is definitely important and one must do that  but in doing so, one must not cut all its downside risk including the essential  components to gain upside. It’s unnatural to go upside without going downside(essential components). This is based on the Scientific Laws of Mother Nature also evident in life.

This needs to be conceptually understood well by many players who misunderstand the scientific dynamics of risk and always minimize the so called downside risk and some even claim to be riskless and great return !! This is against  the Law of Conservation of Risk & Energy & Nature!!

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

33).Why Drawdown might not be Risk Always ? Based on the Scientific Mechanism the trajectory of a Stock will be Naturally like Wave Structure of Ups and Downs due to the Laws of Energy.  Now if a Stock has a Drawdown, that has two components: Natural + UnNatural. Natural Component could not be termed as Risk in True Sense rather they are the essential process of Scientific Trajectory of a Stock due to Law of Energy and Motion. Hence, All Drawdowns and their Components are NOT necessarily True Risk. It has to be scientifically studied by the investors or portfolio manager

 

Even Market is the Set of Stocks with certain weightage. There can be different such Markets by changing the constituent stocks and their weightage. Infact, Market constituent keeps on changing dynamically ! S

 

Let’s analyze them based scientifically why ?

 

 

 

 

 

 

Didier Sornette Super Exponential Crash Theory…

 

Let me also explain the theory where there exists causality that super exponential growth can have downside burst risk. This is supported by the Laws of Energy & Nature. As explained earlier, like the Law of Gravity in Classical world, similar Law of Gravity exists in Human Behavior Dynamics as well in Quantum Economic space-time. Hence, as an example where  some Physicist turned Economists  have claimed about Super Exponential Growth leading to  Crash. This  has scientific base. But it must NOT be misunderstood. It doesn’t mean one can Deterministically predict the Bubble Crash always. In my view,  Even the term Super Exponential Growth is not Absolute rather Relative!!

 It is scientifically based on Law of Energy but indeed Uncertainty exists as to when it would crash !! There can’t be Time  Certainty over this in Real World always. This is also against the Law of Nature where there would be completely deterministic & Predictability of Crash !!

 

 

 

 

34)Financial/Economic Space-Time like General Theory of Relativity in Physics.

At deeper level there exists Economic Space-Time where a firm’s value traces a geodesic Equation. The Trajectory of a Stock Value or Economic Variables traces Geodesics in that Economic Space-Time Curvature. So, the traditional equations of SDE Brownian motion needs to be expanded in N-dimensional Riemannian Economic Space-Time. This approach can also be used to prepare  the Bankruptcy model  of a firm etc. like Singularity e.g. Black Hole in Physical Space-Time.  But this is not the focus of this paper. The analogy is similar to the Einstein’s General Theory of Relativity in Riemannian metric space. Similarly there exists Financial / Economic Space-Time. One of the key fundamental  issues is that Economic Space-Time is not the Euclidean Space of the Paper on which these trajectories are virtually drawn. The actual Trajectory occurs in Non-Euclidean Euclidean /Financial  Space-Time, Hence, the SDE Brownian Motion in Non-Euclidean (Riemann Metric Space-Time would have been more meaningful) .

Infact the below equation  is the Causal Extension of standard SDE Brownian equation below which is also associational in nature fundamentally

 

                              

 

The fundamental issue is Stock Price doesn’t travel on the Euclidean Paper or Computer Screen rather they are just the shadow of their Non-Euclidean Trajectory in the Economic /Financial Space-Time. So, in reality, there is need to expand this SDE generic equations to the N-dimensional Space-time in Causal perspective. SDE above is more of associational in nature where we just try to find out the relationship between Stock Price with Time on the Euclidean Paper. This omits the actual causal dynamics happening in the Real Economic Space-Time. This is also the foundation of Quantitative Finance where SDE is the foundational Structure!That is often used needs to be expanded in that space- time like Schwarzschild metric etc…This is an Economic /Financial Space-Time Metric where a financial /economic variable traces geodesic . This is N-Dimensional Causal Economic/Financial Manifold where the trajectory of a financial variable here stock price travels a Geodesic like Einstein’s General Theory of Relativity in the Riemannian manifold. Indeed, like Physical Space-time there exists Economic Manifold more complex than the physical one. This metric space is Non-Euclidean. Infact, this has also quite significance for Machine Learning where distance metric formula could be Non-Euclidean rather than traditional Euclidean metric.

 

Financial Economic Riemannian Manifold :

Where these the causal n-dimensional space with randomness dimension as well. 

Equation of Geodesic is calculated using the set of following equations as in General Theory of Relativity. Similarly the Financial/Economic Trajectory of  an economic entity traces Geodesic in N-Dimensional Riemannian(Non-Euclidean) Causal Metric Space.

{\displaystyle {d^{2}x^{\mu } \over ds^{2}}+\Gamma ^{\mu }{}_{\alpha \beta }{dx^{\alpha } \over ds}{dx^{\beta } \over ds}=0\ }          

where s is a scalar parameter of motion  and Î“���coefficients are Christoffel symbols symmetric in the two lower indices. Greek indices may take the values: 0, 1, 2, 3 and the summation convention  is used for repeated indices  and . The quantity on the left-hand-side of this equation is the acceleration of a particle, so this equation is analogous to Newton’s Laws of Motion  which likewise provide formulae for the acceleration of a particle. The Christoffel symbols are functions of the four spacetime coordinates and so are independent of the velocity or acceleration or other characteristics of a test particle  whose motion is described by the geodesic equation.

Infact, this approach can also be used to predict the bankruptcy of a firm where Bankruptcy is analogous to the Black-Hole Singularity in the Economic/Financial Riemannian Manifold like General Relativity in Physics !  I had talked about it in a paper around a decade ago, which I had not published publicly. This is not the discussion of this paper as of now but I mentioned it to show that how deep scientific causality exists in finance and economics as well.

 

 

35)Fundamental Issues with the Conventional Probability Theory In Real World : Scientific Perspectives & Origin .

Mathematics in Human Quantum Space-Time Vs Mathematics in Classical Space-Time : Some deep thoughts to explore :

 

We often study the Conventional Probability Theory to describe in the Real World here Finance driven by Human Behaviors. But have we  looked at some of the most foundational questions if that’s really correct and how far ? How far the rules of probability theory(mathematics/statistics) hold true scientifically and are in sync with ?

Let’s take the most common and basic equation of Expected E[ ] and check it’s validity scientifically in Real World especially Human Space-Time/Finance.

Going back to the history and the foundation of Probability theory,

In Probability Theory  the expected value (also called expectationexpectancymathematical expectationmeanaverage, or first moment) is a generalization of the weighted average Informally, the expected value is the arithmetic mean  of a large number of independently selected outcomes of a random value. 

The idea of the expected value originated in the middle of the 17th century from the study of the so-called problem of point s, which seeks to divide the stakes in a fair way between two players, who have to end their game before it is properly finished. This problem had been debated for centuries. Many conflicting proposals and solutions had been suggested over the years when it was posed to Blaise Pascal by French writer and amateur mathematician Chevalier de Mere   in 1654. Méré claimed that this problem couldn't be solved and that it showed just how flawed mathematics was when it came to its application to the real world. Pascal, being a mathematician, was provoked and determined to solve the problem once and for all.

He began to discuss the problem in the famous series of letters to Pierre de Fermat .  Soon enough, they both independently came up with a solution. They solved the problem in different computational ways, but their results were identical because their computations were based on the same fundamental principle. The principle is that the value of a future gain should be directly proportional to the chance of getting it. This principle seemed to have come naturally to both of them. They were very pleased by the fact that they had found essentially the same solution, and this in turn made them absolutely convinced that they had solved the problem conclusively; however, they did not publish their findings. They only informed a small circle of mutual scientific friends in Paris about it.

In Dutch mathematician Christiaan Huygens'  book, he considered the problem of points, and presented a solution based on the same principle as the solutions of Pascal and Fermat. Huygens published his treatise in 1657, (see Huygens’ (1657)   "De ratiociniis in ludo aleæ" on probability theory just after visiting Paris. The book extended the concept of expectation by adding rules for how to calculate expectations in more complicated situations than the original problem (e.g., for three or more players), and can be seen as the first successful attempt at laying down the foundations of the theory of probability..

 

Now, Let’s look at them in the scientific perspective developed so far. This definition of Expectation originating from the Theory of Probability from 17th century by the contemporary great mathematicians. We need to revise if how far they make sense based on our updated scientific understandings of the universe and the Laws of Nature especially Randomness.

 

Consider a random variable X with a finite list x1, ..., xk of possible outcomes, each of which (respectively) has probability p1, ..., pk of occurring. The expectation of X is defined as

                                 

Since the probabilities must satisfy p1 + ⋅⋅⋅pk = 1, it is natural to interpret E[X] as a weighted average of the xi values, with weights given by their probabilities pi.

Now consider a random variable X which has a probability density function  given by a function f on the real number line  . This means that the probability of X taking on a value in any given open interval   is given by the integral of f over that interval. The expectation of X is then given by the integral

E[]=∫−∞∞��()��.

 

Is the Formula of  to Sum and  take average  in line with the Principle of Least Action – Laws of Nature. As explained earlier, the path followed is the most stable equilibrium, least action not the average of all paths. Hence, the Expected Value of a variable in Human Mathematical Quantum Space-Time is not Taking the Sum and Average rather discovering the most Stable Least Action Path , which would the  Expected Value /Expectation E[X].

For example : Suppose A has the probability of the travelling to New York and New Delhi  with 50% probability each. Now Expected Value based on Probability Theory as conventionally calculated as the average would fall some where a place say London  midway say  NY –ND. So, the Expected Place of Travel will turn out to be London using the probability theory. But in Reality, it will land up either NY or ND. And whether NY or ND would depend on the Forces dynamics / Path of Least Action in the Quantum Human Brain of the Traveller A.  What is meant here is that In Reality when the event occurs, it would be either one of them completely not half half both the events. So, the Expected Value would be either NY or ND based on the action of both the paths not London being the mid-way !

The principle behind the existing conventional Calculation is that the value of a future gain should be directly proportional to the chance of getting it. But this fundamental principle its lf has to be reviewed and brought in sync with the laws of nature. The Value of a future gain is not always directly proportional to the chance(probability) of getting it rather it has to be based on the path of maximum stability, least action which would be the expected value in the RealWorld.  This path could be different in different scenarios. The the principle and rule of mathematics of probability theory has to be brought in line with the Real World to conflict the foundational issues. It  has to be based on the Principle of Least Action, Laws of Nature.

Hence the Expected Value Calculation in the Real World Human Space-Time could be different than this Conventional Mathematical Space. In Real World Randomness also has hidden Deterministic aspects !

Infact, even Mathematics could be  “Relative” in different Space-Time. Like Physical Space-Time, Human Space-Time, Biological Space-Time, Social Space-Time can above different Mathematical rules.

Let’s take an example : Newtonian Calculus which was developed by Newton by taking clue form the Classical World , will that be valid in Quantum World ? Will Pythagoras theorem apply in Quantum world ? The Calculus for Quantum  World would be different from Classical world or not ? I think yes. The Rules of Mathematics the way they are developed or discovered originate from the Space-Time experiences/Geometry could be relative or not ?  Like : Schrodinger Quantum Wave Equation satisfying the Newtonian (Classical World) Differential Equations ? Is that correct ? How far ?

The Point is Mathematics in Human Quantum Space-Time would be different from Conventional Probability Theory the way Expected Value Operations are calculated. E[ ]  would be the Path of Least Action /Most Stable not just the Average of all Possibilities. This is very Foundational concern the way Conventional Probability Theories are applied in Real World Financial Human Space- Time !

To calculate the Expected Value, One has to calculate the most stable path ,path of least action .That favorable path should be the true Expected Value. The favourable path would depend on the case to case basis.

The Concept of Probability is very often deeply  misunderstood.

 

 

 

 

 

 

 

 

 

37).  Summary & Conclusion

 

So, finally it’s time for summary now where I would briefly write the brief summary of the paper. I know some readers would like many views, others may disagree on certain points. It’s an open research paper. There is relativity kn thoughts. Different people can have different perspectives and all true.

We have to acknowledge and the part of highest intellectual maturity that duality in the form in of extreme opposites co-exist in Nature whose subset is finance, life etc. So, I have tried to unify two extremes of Randomness (Black Swan) & Causality; Predictability & Uncertainty in scientific way traversing the Quantum & Classical world. I’ve explained how both co-exist together and we need to appreciate it’s beauty.

Essentially Market driven by Human Quantum Behavior is a Quantum wave which is governed by certain laws of Energy and Nature. Black Swan which basically falls under the Unpredictability zone inherent in Nature,hence finance and life as well are essentially caused by sudden Energy shock, which is causal,but can’t be predicted as by definition Black Swan is Unpredictable(if it’s predictable, it’s not a Balck swan for the observer Relatively) . Black Swan (Caused by Randomness ) also follows the Law of Energy and Nature,which converges to its stable state level the time.

So,given the market being a Quantum human wave, it follows Certain laws of Nature & Energy scientifically. One need to understand that and hence manage Quantum type Uncertainty & Predictability both as Duality.

The clue to Financial Investment success is accepting the Certain Predictable components at the same time be convex to the Uncertainty. But to attain the maximum Convexity, one has to discover the Determinism inside Randomness. This causal approach would increase the Convexity by minimizing the cost.

For that rather tha finding the Deterministic set of Factors, one should follow Randomized Factor Approach . This is based on the Feynman & Paul Dirac Principle of Least Action. Where one should follow Causal adjusted Randomized factors and try to find out the Path for Least Action...Eventually AlgoWheel, RCT etc..are based on similar principles. But there can be more advanced approaches . Ultimately own has to discover Determinism inside Randomness, Certainty inside Uncertainty that exist inthe form of Duality in Nature,Life & Financial Markets.

Finance is essentially a form of Quantum Science involving Human world which must be studied in forward direction of time dynamically in Scientific & Causal Perspective(Randomness being part of that) rather than using backward looking conventional Statistical tools in static way . Role of Time Direction is quite important in Finance. One must understand this scientifically rather study finance as a statistical subject.

 

 

 

 

 

 

 

 

 

 

References :

 

 

Prof Nassim Taleb’s Papers

Prof. Marco Prado’s Papers

Prof. Didier Sornett’s Papers