Scientific
Perspective: Black Swan : Causality Adjusted Randomized Factor Investing :
Determinism Inside Randomness
By
Pankaj Mani
May
2023
1.Introduction : In this paper, I’ll discuss about
the Scientific Perspectives of Financial Investments especially Factor
Investing. First will talk about Philosophy & Development in Science and its
connection with Financial Markets. How
Investment/Factor Investing can be made scientific Will discuss about some
fundamental misconceptions of popular statistical tools e.g. Regression Analysis,
Backtesting etc. Also the Role of Time
Dimension in Statistics. Will discuss about the Scientific Origin of Black
Swan, Convexity and finally a New Causal approach of Factor based Investment.
Will also
discuss that ultimate intellectual maturity is the acknowledgement of duality/two
extremes in Nature in the form of Randomness(Uncertainty) &
Determinism(Predictability) like Wave Particle Duality. Rather than being
competitor to opposite being complementary to each other. It attempts to unify
the two extremes of Black Swan Randomness and Causality in the dual form.
Note : I’m
not going to make it much technical in mathematics, for to be understood by
all. Also, there exist Fundamental issues with the conventional statistical and
mathematical tools, which is blindly applied in finance without understanding
the concept. So, my focus is on concepts behind the mathematical/statistical
tools,which are often misused and misunderstood in finance.
2.Background
of Science :
Science has
evolved from Newton’s Classical Laws to
Quantum Mechanics over the time. Einstein’s Theory of Relativity to
Schrodinger’s wave Equation to Heisenberg Uncertainty Principle to Principle of
Least Action to Feynman Path Integral Approach have been developed in the study of science
over the time. Science is still trying to unify the two extremes of Classical
& Quantum world laws. One one hand, things appear Deterministic in day to
day Classical world, things appear highly Uncertain/Random/Probabilistic in
Quantum world. Science is trying to understand the Quantum world of Uncertainty
and Unifying it to the Classical world.
Albert
Einstein throughout his life couldn’t digest that God is playing dice with us. He
believed in Deterministic world inherently which has been refuted over the
time. On contrary, Niel’s Bohr’s Principle
of Complementarity emphasizes the role of Observer(Human Brain) through which
the External world is experienced. Even Stephen Hawking talked in context of
theory of Everything in Science. This is infact true that Science can’t find
the ultimate theory of Everything without knowing about Human Brains,
Consciousness etc. There is also Paradox of Consciousness and I wish I could
take this here to show how Science needs radical approach and try to unify the
ultimate reality of the Universe- Brain mutuality. But this is beyond the scope
of this paper here. Hence, I would like to confine here.
Feynman’s
Path Integral approach has been the most important scientific result in Science
which originated from Paul Dirac’s Principle of Least Action. I would state
that even Classical world theories like Newton’s Laws of Motion etc originate
from the Quantum Energy Laws.
Those who
have studied Physics know that one can derive Newton’s Laws of Motion from
Principal of Least Action which is also supposed to be the core of Quantum world
Laws. It states that any system tries to trace the path of Least Action (Action
is basically a function of Kinetic & Potential energy),loosely speaking, least
energy in least time.
So,
Feynman’s Path Integral approach which also the principle behind Feynman Kac
approach for solving Differential Equation originated when Richard Feynman tri
s to show that Schrodinger Wave Equation in Quantum Physics can be derived from
Feynman’s Path Integral approach.
Feynman’s
Path Integral tries to sum all the possibilities of trajectory to find out the
resultant path and this came up when he tried to explain wave particle duality
in context of the famous Double Slit
Interference experiment..
Though this
approach by Feynman needs further improvement and clarifications which I have
devised but that’s beyond the scope of this paper. Hence, I would keep myself
confirmed to what already is well known rather than proposing my own
improvement in physics over that in this paper.
Feynman’s path integrals formula is possibly the most
powerful theory in physics to describe the laws of Nature. Starting from the
Paul Dirac theory that a body traces the path of least action(roughly speaking
least energy and time ), Richard Feynman derived Path Integral approach to
discover the path of an object. It basically extends Principle of Least Action
to Quantum world from Classical world.
Below are the equations of Feynman’s Path Integral
theory to Principle of Least Action which tries to minimize the Energy Function
of a system from which are linked to the
Laws of Motions
Principle of Least action :
is
Kinetic Energy while .
Still
Science has lot to resolve like Locality-at-distance, Bell’s theorem, EPR
Paradox (this year Nobel was awarded on the same Bell’s theorem) for some wonderful
experiments.
I could humbly
try to take Science to much deeper level to explain my own findings here but
that I would deal with separately the
mystery of Locality-at-distance which
Modern Science has not perhaps imagined so far...
But what is
relevant in Finance context that Market
and hence the dual game play of Uncertainty & Predictability would
keep on going endlessly until investors. Market is essentially the sum of
Quantum Human Behaviors of Buyers and Sellers in the Market and Stakeholders
etc.
Hence, while
Market/Finance has to be studied
Scientifically in Causal way, one has to acknowledge that Science won’t do
magic by being able to Predict the
Fundamental Uncertainty inherent Everywhere. But there is a good news to deal
with Uncertainty which I would cover in different section here on Randomness.
3.Randomness
(Unpredictability, Uncertainty)Vs Determinism(Predictability)
For Long
there has been serious conflict going on in Science/Philosophy/Social/Financial
domains whether the world is Random or Deterministic. Science is still far from
solving this. But let me state that there exists Fundamental duality in the
Universe in terms of Randomness & Determinism just like Light behaves as
Wave or Particle , which is fundamental duality inherent in the Universe.
Similarly , there exists Fundamental duality of Uncertainty & Determinism
in Nature and that’s well reflected everywhere including Financial Markets.
Market essentially also has this duality. And this exists Relatively to the
Observer. Same thing can be Random or Deterministic to different Observer relatively.
Information availability is also one of the causes.
At the same
time,I must categorically mention that Causation & Random are not two
opposites rather even Randomness has the Cause. But knowing that something is
Causal doesn’t mean it’s completely deterministic. It all depends on the level
of information. Like we know the Causal Law of Motion of a Car but that doesn’t
mean one can completely predict an accident !!
So,
Randomness is NOT Randomly Random but “Deterministically” Random. ! Things are
Locally Random but Globally Deterministic in Nature, Universe and even Markets.
One can relate this to Quantum Physics where a particle behave randomly at
individual level but highly deterministic at group level.
Hence,
contrary to the understanding that Randomness is opposite or different from
Causality and Causality is overestimated by Humans, I would rather say Randomness does have Causality and even
Randomness has inherent Determinism that need to be discovered but that doesn’t
mean they would be completely predictable. It’s like Quantum world as explained
earlier, Fundamental duality exists. Things could be random and predictable
relatively. As in Quantum world, a wave could behave Random/ Uncertain at some
level but Predictable as well at some other level l.
So,
Randomness has to be studied deeply as the part
of Causality but we have to acknowledge it’s existence and uncertainty
inherent in Nature Relatively and Things can’t be completely Deterministic or
Completely Random. Randomness doesn’t mean anything can happen, it would always
be driven by Causal forces ,but that we may not completely know.
Hence,
Randomness is Deterministically Random not Truly Random in Nature hence in
Financial Markets as well.
As usually
understood , Randomness is not absolute phenomenon, same thing can be random
and deterministic from different perspectives and availability of information
relatively for the same observer and different observers too.
Randomness
does converge to its Equilibrium State !
This is governed by the Law of Energy and Nature.
So, things –
One we need to understand the Causality /Determinism behind the Randomness. Randomness
does follow some Deterministic aspects but that doesn’t mean it would be
completely predictable. That’s the key to success in real world It’s linked to attaining more and more Convexity.
Will be dealt in the later section.
The
Philosophy behind Prediction : Trying to
make the world Deterministic
Almost all
the financial models try to predict the market based on various statistical and
mathematical tools. Even in Life, we want to predict and also in various other social
and other domains. Everyone talks about Prediction more or less. But have we ever
thought that what we inherently do when we try to predict. So, what does
Prediction mean scientifically? Prediction means the the observer is trying to
make the world deterministic by knowing the state of the world at Time t at present (Time =0). Even Nature doesn’t
know what it could be at Time T =t because it’s itself a work-in- progress
driven by Causal forces. And the State of the world at Time T=0 and the Time T=T both could be different and even Nature doesn’t know what
could be. So, by trying to predict, the observer might influence the system by
influencing itself. These all are Scientific in context of Quantum Reality in
Physics.
But is
Universe Really Deterministic Completely ? Is it not against the Quantum Law of
Nature ? Let’s imagine the world is completely Deterministic or Random ! Will
the world run ? No, it will cease to run eventually. The only way the Universe,
Markets, Nature,Life could Function is the existence of Duality & Opposites.
So, by
Trying to Predict more and more, we are trying to make the system more and more
Deterministic which it’s Fundamentally not. Infact there is paradoxical
underlying truth .
The more an
observer tries to predict, the more uncertainty it would gets. The less he
tries to predict, more certainty would the things become. By predicting more
and more, the observer tends to change its own predictions using quantum
effect. As in Quantum physics, Observer’s role is critical and could influence
the outcome. Recall Heisenberg Uncertainty, to Schrodinger’s Cat to Quantum Entanglement
in Quantum Physics.
In Other
way, to predict like Newton’s Laws of Motion in Physics, one needs to have the
information about all the causal forces. But does one have this, No ?
Infact there
is a Self-referential issue as in the famous Godel Incompleteness Theorems.
In Science
or in Markets, the fundamental issue is that Observer tries to Predict the
system,who is itself the part of the system. The System is not independent of
him/her. So, scientifically, Predicting the Market/Nature/Universe/Life is
Causally Predicting the Self ! Now that’s Contradictory! Can we predict
ourselves ? How far ? Market is essentially the summation of Human Behavioral
Forces. If an observer can’t completely Predict itself, how come it can predict
the Resultant Sum of All the Humans completely?
Caution : I don’t mean own should not predict at all or the world is is
not at all predictable. Because, if Everything becomes Unpredictable/ Random,
that would also cease the system to run ! The point is observer needs to
appreciate and admire the existence of both the sides and act optimally.
So, the key
is not to try to predict more and more rather act in a balanced way
appreciating the principles and Laws of Nature.
This opposite
forces or duality creates the potential to run the system. We often see
opposite school of thoughts people. slamming and criticising others to prove
their own point.
But the
highest level of intellectual maturity and consciousness and knowledge is the acceptance
of this duality and existence of opposites at the same time. Yes, this Opposite
is the source of energy potential that induces the current flow otherwise
everything will stop. This duality is inherently the part of the Nature
scientifically.
The
Beauty of Nature/Life/Markets/Universe is Duality and that’s essential to
create Potential energy to function
4.Connection
between Science, Human Behavior & Financial Markets.
Markets are
driven by Human Behavior which essentially follows the scientific Laws of Nature. Human Brain is run by
Neurons, Electric Signals etc. Market is essentially like a Quantum wave.
Infact Human itself behaves like a Quantum
wave scientifically.
Science
which is often meant Physics has some fundamental difference with Human. Unlike
Physical Objects, Human Behavior has some different Behavior but they are
indeed linked to each other. Human Brain is also run by Neurons, Electric Signals,
Energy etc. Let me state that like Physics, Human Quantum world also has Inertia,mass,weights,
momentum, force etc. Broadly speaking just like Classical world, Quantum world
also has the equivalent concepts. Indeed, the Newton’s Laws of Motions, Laws of
Energy,Principle of Least Action (But Least has different meaning in Human
Quantum world) are all valid in Human world. And as a result, even Markets has
Momentum, Mass, Energy, Gravity, Acceleration etc.What we call Stocks also have
mass, acceleration etc. . They also
follow the Laws of Science/Energy etc.
It has been
traditionally told that Humans are different from normal physics. Indeed that’s
true but even Humans also follow the Laws of Nature and hence the Markets.
And , for
that reason markets have to be studied from that scientific perspective of
energy, mass, momentum, acceleration etc. Infact I would explain later how the
scientific background of concepts like Large Cap, Small Cap, Mid Cap etc.
5.
Science behind Black Swan & How to deal with it in Real World ?
As we have
seen earlier, the Scientific Universe /Nature has Randomness &
Predictability as inherent duality like Wave-Particle Duality. Despite the fact
everything is governed by skem Causal forces, there exist many Causal forces
governing the dynamics and beyond a observer level of information relatively.
So, linked to Godel Incompleteness Results in Mathematics, In any system there will always exist the zone
of Unpredictability as well as Predictability and they exist together. There
will practically always be some information not available to an observer at any
time.
Black Swan
emerges out of this zone of Uncertainty
& Unpredictability which is inherent scientifically in Universe, Nature,
Life and hence in Financial Markets.
This would exist in relative sense not Absolute sense depending on the Level
of information available at a time.
Hence, Technically Black Swan is nothing but exists in the domain of
“Unpredictability, Uncertainty”. Hence , technically the statement “ Predicting
the Black Swan is Contradictory as it means Predicting the Unpredictable”.
“Predicting the Uncertainty”. I again emphasize it depends on the Level of
information available to an observer.
So, what
needs to be done in Real world to Black
Swan type events. First of all, one has to understand that it scientifically
originates in the Unpredictability/Uncertainty zone.So, Predicting the Black
Swan means trying to Predict Unpredictability! Contradictory! If it were
Predictable, it won’t have been termed as Black Swan technically.
So, one has
to understand the science of Uncertainty/Randomness . As explained in the
Randomness section, even Randomness has deterministic hidden pattern and things
do converge to their stable equilibrium state over the time . This is according
to the Laws of Nature/Energy. So, Rather than wasting time to predict Black
Swan events, one need to focus on managing risk to be able to survive/ benefit
whenever they come. This can be done by
attaining Convexity ! This is because
post Black Swan ,it would converge to the equilibrium state. Those who blow up
during Black Swan events are not risk prudent for example they
are exposed to risk due to leverage or similar fragility etc. The key is
to survive during Black Swan. Hence one needs to be Convex enough to survive .
This is the part of strategy how we could be more and more Convex.
6.Background
of Factor Investing
Factor
Investing has traditionally been nothing but a application of Linear Regression
Analysis tools on the Historical Return Data. It’s typically associational
rather Causal where Regression is conducted to calculate some superficial not-necessarily
reliable relationships like beta and so-called Alpha. Especially Error term is
often ignored. Factors such as Value, Momentum, Size etc are quite prominent
based on some economic opinions which are not necessarily scientific.
One of the
most buzzing factors has been Value,
which has not performed over the years recently causing a global fuss if really
Value matters or it has died.. So, how Value is calculated.
7.Application
of Statistical Tools in Finance : Misunderstandings : Role of Time in Mathematics/Statistics
in Real World.
Statistics
Needs to be Scientific: They are always Backward.
Statistics
is heavily applied in the world of finance. Almost all the financial models so
far uses statistics whether for risk management or prediction etc. But the
fundamental issue with all the conventional statistical models is they all are
backward looking in the direction of time. This ignorance of understanding the
role of Time dimension has possibly made
the Financial models a scam in itself any be. Mathematics has always been
studied independently of Time assuming absolute for all. This Platonist view is
the cause of misunderstanding and
misapplication of Statistical tools in
Finance which suffers from Hindsight Bias.
Financial
Statistical Models are always built in Past data in the direction of Time. At
Time T=0, a modeller does curve fitting for Data over T< 0.
But the
Modeller doesn’t understand ignorantly or by inertia that T> 0 & T<0
are not the same in the world of Statistics/Finance.
By looking
Backward in the direction of Time, Modeller rules out the Randomness component which is present
while looking in future way back then. Future Direction of Time has many
possible paths randomly but Backward Direction of time has just One
Deterministic path which actually occurred and on which Data fitting is done. This
foundational blunder of ignoring the Randomness in Backward Direction of Time
is the core of all the issues in Financial Modeling based on Statistical Tools.
The entire estimation of Risk, Prediction etc based on such Statistical Tools
ignore the vital role of Randomness which a trader actually undergoes while
taking decisions in real world. Unfortunately, Out of many possible unobserved
paths in future, the statistical modellers only takes the observed path in the
backward direction of time.
The
fundamental issue with Statistics is that it is
always done in the backward direction of time based on historical data. That’s where the role of Causality
& Science comes in. Causality tries to look forward using scientific
mechanism . Let’s imagine in physics, we are using statistical analysis of past
trajectory to trace out the future Trajectory ? Does it sound awkward ? So,
then how we do that in Finance ? Finance has to be studied like Scientific
Physics and that Principle in forward direction of time . This ingrained psychology of statistical
analysis in finance should be replaced by scientific Causal anlaysi of future
where we could study Randomenss, Determinism, Human Behavior etc..And yes, some
investment and risk strategies are causal based on scientific principles. We
will talk about it later.
Hence
Psychologically engrained Backward Time Looking
Statistical Analysis must be discarded and Forward Looking Scientific Causal
Analysis like we do in Physics etc must be adopted. This would also be the key
to Scientific Risk Taking & Management and dealing Scientifically with
Black Swan type events in Real World.
The tools
like Backtesting etc are the subset of that unscientific understanding of Role
of Arrow of Time in Statistics in Real
World.
That’s the
reason why Finance has to be developed in forward direction of time as we do in
Physics. Do we Backtest in Physics to predict the Trajectory of a vehicle /Car or
we study the equations of of causa forces. This fundamental psychological
change has to be brought in the world of finance where the trajectory of a
stock price is studied scientifically by analysing Causal forces, rather
Backtesting.
Even for
those who simulate Randomly in future like MC Simulations etc must understand
that There could always be more Scenarios in Real World than one can simulate
using Computers or otherwise. This is fundamentally related to Godel
Incompleteness Theorems.
The point is
Finance has to be studied in scientific way in forward direction of time like
we study physics. That’s the way, we can have better understanding of
risk-return in real world finance..For that ,we will have to understand the
Causal concept of Uncertainty/ Randomness in Life/Nature/Markets. This is is
because by understanding this concept, one can take Scientific decisions to
build the portfolio or otherwise.. Otherwise all those Statistical based tools
like Sharpe Ratio, Drawdown, Correlation etc. Out of historical data or
observed data is just the tiny subset of all the possibilities and hence hugely
misleading for understanding risk.
Like in
Feynman Path Integral approach of Quantum Mechanics /Paul Dirac Principle of
Least Action, Explore all the paths not just the observed one.
8.Regression
Analysis, & Paradoxes
Y =
Let’s assume the Linear Regression Equation
Where Y is the
Dependent Variable , X is the Independent variable and α is the Error Term of
the Linear Regression.
X = +*Y +
So, we can see that
Association ignores the Direction of Time here. . It could be both way But
Causation has direction like Cause and Effect at different Times subsequently..
! To determine Causality, we will have to block all other paths/factors and test
the direct effect ..for example…Stocks in different economic scenarios…or other
company specific factors to test the causality …
For example: The Sun
rises in the morning and the bird sings in the morning .They are obviously
correlate but to test if Bird singing causes the Sun to Rise We can test if the bird sings when the Sun
doesn’t rise on cloudy days or if the Bird doesn’t sing, does the Sun rise or
not… We would find that this is not the case. That means it’s association ! Like two cars
running on road would be associational not causal and they could suddenly
change the direction after sometime if roads diverge suddenly…So,
misunderstanding those two cars relationship is causal could mislead suddenly!
Similar is like Factor Investing. If
certain factors work doesn’t mean they are causal..they could be
misleading and could be by chance unless Causality is established !
If this is causality
the absence of one would definitely affect the occurrence of the other event.
If A causes B then if A doesn’t occur, B should be affected every time. That
need to be tested.
But we do have to
accept that there could be many more unknown causal factors that we might not
know which could be in the form of randomness errors…!! So, that’s why RCT type
approach would be required at later stage to deal with them !!
Point 1) In the first
one variable Linear Regression Model (LRM), what is the most important term is
the Error term(residual ). This represents the Randomness ,Uncertainty
component . The error term is the
TRUE Origin of the Black Swan events !
What is to be noted that in
the above Regression formula, X1,X2 etc. are independent variables usually
ignored in Factor based Investment
Models.
For example, while doing
Regression analysis in Factors, why is it assumed that different factors like
Value (HML), Momentum,Size (SMB) etc are independent ? Are they really
independent ? No , so how far this Regression Analysis application to calculate
Alpha is accurate ?
1)Reverse regression Y on X
and X on Y are different
2)Relativity of Beta when
many independent variables X 1 ,X2 etc are included. Beta changes …
Statistics is NOT wrong but Statistician needs to
understand the statistical terms well in real world applications. This error term which is often ignored is the
most vital component in the real-world financial applications. This is where
the Mystery lies !!
Structure
of Machine Learning Models Like the Regression and other ones :
It’s
all about finding the suitable functions . Here also once can see that though could be many but the error term still exists
which is the source of Black Swan Tail Risk.
The key
issue is while most approaches on finding f() based on historical data fitting,
what should be the focus is on managing future error terms (randomness
uncertain components).
That’s
where the real world model development is needed…Managing the Unknowns and
Random Components…
Can
you predict your own life using Regression Analysis ? How far ? Market is
essentially the resultant of all Human Behaviors ! Need not forget.
As
Nassim Taleb says Real World is more Random than RealWorld. Indeed that’s right
!
One
more fundamentally important point is that Stock Prices don’t move on Euclidean
Space of Paper in reality . This is virtual trajectory. In Real World, the
Stock Price moves in different Non-Euclidean /Riemannnian Economic Space-Time
which captures Human Quantum Minds etc.. Then we are trying to
superficially/virtually draw the trajectory on the two dimensional paper . So,
while doing Linear Regression Analysis , it should be independent of the
intrinsic characteristics of the underlying space. Hence, whether Regression
Analysis is done on Euclidean or
different Geometric Space (as in Machine Learning etc.) the true physical
relationship must be invariant. Hence, these parameters like Beta etc depending
on the slope of line in the Euclidean Space-Time could be fundamentally
misleading !! Because in different Space-Time, Beta (calculating using
Euclidean Metric Least Square Distance )could be possibly different but it has
to be independent of the underlying Space. This is extremely fundamental
issue while doing conventional Regression analysis . In a nutshell, this Beta
relationship couldn’t be in the Real Economic Space Time but the
characteristics of the space of the paper on which we have superficially
assumed to draw the trajectory of the stocks !
REGRESSION
PART2
Regression Part 2:
A)Causation
is in the dimension of time which LS Linear Regression doesn’t take into
account….Time dimension…it treats them statically in timeless dimension only
space
We
have to understand in real-world perspective that Causation is established in
the Space-Time dimension not just Space. As in the Physical world, cause occurs at say for example T=0 and
effect occurs at sometime in future T= t. If we ignore the dimension of Time
and take only Spatial locations into consideration, it becomes just association
not causation. For Causation we have to test the Cause and Effect in Space-Time
not just Space. Unfortunately, LS Regression is being conducted particularly in
Factor Based Investing and otherwise in Finance ignoring the Time Dimension.
This omission of time makes the entire relationship like associational of
patterns that could be just co-incidence but not scientifically causal.
By
running the following LS Regression
We
often inherently assume that X and Y
both are able to transmit information
causally “simultaneously” at time T=t at
more than the speed of light, but how this is possible ? If they are really causal, then first X should
occur at Time T=t and this should cause effect to Y at some time in future T =
t+n where n >0. But in the conventional LS regression it is
inherently assumed that X and Y both are cause and event at the same time which
is contradiction in real world scientifically in Nature.
It
otherwise proves that X and Y relationship is Associational based on
Superficial Patterns either caused by Coincidence(Like two independent Cars
moving on the road in the same direction misleading an external observer to be
causal to each other) or they are caused by some hidden causal mechanism of
some other variable known as Confounder. The above LS based regression can
take place at the same time “Simultaneously” iff there is hidden Confounder(if
it’s not coincidence!) as information can’t travel at more than the Speed of
the Light ! (Of course it’s not Locality-at-distance logically!)
So,
for Causational Proof, the Equation has to be in the form of do calculus. This
is causal intervention that if X is
causally set to value x at time T=0 for
example how the effect on Y behaves
in future value of Time =t and that time t can have different values on
case to case basis.
I have been advocating for long that Time Dimension is often ignored in
the Statistical and Mathematical world(Timeless Platonic world) unlike Physical
world(Space-Time). The role of time appears non-sensical often but at deeper
level it makes huge difference! Many of the Paradoxes in Mathematics originate
also due to this omission of Time as well fundamentally. Say for example,
Theory of Relativity considers “Simultaneity” as Relative while Conventional
Set Theory in Mathematics rests on the Principle of Simultaneity as an Absolute
Phenomenon. This is beyond the scope of this paper. But I mentioned to show how
this foundational misunderstanding percolates down to statistics and
Association in LS based Regression is often misunderstood as Causation !
B)
Let’s go back to the history if LS Regression method. It was first formulated
by Mathematicians like Guass, Legendre in 1722 to around 1800 for the
estimation of Trajectories of the
Celestial Physical Bodies like our Earth in the Euclidean Space.
That’s mathematically good estimation as that
physical space-time behaves like Euclidean Space-Time. But the foundational
issue arises when that Tool from Physical Space Bodies is being applied to the
Financial & Economic Space-Time. The financial and economic space-time
driven by quantum human minds is not essentially an Euclidean Space-Time rather they have
Non-Euclidean/Riemannian or some other Space-time in Real World. Statisticians
import the observations from that Space to Euclidean Space of Computer Screen
or Paper to draw and apply LS Regression ! But this raises fundamental question
that is the Euclidean Space of Paper and Metric calculation doesn’t bias the
Estimation of Financial & Economic Variables from a different Non-Euclidean
space-time ? I am stating this because the way Least Square method is
developed, is it Not inherently
dependent on the Euclidean metric relationship ? What if these data are
drawn on some other Non-Euclidean Space ? I mean this Relationship should be
independent of the Underlying Geometry of Space where these are drawn. The Real
-World Economic & Financial Space-Time is not Euclidean !! This a point of
exploration as even in ML methods, often Euclidean metric tools are applied.
But it could be Non-Euclidean and other Riemannian Metric Spaces as well to
better discover the relationship.
But
anyway let’s confine to the Euclidean one that is generally taken to derive the
Least Square based Regression Method for a while ….
Least
Square based Regression Tool is due to the algebraic structure of Least Square
Formula where it minimizes the Sum of Squared Differences(we can say them
Euclidean Errors!). If we change this objective of Minimizing the Squares to
something different, the entire Calculation of Beta would be different and the
value of Beta would change ! Beta is actually dependent on that Euclidean
method. But why we minimize the Sum of Errors is itself under question ! How
far is this process effective to take into account the Outlier in Real World ?
Let’s say we don’t minimize the Sum rather we minimize some other functions
that could be more suited to Outliers for example from Tail Risk Perspective.
There could be different values of Beta on the same set of Data the way we
define the minimization function of errors. It’s not Absolute ! It’s Relative !
The important concern is that in Factors Investing and Finance & Risk
Management LS Beta is blindly applied for Allocation, Risk estimation that has
proven to be disastrous in Real World in
case of extraordinary(outlier) scenarios.
Further,
one more important aspect of LS based Regression is the Exogenous Condition
that
E[error] = 0
And
if this exogenous condition of error is not satisfied, the whole estimation of
Beta is unreliable and biased ! This is the fundamental reason why in Real
World Finance, Beta is often biased and variable in Financial Time Series data.
Exogenous
condition means the independence of Error terms E[error/X). If Exogenous
conditions are not satisfied, that means the Error itself is the function of
some hidden relationship . It could be that Error is dependent on Y or X or some other hidden
variables which govern the dynamics of the error and which in turn make Beta
unbiased and unreliable as both are mathematically related ! hence the internal
dynamics of Error has to be established otherwise it would affect the other
aspects of the LS Regression. Even there can be different dynamics of Error say
if Error itself is some other variables regression
If such is the case value would be highly
misestimated ! This actually happens in the Real World Finance & Economics
Time Series data !
It’s like in algebraic equation
In this case the would be highly misleading
,biased and incorrect mathematically.!
Infact such conceptual mistakes highly mislead in the real world finance
and economics where beta(technically slope) becomes too variable dependent on
the data time frame etc…and exposed to Black Swan Tail Risk affecting Trillions
of Dollars of Investment globally affecting common people and Investors’ lives.
Error
basically means Random Component ..all the deterministic components of Y
dependent on Independent variables like X
have to be removed from the Error term.
So,
In Financial /Economic world, unless Error satisfies the Exogenous condition , the LS estimate value of will not be unbiased.
If the expected value of the Error term above is not 0(the violation of
Exogenous condition ), it
means that there is still some hidden deterministic variables relationship to
be discovered inside error terms. The error term like constant as in the equation is not a constant but a function of some hidden
variables or even .
The correct meaning of is that uncertain Portion of . In Real World example the
error term represents that Unpredictable component when we try to explain Y in
terms of X causally.
It's like we intervene by the cause
at say Time T= 0 and
measure its effect on at Time T= t. Infact in
true causal sense these two occur at different points in time not at the same
time(simultaneously) because information traveling takes some time. Then we try
to understand what uncertain random component of Y is not explained by the deterministic
components of the equation. We call this as the error causally. But this
omission of Time dimension makes the entire thing superficial
associational relationship and just an
artifact of Euclidean metric space !! The way error is defined as Y – E[Y/X], it shows that the error term is
not independently defined.
This
is like B1 Spurious case as explained by Prof. Marcos Prado in his interesting
paper.
“This
is most likely valid in the field for which LS Regression was formulated in
Celestial calculation but not often in
the Financial and Economic Space. The
Real World Data hardly show the Exogenous condition Satisfied.”
Econometricians
can’t define error deriving from algebraic equation like Y- beta*X rather error
terms should independently satisfy exogenous conditions E[error ] = 0
Error
should be independent of X
Exogenous
conditions means Regression assumes deterministic set up.
Expectations
of error or randomness to be 0
Correct
meaning is if we do[X] at time t= 0 then at
time t >0, Y should equal to
X*beta.
Error
means Unexplained part in Causal relationship of Y due to X
Most
Machine Learning Tools focuses on the deterministic part but there is need to
focus on the Randomness(error) part and its underlying mechanism and how to
manage those unexplained, uncertain components.
Present
regression at same time means it assumes the role of hidden confounder.
What
is important to observe that in LS regression, the order of variable is also
very important.
The
two regression lines below are not derived from each other with respective
coefficients.
That means in general LS regression method
This reveals the internal dynamics of LS Regression that the Minimizing
the Sum of Squared Errors method is not symmetric in this sense. The Order
matters and at the same time the values of the Slope and the Error terms depend
on the error and are not inter-linked. This implies that in real world
financial and economic applications, before applying this associational LS
regression, one has to make sure about the Order otherwise the derived slopes
and randomness error would change leading to different estimations of risk
affecting allocations etc. This also shows that Error itself is relative and
order of variable dependent on LS based system . But in the Real World, the Random
Components should be linked to each other if the Variables are the same just
order changes.
Coming back to the Commentary : the entire system is getting misled superficially by
the Associational relationship based on LS type Regression affecting Trillions
of Dollars Globally.
The
way sum of squared error minimization has been defined in LS system, it’s
dependent on the Euclidean type system.
Econometricians
assume they are doing Causal but the mathematical tools used are associational.
If
LS system is changed, Beta will also be changed..
Question
why LS system ?
Rather
than minimizing the sum of the squares, there could be other better approaches
as well from tail risk perspectives. (This is the other detailed research topic
in itself !)
There
is no fixed Beta . The value of Beta
would depend upon how the Error
terms are dealt with like LS equation. This is because Beta is derived by
minimizing the Sum of Squared Error Terms.
Lets
say there are different points
We are trying to find the regression equation
where
Here the values of
are those which minimizes the function
It shows here that at time
T=0 the error(randomness) terms are
figured out, then at time T= t >0
they are minimized mathematically to derive beta (deterministic)term ! This is
also like In-sample derivation of Beta term(Deterministic Term) from the mathematical
minimization of function of Random
Terms!
Please imagine the role of Time dimension here
Lets
say the Regression equation
Here
it shows that Randomness Term is the Function of Deterministic Term and that’s
why we take the derivatives to find out the minimum.
That
mean at Time T =0, we try to estimate
the Error first and then calculate Suitable Beta at Time T= t >0 , Please
note the role of Time Direction as well. This is because for Causation, role of
Time Dimension is the fundamental requirement in Real World. By the
way, This is like In-Sample Estimation
where Beta is searched by deriving from the
Error terms.
Ideally
what happens in the Real World… We should pre-estimate Beta(Deterministic
Expected Term) at T=0 and then measure Error Terms “INDEPENDENTLY” at T = t
> 0 and study the Exogenous Condition E[ error/X] =0 if that is satisfied or
not ! In the existing previous approach, “INDEPENDENCE” is completely compromised
as Beta is derived from the Error itself !!
This
is in principle foundationally incorrect leading to Biased estimation of Beta in In-Sample way.
So,
the entire traditional Beta Estimation in LS approach is like In-Sample. It
doesn’t tell about Out-Sample Beta in Real World in Forward Direction of Time!
Especially for the Financial Data which is often unstable, this could lead to
huge misestimation in real world out sample result.
As
we have seen earlier, the LS approach
was formulated for Celestial Physical Bodies which is Classical World
Stable Data unlike Financial Data driven by Quantum Human(Trader &
Investor) Minds.
EXOGENOUS
19.
Exogenous Condition :
LS
Sum of the error minimum when Expected (error)= 0 . LS approach relies on the Expected (Average )
figure where Average is hugely misleading Statistical Tool in Financial world
with fat-tailed data.
The
issue is even Exogenous Condition is not sufficient in Real World Finance
because even if Expected i.e. Average is 0 satisfied but there is large
negative movement and then subsequently large positive movement but the system
could get exposed to Tail Risk Bankruptcy in the large Negative Error term and
can’t even wait for the next Positive Upside Error Movement. So, from that
Perspective of Tail Risk, even Exogenous Condition is not sufficient. It’s
another detailed discussion of its own how these conditions should be re-framed
!
20.
Beta is not Absolute but Relative.
Beta
depends upon how the error is dealt with and also with respect to what other
variables are .
The
Exogeneous condition of error is extremely important.
So,
probably one test is Take Beta to that level until Exogenous condition on Error
Randomness is satisfied independently !!Then only Beta would be unbiased.
Otherwise, it means that the Beta is biased as Error Term itself is some
function rather independent ! As long as the error doesn’t satisfy exogenous
conditions independently, it shows there are some more factors(contributing to
different betas … deterministic components) to be discovered….
Moreover
Beta value depends on how Error Term formula is tweaked like in LS form, There
can be other forms as well. Beta would be different in various cases. It’s all
dependent on the method followed.
Beta
should be such that it should maintain a balance between tail outliers and
normal depending upon the requirements.
If
Error has intrinsic hidden further pattern ..it would make the existing beta
biased and different types of structure within error would affect the dynamics
of beta and further may be causality or
not.
21. Hidden Dynamics of Error Terms :
Now
Further, There are different possible
scenarios….Error terms have some hidden Deterministic components which affects
X or Y or X affects the error term or even Y affects the error term …in all
different cases the validity of Beta would be different
Say
for example
But if Error term is biased and Not Random say for example
And further
.
In that case, the entire estimation of would be highly biased as
this is derived from
which itself is the function of different variables implicitly rather than
independently.
Biasness
of Beta
It’s
like three variables algebraic equation
in x and y and z..you are assuming z as constant and make quadratic in x and
y…it could be technically wrong !! Similarly until error term is exogenous and
independent…..it would lead to wrong beta estimation !! That’s why in
Regression.. independence is of huge importance.
If
Error is function of some hidden variables then Beta won’t be correct…how
do you make error is independent ?
Simply
doing regression without independence of error is technically wrong and could lead to biasness and misestimation
Like
in Quadratic or Functional algebraic equations if you treat a variable as
constant ..it would give wrong solution
Y =
m X + c ….here c needs to be strictly constant ….can’t be a variable related to
X and Y or else m would be incorrect…
One
can’t treat function as a constant and do functional algebra ..it is
technically wrong
So
when doing such statistical regressions, make sure error terms is not a
function of some related variables or else beta would be biased and unstable
over the time…and as beta would depend on error
That’s
the reason Beta becomes unstable. It can
be tested in real world where we see
that the measured value of beta keeps on changing rather than fixed constant !
One can experimentally verify this on many financial time series data and check
how beta has its own inherent dynamics. Beta keeps on changing dynamically over
the time dimension or otherwise
relatively if new variables are included!
Error
would change over the time if it’s function ..not exogenous conditions
satisfied…
If
Error is itself a function not satisfying exogeneous conditions ..it could be
the source of Black swan
Error Terms & Simpson’s Paradox : Simpson
paradox reveals a lot about the Reality !
A)
It reveals that the DETERMINISTIC parameters like Alpha, Beta themselves have
Randomness over the time !
B)
Further, the existence of these parameters and error terms are RELATIVE !! THE LINE OF BEST-FIT itself behaves
geometrically RANDOM over the time ! What was the Direction of Error Terms
eventually becomes the LINE OF Best Fit and what was Line of Best-Fit becomes
the Error ! This reveals the Fundamental aspect of Relativity and Randomness
over the time for an observer.
It
reveals the Fundamental Duality of Randomness & Determinism existing in the
Universe and hence Markets.
The
Universe and hence its subset ,the Markets have both the components:
Fundamental & Deterministic. One can’t differentiate between the two. They
exist Relatively and Simultaneously. Deterministic(Line of Best Fit) becomes
Random(Error term) and Randomness(Error terms) becomes Deterministic ( Line of
Best Fit)
One
has to be very careful while making decisions based on the Regression analysis
particularly ignoring the Error terms ! Blindly following the Fixed Line of Best Fit to make investment decisions
could be highly misleading and disastrous in real world !!
So,
this points towards extremely fundamental points while doing the Linear
Regression analysis : That parameters like Beta, Alpha etc. could keep on changing depending on the data set
and there are no fixed values.
First,
Two things tracing the same trajectory
might not be correlated in true sense. Say for example two cars moving
on the road would appear to be correlated to an external observer from sky. But
that’s not so, may be later the road diverges after a long time and both the
Cars could change the directions and befool the external observer from the sky
who was assuming them to be correlated by observing their past trajectory for a
long time !!. Similarly in markets !!
The
Point is to establish the Scientific Causation first and then one can rely
scientifically Regression to some extent as long as error is managed well.
For
that causation, one has to block non-causal paths by observing the Y & X while blocking other variables or
checking how Y behave if X is there and
NOT there. If X causes Y then if X stops, Y must be affected or stopped !
###
We
have to understand that Regression is fine
it could help in prediction at times but given the large unexpected
error terms at times could be highly misleading at times and disastrous. So,
what one should do optimally is to expect along the Regression Line of Best Fit
but still be prepared to manage Random Error term which could be huge in the
world full of Complexity
24.Equity Risk Premium Puzzle : It’s indeed
the bug in Factor Investing statistical structure based on the Linear Regression Analysis which we have discussed
earlier…Capturing Human Behavior dynamics based on Linear Regression analysis deterministically while
ignoring the randomness component is deeply flawed ! This puzzle arises because we misunderstand
the concept and foundation of Regression Analysis.. Any regression based model to determine this risk-premia
will have fundamental concern… This is simply related technically to the
fundamental issues of randomness and
structure with such Regressions on which CAPM and Factor based Models are
based… the foundational issue is in the statistical tools itself on which such
model puzzles have been developed !!
So,
the ultimate solution to one of the most important historical puzzles in
finance is to look into the flawed conceptual
misunderstanding of the statistical foundation which has given birth to
the so-called puzzle. The origin of the Puzzle is the loopholes and
misunderstanding in the statistical infrastructure which has given birth to
such superficial puzzle. Once that is understood, the puzzle would cease to
exist automatically
! So, this puzzle infact hints to revise the foundation !
I must again categorically mention : When we get the
extraneous solution of any mathematical equations, it’s not that mathematics is
incorrect rather the assumptions while applying the mathematics are incorrect.
Similarly, here we need to revisit the foundation of the flawed assumptions of
factor investing and the statistical infrastructure like Regression. Coming To Factor Investing Analysis : As the
Byproduct of Regression Analysis
Equity
Risk Premium Puzzle :
The
main reason behind this Equity Premium
puzzle is fundamental structure of the Regression based CAPM model,
Misestimation of Risk Calculation and Ignoring the Time Dimension.
Often
Risk is measured in terms of Beta based on Historical data in backward
direction of time using the same flawed estimation of LS regression techniques.
Even Traditional Tools like Max Drawdown etc doesn’t capture the true Risk in
the Real World. The fundamental reason is Time dimension is ignored. If
we look in forward time direction, there could be many possible scenarios that
an investor is willing to take while investing in Equity but not all such
possibilities actually occur. Say for example an investor in the time of
Uncertainty is willing to take risk of even market falling -50% say due to some
war or some viral outbreak etc.. But somehow, market didn’t fall that much and
only -25%. So, when regression is
done on historical data or even max drawdown is calculated on historical data
in backward direction of time, this
possible forward scenario of -50%(which didn’t occur) was not included ,which
an investor actually took in forward time uncertainty.
Existing
Conventional Tools based on Regression Analysis and Max Drawdown, Sharpe Ratio
etc. often don’t take into account the
possible future real risk which didn’t occur but was a possibility in forward
direction of time ! infact that was the real-time risk an investor took in
forward time direction , but was omitted in backward analysis of time ! This
could hugely underestimate and overestimate the risks.
So,
the fundamental reason of Equity Risk Premium is the
·
Misestimation
of Risk in Real World by Ignoring Time Dimension and Traditional Backward
Looking Tools e.g. Beta, Share Ratio, Max Drawdown etc.
·
Possible
Randomness in Future gets omission in Backward Analysis!
·
Fundamental
Issues with LS -Regression based approach (the backbone of Traditional Factor
Investing)
·
Also,
the omission of Fat-tailed Outlier data which could drastically change the estimated risk parameters like Beta etc..
Hence,
unless the Risk is Truly measured in Forward Direction of Time( Not on
Historical Tools), the true estimation of Equity Risk Premium would be
inaccurate.
You
are measuring the estimate based on back ward looking data….in forward real
world risk is way more than backward looking due to Uncertainty…
*Because
equity risk premiums require the use of historical returns, they aren’t an
exact science and, therefore, aren’t completely accurate.*
The
estimates also vary largely on the time frame and methods of calculation…removal
of outliers fat tailed negative data
Relying
on CAPM type regression model to do the
historical analysis for equity risk
premium based on Beta etc.not reliable enough
Risk
should be truly seen in the forward direction of time not backward analysis of
Historical data using regression models
In
real world we see how much risk we take for equity investing over the Risk-Free
rate etc.
Measurement
of Risk in the form of Beta using Regression on historical data is not correct
in RealWorld!!
Real
world Risk in forward direction of time is way more than the backward analysis
of risk using regression tools
Lot of randomness that appear in forward direction of time in real-world
doesn’t get captured in the backward analysis
For
example : risky scenarios that could
have taken place while taking investment decisions in fhe forward direction of time ..but say
they didn’t actually happen …will not get captured and lead to less risk
calculation on historical data
regression analysis…
.:
Misestimation of actual risk in forward
direction
·
Risk
is hugely misunderstood on backward Anlaysis and in forward directions..many
disastrous possibilities had the chance which actually didn’t occur and hence
don’t get captured truly in the risk measurement.
An
equity risk premium is based on the idea of the risk-reward tradeoff. It is a
forward-looking figure and, as such, the premium is theoretical. But there’s no
real way to tell just how much an investor will make since no one can actually
say how well equities or the equity market will perform in the future. Instead,
an equity risk premium is an estimation as a backward-looking metric. It
observes the stock market and government bond performance over a defined period
of time and uses that historical performance to the potential for future
returns. The estimates vary wildly depending on the time frame and method of
calculation.
Future
Beta & Historical beta
An
equity risk premium is an excess return earned by an investor when they invest
in the stock market over a risk-free rate.
This
return compensates investors for taking on the higher risk of equity investing.
Determining
an equity risk premium is theoretical because there’s no way to tell how well
equities or the equity market will perform in the future.
Calculating
an equity risk premium requires using historical rates of return.
CAPM
model itself needs fundamental change based on historical regression
How exactly to calculate this premium is disputed
How
exactly to calculate this premium is disputed
Equity
Risk Premium = Ra – Rf = βa (Rm – Rf)
But
how do you calculate future beta in real world
? Not based on historical analysis
By
the way this is associational not causational …
Expected
equity risk premium likely depending on the hidden causal factors not just on
this associational beta based on Regression where the role of confounders also
come in.
Future
return Equity Risk Premium would actually depend on the Causal Mechanism &
factors & Randomness not this
associational type models like Factor based models and CAPM etc.
Future
Expected Fair Valuation drive the return
causally scientifically Not this CAPM type associational superficial
statistical theory
One
needs to further scientifically
understand why an investor took systematic risk means he/she should be
compensated with return why ? What is the science behind this ? This would be explored later.
Missing
Causal Confounder variable actually affects the equity risk premium. That is
hidden but could make the entire risk calculation biased !
9.Newton’s
Laws of Motion in Markets & Quantum Law for Markets
Newton’s Laws of Motion for Markets : Principle of
Least Action by Paul Dirac : Feynman’s Path Intergal approach in Quantum &
Classical World
Laws of Motion of Stock Prices in Quantum World:
Net Resultant Valuation(Energy) generates Forces(Demand * Supply i.e.
Buy and Sell Orders i.e. OrderFlow which finally leads to the Motion i.e.
Change in Momentum
Newton’s
Laws of Motion in Microscopic world :
Consider a system of N classical
particles. The particles a confined to a particular region of space by a
"container'' of volume V. The
particles have a finite kinetic energy and are therefore in constant motion,
driven by the forces they exert on each other (and any external forces which
may be present). At a given instant in time t, the
Cartesian positions of the particles are r1(t),⋯,rN(t). The
time evolution of the positions of the particles is then given by Newton's
second law of motion:
mir¨i=Fi(r1,⋯,rN)
where F1,⋯,FN are
the forces on each of the N particles
due to all the other particles in the system. The notation r¨i= d^2 (r i/dt^2) i.e. second
derivative.
N Newton's
equations of motion constitute a set of 3N3 coupled
second order differential equations. In order to solve these, it is necessary
to specify a set of appropriate initial conditions on the coordinates and their
first time derivatives, {r1(0),⋯,rN(0),r˙1(0),⋯,r˙N(0)} Then,
the solution of Newton's equations gives the complete set of coordinates and
velocities for all time t.
For
an Investor , based on the valuation, the causal force of demand is created for
a stock by the trader/investor. The Valuation is like Total Potential (in
Physics) that generates the force in the stock.
Valuation
is relative depending upon the investors’ perspectives. Valuation(Potential)
generates the market forces. Yes it’s as per the Physical Laws of the Nature.
So,
there would be different valuations for different investors/traders (who are
like human brain quantum particles!) causing different forces.
Let denotes that equivalent force caused by the
respective potential (Valuation)
So,
the causal process : market is the system of different investors
Each
investor has the relative valuation . Every Non-Zero Valuation would apply either upward or downward forces.
Causal
Process due to Forces in the Markets.
Different
Valuation-generated forces attracts
and causes different investors/traders to put Buy/Sell Orders .
(i.e. demand and supply forces) .This is scientifically like the gravitational force on energy or law
of attraction! Higher Value tends to attracts small values towards itself..
The Resultant Order Flow Imbalance is caused by the Sum of all these
Buying and Selling Orders(Caused due to the Value generatedDemand-Supply
Forces).
is like Electric Flux i.e.
Number of Order Lines passing through due to Valuation Caused Field.
(Note : I refer here only to the genuine orders intended to be executed
not the false orders ,for that there will be more causal analysis)
The Resultant Order Flow( due to Valuation Generated Forces) causes the
Momentum change for the next time duration [Newton’s Laws of Motion]
Now As the Stock Price gains Momentum due to forces, Price changes using F = m *a
(Newton’s Laws of Motion type mechanism), The Price changes subsequently.
Price at time t+ would be caused by Momentum generated at time t and then updated Valuation for different
investors would change relatively due to change in Price and Momentum in the
dimension of time. Price and Momentum being two of many factors affecting the
Real- World Valuation at anytime. ( This has been explained in detail)
Explanation :
Valuation
at any time is the function of many multiples and factors and
their dynamics over the time for every
investor. There is a resultant of all the Valuation. Open-ended process.
We have explained earlier
So, to be specific related to Momentum and Value Relationship
a)
Valuation at time t is also dependent on Momentum at time t-1 i.e. previous
momentum. The mechanism for Valuation is mentioned below..
a)Valuation
at time t drives the momentum (velocity) at time t for t+1 through the force it
generates.
a)As
the Stock price goes into the motion(momentum), it adds values for different
investors leading to change in the Order Flow Imbalance. is caused by more and more forces in the markets caused by the Valuation
changing over the time.
Number of Field Lines crossing is akin to Number of Trading Orders of
Traders/Investors caused due to Valuation changes as the Potential.
Which depends upon the Current Strength caused by the
Potential/Force/Valuation changing over the time.
Order Flow is essentially and scientifically the Market Quantum Force. One needs to study
the Order Flow (Force) to be able to understand the motion of the resultant
Stock Price. Like we study Newton’s Laws of Motion to figure out the trajectory
of a body. One can apply ML to study Order Flow Dynamics to study the Quantum
Forces driving the Market.
So, Based on the Laws of Quantum Motion affecting The Human Traders’
Behavior and hence the Stock Price can be studied in the ORDER FLOW dynamics
how Forces evolve. This is where ML could be useful. I state and claim based on
my experiments that it’s similar to Energy system of Particles. This is linked
with Century Old Problem and essentially Financial Brownian Motion would be similar to
Geometrical Brownian motion but in different sense ! This is linked to what
I’ve explained in other parts that Human Brain follows the Laws of Quantum Motion and that’s the advanced
version of Classical World Laws of Motion. So,microscopic Newtonian dynamics,
the Boltzmann and Langevin equations are derived for the mesoscopic and
macroscopic dynamics,
Hence I state and claim that one can experimentally verify that
Financial Brownian Motion (Quantum Living
version) is more advanced form of
Physical Brownian Motion (Quantum
Non-Living Version) although their foundations being the Energy system
only fundamentally but in different
forms. This fundamentally originates from the fact that Human Brain Quantum
Behavior is the advanced version of Classical Laws of Motion where the former
is variable while the later is constant. But yes all these Classical Laws of
Motions are valid and originate from the Quantum world only. That too, Living
Beings Quantum Laws have more variable components than Non-Living Quantum Laws
but all of them are highly Causal. This is beyond the scope of this paper.
But for this paper, that can be well studied and verified in the
behavior of Order Flow dynamics ! It’s
fundamentally the energy dynamics.
More
the Valuations for different investors, it would attract more OI assuming there
is availability of fund with the investors.
To
note : Momentum is just one of the factors among many. What matters is the
resultant of all like in physical forces. These are similar physical forces attracting investors through their
quantum force on their brains(neurons).
Step2 :At any moment in time The V(upward/buyorders
)tries to push the price up and V(downward/sell orders) tries to push the price
downward.
This
is tussle between the two driving the market ,eventually converging to the
equilibrium state.
Step
3. The Momentum drives the prices further up due to the net upward force.
Gradually based on the combined effects, the price moves. The distance travelled by the stock price can
be calculated on the change in momentum caused by the forces.
Step
4: As the price moves up and the momentum increases, it affects the Valuation
dynamically. Valuation (upward ) and Valuation(downward) keeps on changing time
to time.Upward force and downward force keeps on changing. .Gradually as the
price goes up significantly , the Upward Force weakens gradually due to weaker
upward Valuation and downward force becomes strong due to stronger downward
valuation. Upward That leads to price coming down again. This cycle goes on
to reach the state of equilibrium and
fair value over the time.
[Newton’s Laws of Motion]
This is universal law in Nature not just for Physics of Non-Living Objects
! But yes in different forms because
nature of motions differ in classical and Quantum worlds ! But this is also
applicable in quantum brain world ..
Here
Mass of Stock can be taken as Market Capitalization which typically means Small
Cap Stock will have less mass than Mid Cap than Large Cap. So, why Small caps
could be more volatile than large cap in general is also because of their lower
mass and hence requirements of forces. But this would depend on the resultant
forces on ase to case basis. But yes it’s indeed caused by scientific equations of forces !
Like
in Classical world, Quantum world also has gravity ! Larger stocks are having
more mass and hence gravity than smaller caps in general comparatively. Yes
it's like Physics . So, there is more amount of force required to move its
price to similar distance (i.e. same return ) compared to small cap (low mass)
stocks in general given all other things
are constant. Hence they also show lower risk as more amount of force would be required to pull their prices down.
And there this forces come from demand supply forces which are actually quantum
forces in the brains of buyers and sellers ( traders) or computer codes…
“Factors
have to be discovered scientifically how they contribute to the Quantum Laws of
Motion of Stock Prices not associational
regression based approaches
traditionally followed. One has to
discover how a factor is affecting the acceleration/motion of the stock through
Force and Mass ..For that one has to study the Law of Quantum Motion of a Stock
Price originating from Human (Investors) Forces of Demand Supply & Order
Flow. So, one has to study how Momentum , Size, Quality, etc. are contributing to the Scientific Laws
of Equation .
Not
that they should be compensated for their risk due to a factor and hence the
return.
This Economic Rationale that an investor
should be compensated just because he/she has taken some specific type of risks
has to be scientifically explained and determined how they are contributing to
the Laws of Motion of a Stock Price, otherwise it’s just a superficial and
non-scientific.
Infact explained later that Economic /Financial Space- Time itself exists scientifically
having their Laws of Motion as we have in Physics!!
The Issue with Conventional Value Method
A commentary on
Prado’s calculation of Goldman Sachs Factor Index Calculation of Performance
Connection
with Regression Analysis Price to Book Value wrt. ROE (Aswath Damodaran).
This
Regression Analysis is again applied with misleading results about Undervalued
and Overvalued. Also, something becomes undervalued using one multiple while
overvalued using another multiple. It also reveals the flaw and incompleteness
of single multiple approach to Valuation. If one uses(despite flaws),one should
possibly take into account different
multiples to figure out the same.
Traditionally
we have seen factors e.g. Value, Size, Quality, Momentum etc. based investing
being very popular and at the same it has also shown the dismal performance
over the last few years.
Two
important aspects :
a)Value
as the term has been defined incorrectly in this context that Low price to Book
Value mean Undervalued and High mean Overvalued or similar other analysis.
Infact this decision on Low or High is based on the LS -based Regression
Line as well .
Given
the fundamental issue with the Regression Line, the basic definition of Low and
High could be mis-estimated.
Moreover, even if one takes LS based Regression approach, Valuing
something based on just one specific ratio could be misleading as different
Ratios reflect different components of the Valuation(Relative Valuation
Approach).
Low
Price to Book Value Vs ROE Regression(Reference Damodaran).Misuse of LSS based
Regression to figure out Low to High Book Value. The determination of Low and
High based on the Line of the Best-fit is itself faulty ! Again same typical
Regression issue. Line of Best-fit itself could itself misleading in Real-World
applications. So, what one figures out Low could be High if Line of Best Fit is
changed !
There
could be indeed better methodologies to incorporate different ratios dimensions
of Valuation but that’s not the discussion of this paper here.
a)
Value in Real World is not the same as such
Academic Theoretical Definition as
conventionally calculated in the Finance & Investment Literatures. The
Definition of Value needs to be Redefined and Recalculated even in the
Literature. This is the fundamental reason why Value has started
underperforming over the time recently. So, we shall discuss on this in detail
below.
This
Theoretical Definition of Value is itself flawed and incomplete when we look at
how Value works out in Real-World Markets for Investors.
How
Investors figure out value in a stock traditionally is itself flawed based on
conventional approaches.
There will always be some patterns in this
world enough to mislead us and that could be due to chance.
31. VALUE IN
REAL SCIENTIFIC WORLD :REDEFINITION
So, How this Valuation formulates in Real World.
There is the difference between Academic Old Definition of Value and Real-World
Scientific Value. Infact Academic definition of Value should be redefined.
Here
is the brief formulation: Now taking the idea of Relative Valuations Multiple
here…Valuation of a stock for an observer(investor) could be the different
(Relative Valuation). Every Relative Multiple shows different component of
Valuation. Selecting any particular multiple might be biased at times in real
world finance.
The
key drawback is that a firm can be overvalued and undervalued at the same time
using different relative multiples. This reveals the key drawback of the process! How something is
concluded to be Overvalued or Undervalued traditionally using any particular
multiple e.g. Price to Book Value etc.
in Valued based Investment is both conceptually and statistically flawed
!
Infact , Value depends on the optimal combination
of different multiples with right weightage and randomness component. Not just
that rather higher order dynamics of how those Value ratios, their trajectories
etc. The different ratios have their own dynamics. Higher Order Analysis not
just First Order Analysis also affect the Value.
That’s why I categorically repeat that the way
Value has been traditionally and academically defined as Low Price to Book
Value etc. is both foundationally as well as statistically flawed in Real World.
Value is also relative on observer to observer.
It’s NOT absolute !
)
This is just a basic
mathematical simple framework to show but actually this is proprietary to
decide the Value and would depend investor to investor relatively. |
Further,
it’s not just first order thing but also how these multiples behave over the
time also determine the Value over the time.
So, What Value depends
upon the combined approach for an investor. Conventional Academic Value itself
needs Redefinition in the Real World.
Momentum, Quality etc. are the components of the True Consolidated Value not
different entities from them in Real World. All these aspects/components
determine the Value not just Cheapness on just low Price to Book Value
multiple. It’s an Open -ended rather than a closed ended formula in every
scenario.
Value is scientifically determined by the Causal
forces affecting human (investors’) Relative perceptions in real life and
Cheapness of Price is just one of the many causal factors.
a)
Price
to Book Value Multiple.
b)
Earning
Multiples
c)
Debt
Multiples
d)
Cash
Flow Multiple
e)
Macro
Economic Multiple
f)
Profitability
& Growth Multiple
g)
Risk
Multiple
h)
Cost
Multiple
i)
Utility
Multiple
j)
Momentum
Multiple
k)
Sentimental
Multiple
l)
Randomness
m)
Others
factors
This
can be written as the statistical framework which remains proprietary here!
Further,
I repeat time and again - it’s not just the first order rather how they have
behaved dynamically over the time i. e. higher order would give better insight.
So,
Value in Real World depends upon many Causal Multiples which actually affects
the Business of a firm and Human(Investor behavior) scientifically in a causal way as well as Randomness.
It’s
not just Book to value Multiple. Infact, this traditional academic definition
of Value is incomplete and needs to be redefined. Also, by this traditional
definition, if something is cheap doesn’t mean all the investors would buy it.
Cheapness is just one of the factors to buy. The IMPORTANT Point is We need to Understand How Value should be
defined and how it works actually in the Real World.
One
thing to Note that Momentum is also one of the Cause of Value . The reason
being Momentum reflects the strength of Demand Supply forces being caused through
Order Flow Imbalance. This force of demand supply also determines the value of
something! Hence, contrary to traditional static absolute understanding of
Value as different to Momentum, we have to understand it from the dynamic
relative perspective that MomIentum is itself one of the Causes of Value and
Vice versa ! Infact, it’s two
ways : Value causes Momentum and Momentum causes Value. It’s essentially like
Laws of Nature/Physics where Value is like Total Energy( including Potential)
and Momentum is like Kinetic form.
So, all the other causal factors including momentum
and randomness resultantly decide the Value.
Infact,
Value is the function of many different causal factors including Momentum as
well as unknown randomness ! This has
its origin in the Human behaviors driving the Investors’ behavior at large
coming from the science and laws of Nature & Human Behavior.
By the way,
Scientifically using Laws of Stability
Equilibrium in Nature , there also exists Proprietary approach to calculate the
fair Valuation Relatively using Geometrical Methods! We shall possibly discuss
this later as much we can(due to proprietary nature) after talking about how
Markets are linked to the Laws of Nature and Physics !
10.Causal
Equation of Market
Let’s
understand this in simple Newton’s Laws of motion like relationship in detail
later but before that I write the Causal mechanism in the form of Ornstein
Uhlenbeck(OU) mean reversion Process where Stock price tends to revert to their
true mean value over the time which is the state of equilibrium.
This is Pure
Scientific.
So
in this equation, Net Effective Valuation at time t-
cause the momentum at time t, to move.
Valuation
at time t is caused by the historical Momentum as well.
But
unlike interest rate dynamics here the mean reversion level keeps on changing
over the time for a stock or market depending upon various valuation constituents.
is the velocity at time t
causing the momentum
is the Expected Stable
Equilibrium State True Value Equivalent Calculated at time t. This will be
calculated based as discussed above and results from the scientific dynamics of
the forces.
What is extremely important to understand is that Even the Causal
Equations have Error & Randomness Components which is the origin of Black
Swan type events/ risks!
The typical parameters �,� and �, together with the initial
condition �0, completely characterize
the dynamics, and can be quickly characterized as follows, assuming � to be non-negative:
·
�b: "long term mean
level". All future trajectories of � will evolve around a
mean level b in the long run;
·
�a: "speed of
reversion". � characterizes the
velocity at which such trajectories will regroup around � in time;
·
�σ: "instantaneous
volatility", measures instant by instant the amplitude of randomness
entering the system. Higher � implies more
randomness
The following derived quantity is also of interest,
·
�2/(2�) σ^2 / 2a: "long term
variance". All future trajectories of � will regroup around
the long term mean with such variance after a long time.
11.Scientific
Background of Convexity & Randomized Approach in finance : Quantum Connection
Convexity
:
More
Gain than Pain from a Random Event. The performance curves outward,
hence looks "convex". Anywhere where such asymmetry prevails, we can
call it convex, otherwise we are in a concave position. The implication is that
you are harmed much less by an error (or a variation) than you can benefit from
it, you would welcome uncertainty in the long run.
The Picture
taken from Internet
Let’s look
at the Science once again and then how Convexity, Randomized Tools like RCT,
Algo wheel etc. Are Scientifically placed for Finance, Life etc..
As we know
Feynman’s Path Integral approach has been the most important scientific result
in Science which originated from Paul Dirac’s Principle of Least Action. So, as
we know that Market is basically a
Quantum wave driven by the sum of many Human Brains, it would follow the
Laws of Quantum world. Market would have uncertainty,randomness which could
make it Unpredictable in some perspectives while they would follow some
deterministic approach in other perspectives. These two aspects would co-exist simultaneously.
We will have to discover that Determinism inside Randomenss/Uncertainty at some
scale. So, rather than directly trying to predict the Uncertainty, one has to
look at Uncertainty from indirect perspectives. One has to find out the hidden Causal
Mechanism inside Randomness. These all are being done while acknowledging the
existence of Uncertainty at some scale. So, we may not directly predict
Uncertainty at lower time scale or in individual perspective but we could
definitely do that in global context. This is directly how Quantum world works.
One might not predict the Behavior of a
Quantum entity like electron individually due to Uncertainty but definitely as
a system. That’s where the role of Randomized based approach comes in. The most
Predictive approach tries to predict at individual level but that’s not how the
Quantum world works..They don’t acknowledge Uncertainty inherent in Nature. So,
Scientifically one should find out global Certainty in Uncertainty/Randomness
rather than trying to predict the Local Uncertainty! That’s what I had stated
earlier that Randomness is NOT Randomly Random rather Deterministically Random
.Randomenss also has cause. The discovery of that Determinism inside Randomness
is the key to success in markets or real world or life.
Hence, One
should conduct Random Trials in Causal way and discover the Determinism inside.
It’s like Path of Least Action as in Feynman’s Path Integral or Paul Dirac’s
Principle of Least Action . That Path inside Randomness is the key. And random
trials should be based on some causal aspects not Randomly Random Trials always
because for every Random Trial there would be some cost involved. One has to
minimize that cost too. So, by following the Causal adjusted Random Trials, own
can maximize Convexity. This is also linked to the law of energy. If the cost
is not minimized, due to herd behavior of energy in markets, it could worsen
leading to huge Tail risk.
So, it’s not
Knowledge Vs Convexity rather Knowledge+ Convexity . Through Causal knowledge
one can increase the Convexity by minimizing the cost of random trials. And
this is also linked to Reproducibility Crisis where just taking random trials
and selecting the best blindly could be biased as done in Backtesting etc.. !
This is because even the Best could behave randomly over the time.
That’s the
clue to success : Discover the Determinism inside Randomness and follow the
Path for Least Action . That’s based on Scientific Quantum Laws.
12.
Causal -Adjusted Randomized Factor Approach
Now having
understood the scientific concepts now deduce the approach for Causal Factor
Investing. We see that Market participants keep on finding new and new factors
and try to predict it using LS Regression type tools. They try to predict
individual factors and try to find out the Ultimate complete set of
Deterministic factors to predict the markets. That holy grail is utopian and
scientifically could not possibly exist due to Quantum Nature of Market. The effort
to make the Real World Deterministic which is fundamentally Quantum and having
Uncertainty is against the Laws of Nature scientifically. Hence , that utopian
dream of discovering the ultimate set of Deterministic Factors is against the
Laws of Nature. It’s analogous to the ground breaking Godel Incompleteness
Results in Markets where the effort to make market Deterministic by Complete
set of Factors is like making Arithmetic a Complete Axiomatic system...
In other
words, discovering the ultimate Holy Grail is like trying to assume the world
as Deterministic, which is scientifically not possible and against the Law of Quantum world.
So, the key
is to follow Randomized Factor Approach. In this approach like Feynman’s Path
Integral & Paul Dirac Principle of Least Action, we try to apply the same
in Human Quantum World. This is where we try to discover the hidden Determinism
inside Random Causal Factors. This way we explore Certainty inside inherent
Uncertainty..That’s the key and analogous to the Law of Quantum world (Physics
being just a subset of that ). RCT, Algo wheel are based on that principles but
there are many other tools using which we try to study the Determinism inside
Random Factors. So, like an electron, we may not predict exactly about every
single factor deterministically rather obviously in collective sense. It’s like
Quantum world Interference, where we may
not predict the trajectory of each electron, but definitely as a whole we can
find some determinism that they would like around.
Hence,
unlike traditional approach of selecting factors on one by one basis, the scientific
approach should be to Randomize the Factors based on some Causal understanding and create a Convex
System.
For that, we
can follow a rough algorithm as below
a)
Random
Selection of the Set of stocks based on
the Causal based Deterministic and
Random Factors according to their Real-World Valuations as discussed
above .
b)
Allocation
weights to them have to be dynamically arranged starting from some random
component based on the historic performance, expected future scenarios in a
well diversified manner, putting various constraints on the weight,while also
looking at Tail Risk perspective idiosyncratically.
c)
The
key is to discover the determinism out of Randomized approach over the
time follow the path of least action as
a group on the principle the Nature works
d)
Allocate
random weights to them at start to these selected Causal stocks
e)
Some
Predictive views based on Causality Factor Analysis could add up to weightages.
Say for example if the has better risk-return
profile expectation in future than , then even though randomly
assigned weight .
f)
Put
Constraints on the Weights to make it well diversified for example
risk management point of view.
g)
Based
on the discovered determinism inside their random trajectories over the time,
keep on adjusting weights dynamically to each of them over the time.(for example)
There
can be more sophisticated equations to update the weightage.
h)
The weights keep on adjusting dynamically based on determinism inside
their randomness in real world.
g) Effectively, the Portfolio
becomes relatively Convex to handle Black Swan Risk Effectively and revert to
their equilibrium true state over the time….
The
exact equations and functions to update weight would be changing in real world
scenarios depending on the scenarios and one’s own risk-return objectives. Here
it is just for demonstration perspective. The key aspect of Randomized Factor
Approach is to Randomize the Factors selection in causal way and then discover
the Determinism hidden inside over the time. This is unlike traditional
approach where selected traditional associational factors are bet in
deterministic way. But to acknowledge that the Holy grail of Completely Deterministic
Factors won’t ever be found due to Quantum behavior of Markets. The key is to
discover Causal Determinism inside Uncertainty/Randomness !
RCT/Algo
wheel are some basic tools based on this approaches. The more advanced
proprietary scientific tools based on this approach could be applied. It’s also
based on the Convexity Principle to build the Portfolio dynamically over the
time.
So,
here True Risk is Black Swan Tail Risk, Normal Volatility should not be much
taken as Risk rather they are the essential mechanism of the Markets ! As long
as one is Convex and Prudent to Black Swan/Tail Risks, Intermediary Volatility
should not be of much concern as according to the Laws of Nature and hence
Markets, it would revert to the true Equilibrium Value state over the time.
13. Summary & Conclusion
So, finally it’s time for summary now where I would
briefly write the brief summary of the paper. I know some readers would like
many views, others may disagree on certain points. It’s an open research paper.
There is relativity kn thoughts. Different people can have different
perspectives and all true.
We
have to acknowledge and the part of highest intellectual maturity that duality
in the form in of extreme opposites co-exist in Nature whose subset is finance,
life etc. So, I have tried to unify two extremes of Randomness (Black Swan)
& Causality; Predictability & Uncertainty in scientific way traversing
the Quantum & Classical world. I’ve explained how both co-exist together
and we need to appreciate it’s beauty.
Essentially
Market driven by Human Quantum Behavior is a Quantum wave which is governed by
certain laws of Energy and Nature. Black Swan which basically falls under the
Unpredictability zone inherent in Nature,hence finance and life as well are
essentially caused by sudden Energy shock, which is causal,but can’t be
predicted as by definition Black Swan is Unpredictable(if it’s predictable,
it’s not a Balck swan for the observer Relatively) . Black Swan (Caused by Randomness
) also follows the Law of Energy and Nature,which converges to its stable state
level the time.
So,given
the market being a Quantum human wave, it follows Certain laws of Nature &
Energy scientifically. One need to understand that and hence manage Quantum
type Uncertainty & Predictability both as Duality.
The
clue to Financial Investment success is accepting the Certain Predictable
components at the same time be convex to the Uncertainty. But to attain the
maximum Convexity, one has to discover the Determinism inside Randomness. This
causal approach would increase the Convexity by minimizing the cost.
For
that rather tha finding the Deterministic set of Factors, one should follow
Randomized Factor Approach . This is based on the Feynman & Paul Dirac
Principle of Least Action. Where one should follow Causal adjusted Randomized
factors and try to find out the Path for Least Action...Eventually AlgoWheel,
RCT etc..are based on similar principles. But there can be more advanced
approaches . Ultimately own has to discover Determinism inside Randomness,
Certainty inside Uncertainty that exist inthe form of Duality in Nature,Life
& Financial Markets.
Finance
is essentially a form of Quantum Science involving Human world which must be
studied in forward direction of time dynamically in Scientific & Causal
Perspective(Randomness being part of that) rather than using backward looking
conventional Statistical tools in static way . Role of Time Direction is quite
important in Finance. One must understand this scientifically rather study
finance as a statistical subject.
No comments:
Post a Comment