How Old Logic And New Math Help Us Discover The Nature of Things.
The nature of things is baffling and complex, and that poses two challenges. Deal with it with reason, or with emotion. An example of reason being applied to a complex problem is cancer research. A network of engineers, oncologists, scientists, patients, and data has been unleashed. Mammoth data silos of information are being shared and mined by new technological infrastructures and artificial intelligence. This will provide front-line doctors with the information they need to develop targeted, personalized therapies; ones that kill the cancer while minimizing damage to healthy tissues.
The Nature of Cancer
As a golfer carries fourteen clubs to customize each shot to the ball’s terrain, wind, distance, and hazards, a cancer medical team needs many tools to craft the most targeted therapy possible. They now include immunotherapy, proton therapy, and genomic sequencing. According to Dr. William Cance of the University of Arizona Cancer Center:
One of the challenges is to devise simple tests that will predict whether a person’s tumor will be sensitive to immunotherapy or resistant. Scientists are also paying attention to something called the tumor microenvironment, which is the area immediately around the tumor.
In these resistant tumors, “normal immune cells that live in this microenvironment act as a wall, preventing the immune system’s T-Cells from reaching the tumor.” The challenge of immunotherapy is to find a way to tear down this wall. Think of the malignant tumor, its protective wall of immune cells, and safely eradicating them as Secretary Clinton described last year: “Our body politic immune system has been impaired.”
With regard to sequencing the human genome, mathematics and artificial intelligence have proven themselves to be essential tools for decision support. Scientists have uncovered hundreds of thousands of “variants of uncertain significance” (VUS), and many are harmless. But which ones are harmful? According to Dr. Sean Tavtigian of the Huntsman Cancer Institute, they’ve developed new gene therapies with a “computational tool that comes up with a probability for the final result with astonishing accuracy.” The basis of this gene sequencing breakthrough was found in a centuries old formula: Bayes Theorem.
The Nature of Knowledge
According to economics consultant Peter L. Bernstein in his book Against the Gods, “A dissident minister named Thomas Bayes made a striking advance in statistics by demonstrating how to make better informed decisions by mathematically blending new information into old information.” His mathematical achievements were not published until after his death in 1761. Bernstein continues:
Essays Towards Solving a Problem In The Doctrine of Chances, was a strikingly original piece of work that immortalized Bayes among statisticians, economists, and other social scientists. This paper laid the foundation for the modern method of statistical inference, the great issue first posed by Jacob Bernoulli.
“The Bayesian system of inference is too complex to recite here in detail” says Bernstein about his book, but in summary “there is no single answer under conditions of uncertainty.” Inferences about old information must must be revised as new information arrives. And regarding Bernoulli, just prior to his death in 1705 he had invented the “Law of Large Numbers and methods of statistical sampling that drive modern activities as varied as opinion polling, wine tasting, stock picking, and the testing of new drugs.”
And now their work has been blended with genetic sequencing data. Says Tavtigian, researchers can establish genetic predispositions for a disease, and “these efforts can add years to patients’ lives and improve their quality of life.” And with the growing volume of new data, there are new platforms for crowdsourcing such as patientslikeme.com, where people are able to share treatment options with each other and physicians.
The Nature of Science
In 1875 another amateur mathematician by the name of Francis Galton discovered regression to the mean, which is “the expectation that matters will return to normal.” Bernstein adds “Galton’s line of analysis led immediately to the concept of correlation, which is a measurement of how closely any two series vary relative to one another.” In the natural world it could be the relative size of parent and child, or rainfall and crops. In the social world it could be inflation and interest rates. These are conditional dependencies.
In a modern Bayesian biology network, conditional dependencies could represent the relationship between diseases and symptoms. Given symptoms, the network can be used to compute the probabilities of various diseases. And in the social science of micro-economics, these variables could be the relationship between future wealth and investment strategy. With given capital market assumptions, the Bayesian-like network could be used to compute the probability of various financial outcomes.
In both cases, Bayesian logic updates the probability of a hypothesis as more scientifically relevant evidence becomes available; and for the first time, making mathematics a viable tool for social sciences. This is necessary because people adapt as circumstances change.
The Nature of Things
As independent trials are critical to statistical reliability, so are the relationships between dependent variables to fully understand the nature of things. In 1952 Harry Markowitz expanded on Galton’s work and “touched off the intellectual movement that revolutionized business decisions around the world.” We call it the Efficient Market Hypothesis.
Throughout human existence, people have studied nature and its math to understand possibilities and manage risk. People have also studied ways to manipulate data for political power. The currency of the scientist, the economist, and the entrepreneur is reality. Their findings must be repeatable, and being wrong has consequences. The currency of the demagogue is emotion. Being wrong is of no consequence to them; their news cycles are saturated with stories about environmental and economic doom. As Weather Channel founder John Coleman has claimed, “scientists with environmental and political motives manipulated long term scientific data to create an illusion of rapid global warming.” Economist Murray Rothbard describes the fallacy of total utility as “absurd microeconomics textbook discussions of non-existent entities subject to mathematical manipulation.”
Statistician Adrian Smith sums up Bayesian statistics best when he says:
Any approach to scientific inference which seeks to legitimize an answer in response to complex uncertainty is, for me, a totalitarian parody of a would-be rational learning process.
The natural world is our universe of entities, all acting and interacting in accordance with their identities. Understanding those identities and their interactions is the job of curious and connected individuals. And they’re now connected with information and technology in decentralized networks. Bernoulli, Bayes, and Galton left behind an enormous gift, perhaps the Rosetta Stone of complex systems. Breakthroughs in cancer and economic science are just two of the latest beneficiaries.