The Future of Everything

July 17, 2017

On straw men

Filed under: Uncategorized — David @ 2:59 pm

From the preface to Economyths: 11 Ways Economics Gets It Wrong

As anticipated in the 2010 version of Economyths, many economists have argued that the economyths are an unfair caricature of their field – a ‘straw man’ I am setting up to easily defeat. Four things to add. First, this argument is a little over-used. ‘Read any review of a heterodox book by an economist’, noted Cahal Moran in 2011, and ‘you will find the exact same rhetoric’: the author is ‘attacking straw men, he doesn’t understand economics, etc.’ An external investigation into the economics department at the University of Manitoba in 2015 found that ‘the insistence by the mainstreamers that the heterodox are attacking a straw man could be labelled “gaslighting” [i.e. psychologically manipulating someone into doubting their own sanity]. Even as some heterodox are subject to unfriendly discrimination, ridicule, hostility, and censure, some mainstreamers simply deny it and insist the others are making it all up.’ Call me crazy, but I think they have a point.

Secondly, economists have long deflected criticism by claiming that key assumptions such as the rational behaviour of ‘economic man’, as Lionel Robbins put it in 1932, are ‘only an expository device – a first approximation used very cautiously at one stage in the development of arguments’. (As seen in the Appendix, economists repeat the identical argument today.) But that same ‘economic man’ – which as a view of human behaviour is less a first approximation than a severe distortion – reached perhaps its most gloriously exaggerated form in the Arrow-Debreu model (Chapter 5) well after Robbins dismissed it as a ‘bogey’ (the expression ‘straw man’ was not yet in vogue), and remains at the heart of much economic modelling, which is why eight decades later we could name a book after its impending twilight with no fear of redundancy.

Thirdly, there is also a longstanding tradition in which, as Moran and his co-authors Joe Earle and Zach Ward-Perkins put it in The Econocracy: ‘The concerns of critics are said to be addressed when economists find some way of incorporating their critiques into existing frameworks. The result is often a highly stylised version of what the critic had in mind, and may drop the things that are most important while conforming to certain assumptions that the critic may reject.’ When economists consider small departures from something like equilibrium – they would have to, wouldn’t they? – or arrange patches for the more egregious examples of ‘market failure’ – such as the environmental crisis – they are like the ancient astronomers who added extra epicycles to their geocentric models of the cosmos to better fit observations, while still assuming that the universe was based on circles and the sun went around the earth. In fact it is economists who have set up a highly simplified version of the real world – but instead of destroying it, they hold it up as an ideal to which real economies can only aspire. (And if that is a ‘caricature’ or a ‘straw man’, we will stop attacking it when it stops threatening to blow up the world.)

Finally, I take pains in the book to show that the arguments apply not just to this pure textbook version of the theory, but to anything near it, epicycles and all. And as we’ll see, supposedly sophisticated models may deviate from these foundational assumptions, but they can never stray too far without losing internal consistency – which is exactly why the field finds itself in a state of crisis.

Advertisements

July 16, 2017

Time for critics of economics critics to move on!

Filed under: Uncategorized — David @ 3:28 pm

There is a growing trend for economists to write articles criticising the critics of economics. These articles follow a similar pattern. They start by saying that the criticisms are “both repetitive and increasingly misdirected” as economist Diane Coyle wrote, and might complain that they don’t want to hear one more time Queen Elizabeth’s question, on a 2008 visit to the London School of Economics: “Why did nobody see it coming?”

Economist Noah Smith agrees that “blanket critiques of the economics discipline have been standardized to the point where it’s pretty easy to predict how they’ll proceed.” Unlike the crisis then! “Economists will be castigated for their failure to foresee the Great Recession. Some unrealistic assumptions in mainstream macroeconomic models will be mentioned. Economists will be cast as priests of free-market ideology, whose shortcomings will be vigorously asserted.” And so on.

The articles criticising critics then tell critics it is time to adopt a “more constructive tone” and “focus on what is going right in the economics discipline” (Smith) because “only if today’s critics of economics pay more attention to what economists are actually doing will they be able to make a meaningful contribution to assessing the state of the discipline” (Coyle). If the critics being criticised are not economists, the articles often point out or imply that they don’t know what they are talking about, are attacking a straw man, etc., or even (not these authors) compare them to climate change deniers.

Speaking as an early adopter of the Queen Elizabeth story (in my 2010 book Economyths, recently re-released in extended form), allow me to say that I agree completely with these critic critics. Yes, economists failed to predict the most significant economic event of their lifetimes. Yes, their models couldn’t have predicted it, even in principle, based as they were on the idea that markets are inherently self-stabilising. And yes, economists didn’t just fail to predict the crisis, they helped cause it, through their use of flawed risk models which gave a false sense of security.

But it is time for us critics to move on, and accentuate the positive. Only by doing so can we make a meaningful contribution. And as Smith points out, calls for “humility on the part of economists” are getting old (Tomáš Sedláček, Roman Chlupatý and I wrote Bescheidenheit – für eine neue Ökonomie five years ago). It’s like asking Donald Trump to admit that he once lost at something.

Of course, some people might say that it isn’t up to economists to tell everyone else when they should stop talking about economists’ role in the crisis, or bring up what the former head of the UK Treasury memorably called in 2016 their “monumental collective intellectual error.”

Some stick-in-the-muds note that “No one took any responsibility or blame for a forecasting failure that led to a policy disaster” and have called for a public inquiry into their role in the crisis. Instead of telling everyone else to move on, they argue, it is time for economists to own their mistakes. Well guess what, people – it’s not going to happen! And stop asking for a public apology. Let’s focus on what is going right and hand out some gold stars.

For example, there is the “data revolution” heralded by Smith. As he notes, “econ is paying a lot more attention to data these days.” Sure, economists are literally the last group of researchers on earth to have realised the usefulness of data. In physics the “data revolution” happened back when astronomers like Tycho Brahe pointed their telescopes at the sky and began to question the theories of Aristotle. But better late than never!

Oh, here’s a data point – all the orthodox theories failed during the crisis! But you knew that.

Or there is behavioral economics, which Coyle notes is “one of the most popular areas of the discipline now, among academics and students alike.” Critics again might note that progress in this area has been painfully slow and has had little real impact. Tweaks such as “hyperbolic discounting” are equivalent to ancient astronomers appending epicycles to their models to make them look slightly more realistic. But that rational economic man thing is so over – straw man walking.

Admittedly, there has been less progress on a few things. The equilibrium models used by policy makers, for example, still rely on the concept of equilibrium – and so have nothing to say on the cause or nature of financial crises. Risk models used by banks and other financial institutions still view markets as governed by the independent actions of rational economic man investors, and are more useful for hiding risk than for estimating it, as quant Paul Wilmott and I have argued.

As Paul Krugman noted in 2016, “we really don’t know how to model personal income distribution,” even though social inequality – along with financial instability – is one of the biggest economic issues of our time. Some insiders such as World Bank chief economist Paul Romer – who compared a chain of reasoning in the field of macroeconomics to “blah blah blah” – describe the area as “pseudo-science”. And economics education still concentrates almost solely on the discredited neoclassical approach, complete with rational economic man, according to the student authors of The Econocracy.

But these are details. As Coyle notes, some economists are finally getting to grips with ideas from areas such as “complexity theory, network theory, and agent-based modeling” which of course are exactly those areas that critics have long been suggesting they learn from.

Or the UK’s Economic and Social Research Council recently let it be known that it is setting up a network of experts from different disciplines including “psychology, anthropology, sociology, neuroscience, economic history, political science, biology and physics,” whose task it will be to “revolutionise” the field of economics. Again, that is nice, since Economyths called in its final chapter for just such an intervention by non-economists back in 2010.

So, yes, it is time to celebrate the new dawn of economics! But critics of critics – do try to move on from the same criticisms, we’ve heard it all before, in fact for decades now.

April 13, 2017

Review of The Evolution of Money

Filed under: Books, Economics, Reviews — David @ 8:56 pm

The Evolution of Money is reviewed in News Weekly by Colin Teese, former deputy secretary of the Australian Department of Trade:

“Who would have thought of linking money and quantum physics? Well, Orrell and Chlupaty  have done just that in The Evolution of Money, perhaps the best book on money I have  ever read …

The authors have set themselves the dauntingly difficult task of explaining money, as it  were, from the ground up, cutting the cant that has surrounded the subject for centuries.  Blending a happy combination of skills and experience, they have recorded a satisfying and  entertaining account of how money has impacted, of course, on economics, but no less on  politics and society. But that is not the end of it. They make a persuasive case, at least to this reader’s satisfaction, on how the evolution of money has tracked that of science …

A reasonable and benign dictator might demand that those engaged in activities relating to economic management should, as a condition of employment, be compelled to read The Evolution of Money and pass a written examination based on an understanding of its contents.”

Read the full review at News Weekly.

April 4, 2017

The Money Formula – New Book By Paul Wilmott And David Orrell

Filed under: Books, Economics — Tags: , — David @ 3:09 pm

The Money Formula: Dodgy Finance, Pseudo Science, and How Mathematicians Took Over the Markets

OUT NOW!!!

BUY ON AMAZON.COM                                 BUY ON AMAZON.CO.UK

Explore the deadly elegance of finance’s hidden powerhouse

The Money Formula takes you inside the engine room of the global economy to explore the little-understood world of quantitative finance, and show how the future of our economy rests on the backs of this all-but-impenetrable industry. Written not from a post-crisis perspective – but from a preventative point of view – this book traces the development of financial derivatives from bonds to credit default swaps, and shows how mathematical formulas went beyond pricing to expand their use to the point where they dwarfed the real economy. You’ll learn how the deadly allure of their ice-cold beauty has misled generations of economists and investors, and how continued reliance on these formulas can either assist future economic development, or send the global economy into the financial equivalent of a cardiac arrest.

Rather than rehash tales of post-crisis fallout, this book focuses on preventing the next one. By exploring the heart of the shadow economy, you’ll be better prepared to ride the rough waves of finance into the turbulent future.

  • Delve into one of the world’s least-understood but highest-impact industries
  • Understand the key principles of quantitative finance and the evolution of the field
  • Learn what quantitative finance has become, and how it affects us all
  • Discover how the industry’s next steps dictate the economy’s future

How do you create a quadrillion dollars out of nothing, blow it away and leave a hole so large that even years of “quantitative easing” can’t fill it – and then go back to doing the same thing? Even amidst global recovery, the financial system still has the potential to seize up at any moment. The Money Formula explores the how and why of financial disaster, what must happen to prevent the next one.

PRAISE FOR THE MONEY FORMULA

“This book has humor, attitude, clarity, science and common sense; it pulls no punches and takes no prisoners.”
Nassim Nicholas Taleb, Scholar and former trader

“There are lots of people who′d prefer you didn′t read this book: financial advisors, pension fund managers, regulators and more than a few politicians. That′s because it makes plain their complicity in a trillion dollar scam that nearly destroyed the global financial system. Insiders Wilmott and Orrell explain how it was done, how to stop it happening again and why those with the power to act are so reluctant to wield it.”
Robert Matthews, Author of Chancing It: The Laws of Chance and How They Can Work for You

“Few contemporary developments are more important and more terrifying than the increasing power of the financial system in the global economy. This book makes it clear that this system is operated either by people who don′t know what they are doing or who are so greed–stricken that they don′t care. Risk is at dangerous levels. Can this be fixed? It can and this book full of healthy skepticism and high expertise shows how.”
Bryan Appleyard, Author and Sunday Times writer

“In a financial world that relies more and more on models that fewer and fewer people understand, this is an essential, deeply insightful as well as entertaining read.”
Joris Luyendijk, Author of Swimming with Sharks: My Journey into the World of the Bankers

“A fresh and lively explanation of modern quantitative finance, its perils and what we might do to protect against a repeat of disasters like 2008–09. This insightful, important and original critique of the financial system is also fun to read.”
Edward O. Thorp, Author of A Man for All Markets and New York Times bestseller Beat the Dealer

April 2, 2017

Why Toronto house prices keep going up

Filed under: Economics — Tags: , — David @ 7:08 pm

Ever wonder why prices in cities such as Toronto keep going up? The reasons given are many – foreign buyers, low interest rates, lack of supply, and so on – but while these are all contributing factors, the real reason is much simpler.

It’s because there is more money.

housepricemoneysupply

The solid line shows the Teranet 6-city index which goes back to 1999, the dashed line is a broad measure of money supply (M2++).

And why is there more money? It’s because house prices have gone up. Most of the money in our economy is generated by bank loans, usually against real estate – and when prices go up, they can make larger loans.

Thus house prices and money supply increase in tandem. Of course, at some point they can also go down in tandem …

February 7, 2017

Big data versus big theory

Filed under: Forecasting — Tags: — David @ 4:05 pm

The Winter 2017 edition of Foresight magazine includes my commentary on the article Changing the Paradigm for Business Forecasting by Michael Gilliland from SAS. A longer version of Michael’s argument can be read on his SAS blog, and my response is below.

Michael Gilliland argues convincingly that we need a paradigm shift in forecasting, away from an “offensive” approach that is characterized by a reliance on complicated models, and towards a more “defensive” approach which uses simple but robust models. As he points out, we have been too focussed on developing highly sophisticated models, as opposed to finding something that actually works in an efficient way.

Gilliland notes that part of this comes down to a fondness for complexity. While I agree completely with his conclusion that simple models are usually preferable to complicated models, I would add that the problem is less an obsession with complexity per se, than with building detailed mechanistic models of complexity. And the problem is less big data, than big theory.

The archetype for the model-centric approach is the complex computer models of the atmosphere used in weather forecasting, which were pioneered around 1950 by the mathematician John von Neumann. These weather models divide the atmosphere (and sometimes the oceans) into a three-dimensional grid, and use equations based on principles of fluid flow to compute the flow of air and water. However many key processes, such as the formation and dissipation of clouds, cannot be derived from first principles, so need to be approximated. The result is highly complex models that are prone to model error (the “butterfly effect” is a secondary concern) but still do a reasonable job of predicting the weather a few days ahead. Their success inspired a similar approach in other areas such as economics and biology

The problem comes when these models are pushed to make forecasts beyond their zone of validity, as in climate forecasts. And here, simple models may actually do better. For example, a 2011 study by Fildes and Kourentzes showed that, for a limited set of historical data, a neural network model out-performed the conventional climate model approach; and a combination of a Holt linear trend model with a conventional model led to an improvement of 18 percent in forecast accuracy over a ten-year period.[1]

As the authors noted, while there have been many studies of climate models, “few, if any, studies have made a formal examination of their comparative forecasting accuracy records, which is at the heart of forecasting research.” This is consistent with the idea that complex models are favored, not because they are necessarily better, but for institutional reasons.

Another point shown by this example, though, is that models associated with big data, complexity theory, etc., can actually be simpler than the models associated with the reductionist, mechanistic approach. So for example a neural network model might run happily on a laptop, while a full climate model needs a supercomputer. We therefore need to distinguish between model complexity, and complexity science. A key lesson of complexity science is that many phenomena (e.g. clouds) are emergent properties which are not amenable to a reductionist approach, so simple models may be more appropriate.

Complexity science also changes the way we think about uncertainty. Under the mechanistic paradigm, uncertainty estimates can be determined by making random perturbations to parameters or initial conditions. In weather forecasting, for example, ensemble forecasting ups the complexity level by making multiple forecasts and analysing the spread. A similar approach is taken in economic forecasts. However if error is due to the model being incapable of capturing the complexity of the system, then there is no reason to think that perturbing model inputs will tell you much about the real error (because the model structure is wrong). So again, it may be more appropriate to simply estimate error bounds based on past experience and update them as more information becomes available.

Complexity versus simplicity

An example from a different area is the question of predicting heart toxicity for new drug compounds. Drug makers screen their compounds early in the development cycle by testing to see whether they interfere with several cellular ion channels. One way to predict heart toxicity based on these test results is to employ teams of researchers to build an incredibly complicated mechanistic model of the heart, consisting of hundreds of differential equations, and use the ion channel inputs as inputs. Or you can use a machine learning model. Or, most complicated, you can combine these in a multi-model approach. However my colleague Hitesh Mistry at Systems Forecasting found that a simple model, which simply adds or subtracts the ion channel readings – the only parameters are +1 and -1 – performs just as well as the multi-model approach using three large-scale models plus a machine learning model (see Complexity v Simplicity, the winner is?).

Now, to obtain the simple model Mistry used some fairly sophisticated data analysis tools. But what counts is not the complexity of the methods, but the complexity of the final model. And in general, complexity-based models are often simpler than their reductionist counterparts. Clustering algorithms employ some fancy mathematics, but the end result is clusters, which isn’t a very complicated concept. Even agent-based models, which simulate a system using individual software agents that interact with one another, can involve a relatively small number of parameters if designed carefully.

People who work with big data, meanwhile, are keenly aware of the problem of overfitting – more so it would appear then the designers of reductionist models which often have hundreds of parameters. Perhaps the ultimate example of such models is the dynamic stochastic equilibrium models used in macroeconomics. Studies show that these models have effectively no predictive value (which is why they are not used by e.g. hedge funds), and one reason is that key parameters cannot be determined from data so have to be made up (see The Trouble With Macroeconomics by Paul Romer, chief economist at the World Bank).

One reason we have tended to prefer mechanistic-looking models is that they tell a rational cause-and-effect story. When making a forecast it is common to ask whether a certain effect has been taken into account, and if not, to add it to the model. Business forecasting models may not be as explicitly reductionist as their counterparts in weather forecasting, biology, or economics, but they are still often inspired by the need to tell a consistent story. A disadvantage of models that come out of the complexity approach is that they often appear to be black boxes. For example the equations in a neural network model of the climate system might not tell you much about how the climate works, and sometimes that is what people are really looking for.

When it comes to prediction, as opposed to description, I therefore again agree with Michael Gilliland that a ‘defensive’ approach makes more sense. But I think the paradigm shift he describes is part of, or related to, a move away from reductionist models, which we are realising don’t work very well for complex systems. With this new paradigm, models will be simpler, but they can also draw on a range of techniques that have developed for the analysis of complex systems.

[1] Fildes, R., and N. Kourentzes. “Validation and forecasting accuracy in models of climate change.” International Journal of Forecasting 27 (2011): 968–995.

 

October 10, 2016

More quantum money

Filed under: Economics — Tags: , , — David @ 2:37 pm

New discussion paper at Economic Thought is called A Quantum Theory of Money and Value, Part 2: The Uncertainty Principle.

Here is the abstract:

Economic forecasting is famously unreliable. While this problem has traditionally been blamed on theories such as the efficient market hypothesis or even the butterfly effect, an alternative explanation is the role of money – something which is typically downplayed or excluded altogether from economic models. Instead, models tend to treat the economy as a kind of barter system in which money’s only role is as an inert medium of exchange. Prices are assumed to almost perfectly reflect the ‘intrinsic value’ of an asset. This paper argues, however, that money is better seen as an inherently dualistic phenomenon, which merges precise number with the fuzzy concept of value. Prices are not the optimal result of a mechanical, Newtonian process, but are an emergent property of the money system. And just as quantum physics has its uncertainty principle, so the economy is an uncertain process which can only be approximated by mathematical models. Acknowledging the dynamic and paradoxical qualities of money changes our ontological framework for economic modelling, and for making decisions under uncertainty. Applications to areas of risk analysis and economic forecasting are discussed, and it is proposed that a greater appreciation of the fundamental causes of uncertainty will help to make the economy a less uncertain place.

Download the paper here.

October 8, 2016

Notes on the quantum theory of money and value

Filed under: Economics — Tags: , — David @ 12:38 am

Following the publication in Economic Thought of my paper “A Quantum Theory of Money and Value” I have received a number of interesting comments and questions from readers, and this post is an attempt to clarify some of the points which came up. For a description of the theory, please see the paper, or (for the book version) The Evolution of Money.

What is a money object?

These are objects – either real or virtual – which have a fixed numerical value in currency units. Just as quantum objects have dual real/virtual properties, so do money objects (bitcoins don’t seem like objects, until you lose the hard drive they are located on). Money objects are unique in that they have a fixed numerical price. Other objects or services attain their price by being traded for money objects in markets.

Is money an emergent phenomenon?

Money objects are designed (e.g. by the state) to have a set price. The prices of other things emerge as the by-product of money-based markets, which themselves emerge into being as money objects become commonly used. Therefore prices and markets can be viewed as emergent phenomena, but money itself is better seen as a carefully designed technology. (Of course the way that e.g. cybercurrencies emerge into actual use, as markets develop around them, can also be described as an emergent phenomenon.)

What does money measure?

Nothing. Because prices emerge from the use of money objects, one consequence is that price should not be viewed as an accurate measure of “labor”, “utility”, “economic value”, or any other quantity. Money is better viewed as a fundamental quantity, like electrical charge. Money objects, as used in markets, are a way of attaching numbers to things, but that is not the same as measuring them in some way. Of course market forces tend to align prices with some vague idea of value, but the process is far from exact, and money has its own dynamics (which is one reason CEOs in the US earn over 300 times the median wage of their employees). Note this contradicts the Aristotelian idea, later expressed by Aquinas, that money was “the one thing by which everything should be measured.”

Why quantum?

The comparison with quantum theory comes about because money is treated as a fundamental quantity (from the Latin quantum); and money objects are a way of combining the notions of number and value, which are as different from one another as the dual wave/particle properties of matter. For example, number is stable, while value varies with time. Money objects are therefore fundamentally dualistic.

As mentioned in The Evolution of Money, other authors and economists (and many others) have used the term “quantum” in different ways. One example is Charles Eisenstein’s Sacred Economics, where in an appendix called “Quantum Money and the Reserve Question” he notes “the similarity between fractional-reserve money and the superposition of states of a quantum particle,” in the sense that money can seem to exist in more than one place at the same time. The quantum macroeconomics school, also known as the theory of money emissions, which dates to the 1950s, gained its name from the idea that production is an instantaneous event that quantizes time into discrete units. A completely different concept is quantum money, which exploits quantum physics in an encryption technique.

What inspired the approach?

One thing is the history of money. The most concrete example of a money object is a coin, which consists of a number pressed into a piece of metal. These date to the time when Greek philosophers were developing the first theories of mathematics. Pythagoras believed that the universe was based on number, and money can be seen as a way of making that true by impressing numbers onto the real world. However mixing the properties of number and things produces a strange kind of alchemy. See this presentation for the 2015 Marshall McLuhan lecture at transmediale in Berlin for a discussion.

How does this differ from the usual understanding of the role of money?

One consequence of the theory is that it inverts the usual narrative of mainstream economics. Since the time at least of Adam Smith, economists have downplayed the importance of money, seeing it as a kind of neutral chip that emerged as a way of facilitating barter. But instead of money emerging from markets, it is more accurate to say that the use of money (jumpstarted by the state) prompted the emergence of markets. And far from being an inert chip, money is an active, dualistic substance with powerful and contradictory properties. Putting numbers on things changes the way they behave.

What is the mathematical map or connection between price and value?

In general there is no such map. Price is an emergent property, which means it need not be computable at all. Of course it is possible to come up with some rules of thumb, but there are no fundamental laws as in physics.

 

Update

The quantum properties of money and the economy as a whole are explored in the new book Quantum Economics: The New Science of Money. For a discussion of the mathematical background, please see: Introduction to the mathematics of quantum economics.

June 28, 2016

Book extract: The Evolution of Money

Filed under: Books, Economics — Tags: — David @ 8:48 pm

Read an excerpt from The Evolution of Money here.

June 27, 2016

Evolution of Money featured at CUP

Filed under: Books, Economics — Tags: — David @ 3:32 pm

This week (June 27-July 1) the Columbia University Press blog will be featuring content from or about The Evolution of Money, starting with a book giveaway – you can enter the competition for a free copy here.

« Newer PostsOlder Posts »

Blog at WordPress.com.