New Papers

 

 

 

 

Sheri M. Markose
Systemic Risk Analytics: A Data Driven Multi-Agent Financial Network (MAFN) Approach
Forthcoming Journal of Banking Regulation August 2013. PDF

 

New Scientist

The New Scientist covers the Markose-Giansante-Shaghaghi publications on the new eigen-pair systemic risk analytics for too interconnected

The financial meltdown forecasters” , by Andy Coghlan and Michael Marshall, NewScientist, Magazine issue 2877, August 2012. http://www.newscientist.com/article/mg21528773.400-the-financial-meltdown-forecasters.html

 

http://www.essex.ac.uk/economics/news_and_seminars/newsEvent.aspx?e_id=4380. You can view it in full in the .pdf document.

 

Celia Hampton on the Agent Based Computational Economics

Celia Hampton on the Agent Based Computational Economics done by Sheri Markose and her team.


www.financialworld.co.uk December/January 2011-2012

link
PDF

 

IMF Working Paper

Systemic Risk from Global Financial Derivatives: A Network Analysis of Contagion and Its Mitigation with Super-Spreader Tax, Prepared by Sheri M. Markose

Abstract:

Financial network analysis is used to provide firm level bottom up holistic visualizations of interconnections of financial obligations in global OTC derivatives markets. This helps to identify systemically important financial intermediaries (SIFIs), analyse the nature of contagion propagation and also to monitor and design ways of increasing robustness of the network. Based on 2009 FDIC and individually collected firm level data covering gross notional, gross positive (negative) fair value and the netted derivatives assets and liabilities for 202 financial firms which includes 20 SIFIs, the bilateral flows are empirically calibrated to reflect data based constraints. This produces a tiered network with a distinct highly clustered central core of 12 SIFIs that account for 78 percent of all bilateral exposures and a large number of financial intermediaries (FIs) on the periphery. The topology of the network results in the “too- interconnected-to-fail” (TITF) phenomenon in that the failure of any member of the central tier will bring down other members with the contagion coming to an abrupt end when the ‘super-spreaders’ have demised. As these SIFIs account for the bulk of capital in the system—ipso facto no bank among the top tier can be allowed to fail highlighting the untenable implicit socialized guarantees needed for these markets to operate at their current levels. Systemic risk costs of highly connected SIFIs nodes are not priced into their holding of capital or collateral. An eigenvector centrality based ‘super-spreader’ tax has been designed and tested for its capacity to reduce the potential socialized losses from failure of SIFIs.

 

 

 

 

World Economy Summer School

       Professor Sheri Markose

Kiel Institute for the World Economy:6th Kiel Institute Summer School on Economic Policy Policy,June 24 – 30, 2012

 

Multi-Agent Financial Network Models (MAFN) For Systemic Risk Management: A New Complexity Perspective


Lecture 1: Eigen-Pair Analysis of Financial Systemic Risk and Design of Super-Spreader Tax To Mitigate Moral Hazard (Slides)

 

Lecture 2: Design of Robust Macro-prudential Policy: Why a new complexity approach and MAFNs ? (Slides)

 

Some new drafts of papers

 

Materials uploaded 27-4-2012

Forthcoming, under review and most recent drafts

 

Multi-agent Financial Network Models and Systemic Risk Management: A New Complexity Perspective Sheri Markose

 

The above paper sets out the critique of how deep doctrinal errors and inadequate operationally relevant macro-monetary tools led to the crisis. Sheri Markose is no arm chair critic. Over the last decade she has pioneered a framework based on markets as complex adaptive systems. (i)Such systems need new modelling tools to produce holistic visualization of macro-monetary systems for which large scale Multi-agent Financial Network Models were developed. This is joint work with Simone Giansante and Ali Rais Shaghaghi.

(ii) Arms races and competitive co-evolution is overlooked in most models of policy design. The lethal error was to have regulators paint themselves into a corner and follow a fixed rule on the grounds of inflation credibility, rather than coevolve.

(iii)A lack of understanding of non-Guassian statistical models with extreme events dominated mainstream financial economics. In 2005 Sheri Markose and Amadeo Alentorn derived a closed form solution for option pricing with non-Guassian Generalized Extreme Value Distribution. This has now been successfully developed to obtain traded option implied extreme volatility indexes and extreme economic value at risk.

(iv) One of the biggest elephants in the room is how inflation was conquered in advanced cashlessness economies. Sheri's detailed work based on data on how EFTPOS irrevocably reduced transactions demand for state supplied money with Ying Jia Loke in 2002/3 - leads her to conclude that inflation in the CPI index was conquered by widespread technology led change in payment habits. In contrast, in high cashed based economies like India, inflation in the CPI index remains intransigent. Urgent investigation into how the transmission channel for inflation in advanced cashless economies is needed.

 

 

-The following two papers set out the eigen-pair (the dominant eigen-value and corresponding eigen-vector centrality) analysis to quantify systemic risk or instability from large financial networks and determine the contribution to systemic risk by financial intermediaries.

 

Systemic Risk Fom Global Financial Derivatives Markets: A Network Analysis and Mitigation of Contagion Effects With Superspreader Tax

Report on Project for the IMF ( March - Dec 2011)

Sheri Markose

 

"Too Interconnected To Fail Financial Network of US CDS Market: Topological Fragility and Systemic Risk"

Sheri Markose, Ali Rais Shaghaghi and Simone Giansante

 


- Agent Based Modelling To Monitor Perverse Incentives from Policy

 

"Agent Based Financial Network (MAFN) Model of US Collateralized Debt Obligations (CDO): Regulatory Capital Arbitrage, Negative CDS Carry Trade and Systemic Risk Analysis"

Sheri Markose , Bewaji Olewasegun and Simone Giansante

Forthcoming Chapter in book Simulation in Computational Finance and Economics: Tools and Emerging Applications , Edited by Serafin Martinez et. al.

 

 

-New Results On a Closed Form Solution for Option Pricing Usin-g Generalized Extreme Value (GEV) Distribution

 

“The Generalized Extreme Value (GEV)Distribution, Implied Tail Index and Option Pricing”,

Sheri Markose and Amadeo Alentorn, Spring 2011, Journal of Derivatives

(http://www.acefinmod.com/docs/GEVpaperMarkoseAlentornFinal_16Dec2010.pdf)

 

Forecasting Extreme Volatility Using GEV based Implied Volatility

For Review of FinancialStudiesMarkoseApril2012

 

 

ICT based Integrated Network Modelling of the Indian Financial System

 

Materials uploaded 2-7-2011

Narayana Murthy (Ex CEO and Founder, Infosys): "In god we trust, everyone else must come with data."

 

April 24- 5 May 2011, Sheri Markose and Simone Giansante visited the Reserve Bank of India to commence work on an ICT based integrated computational model of the Indian financial system in collaboration with the Financial Stability Divisison. The facility being developed is a fine grained data base driven multi-agent financial network (MAFN) model that differs from traditional macro-econometric models. The main objective of this framework is to enable the regulator to monitor and track the activities of financial firms and their resultant interconnections for their systemic risk consequences. The strategy has been to build up the ICT based platform in a modular fashion starting with the network modelling of the bilateral financial obligations of the Indian interbank market. June 2010 and December 2010 data for interbank exposures was analysed. A bespoke financial network analysis and contagion stress testing platform for the Indian interbank market has been developed. A summary of this work has been published: 

 

Excerpt from Reserve Bank of India Financial Stability Report June 2011 on Financial Network Modelling for the Indian Financial System (pdf report)

 

Integration of bank exposures to other banks and  non-bank financial firms in the CBLO and Repo market will be undertaken in future work. This is also the case with the integration of the flow and use of funds that is essential for linkages between the financial system, the real side of the economy and global markets.  Also, the interlinkages between the regulated and non-regulated sectors of credit creation will be modelled to avoid oversights such as on the innovation driven shadow banking or the activity of the more tradtional sectors as in the Spanish cajas.

 

So what was gleaned about the structural aspects of the Indian interbank system that was not already known ? What will be the payoffs in monitoring the topology of the financial network?

 

As reported in the June 2011 Financial Stability Report of the Reserve Bank, the failure of highly connected banks within the financial network constructed  from banks’ bilateral exposures for December 2010 “shaves off 14 per cent of the banking system’s core capital”.  Distress and insolvency when secondary failures can follow is assumed to occur when the ratio of Tier1 capital after loss to risk weighted assets exceeds 6%. As has been noted in the Markose research on too interconnected to fail networks (see, Section below), a number of banks that constitute the central core of the system in terms of their high connectivity to one another impact on the same subset of banks and bring about a similar magnitude of system wide losses. The system showed greater propensity for contagion in December 2010 than in June 2010. As the stress tests are not based on Monte Carlo simulations but represent actual exposures on the day to counterparties, it is important to understand why the interbank system is structurally prone to contagion effects. The contagion propagation from failure of a 'trigger' bank (centre most black node in Figure 1) is displayed in terms of direct failures (black nodes) placed on the first concentric circle, the second order failures are on the circle beyond and so on. The contagion halts when no further bank failures follow. It is found that the Indian interbank network becomes somewhat denser in Dec 2010 and the average number of links in the system rises by 2. However, the potential for contagion caused by banks with high eigenvector centrality increases in terms of losses of total Tier 1 capital from around 7%-9% to over 14% in the period from June 2010 to December 2010.

 

Figure 1 Contagion from failure of a 'trigger' bank ( 2010 Q4) Source RBI June 2011

 

Two structural aspects of the Indian interbank system drive this. Firstly, there is a group of banks which for historical reasons have dominance in country wide branch banking and a very large deposit base. Such banks are net lenders while those with limited depository base are net borrowers of funds. The latter include the foreign banks and new private sector banks. The reliance on unfunded collateralized liabilities in these banks is a source for concern as is the use of funds by large net borrowers. The direction of credit risk and contagion always follows from the inability of borrowers to repay.

 

Secondly, the Indian interbank market is a dynamic one and reflects the highly competitive nature of commercial banking in the Indian economy. In the two periods that were closely studied, the share of activity of a certain new commercial bank has outpaced others in a matter of four months. This showed up as increased eigenvector centrality statistics for one or two banks in the Indian financial network and also, as noted above,in the higher average connectivity or interconnectness of the system as a whole. At a local level, ie.at the level of a firm, as in case of Northern Rock and HBOS, increased market share and network centrality may appear as a matter worthy of awards and accolades. However, holistic modelling tools like network modelling can reveal how this can endanger system stability. Overall financial network structures are persistent, build up of stresses occur over time and hence close scrutiny of network changes can help in a way that market price based early warning signals cannot.

 

 

2. Six Steps to Paradigm Shift and Bridging Skills Gap for Financial Systemic Risk Modelling and Robust Regulatory Design: Excerpts from August 2010 Markose talks and tutorials on at the Reserve Bank of India and IMF May 2010

"Why did they not see it coming ?"

 

The financial crisis of 2007 which continues to engulf Eurozone countries reflects a crisis in economics as much as it does the severe costs for workers, pensioners and tax payers.Deep seated doctrinal flaws and an erosion of the science base in economics and finance have been cited by many as being instrumental for the regulatory and market failures in the Anglo-American economies. Further, as noted by Larry Elliot - why are the only economists (Hayek, Keynes and Minski) who have anything relevant to say about the 2007 financial crisis dead ones ?

‘Group think’ was a major ingredient in the recent crisis and is evident in top
economic journals, departments and key policy institutions. (The 2010 Report of the IMF Independent Evaluation Office (IEO) documented that “IMF’s ability to
identify the mounting risks was hindered by .. a high degree of groupthink ... and an institutional culture that discourages contrarian views..”. Among the changes being recommended is the ability “to speak truth to power”.) The 20 odd years of an intellectual Gosplan of having to meet quantitative targets that UK Economics Departments had to suffer under the variants of the Research Assessment Exercises - have taken their toll. When research excellence is judged by whether papers are published in North American economic journals which dominate the ratings league tables, it is little wonder that.the LSE and other leading UK and European Economics Departments became 'clones' of the worst excesses of North American economics and in particular began to be led by the nose in the areas of macro and monetary economics. In the last 3 decades, UK failed to produce a Hayek or a Keynes, following Larry Elliot's rhetoric, or for that matter anything of any distinction in terms of new ideas or tools to deal with the challenges posed by a modern macro, financial and monetary environment. In the recent Parliamentary Select Committee inquiry which queried why Mervyn King ordered cuts to the financial stability research department in the run up to the crisis, it is clear that these institutions were reduced to 'report' writing ones. They were not repositories for fine grained knowledge and modelling on how private sector securitized/collateralized forms of credit creation was transforming banking and finance or on what implications the advances in payments technology in rendering cashlessness in the economy have for inflation in the CPI index.

 

Thanks to the liberal and radical intellectual inputs from Sir Ivor Crewe, the then Vice Chancellor of the University of Essex, the chance to set up and run a multi-disciplinary Centre (2002-2008) in economics and finance came about. This intellectual autonomy to innovate an economics curriculum is unlikely to have been possible in many a parallel universe.

The Centre for Computational Finance and Economic Agents provided a milieu where solutions to problems are crafted in ways irrespective of economic doctrines using a complexity perspective and new ICT based artificial life techniques. The curricula at CCFEA produced operational tools for networks based digital maps for financial interconnections to monitor and analyze systemic risk, new non-Gaussian models that can deal with extreme market events and a complexity perspective on policy design.

 

Insufficient scientific and empirical basis of a number of pronouncements in mainstream economics to do with rational calculations, equilibrium outcomes, Gaussian models and optimal policy design, stem from attempts to maintain economics as a closed and complete inferential system which is least suited to its subject matter that is best viewed as a complex adaptive system, Markose (2005).

 

Most of all, the crisis reveals a lack of quantitative modelling tools to monitor and manage systemic risk from complex interconnectedness of markets and regulatory incentives for financial intermediaries. We take a distinct view that systemic risk from excessive leverage is a negative externality analogous to environmental pollution. Solutions to negative externality economic problems require holistic visualization of the system to avoid fallacy of compostion type design flaws that characterize Basel II and III. Despite the shortcomings noted by Robert Lucas on the use of eocnometric models for policy analysis, little was done to address the problem. In contrast multi-agent models are flexible in terms of agents’ response and are not hampered as equation led macro econometric models are by the need for long time series data. The latter ipso facto prevents the inclusion of innovative sectors. Key to the complexity perspective, absent in the mainstream and stochastic control policy models, is reflexivity and competitive co-evolution with the Red Queen type arms race in strategic innovation especially by regulatees. This view on the interaction between uber intelligent agents is underscored in Markose (2005,2004) is different from that of econo-physicists who often confine analysis to random interactions. Fixed rules that cannot co-evolve nor respond to regulatory arbitrage will lead to policy failure. I advocate the need to stress test policy prior to implementation and to monitor it on an on-going basis for perverse incentives. I bring insights from new research on how cashless e-payments may have brought down inflation, a major oversight in the management of macrostability. Finally, by endorsing that we avoid “official definitions of systemic risk that have left out the role of government officials in generating it” Kane (2010) – there is an urgent need to counter self-serving behavior in regulatory circles.

 

Six short overviews will be provided on paradigm shift and skills upgrades needed:

 

- Statistical Models of  Systemic Risk and Financial Contagion: Early Warning Signals based on Market Price Data of Limited Use


-Complexity Perspective: Reflexivity, Non-Computability of Rational Expectations and Red Queen Type Arms Race

 

- Lucas Critique and Traditional Policy Design Framework Devoid of Complexity Perspective

 

-Holistic Visualization to Avoid Fallacy of Composition Type Errors

 

-Conundrum on inflation reduction in advanced cashless economies: Imbalancing of Anglo-american economies away from real investment

 

-Too Interconnected to fail financial networks and system wide fragility

 

-Large Scale database driven Multi-agent Financial Network Models: Platform for Monitorig for Systemic Risk Factors and For Perverse Incentives from Policy

 

 

3.Statistical Models of Financial Contagion : Early Warning Signals Based on Market iddexes of Limited usse


The statistical models made popular by Kaminsky and Reinhart (2000) view financial contagion as the downward co-movement of asset prices across different markets for different asset classes or for different financial intermediaries.  This is based on statistical or econometric methods which operate on cross sectional and time series (panel) data of market prices of the set of relevant assets or financial intermediaries to estimate (amongst other ways) the increased correlations or co-movements of prices across markets.
In the area of systemic measures of market risk, the introduction of Conditional Value at Risk (CoVaR) (Adrian and Brunnermeir, 2008), the principal components approach to measure systemic risk (Kritzman et al. 2010),  the distress dependence approach, Chan-Lau et al. (2009), and the distress intensity matrix approach, Giesecke and Kim (2009), are also noteworthy as important complimentary means of monitoring the direction in which a financial contagion is likely to spread. We will briefly outline the Co-VAR model to give a flavour of  statistical detection of costs of contagion in terms of conditional value at risk. The CoVaR of the institution i is the Value at Risk (VaR) of the entire financial sector conditional to the event that the i-th financial institution is in difficulty. The theoretical and empirical analysis by Adrian and Brunnermeier (2008) shows that ΔCoVaR of the change in CoVaR has several desirable properties: in particular, helps to explain the effects of spillover risk throughout the financial system, ΔCoVaR can easily be extended to risk measures as the Co-Expected Shortfall (Co-ES) and can be calculated both in conditional and unconditional terms. The main statistical technique used by Adrian and Brunnermeier (2008) for Co-VaR estimation is quantile regression, estimated from changes (week) in the market prices of assets of financial institutions. Clearly, Co-VaR overcomes the criticism levelled at VaR for an individual financial unit in that its own market price is not a sufficient proxy of the losses sustained at a given confidence level if other financial units failed, in particular those to which it has large exposures. But, as Co-VaR is also based on historical price data, and if recent history did not record ‘bad’ states, it is unlikely for Co-VaR to indicate extreme contagion.
An important advance in the area of statistical models for financial contagion is the use of CDS market premia. The CDS market premia integrate market expectations on solvency conditions of reference entities and hence the study of correlations and/or comovements of CDS premia across different classes of firms such as non-financial, financial and also sovereign debt can give an indication of the extent to which the economic contagion has spread and also the direction of future defaults (see, Castellano and D’Ecclesia, 2011). CDS spreads and derived metrics are by far the most closely monitored early warning signals for changes in credit risk of a reference entity, whether private or sovereign.
On the technology front, correlations have been replaced by more sophisticated copula based measures as co-movements of market variables under extreme conditions are not linear as is assumed by Gaussian models.  Here the cross sectional statistical assessment on systemic risk within the distress dependence matrix of Segoviano ( 2009) based on the CDS spreads of large complex financial intermediaries is important. Indeed, it has been put to use in the only quantitative assessment to date of FIs’ overall derivatives exposure and systemic risk impact in Segoviano and Singh (2010). The paper is based on the FDIC/OCC data on fair value derivatives liabilities for FIs and the CDS spreads for these FIs as reference entities. The latter determines the conditional default probabilities and the so called distress dependence between FIs determines which FIs will fail conditional on failure of others. Segoviano-Singh find that the expected cumulative derivatives losses when cascaded in a series of insolvencies of top broker dealers are even beyond the capabilities of the Fed Reserve to provide backstops.  This adds urgency for the need to conduct an in-depth structural analysis of the financial derivatives and the role of large complex financial intermediaries which are broker dealers in these markets.

 

 



Figure 2: Banking Stability Index (Segoviano, Goodhart 09/04) v Market VIX and V-FTSE Indexes  Note, market data based indices spike contemporaneously with crisis ; devoid of requisite info for Early Warning System (Source:  Markose (2010) Presentation at IMF Workshop on Operationalizing Systemic Risk Monitoring)


However, one is faced by a major problem with the use of historical market price data to predict future financial crisis. The above approach clearly has merits in determining co-movements or statistical dependencies at a point in time. However, when the cross sectional data is integrated to produce a single Banking Stability Index in Sergoviano - Goodhart (2009), there was little predictive power.  At the 2010 IMF Workshop on Operationalizing Systemic Risk, Rama Cont and Sheri Markose strongly objected to the use of a statistical index based on market price information (on CDS spreads) to provide an early warning signal for banking fragility.  In the above graph, Markose showed that the Sergoviano – Goodhart (2009) Banking Stability Index spiked contemporaneously with the crisis same as any other market index such as the VIX or VFTSE and it also subsided far too soon after the Lehman Brother crisis making it devoid of requisite information as an early warning signal for impending and prolonged financial market crisis.


This variable involves derivatives payable by each FI with legally binding bilateral obligations netted over all OTC derivatives products and adjusted for collateral posted.

 

4. Complexity Perspective: Reflexivity , Non-Computability of Rational Expectations and Red Queen Type Arms Race

 


Remarkabily, the FP7-ICT-2011(5.6) EU funding call requires that societal trends be analysed using non-classical economic modelling and reflexivity. Reflexivity refers to the capacity to make self-referential mappings and it is understood that social systems pose problems of epistemic undecidability or problems of predictability of collective outcomes.
Brian Arthur (1994) used the notion of reflexivity in the context of stock prices to challenge the foundations of homogenous rational expectations equilibria as being a logical impossibility.  Reflexivity refers to the fact that prices and other social phenomena depend on beliefs or meta-representations on outcomes, and has been popularized by George Soros . He elaborates “Markets don't reflect the facts very well, partly because they create the facts themselves." In what Soros calls "reflexivity", trends in the real world reinforce a bias in market participants' minds, which in turn reinforces those trends in "a double feedback, reflexive connection". Realities create expectations, but expectations also create realities, and so on.  Spear (1989) showed that the fixed points representing rational expectations have no computable solutions.  Arthur (1994) also noted that in asset markets, rewards tend to accrue to those agents who are contrarian or in the minority. That is, if it is most profitable to buy when the majority are selling and sell when the majority are buying, then if all agents act in an identical homogenous fashion, they will fail in their objective to be profitable. Hence, heterogeneity in strategies becomes a rational response and endogenous uncertainty and non-predictability of prices follow not because the system is being buffeted by white noise or ‘news’ but because of activities of agents who can form meta-representations of the world.  Despite, the significance of Brian Arthur’s challenge to orthodoxy which is often held up as the motivation behind a very large class of ACE called artificial stock markets models, few economists acknowledge the significance of the role of reflexivity and contrarian structures in terms of the mathematical logic that leads to epistemic problems of undecidability or to full blown non-computability (Smullyan, 1961). The contrarian is often referred to as the Liar in logic and in self-referential statements as in “this is false” leading to epistemic uncertainty as it is neither true nor false.  Like-wise, fixed points involving those who can negate cannot be computed á la the Gödel-Turing- Post result.
It is only with recent advances in neurophysiology of the brain on mirror neurons, hailed as one of the great discoveries in science, that there is an explanation for why the capacity of meta-representation with self as an actor in the mapping (leading to the recognition of hostile oppositional intentions) is essential for social and strategic behaviour. Ramachandran (2006) views Machiavellian behaviour, über intelligence, deceit and creativity as being part and parcel of this capacity of meta-representation. The significance of the mathematical logic of Gödel-Turing- Post is that any system incorporating such elements of self-referential calculation ( ie. a Universal Turing Machine, UTM) will imply incompleteness or capacity to produce new objects outside of a set that can be enumerated by another negating or oppositional UTM  (ie. one who negates what is computable/predictable).  Smullyan (1961) and Cutland (1980) show how an arms race of such new objects, viz. novelty or surprises can be generated from Emil Post’s creative and productive sets.  Intuitively, this says expect surprises from the opposition and if innovation is not adopted then expect to be ‘negated’ by the hostile agent.  Following these epochal results, the so called Wolfram- Chomsky  schema of dynamical systems exclusively categorize complex adaptive systems (see, Wolfram, 1984, Casti, 1994) in terms of their capacity to produce novelty or ‘surprises’ and Type 4 undecidable structure changing dynamics.
Sadly, even major exponents of agent based economics (see, Testfatsion and Judd, 2006, or a recent manifesto on complexity perspective for economics, Deli Gatti et al. 2010) make no reference to reflexivity or the Red Queen type arms race involved in CAS. The main exceptions here are Robert Axelrod (2003), Arthur Robson (2005) and Markose (2004, 2005). Econo-physicists shy away from strategic interactions and favour simple rules or even random interactions and their categorization of complex systems do not include the Wolfram-Chomsky CAS that requires UTMs to produce new objects not previously there.  There has indeed been a long legacy of the Theil-Tinbergen theory of macro-policy design based on the so called Linear- Quadratic- Gaussian model of optimal control where it is assumed that the policy maker’s targets are only buffeted by random noise rather than by regulatees who game the system.  While clearly there are important emergent patterns that cannot be deduced from the interaction of agents using simple rules as in the Schelling model, oversight of strategic innovation by regulatees in policy design we argue is detrimental to system stability.
In this we will draw on a remarkable study by Axelrod (2003) in which he provides a check list of all manner of threats to a networked system.   Axelrod (2003) cites system failure in networks arises from a situation in which “coevolution is not anticipated”.  He states: “A networked information system not only evolves, its parts coevolve in response to each other’s changes. The evolution of a networked information system is driven by a constant process of change in response to new opportunities (especially technical advances in hardware and software), and new lessons learned (both from experience within the system and from the experiences of other networked systems).”A regulator-regulatee arms race (cf.  parasite-host dynamic) requires monitoring and production of countervailing new measures (cf. production of anti-bodies) by authorities in response to regulatee deviations from rules due to perverse incentives or loopholes in place. When competitive co-evolution is present, the system retains the status quo in terms of some relative performance measure, and structures will manifest persistence, especially in market shares. Run away growth in some sectors or of some agents are indications of the removal of constraints or countervailing forces which if they are not reintroduced could result in systemic collapse. The reflexive property of  über intelligence where strategic behaviour of regulatees or market participants involving innovations can not only render policy ineffective but can potentially make policy itself the trigger behind perverse maladaptive outcomes and system failure. Such system failure is endogenous rather than brought about by exogenous shocks.
In the run up to the financial crisis some had warned of the threats from Basel II regulatory framework for engendering perverse incentives (Borio and White, 2004; Jones, 2000; Hellwig, 2003). Eichengreen (2010) concludes: “fundamentally, the (2007) crisis is the result of flawed regulations and perverse incentives in financial markets”.  A complexity perspective will place monitoring for perverse incentives and regulatory arbitrage as important criteria for mitigation of systemic risk. Needless to say, conventional thinking and extant reforms do not highlight this. Hellwig (2010) and Trianna (2010) have pointed out that the proposed capital adequacy reforms do not go far enough as all the ‘race to the bottom’ incentives are retained in Basel III.   After two decades of concerted effort under Basel I and Basel II to maintain capital adequacy in the G6 banking system, we are now confronted by one that is close to insolvency and the emergence of risk sharing institutions that gave the semblance of removing local risk only to bring about extreme tail risks for the system as a whole. 

               

See Ramachandran (2006) and Oberman  et al. (2005). 

The Red Queen, the character in Lewis Carol’s Alice Through the Looking Glass, who signifies the need ‘to run faster and faster to stay in the same square’ has become emblematic of the outcome of competitive co-evolution for evolutionary biologists in that no competitor gains absolute ground, see Markose (2005).  Local Red Queen arms race leads to the Chuck Prince type observation on “ having to dance when the music is on” and constant innovations are needed to retain status quo in market share.

There is evidence from FDIC data set is that in 2009 Q4 JP Morgan and Citibank availed of $53.57 bn of the total $65.09 bn of bank balance sheet items for which CDS protection gave it regulatory capital relief.

 

 

5. Lucas Critique and Traditional Policy Design Framework Devoid of Complexity Perspective

While after the event, Andrew Haldane and others (see, also Sheng, 2010) have upheld the significance of having a network perspective on financial stability- it is important to understand why economists in the last two decades did not pursue a systemic perspective nor develop any integrative quantitative tool that includes interconnections among component entities for macro-economic modelling and policy design. Reductionism in mainstream economics with the conflation of the so called representative agent with a sector or a system as whole has rendered it useless for analysis of stability of systems that arise from interactions between a multiplicity of heterogeneous agents (see, Kirman, 1992, 1997, for a longstanding critique of this).  In traditional macro-economics, private claims are netted out and hence interconnections of FIs in terms of their obligations that can trigger cascades of insolvencies do not feature in any models on macro-stabilization policy design.  However, the main problem with policy design to date is the limited incorporation of strategic behaviour in terms of concrete institutional developments.
Though popular with academic economists, due to the inadequacies of the Theil-Tinbergen theory of policy design based on the so called Linear- Quadratic- Gaussian model of optimal control where the policy maker’s targets are only buffeted by random noise rather than by regulatees who game the system, the framework is of little practical use for policy implementation.  The Lucas thesis on policy design which implied that policy analysis must not be conducted as if it is a game against nature effectively overturned the traditional Theil-Tinbergen approach to policy analysis as it was traditionally done.  A new dawn was promised along the lines of the Axelrod (2003) dictum on system failure due to oversight of co-evolution. The Lucas view on policy is associated with the three following well known postulates (Lucas, 1972,1976).The  first Lucaspostulate says that policy objectives may be rendered ineffective by the strategic behaviour of regulatees if they can anticipate (viz. have rational expectations) or know the outcomes of policy when policy is effectively transparent. I have called this contrarian, rule breaking behaviour or the Liar strategy in that a contra position cannot be implemented from what cannot be computed or known without ambiguity.  Lucas’s second postulate said that when faced by a private sector with rational expectations, it is deemed necessary for authorities to use ‘surprise’ strategies to achieve policy objectives. Third, the computation of equilibrium outcomes or the econometric estimation of models to evaluate policy may be difficult or impossible as behaviourial changes to anticipated  policy lead to a lack of structural invariance of the models concerned.  Strictly speaking, it is the third postulate above in Lucas(1976) that is referred to as the Lucas Critique. Subsequently, the Lucas Critique correctly put a nail in the coffin of equation based econometric models which cannot model the capacity of a rule breaking private sector which can anticipate policy and negate policy or jeopardize the system by a process of regulatory arbitrage. Such strategic behaviour results in a lack of structural invariance of the equations being estimated, highlighting the restrictiveness of econometric modelling for policy analysis.   
However, a longstanding misunderstanding by macro and monetary economists of the notion of a ‘surprise’ policy strategy in the Lucas thesis on policy design resulted in the dominant view that good monetary policy is one where authorities are engaged in a pre-commitment strategy of fulfilling a fixed quantitative rule (see, Markose, 2005 Sections 3 and 4) rather than set up a macro-prudential framework that will enable them to co-evolve with regulatees and produce countervailing measures to keep regulatory arbitrage in check. Though Binmore (1987) had indicated that any strategist who upholds deterministic strategies as being optimal must answer the question “what of the Liar ?” or what of the agent who is contrarian and can negate or falsify a rule, few if any recognized the Lucas postulates are analogues of the formal conditions in mathematical logic of complex adaptive systems , Markose (2004,a).  ‘Surprises’ or novelty production that brings about strategic indeterminism is the logical outcome of agents placed in oppositional or hostile positions as predictable outcomes of any player will bring about its demise.  Quite simply deterministic strategies cannot be played unless the rule breaker qua Liar can be kept in check and if not, co-evolution is the name of the game to avoid system collapse. An arms race of surprises or innovations can also be proven to arise from this logic of opposition. As for the failure of econometric models to identify undecidable structure changing dynamics from strategic innovations, mathematical logic indicates though the meta model will fully deduce the necessity to surprise or innovate in an arms race structure, there no ex ante way of identifying these emergent outcomes.  In other words, the trio of Lucas postulates characterize the famous incompleteness of mathematical logic which underpin the Godel-Turing- Post framework of complexity, Markose (2005).  
In the 1990’s there has been a bandwagon effect of a class of models called monetary game theory models that set aside the postulates of the Lucas Critique and advocates its exact opposite for the conduct of monetary policy. The dichotomous application of the Lucas Critique to policy objectives pertaining to real and nominal sides of the economy is the prominent feature of  monetary game theory models that dominated discussions on policy design, see, Goodhart (1994) .  For real side objectives the famous Lucasian categories of ‘dust, ambiguity and uncertainty’ (ibid. p.110)are deemed necessary to achieve policy outcomes. For nominal variables such as the price level and the rate of inflation, these models hold that commitment to transparent monetary rules such as that of currency pegs or preannounced inflation targets involving interest rate adjustment will lead to greater credibility and success in inflation control.    


The notion of a surprise strategy in the macro-economics literature appears in the so called Lucas surprise supply function often defined as follows: y  = y* + b( p- pe ) + e . This says that output, y, will not increase beyond the natural rate, y*, unless there is ‘surprise’ inflation, (p - pe) which is the prediction error from expected inflation, p e. The idea here is that the private sector contravenes the effects of anticipated inflation, viz. the neutrality result. Hence, it is intuitively asserted that authorities who seek to expand output beyond the natural rate need to use surprise inflation. As surprise inflation sounds like a ‘bad’ thing to do – the objective of mainstream monetary policy became one of pre-committing authorities to a fixed rule for inflation control.  See Box 2 on how surprises or novelty production of objects not previously there is not permissible in extant game theory and hence the role of co-evolutionary arms race between regulator and regulatee is not part of the mainstream macro-economic policy setting framework.    

Smullyan (1961) in his monograph on Formal Systemsand the characterization of the limits of deduction provides a proof of an ever extendable set of such ‘surprises’ using productive sets and functions that underpin the mathematics of incompleteness. See also Cutland (1980). Box 2 uses this framework.

The number of papers espousing the main tenets of this class of models is so large that it is best to refer to Fischer (1994) for a balanced survey of the macro-policy framework that has dominated in the last two decades. See, also Cukierman (1994).

Goodhart(1994), in the format of an open letter to the Governor of the Bank of England, reviews Cukierman (1992). Though, Goodhart suggests that it may be  “silly” (italics in original, ibid.p144) that these models have diametrically opposite policy recommendations for policy objectives on real and nominal variables, he is unable to explain in strategic terms why people behave differently to real and nominal policy variables.     

                                          Box 2 Complexity Perspective For Policy Design
The epithet complex adaptive system (CAS) following in the lineage of Gödel-Turing- Post is attributed only to so called Type 4 dynamics of the Wolfram- Chomsky schema where agents with the highest level of computational intelligence interact and produce new objects not previously there and also bring about undecidable structure changing dynamics , Markose (2004a, 2005). The mathematical logic that underpins CAS has three key components : (i)Meta representational systems and self referential or reflexive mappings that necessitates ' über' computational intelligence of a Universal Turing machine  (ii) Contrarian or self-negating structures like the Liar (as in ‘this is false’), and  (iii) The consequences of (i) and (ii) can be represented by so called creative and productive sets with the latter depicting an arms race in novelty production or 'surprises'.  
It has been argued that policy design needs to take on board strategic interaction involving hostile or contrarian rule breaking agents and how this implies an arms race with new objects and ‘surprises’. However, as partly indicated by Binmore (1987) extant mathematics of game theory is closed and complete. It can only provide strategy mappings to a fixed action set and indeterminism extends only to randomizations between given actions. Regarding contrarian behaviour, the Liar and strategic innovations or surprises to escape from hostile agents, as pointed out by Crawford (2003) –to date , economic “theory lags behind the public’s intuition”... and “we are left with no systematic way to think about such ubiquitous phenomena”. The Gödel-Turing- Post theory of computation provides the best known formalism of meta-representation of an underlying system in terms of encoding using integers, n ÎÀ, (also known as Gödel numbers) to represent the instructions utilizing strings of symbols to achieve encoded outputs from inputs in a finite number of steps in terms of an algorithm or program. The execution of this encoded information which one can regard as a simulation can be done on ‘mechanisms’ involving any substrata ranging from in intra- cellular biology to silicon chips. This capacity of meta-representation without which CAS properties do not emerge yields the notion of a universal Turing machine (UTM)which can take encoded information of other machines and replicate them. Remarkably, UTMs can run codes involving themselves, which is the basis of self-reference. If codes of functions are not already given, then successful simulations require discovering fixed points of executable functions. The typical notation for mappings involving encoded information is given as f(x)    a(x) =q. That is, function f(.) on input x when computed using the program a is denoted as a(x). If a(x) is defined or halts it yields output q and if the function f(x) is undefined (~) then a(x) does not halt.
The function that always yields outputs on any encoded input is called a total computable function and can be regarded as all potential technologies. This set denoted by  is uncountably infinite and there is no systematic way of ‘searching’ or listing this set. Some finite subset of this set entails known technologies and can represent a given action set A of traditional game theory. A novelty or a surprise is an encoded object in the set ( - A), ie. out- side of set A that is already known to exist. The remarkable achievement of Gödel-Turing- Post mathematical logic is that there is only one way, viz. incorporation of the Liar or contrarian function, we denote by , by which fully deducible meta-computations on fixed point of f¬ determines the logical necessity for surprise mappings into the set ( - A). This functional mapping is called the productive function in logic and as it involves novelty and surprise, we will denote it as f !. We say that a total computable function g(m) has a fixed point m such that fg(m)= fm. Note, g(m) ¹ m, but they identify the same function f and if programs g(m) and m for both sides of equation halt they must yield an identical output q, then m is the computable fixed point of g
In a simple two person oppositional game involving for example private sector and government authorities (or parasite-host relationship, indexed by p and a) , the generic statement of the Liar or contrarian strategy is the following :                                                               
The first line states that output q will be negated to q~ by the contrarian fp¬ strategy if and only if the policy with code ba is applied and output q is produced in state s. If not, as noted in the second line above, the Liar does nothing. The fixed point of fp¬ is denoted as s(ba¬, ba¬), is not computable as in  f(s) =  f(s). For if it is, the two sides of the equation will produce contradictory outputs. Remarkably, two place encoding s(ba¬, ba¬) (analogous to Gödel substitution function) says that p knows that a knows that p is the Liar. From here on total computable strategy functions starting with that of the authorities, fa!, can only map into a set such as (Â - A)and will mark an arms race in surprises. If this is not feasible, preannounced policy rule a has to be abandoned or the Liar eliminated to avoid policy failure.  This framework signifies that recognition of hostile agents requires the highest level of computational intelligence (which Steven Wolfram claims is already ubiquitous even in the humble virus) and further in the absence of contrarian or oppositional structures there is no logical need to innovate or surprise.
The best known example of reflexivity, often written about in the popular press, is that of stock market prices:
Pt+1  = g(åi=1,..N bit((Pt+1)). That is, the price at t+1 is determined by the strategies bit (to buy or sell)of  investors  (i= 1,2,.N) agents, based on their respective beliefs,, of the price at t+1 and the market price determination function g(.) is increasing in excess demand (aggregate buy order less sell orders) at t. Spear (1989) was the first to show that rational expectations involving the belief or forecast function corresponds to inductive identification by trial and error of the fixed point for the market price function g(.), as in fg(m)(s)= fm(s) where s is an encoding of past historical data.  Further, pointing out the inherent contrarian or minority nature of the stock market game here payoffs to pure speculative investors are at their maximum if they sell when majority are buying and vice versa, Arthur (1994) over turned traditional ideas of rationality and showed that it is logically impossible for all investors to have an identical/homogenous rational expectations. The role of contrarians in bringing down financial systems should not be underestimated. The prominent contrarian strategies that have netted vast profits in the context of institutionalized free lunches of the ERM currency peg and the CDS carry trade have been, respectively, that of George Soros in 1992 and Paolo Pelligrini and John Paulson in the 2007 crisis.  Good institution design should vitiate such opportunities.
The contents in this Box is to underscore how failure of policy can arise from insufficient understanding of the logic behind co-evolutionary pressures that arise from strategic interaction between intelligent and potentially hostile agents. This is to be contrasted with science behind yet another complexity perspective, which is important in describing a large class of spectacular phenomena which can only emerge or self-organize such as pattern formation in shoals of fish or flock of birds and even racial segregation as in the Schelling model.  It is important to understand tipping points and sudden phase transitions that such models can throw light on. These are brought about by simple local interactions or rule following by agents but lack the strategic elements arising from reflexivity or fixed point mappings that lead to arms races in innovation in CAS.

In the Wolfram-Chomsky schema of dynamical systems, CAS is shown to be different from chaotic dynamics. The popular view is to conflate either chaos or all manner of complicated situations with CAS. Further, it is only with the discovery of so called mirror neurons in the neuro-physiology of the brain and the work of Ramachandran (2006), Oberman  et. al. (2005) that it is being understood why the capacity of meta-representation with self as an actor in the mapping (leading to the  recognition of hostile oppositional structures) is a key ingredient of strategic behaviour. The significance of the mathematical logic of Gödel-Turing- Post is that any system incorporating such elements will imply incompleteness or capacity to produce new objects with algorithms or encoded information as inputs.                              

Assume that there is an unique homogenous forecast function "i , = fa(s) = Pt+1 ­, ie. a price rise is predicted. Then the contrarian  strategy bit¬ kicks in for all investors leading them to sell hence result in the market price function  to output a price fall fg(a)(s) = Pt+1 ¯. Rationality, in the presence of minority pay off structures generates endemic heterogeneity in strategies.

 

 

 

6. Fallacy of Composition and the Need for Holistic Visualization

Fallacy of composition (see Brunnermeir et. al.  2009)  is what has been noted to be one of the fatal flaws of the Basel 1 and 2 regulatory framework.  This fallacy arises from the mistaken view that what is efficient or rational at the level of an individual firm also produces systemically stable outcomes. As will be shown below, only a holistic visualization of the topological structure of the system can demonstrate concentration of risk bearing activity and increased interconnectedness in the system, as more and more firms outsource insurance or diversification activities to a few.  Risk sharing is only as good as those involved as risk guarantors in terms of numbers and quality of capital.  Adverse selection can arise inadvertently (due to problems of mispricing of risk) if risk guarantors in unfunded contingent claims markets undertake obligations which they cannot fulfil when the crisis occurs. The generation of tail risk or extreme outcomes become more probable with ‘excessive’ outsourcing of risk from balance sheets of firms.  
Tiering and concentration of banking activity with some banks assuming specialist broker dealer roles often arises from the objective to economize on liquidity and to minimize on final settlement.  For example, bilateral offsetting is undertaken in OTC derivatives to maximize returns from spreads and to minimize final settlement to end users.  It must be noted that of the $700 Trillion gross notional value of global derivatives only 5% is for purposes of hedging. 
Figure 1  gives the bipartite graph which shows the participants of the global OTC derivatives markets (trading only) in the four markets, Interest Rates, forex, credit, equity and commodities.  The graph plots which of the participants operate in only one, two, three, four and all markets.  What is significant that the 16 universal banks in a circular grouping in Figure 1 are present in all 5 markets.  These are the broker dealers in all these markets while the majority of participants are mostly in the interest rate derivatives only and a smaller number in both interest rate and forex.         
Figure 1 : Structure of Financial Derivatives Market: (2009, Q4): Green(Interest Rate), Blue (Forex), Maroon (Equity); Red (Credit/CDS); Yellow (Commodity); Circle layout : Broker Dealers in all markets (Bi-partite Graph) (Source:  Markose, 2011, Report for IMF )

The Figures 2 (a,b) below from Blake et al. (2010a) give another example of how what appears like a rational strategy to follow in pension fund management at the level of the individual fund sponsor can contribute to growing systemic risk from concentration.  They study key shifts in the structure of the UK pension fund industry from 1984 to 2004.  From the 1980s to the mid 1990s, pension funds were primarily managed in house and if they were outsourced, fund sponsors used balanced fund managers. Increasingly, pension fund management is both outsourced and the balanced fund manager has been superseded by specialist fund managers that are often specialized according to asset class.  At the individual fund level outsourcing to a number of special fund managers rather than putting the bulk of funds in balanced fund management appears to fulfil objectives of diversification.  However, as more and more pension fund sponsors follow this route, at a system wide level, there is increasing concentration of fund management in very large specialist funds.  In other words, at the system level there is growing evidence of suboptimal diversification.  Note that the individual red nodes at the top of the figures depicting individual pension funds following in-house portfolio management have fallen off by 2004 when compared to 1984. 
Figure 2a UK Pension Fund Management Network in 1984:

 

 

Figure 2b UK Pension Fund Management Network 2004

 

By 2004, not only are there fewer consultants (black diamonds) channeling funds, the blue lines showing multiple mandates from the same pension funds to specialist managers show greater density implying more and more individual fund sponsors are routing their funds to the same set of specialist fund managers. Growing concentration of funds in specialist fund managers in asset classes can precipitate herding in these markets when compared to funds managed in balanced funds.

No doubt, overcoming the fallacy of composition and designing systemically stable systems while strategic interactions among economic agents dynamically change the stability of network connections is no mean task. As it has become prominent with environmental pollution and congestion issues for society, striking an appropriate trade off between growth versus stability or sustainability requires integrative modelling tools capable of yielding visualizations of systemic instability emanating from individual behaviour. Individual activity is directly observable and collective outcomes from unobservable interactions are hard to ‘see’. Hence, it is easy to succumb to fallacy of composition in policy design. Composite scenario analysis is needed to help design satisfactory institutional solutions for sustainable growth where endemic problems of individual behaviour with local incentives may not coincide with stable system wide outcomes. Thus, systemic risk from financial activity and the pricing of it is no different from overuse and degradation of resources as in environmental negative externalities (eg. CO2 emissions). These arise from economic activities where the ‘clean up’ costs are not fully priced at point of use by the individual and hence intra and intergenerational problems of fairness follow when costs are passed on. At least since Pigou (1950) it has been known that regulatory intervention is needed to ‘cap’ the aggregate quantity of the negative externality/economic bad and hence of the original economic activity at sustainable levels. Outright prohibition, taxing the activity or pricing the negative externality by novel ‘cap’ and trade methods are ways of controlling these potentially unsustainable trends. In financial and monetary instability, what ever the finer details, there is always an inherent ‘cap’ to the quantity of credit or fiat money that the system can absorb safely and traditionally it is the role of the head of the central bank 'to take the punch bowl away as the party gets going’.

Hence, compared to controls on carbon emissions and other measures that aim to align individual actions with environmental sustainability which are relatively new, institutional controls on privately issued credit or leverage and fiat money supply by governments have had a rich history. With regard to the first, there has always and only been one unique way in determining sustainability of financial intermediation: Privately issued commercial paper and securitized bank assets have to rely on fiat money or government securities as reserves when issues of convertibility are at stake under conditions when the underlying assets suffer loss of value due to increased threat of default of primary debtor, counterparty or collapse of asset markets. The restraints over the years on the fractional system of private credit creation are too numerous to list here. As for the second, state engineered inflation in the consumer price index (CPI) due to the monopoly in its supply of fiat money on which payments for goods and services rely on has been the main threat to monetary and economic stability.   

 

 

7. Conundrum on Inflation Reduction in Advanced Cashless Economies

Up until 2007, the lack of incentives for academic or economists in central banks to keep up with the recent advances in the monetary and financial sectors or to build simulation platforms for the assessment of policy within a strategic and system setting is tied up with the major conundrum surrounding the run up to the recent crisis. Since 1994 traditional overheating of economies with inflation in the CPI index despite long standing consumer credit fuelled spending sprees was virtually non-existent in developed countries. Threats to financial and economic instability from inflation have been in abeyance. Subsequent to double digit inflation in the 1970’s in some advanced OECD economies, concerted efforts especially in the UK to restrict central banks by statute to focus entirely on fulfilling a fixed rule on inflation, led many to conclude that though interest rate policy was unabashedly loose as in the US in the early part of this decade, the so called ‘great moderation’ was attributed to good helmsmanship. Hence, few threats were perceived either in the spectacular growth of US shadow banking sector or as in the case of UK bank assets grew by as much as 200% of GDP to about 600% by 2006 in only 2 years, Alessandri and Haldane (2009). To my mind, two key structural developments that occurred in the Anglo-Saxon monetary and financial system were largely ignored. Both of these have a bearing on future developments in BRICs.
Firstly, state supplied notes and coins (also known as M0 in the US) are increasingly being phased out of monetary transactions in so called advanced cashless economies due to the increased use of electronic methods of fund transfer at point of sale (EFTPOS) in physical markets and with electronic payments being the only option in e-markets. The latter have de facto undermined legal tender of state monies. With EFPTOS effectuating fund transfer between payee and payor with direct electronic debit and credits on respective bank balances, M0 circulating outside the banking system is drastically reduced, Markose and Loke (2003a,b ). In Finland which is on the vanguard of cashlessness, in this decade M0 is under 1% of GDP, in UK it is about 2% and USA 5%.  In economies where e-cashlessness has not yet taken off typically have about 16% -25% of GDP as cash in circulation. There are studies that acknowledge that technology led change in payment habits in the direction of cashlessness has eroded seigniorage. However, few have advanced a theory first propounded by Hayek (1974) that the only permanent brake on the capacity of governments to engineer inflation via the monopoly on the supply of payments media is to find private substitutes to economize on this. Marimon et. al (1997) admirably stated the following key issues : “Most developed countries have experienced a drastic reduction of inflation rates in the last quarter of this century, from the double digit numbers of the mid seventies to the very low – say, below 2.5% , numbers at the end of the nineties. High inflation episodes seem to be problems of the past, as if society had become immune to the disease. This success in curbing inflation has been usually attributed to a better monetary policy management to achieve price stability. But, maybe the right incentives have been created by the widespread development and use of cash substitutes. Who deserves most credit?  An implication of the paper will be that the role of electronic money in curbing inflation has been undervalued.” To my best knowledge the only governor of a central bank of a developing country in recent times who sought to eliminate cash from transactions with electronic payment methods as a means to control inflation is that of Ghana.  

Theoretical papers that acknowledge the feature of diminishing or zero transactions demand for currency are unable to throw light on what happens to inflation due to what is called indeterminacy of the price level. This is, ofcourse, an artefact of their models rather than a fact of the real world. In contrast, Woodford (1998, p.217) reinstated the price level determinacy but concludes that “… the project of modelling the fine details of the payments system and the sources of money demand is not essential… to the analysis of the effects of alternative monetary policy”.  This effectively marked an end to any serious academic investigations into one of the most important developments in monetary history, viz. the erosion of governments’ role in the supply of the means of payment and its implications for state engineered inflation. 

In a highly cash based payments system for goods and services, as were the periods when double digit and hyper inflation crisis occurred, quantity of cash mediating transactions influence the nominal prices of goods. In many variants of monetary theory, this is called the monetary veil. As noted in Humphrey et. al. (1996) in fourteen developing countries (barring US) between1987-1993, 34% of retail expenditures involved electronic payments. With increased cashless payments, the role of fiat money in transactions is increasingly being reduced to that of a numeraire or as a unit of account. With the monetary veil being lifted from transactions in goods and services, inflation on CPI index has come down permanently in OECD countries. The bulk of fiat money is mostly in the form of bank reserves which underpin the large edifice of ‘inside money’ or broad money which constitute bank deposit creation, securitized forms of the loan book of banks (example, asset backed commercial paper) and other private credit instruments held as bank liabilities. As private credit is self-liquidating once loans are repaid, inside money (via the proportion channelled to consumer credit) is not capable of bringing about permanent inflation in the CPI index. But unrestricted growth of private credit channelled to assets can produce asset (real estate, equities and commodities) price bubbles.  

In view of the recent increase in the US monetary base to the tune of 142% as the Federal Reserve expanded bank reserves to about $1.2 with quantitative easing , many have forecast the threat to inflation as the economy picks up. There is in fact an unique social experiment in the making from which we may able to learn what consequences there are for inflation from the technology instilled force of habit of using debit card in payments which leads consumers to desist from withdrawing cash for purchases. It is an intriguing thought that despite high inflationary expectations among consumers and the large monetary base being supplied with quantitative easing, a Weimar Republic scenario of inflation growth can be curtailed by non-cash based payment habits. In 1923 cash withdrawals accelerated to keep up with the upward repricing of goods as real output decreased; cash in circulation grew 15-20 times as prices rose 40-50 times in the Weimar Republic. Despite, some resurgence of cash use due to low interest rates (see, Markose and Loke, 2003), there appears to be an absolute brake placed on the CPI index by cashless payments which warrants the construction of a new cashless consumer price index.    

Managing Fractional Private Credit Creation Systems And Leverage From Derivatives
The consequence of a lack of understanding of the role of interest rate management in an advanced cashless economy where interest rates can influence credit generation (subject to the Keynesian liquidity trap) but not inflation in CPI, has not only led to a series of major oversights but allows US and UK authorities to apply quantitative easing via injecting bank reserves with impunity. The process that Minski (1982) called Ponzi finance, when existing financial obligations can only be met by issuing new liabilities became endemic in the banking sector which increased leverage through securitization is now dubbed the shadow banking system. Securitized bank assets which in forms like asset backed commercial paper (ABCP) are bank liabilities in that it enables the bank to leverage its loan book via the short term repo markets. Reserves and the leverage multiplier in the creation of private sector claims ultimately govern issues of convertibility at times of crisis. As presaged by Hayek(1936) and is a well known idea since at least Henry Thornton, private monies will ultimately balloon  into problems of systemic collapse when convertibility to more liquid forms of regulated funds or fiat money are at stake. This is the fundamental problem of non fiat monies and fractional banking with less than100% reserves. The run on the repo market (where funds are borrowed and lent against collateral that included private debt instruments such as MBS) in the recent crisis is such a manifestation of the convertibility problem of private monies.

In this context, the proposal in Basel II and III that enable banks to replace bank capital by CDS derivatives must be viewed within a fractional monetary system. The regulatory boundary can be extended so that reserve and leverage ratios apply to all participants involved in credit creation irrespective of whether they are depository institutions or not. But on the question of whether credit derivatives can be included as a substitute for capital for bank assets, the answer is that they pose a violation of the principle behind capital and reserve requirements. These aim to decrease the multiplier effect on further securitization of debt whereas guarantors of credit derivatives will naturally seek to offset the risk by finding another guarantor and so on. While it is not even clear that banks will adopt CDS for risk mitigation without the capital reduction incentives , there appears to be little understanding that the use of unfunded CDS insurance cover to replace bank equity capital can only produce a potentially explosive Ponzi type fractional system in credit derivatives as ‘derivatives beget derivatives’. That is obligations from one level of credit derivatives will be dynamically ‘hedged’ by further unfunded CDS protection. Note here unlike vanilla type derivatives where dynamic hedging is vis -à-vis an underlying variable whose returns are not (by and large) influenced by derivatives written on it, in credit derivatives on securitized bank loans which themselves become underlying securities can leverage the system further. What appears like an individually rational activity can be systemically lethal. Further bundling up of these credit derivatives can itself become an underlying -all of which can magnify the risk that the original guarantee will not be met when a major macro event occurs which reduces the value of the pro-cyclically correlated underlying. It was the failure to meet CDS obligations by key CDS protection sellers on subprime related MBS and CDOs that led to implicit or explicit tax payer bail-outs on the premise that these financial entities were too interconnected to fail. Further, in repo markets that use these assets as collateral suffer higher haircuts – all of which will trigger a run on private debt creation.

The final report is to be submitted on 30 July 2011.

In a recent project on a market design for pricing road congestion externalities that was involved in (Markose et al , 2007 ) the build up of congestion Central Gateshead during peak times was artificially reproduced in a computer environment. For this according to the principles of model verité, the entire road system of an urban congestion hot spot is digitally mapped and commuter agents were incrementally added to the road network based on actual origin and destination data for habitual commuters who traverse that city centre area. The ‘cap’ was then determined as the total distance travelled function peaked and started to fall.    

It is well known that productivity growth can be a direct cause of the fall in the price level. However, as shown in Pilat (2002) Figure 1 on multifactor productivity (MFP)for OECD countries, Netherlands, Austria, Belgium, Italy, Japan, France, Germany UK and Spain had lower MFP in 1990-2000 than in the previous decade. In contrast, 1994 has been noted in Markose and Loke (2002) as the watershed when inflation fell to 2% on average in selected G10 countries and remained low thereafter. Cheap goods from China (see, Bean (2006)) have been given as another explanation for the drastic fall in core inflation in developing countries. The lack of upward pressure of factor prices such as wages is another reason why inflation remains low, once it has fallen.    

However, though Hayek (1974) conceived of the evolution of private currencies that circulated as a means of substituting away from fiat money for payments,  he did not presage the electronic developments such as the debit card of the late 20 th century. Debit cards are successful because they permit direct verification of bank balances and therefore obviate the need for any reputational inputs for payment guarantees that had held up other cashless payment methods. While, Hayek (1976) decried the state monopoly of the mint due to the forced reliance of citizens on national currencies for transactions and the capacity of governments to engineer inflation via this channel, he is clear that “neither higher wages nor higher prices of oil or perhaps inputs can drive up prices of all goods unless purchasers are given more money to buy them” (ibid p. 95).

Speaking on the theme ''Banking in the next millennium, expectations, opportunities and challenges'' at the 28th anniversary of Ghana's Chartered Institute of Bankers in Accra, the Governor, Dr Kwabena Duffour noted that Ghana's payment system is highly cash-based and underdeveloped, saying: "there is still over-reliance on cash as a means of payment despite the few electronic modes of payment currently adopted by the banks”. He said the situation allows for increase in the cash flow outside the banking system, with its resultant increase in inflation, foreign exchange and interest rates and severe constraints on commercial activities. In a drive to reduce inflation from 20.8 per cent in December 1997 to 17.4 per cent in May 1999 by reducing money in the economy to further reduce the inflation level to a single digit, Dr. Duffour said that "our current cash-based payment systems are impeding the Bank's effort." http://www.ghanaweb.com/GhanaHomePage/NewsArchive/artikel.php?ID=4351

US cash in circulation is about $900billion in 2009.

I’m grateful to Steve Spear of Carnegie Mellon University for the discussion on whether the growth of new financial derivatives markets (with derivatives on derivatives)can succeed in completing markets and provide risk free hedges. As in the well known mathematical logic of incomplete systems, with derivatives on assets that reflect system wide information (and pro-cyclicality), these instruments are inherently incapable of completing markets. Ironically, as these instruments endogenously generate extreme volatility, using them to hedge volatility may seem indispensible at an individual level!