Business Cycle Theory
- Details
- Category: Economics
- Hits: 8,573
It has frequently been observed that interest in business or trade cycle theory is itself cyclical (e.g. Zarnowitz 1985, p.524). In periods of sustained prosperity interest wanes, as it did in the 1960s and early 1970s when research into macroeconomic dynamics concentrated on growth theory. At the end of the 1960s, the continued existence of business cycles was questioned. The experiences of the 1970s and early 1980s, especially following the 1973 and 1979 oil price shocks, brought a resurgence of interest in business cycles.
In this introductory section, the main themes in business cycle research in the 1980s will be reviewed. In section: Equilibrium Business Cycle (EBC) Modelling, the equilibrium approach to business cycle modeling, which has been dominant in the 1980s, will be discussed in more detail.
In section: Nonlinear Cycle Theory, recent contributions by economists who do not accept that the business cycle can be adequately modeled using the linear Frisch-Slutsky approach and that nonlinearities must be introduced, will be surveyed in order to update the survey of nonlinear business cycle models in Mullineux (1984,). This chapter does not seek to provide a comprehensive survey of the vast literature on business cycles. Zarnowitz (1985) has recently made a Herculean attempt at this.
Mullineux (1984, Ch. 3) traces the renewed interest in business cycle theory to the contributions of Nordhaus and Lucas in the mid-1970s. Nordhaus (1975) revived interest in the idea of a political business cycle (PBC) and Lucas (1975) utilized rational expectations to revitalize interest in the equilibrium business cycle (EBC).
Nordhaus’s contribution differed from previous PBC literature1 in stressing the influence of the electoral period on the cycle in economic activity and drew on the ideas being developed by ‘modern political economists’ (e.g. Tullock 1976 and Frey 1978), who argue that governments manipulate the economy to maximize votes.2 Lucas viewed Hayek (1933) as an antecedent of his work.
Lucas’s EBC model marked a major change from the Keynesian approach to business cycle modeling, which regarded the cycle as an essential disequilibrium phenomenon. ‘Rigidities’ or ‘frictions’ in the economy, such as sticky nominal wages and prices, were the proximate cause of disequilibrium.
Lucas, and other members of the ‘New Classical School’,3 aimed to derive the dynamic behavior of the macroeconomy from the basic microeconomic principles of rational, maximizing firms and individuals and in so doing made use of advances in microeconomic theory relating to intertemporal labor supply.
In the latter connection, Lucas (1972) had developed the ‘Lucas Supply Hypothesis’ (LSH) which showed that, under rational expectations with restricted information, deriving from the ‘islands market hypothesis’ (Phelps 1972), a ‘signal extraction problem’ arises and labour and firms will tend to respond to a rise in price by increasing supply, even if they are uncertain if the price rise is a relative or an absolute one, in order not to miss out on profit maximizing opportunities - see Lucas 1977 for an informal discussion. The LSH formed the basis of the Lucas (1975) EBC model.
One might have expected the ‘New Classical School’ to view the economy as essentially stable but to display erratic movements by virtue of being hit by a series of random shocks, as postulated by the Monte Carlo hypothesis. Instead Lucas (1975) utilised a modified version of the acceleration principle, much favoured in Key-nesian cycle models, to explain the persistence of the effects of the shocks and the findings of the NBER - which he then distilled into a number of ‘stylised facts’ discussed in Lucas (1977).
Although an equilibrium theory of the cycle, in the sense that all markets were assumed to ‘clear’ continuously, the Lucas (1975) model remained in the Frisch-Slutsky tradition that had dominated Keynesian econometric model building. The cycle was not endogenous. It was instead driven by external random shocks. In the Lucasian EBC and related models,4 these were a combination of real shocks, entering via random error terms, and nominal shocks, caused by unanticipated changes in the growth of money supply.
(For further discussion, see Mullineux 1984,) These shocks were transformed or ‘propagated’ by the Lucasian EBC models to generate a cyclical output, the cycle being assumed to occur around a log linear trend. In the absence of the modified accelerator, the Lucas (1975) model would have generated a Monte Carlo cycle. Barro (1981, Ch. 2) considers alternative sources of the persistent effects of shocks which introduce the rigidity necessary to lock economic agents into incorrect decisions.
As a result of the major recession, or depression, experienced in many Western economies in the early 1980s, interest in business cycles was sustained and the EBC approach remained dominant. The practical applicability of the modern PBC and Lucasian EBC models was, however, increasingly questioned. Despite the fact that journalistic economic commentary continues to imply that government economic policy-making is heavily influenced by the stage of the electoral period, mainstream academic interest in the PBC has not been sustained in the 1980s.
This is perhaps surprising in light of the widespread belief that the large US budget deficit was a major cause of the worldwide stock market crashes in October 1987 and that no significant action on the deficit was likely prior to the November 1988 US Presidential election.
There were also suggestions that the US Federal Open Market Committee (FOMC) had been unwilling to raise interest rates in the run-up to the election, despite a buildup of inflationary pressures. In the United Kingdom, there has been the widespread acknowledgment of the contribution of Chancellor Nigel Lawson’s tax-cutting budget to the 1987 Conservative Party election victory. Not only has the modern PBC approach been largely dismissed, but so too has the Lucasian EBC model. It received little attention at the NBER conference reported in Gordon (1986) and was rejected in Gordon’s overview of the conference (Gordon (ed.) 1986, p.9).
Mullineux (1984, Ch. 5) concluded that, although apparently contradictory, the modern PBC and Lucasian EBC models could be usefully developed and merged using game theory. The major weakness of the PBC theory was the incredible naivety of the electorate, who were assumed to allow themselves to be repeatedly conned by the government in a way that seemed to be inconsistent with rationality.
A weakness of the EBC was the assumption that the government was completely benign and would never be tempted to manipulate the economy in its own interests. Backus and Driffill (1985a, b) have shown that, under rational expectations, the PBC can indeed be rationalized using game theory. Building on the work of Barro and Gordon (1983a, b), they show that a government can exploit the time inconsistency identified by Kydland and Prescott (1977).
By building up credibility for its anti-inflationary policies, or an anti-inflationary reputation, the government can create an opportunity to manipulate the economy to improve its chances of winning the next election. The electorate, which Backus and Driffill assume to be atomistic, has effectively only one strategic choice in this game. That is to form expectations of inflation on the basis of the rational expectations hypothesis (REH). It turns out that they can indeed be duped as long as the government’s anti-inflationary reputation or credibility is not dissipated.
This point is apparently illustrated by recent UK experience. Prior to the 1983 election, UK Chancellor Geoffrey Howe can be regarded as being in the process of building an anti-inflationary reputation. He declined to engineer a pre-election boom in 1983, paving the way for the subsequent Chancellor, Nigel Lawson, to boost the economy by relaxing monetary policy and cutting taxes.
This is widely acknowledged to have contributed to the electoral success of the Thatcher government in the summer of 1987. By the summer of 1988, however, it appeared that anti-inflationary credibility was evaporating rapidly. Broad money targets had been abandoned in the 1987 tax-cutting budget and it appeared that a policy of shadowing the EMS had been adopted instead.
Uncertainty over the status of the exchange rate policy was increased significantly by a public disagreement between the Chancellor and the Prime Minister prior to the 1988 tax-cutting budget. As a result, the Chancellor was forced to encourage a rise in bank base rates from 7 percent in March to 12 percent in August.
Further progress along the lines of Backus and Driffill could perhaps be made if the assumption of an atomistic electorate were to be dropped, the existence of trade unions recognized and wage demands regarded as a strategic variable.5 The economic game would then become richer and the two-player framework would need to be extended to allow for additional players.
In the 1980s there has been renewed interest in the analysis of economic decision-making under uncertainty, especially by ‘New Keynesians’. Zarnowitz’s survey article on business cycles (1985, section: The Unfinished Research Agenda ), draws attention to the fact that in models utilizing the REH, economic behavior is guided by subjective probabilities which agree, on average, with the true frequencies of the events in question.
The REH therefore deals with risk, rather than uncertainty, in the sense of Knight (1921) and Keynes (1936). In particular, there is no uncertainty as to what the objective probabilities are. Lucas (1977, p. 15) goes so far as to argue that in cases of uncertainty, economic reasoning will be of no value. Shackle (1938) argued long ago that the view that we cannot theorize rationally about conduct that is not completely rational has inhibited the development of economic thought and caused a preference for the Walrasian equilibrium framework.
It may also have generated widespread acceptance of the REH but as a result of this renewed interest in the implications of uncertainty, the applicability of the REH is being increasingly questioned.
Meltzer (1982) distinguishes formally between uncertainty, which is associated with variations in nonstationary means resulting from permanent changes in economic variables, and risk, which is associated with transitory, random deviations from stable trends. He argues that uncertainty should form an essential part of an explanation of the persistence of business-cycle contractions by allowing permanent changes to occur without being identified immediately.
Stochastic shocks, he argues, have permanent and transitory components which cannot be reliably separated and new information cannot completely remove the confusion. The rational response to such shocks may well be adaptive, taking the form of gradual adjustments of prior beliefs about the permanent values of endogenous variables.
If uncertainty prevails, then economic agents cannot behave atomistic-cally6 - as they are implicit, in models employing the REH, and explicitly, by Backus and Driffill (1985a, b), assumed to do. Under uncertainty, instead of forming expectations independently, agents must take account of the weight of opinion guiding the activities of other agents in the manner of the Keynes (1936, p. 156) ‘beauty contest’ example. Such complications are ignored under the REH and by the New Classical models that employ it.
Gordon (ed.) (1986, p.9) identifies two further problems with the New Classical EBC models. The first is their inability to explain how an information barrier of a month or two could generate the output persistence observed in the typical four-year US business cycle and in the twelve-year Great Depression.
This is perhaps a bit harsh considering that Lucas (1975) generated persistence using a modified accelerator and the information lag merely gave time for decisions about investment to be made based on misperceptions. Further, the incompleteness of information upon which the investment decisions are made can be alternatively rationalized as being the result of costs of processing information rather than a lag in its availability - a view that Lucas (1987) seems to hold.
The second problem is the internal inconsistency of stale information itself. If it is solely responsible for the phenomenon of business cycles, Gordon argues, then one would have expected an ‘information market’ to develop to diffuse the ‘signal extraction problem’. This would also go some way towards removing the alternative rationalization of the information deficiency, based on the cost of processing information, if the market could provide processed information at a sufficiently low price. Lucas would presumably counter, in line with the arguments in Lucas (1987), that because of the excessive volume of information to be processed this would not be the case.
Alternatively, it can be argued that the information market cannot provide full information if the world is one in which uncertainty plays an important role, but it can, and probably already does, provide a picture of how other agents view the economy.
As well as attempting to model economic decision-making under uncertainty, the New Keynesian School, rather than accept the New Classical approach of deriving macroeconomic theory from traditional microeconomic foundations, is attempting to derive rationalizations for the key Keynesian rigidities from alternative micro-foundations. Greenwald and Stiglitz (1987) review some of the contributions to the
New Keynesian literature, such as efficiency wage models, which attempt to explain wage rigidities, and analyses of equity and credit rationing and the implications of the latter for the role of monetary policy. The analysis relies heavily on imperfect information and asymmetries of information and, as such, is related to the work of Lucas, referred to above, and Friedman (1968), who used information asymmetries between workers and firms to explain how a rise in the money supply could have real effects in the short term.
Greenwald and Stiglitz (1987) claim that New Keynesian economics provides a general theory of the economy, derived from microeconomic principles, that can explain the existence of an equilibrium level of unemployment, based on efficiency wage theories, and business cycles. The latter result from the effect of shocks on the stock of working capital held by firms.
They note that even in the absence of credit rationing, firms’ willingness to borrow would be limited by their willingness to bear risk. Given risk aversion, the fixed commitments associated with loan contracts implies that as the working capital which is available declines, the risk of bankruptcy probability, associated with borrowing, increases. Thus a reduction in working capital will lead to a reduction in firms’ desired production levels and it takes time to restore working capital to normal levels.
The effects of aggregate shocks will, therefore, persist. They also argue that sectoral shocks (e.g. oil price shocks) will have redistributional effects via their influences on the stocks of working capital of firms in various sectors and, because it takes time to restore working capital to desired levels, there will be aggregate effects.
It is clear that the New Keynesian school also tends to adopt the linear Frisch-Slutsky approach and there are certain similarities with Lucasian EBC models in that incomplete information is stressed and shocks have persistent effects because of their influence on capital investment decisions. The New Keynesians do, however, stress that markets are not perfect and that markets do not clear continuously, as assumed by New Classical economists.
They accept that the decisions of economic agents are based on future expectations and influenced by past decisions but reject the view that individuals have perfect foresight or rational expectations concerning the future. Instead, they postulate that events which economic agents confront appear to be unique and that there is no way that they can form a statistical model predicting the probability distribution of outcomes, as assumed in Lucasian EBC models. Decisions are made under uncertainty rather than risk.
Despite these criticisms, the EBC approach itself has not been abandoned. In its place real EBC (RBC) models have proliferated. These usually retain the REH but reject the information deficiencies inherent in the Lucasian EBC models. The RBC models retain the Frisch-Slutsky approach but postulate that real, as opposed to unanticipated monetary, shocks are the major source of impulses.
The contributions of Kydland and Prescott (1982) and Long and Plosser (1983) have proved very influential. Kydland and Prescott relied on ‘time to build’ to form the basis of a propagation model which converts technology shocks into a cyclical output. These models will be discussed further in section: Equilibrium Business Cycle (EBC) Modelling. The coexistence of Lucasian EBC and RBC models in the early 1980s revived an old debate about whether the cycle is primarily real or monetary in origin. At present, the theoretical literature appears to be dominated by the view that it is real.
Some of the empirical studies reported in Gordon (ed.) (1986) attempted to identify the major sources of shocks and more work clearly needs to be done in this area. The issue runs deeper, however, because it is also important to decide on the roles real and monetary factors play in the models that propagate these shocks. Acknowledgment of this is evident in the synthetic EBC models suggested by Lucas (1987) and Eichenbaum and Singleton (1986), who attempt to introduce money into RBC models; this issue will be discussed further in section: Equilibrium Business Cycle (EBC) Modelling.
While conforming to the linear Frisch-Slutsky modeling strategy, the RBC approach does attempt to integrate growth and cycle theory by analyzing stochastic versions of the neoclassical growth model. It does not attempt to derive a truly endogenous theory of the cycle. The majority of RBC models stress technological shocks and are therefore related to the work of Schumpeter (1935, 1939), which is discussed in section: Schumpeter on Economic Evolution and Shackle (1938), which is discussed in section: Shackle on the Business Cycle. Schumpeter and Shackle, however, argued that innovations occurred in ‘swarms’ when favourable economic conditions prevailed, rather than as a random series of exogenous shocks.
The role of innovations and their dissemination has not, however, been completely neglected in the cycling literature. The experiences of the 1970s and 1980s have stimulated a revival of interest in long cycles or waves,7 with the deceleration of growth following the 1950s and 1960s being interpreted as a downswing in the longwave (e.g. Mandel 1980). An alternative, and perhaps dominant, view is that the deceleration marked response to major historical episodes in the form of oil price shocks.
The long cycle literature includes contributions which emphasise the impact of major innovations and the dissemination of technological innovations (e.g. Metcalfe 1984). No attempt will be made to review the literature in this chapter, but if innovations contribute to the business cycle, as well as the long cycle, then it will be important to pay attention to their dissemination.
To do this, sectoral models that allow for input-output interactions, such as those developed by Goodwin and Punzo (1987) , will be required and the tendency towards reduced form vector autoregressive (VAR) modelling will have to be reversed in favour of a more structural approach.
The relative roles of induced and autonomous investment remain unresolved but explanation of the dynamic economic development process will require a judicious mix of these elements with various ‘multipliers’. Hicks (1950) used a trend in autonomous investment to bring about a gradual rise in the floor, while Kaldor (1954) and Goodwin (1955) groped for a richer mix to explain both the business cycle and growth. The division between growth and cycle theory had followed the decision by Harrod (1936) to concentrate on growth, leaving the cycle to be explained by multiplier-accelerator interaction.
Goodwin and Kaldor, however, felt that the role of innovations in stimulating cyclical growth, as stressed by Schumpeter, had been overlooked and that undue stress had been placed on induced investment via the acceleration process and autonomous investment - which was regarded as a separable trend resulting from replacement investment and a steady stream of innovations. In particular, the Schumpeterian bunching of innovations had been ignored.
They expressed the view that an integrated theory of dynamic economic development was required. As a consequence, it is incorrect to decompose economic time series into a linear trend and business cycles components because they are part of the same process and the statistical trend has no economic meaning. Nevertheless, Goodwin (1955) combined a nonlinear accelerator with a trend in autonomous investment, while Hansen (1951) made autonomous investment depend on a steady stream of innovations.
Attempts to develop a theory of dynamic economic development will be discussed further in Chapter 4. In the next section the equilibrium approach, which adopts the linear Frisch Slutsky approach and which was dominant in the 1980s, will be discussed and in the final section of this chapter recent developments in the nonlinear business cycle modeling will be reviewed.
Before discussing the equilibrium theories of the business cycle, another influential contribution to the literature should be mentioned. Azariadis (1981) considers the possibility that, under uncertainty, business cycles are set in motion by factors, however subjective, that agents happen to believe to be relevant to economic activity.
Such factors could include Keynes’s “animal spirits’, consumer sentiment, pronouncements of Wall Street gurus, the growth of certain monetary aggregates, or even sunspots if a sufficient number of people naively believe they influence economic activity, as Jevons (1884, Ch. 7) asserted they did. ‘Sunspot theories’ demonstrate that extraneous or extrinsic uncertainty is consistent and commonly associated with rational expectations equilibria in an aggregate overlapping generations model with no price rigidities and continuous market clearing.
Azariadis finds that even well-behaved economies typically allow rational expectations equilibria in which expectations themselves spark cyclical fluctuations. This is because if individuals naively believe in indicators of future prices, such as sunspots or perhaps certain monetary aggregates, they take actions that tend to confirm their beliefs. These self-fulfilling prophecies are a source of indeterminacy which augment the multiplicity of equilibria that typically emerge in generalised monetary models with perfect foresight.
A significant proportion of the equilibria may result from self-fulfilling prophecies and resemble perpetual cycles. It is also possible that equilibria resembling permanent ‘recessions’ or permanent ‘booms’ will result. Azariadis shows that such ‘perpetual’ and ‘permanent’ states may unravel if the uncertainty is reduced by introducing contracts, or as a result of the development of financial markets for claims contingent on predictions, which permit hedging.
Equilibrium Business Cycle (EBC) Modelling
EBC models differ from the traditional Keynesian business cycle models , which incorporate multiplier-accelerator interaction and wage-price rigidities, in assuming that markets clear continuously throughout the cycle. In this respect they also clearly differ from the alternative Keynesian models which stress constrained demand,9 the New Keynesian models and models of financial instability, such as that of Minsky, which will be discussed in section 3.3.
In the disequilibrium models stressing constrained demand, the importance of the interdependence of markets and sectors is highlighted. This feature is assumed away by the ‘market islands’ paradigm (originated in Phelps 1972) employed in Lucasian EBC models10 and consequently, the linkages underlying the Keynesian multiplier are decoupled. The cycle generated by Minsky’s disequilibrium model is much more nearly endogenous than in traditional Keynesian multiplier-accelerator formulations, which rely heavily on external shocks.
The potential for financial crises is generated endogenously by the interplay of real and financial factors and crises can be sparked by exogenous shocks or endogenous shocks, such as shifts in Keynesian ‘animal spirits’ leading to reduced optimism. The cycle is normally originated by real factors but money and credit play a significant role in the propagation of the cycle and are primarily responsible for speculative booms and crises. This contrasts with the Lucasian EBC model which utilizes monetary shocks to derive a cycle from a real propagation mechanism.
Dissatisfaction with the Lucasian EBC model, which relies on monetary shocks and price misperceptions, which in turn result from limited information and give rise to a ‘signal extraction problem’ despite the assumption of RE, prompted experimentation with alternative models. Boschen and Grossman’s (1982) paper was particularly influential in persuading New Classical economists to abandon the Lucasian approach and to focus on real factors instead.
They found contemporaneous monetary data to be significantly (partially) correlated with real activity, as measured by output. This is inconsistent with the Lucasian approach, which stresses incomplete information and the role of unanticipated and unobserved, rather than anticipated or observed, monetary shocks, in particular, as a determinant of macroeconomic fluctuations. Tests of the rational expectations and structural neutrality hypotheses, which underlie the Lucasian approach, are discussed further in Mullineux (1984).
Zarnowitz (1985) distinguishes three main alternative approaches. The first was to reimpose Keynesian wage and price rigidities by introducing explicit (Taylor 1979, 1980a) and implicit (Okun 1980) multi-period contracts into RE models. These contracts were then given a microeconomic rationalization in the New Keynesian literature (Greenwald and Stiglitz 1987).
The second approach was to emphasize the role of the interest rate and harked back to the work of Wicksell, Myrdal, Keynes and Shackle, who had attempted a synthesis of Keynes’s ‘General Theory’ with the work of Myrdal (see Shackle 1938 and section 4.4). This approach emphasizes the importance of deviations from the natural rate of interest.” Additionally McCulloch (1981), in the Lucasian incomplete information mould, argued that business cycles are associated with unanticipated changes in the rate of interest that misdirect investment and lead to an incorrect mix of capital goods and distort the intertemporal production process in a Hayekian manner.
McCulloch’s models stress the role of the financial sector in adding to instability through ‘misintermediation’ or ‘mismatching’, which results when banks raise short-term finance to lend for longer periods. This adds to the uncertainty surrounding the rate of interest by creating imbalances in the term structure of interest rates, McCulloch argues. In another equilibrium model, Grossman and Weiss (1982) utilise a mix of random real shocks which affect production and the real interest rate, leading to investment and output fluctuations, and monetary shocks which affect inflation and the nominal interest rate, leading to amplification of the cycle.
As in Lucasian EBC models, a ‘signal extraction problem’ arises from trying to infer ex ante real interest rates and inflation from observed nominal interest rates when agents cannot observe relative rates of return.
The third approach focuses on real factors and has led to the development of a class of EBC models called real business cycle (RBC) models. Zarnowitz observes that it entails the most extreme reaction to Friedmanite monetarist and Lucasian monetary shock theories by strong believers in general equilibrium models and the neutrality of money.
While retaining the REH, these models assume that information is publicly and costlessly available. The ‘signal extraction problem’ that is such an important ingredient of Lucasian EBC models is thus diffused and consequently unanticipated monetary shocks play no role.
The Lucasian approach is highly aggregated and effectively assumes a single product is produced on each market island. Black (1982) rejects the single goods approach to modelling. He argues that because specialisation increases efficiency, multi-product models must be considered. In his model unemployment can be explained by the fact that the effects of a large number of partly independent shocks hitting different sectors of the economy will persist for a considerable time because rapid transfers of resources are costly and will be more so the greater the specialisation.
The most influential contributions to the RBC literature have probably been those of Kydland and Prescott (1982) and Long and Plosser (1983). Long and Plosser adopt a highly restrictive formulation, which assumes the following:
- Rational expectations.
- Complete current information.
- No long-lived commodities.
- No frictions or adjustment costs.
- No government.
- No money.
- No serial correlation in shock elements.
While not disputing the explanatory power of hypotheses inconsistent with these stringent assumptions, their aim was to focus on the explanatory power of fundamental hypotheses about consumer preferences and the production process. The preference hypothesis employed implies that consumers will want to spread unanticipated increases in wealth over both time and consumer goods, including leisure. There is, therefore, persistence in the effects of changes in wealth since they will alter the demand for all goods.
The production possibility hypothesis allows for a wide range of intra- and intertemporal substitution opportunities. In emphasising the latter, their approach is consistent with the Lucasian EBC approach (see Lucas 1977 in particular), but it does not incorporate a Lucasian ‘signal extraction problem’ because it assumes complete information.
The model allows for the interplay of the preference and production hypotheses and is used to analyse their cyclical implications. As in Black (1982), the business cycle equilibrium is preferred to non-business cycle alternatives because agents are willing to take risks to achieve higher expected returns. Random shocks are added to the outputs of numerous commodities, which are used for consumption or as inputs.
Input-output relationships propagate the effects of output shocks both forward in time and across sectors but, unlike Black’s model, there are no adjustment costs and so unemployment is difficult to explain. Despite the problems with the model, identified by Zarnowitz (1985, p.567), it usefully stresses the role of input-output relationships and in so doing is richer than the Kydland and Prescott (1982) single product model and moves away from the ‘islands’ hypothesis, inherent in Lucasian EBC models, which effectively denies the multiplier process.
Long and Plosser use stochastic simulations with random shocks to test whether their propagation model can produce a realistic cyclical output. The propagation mechanism is found to display damped cyclical responses following a shock and can, therefore, generate cycles in the Frisch-Slutsky tradition if hit by a series of shocks of suitable frequency and size. Further, comovements in industrial outputs can be identified as a result of input-output relationships.
Kydland and Prescott (1982) modify the neoclassical equilibrium growth model (e.g. Solow 1970) by introducing stochastic elements and an alternative ‘time to build’ technology in place of the constant returns to scale neoclassical production function and, in so doing, also reject the adjustment cost technology often emphasised in empirical studies of aggregate investment behaviour.
Their approach is, therefore, to integrate neoclassical growth theory with cycle theory. They aim to explain the cyclical variation of short-term economic time series and especially the autocorrelation of output and the covariation of real output and other series. The main modification of the standard neoclassical growth model is the assumption that multiple periods are required to build new capital goods and only finished capital goods are part of the productive capital stock.
The assumed preference function admits a great deal of intertemporal substitution of leisure, in line with Lucasian EBC models. This feature does not increase persistence in their model. The persistence of the effects of shocks is instead the result of the ‘time to build’ assumption.
The technology parameter is subject to a stochastic process with two components which differ in their persistence. Productivity itself cannot be observed but an indicator or ‘noisy’ measure of it can be observed at the beginning of the period. Consequently a ‘signal extraction problem’ is present, but it is different from the Lucasian one.
The technology shock is the sum of a permanent and a transitory component, in the manner of permanent income models (Friedman 1957). The permanent component is highly persistent, and shocks are therefore autocorrelated.12 When the technology parameter grows smoothly, steady state growth prevails but when it is stochastic, cyclical growth results. When estimated and empirically plausible parameters are introduced to the essentially linear model, investment varies three times as much as output and consumption only half as much.
Kydland and Prescott found that most of the variation in technology had to come from the permanent component in order for the serial correlation properties of the model to be consistent with postwar US data. Their results proved to be sensitive to the specification of the investment technology and the ‘time to build’ lag is important, but the cycle is not particularly sensitive to the length of the lag.11 Experiments with adjustment costs as an alternative source of persistence and lags proved unfruitful.
This contrasts with the Black (1982) analysis but may reflect the lack of specialisation. Kydland and Prescott themselves felt that the introduction of more than a single type of productive capital with different ‘time to build’ and patterns of resource requirements would improve the performance of the model. As in Lucas (1975) there is a ‘signal extraction problem’, and capital is used to create persistence but the approach is different. Lucas uses a modified accelerator and relies on random monetary shocks while Kydland and Prescott use ‘time to build’ and autocorrelated real technology shocks.
Zarnowitz (1985) notes that consequently the Kydland and Prescott model lacks the random shock property which EBC theorists had previously looked for in an essential propagator of the business cycle (see also Taylor 1980b). The models of Long and Plosser and Kydland and Prescott are representative agent models with complete markets so that credit does not enter into the determination of real quantities. King and Plosser (1984) consider an extended version of an RBC model in which certain, perhaps information, ‘frictions’ in private markets lead to the creation of institutions specialising in the issuance of credit.
They conclude that while credit may have a role to play in the propagation mechanism, the actions of the Federal Open Market Committee (FOMC) may not be an important independent source of fluctuations in real quantities and relative prices.
As a class, RBC models analyse the role of basic neoclassical factors in shaping the characteristics of economic fluctuations. In particular, they concentrate on the specification of preferences, technology and endowments in order to derive a stochastic propagation model. RBCs are driven by shocks entering the economic system via a number of channels, including those to technology and preferences. Within the basic neoclassical model it has proved necessary to incorporate substantial serial correlation in the productivity shocks, which are most commonly utilised in RBC models, to allow the generation of fluctuations resembling the post-war US experience.
This was evident in Kydland and Prescott (1982) and Prescott (1986) and may imply that stochastic growth is at the heart of observed economic fluctuations, as postulated by Nelson and Plosser (1982). RBC models can be regarded as stochastic versions of the neoclassical growth model and as permitting a unified analysis of growth and the cycle. They imply that the business cycle is exactly what should be expected to emerge from industrial market economies in which consumers and firms solve intertemporal optimisation problems in a stochastic environment.
RBC models commonly assume the following:
- The existence of a complete set of contingent claims to future goods and services based on the nonseparability of time preferences.
- That agents have common information sets.
- That the only frictions are due to technological factors modelled variously as ‘time to build’ or costs of adjustment.
They normally abstract entirely from monetary considerations and the fact that exchange in modern economies uses the medium of real money. They implicitly assume that monetary shocks have played an insignificant role in determining the behaviour of real variables.
In order to assess the importance of the RBC explanation of cycles, which assigns little or no role to monetary shocks, in the post-war period Eichenbaum and Singleton (1986) present and interpret evidence on the importance of monetary shocks as determinants of real economic activity. They note that in empirical investigations of RBC models it is common to assure their good fit by choosing the stochastic process appropriately. Little consideration had been given to the extent to which RBC models emerge as special cases of monetary models of the business cycle.
Acknowledging that the work of King and Plosser (1984) had come closest to such an exercise, they derive an RBC model with money introduced using a cash-in-advance constraint of the type considered in Lucas and Stokey (1984). Eichenbaum and Singleton then use the model to investigate Granger-causal relationships between nominal and real aggregates and find little empirical support for the proposition that monetary growth or inflation Granger-cause output growth.
They interpret this to mean that exogenous shocks to monetary growth are not an important independent source of variation in output and growth in the United States in the post-war period and consequently that real shocks are the predominant source of variation in real quantities over the cycle. In itself this does not imply that the RBC propagation model accurately characterises the economic environment, however, and in any case Granger-causality is difficult to interpret when expectations are taken account of. Like Lucas (1987), they conclude that acceptance or rejection of RBC models must be based in part on the plausibility of the variances and autocorrelations of technology shocks employed to generate realistic cycles.
Kydland and Prescott (1982), for example, simply chose variances and levels of persistence of shocks that were consistent with those of observed variables and concentrated on the adequacy of the model for propagating them to give output comparable to US data.
There is, therefore, a need to quantify the various shocks that hit the economy in order to gain an insight into the magnitude and nature of real shocks in the real world, to assess the relative importance of different types of real shocks, and to compare the magnitude and persistence of real and monetary shocks. This can only be done within the context of a model complex enough to accommodate various types of shock.
Eichenbaum and Singleton derive an EBC model for a monetary economy in which monetary growth can have real effects. The cash-in-advance constraint is the only source of non-neutrality in the model. It is an extended version of the Garber and King (1983) model and is closely related to the Long and Plosser (1983) model. They examine the conditions under which the RBC special case, in which the cash-in-advance constraint is not binding, provides an accurate approximation to the monetary economy. They find a constant monetary growth rate to be both necessary and sufficient for the real allocations to be identical in the monetary EBC and RBC versions.
There is clear evidence, however, that monetary growth has not been constant. This alone is not sufficient to dismiss the RBC explanation of aggregate fluctuations, since the cash-in-advance constraint may be incorrectly imposed. They also show that when real shocks to tastes and technology predominate, the RBC will be a good approximation to the monetary EBC model with constant monetary growth, and money may not be seen to Granger-cause output.
Commenting on Eichenbaum and Singleton’s paper, Barro (1986) emphasises their warning that the lack of significance of monetary shocks as a determinant of output does not imply that RBC models are correct. Keynes’s model, as outlined in the ‘General Theory’, is clearly not an RBC model and yet would attribute a major role to endogenous shocks in the form of shifts in the optimism and pessimism, or animal spirits, of investors and the propagation mechanism would rely on wage-price rigidities. Barro’s main concern with RBC models is the lack of important ‘multipliers’, which leads them to rely heavily on the frequency and size of shocks, and he argues that many economists are sceptical about whether shocks to preferences and technology are large and frequent enough.
He notes that the oil price shocks of the 1970s clearly were large enough but this does not convince him that real shocks alone can explain the cycle throughout the post-war period in the United States.
Mankiw (1986), in another commentary, finds Eichenbaum and Singleton’s conclusions surprising in the light of the work of Sims (1972, 1980) and argues that their failure to establish that money Granger-causes output might not have implications as far-reaching as they suggest. Mankiw uses examples, based on the Fischer (1980) contract model, to demonstrate that money need not Granger-cause output for it to be a determinant of output.
He further notes that Granger-causality is unlikely to be detected for most of the post-war period because the Federal Reserve Board’s goal was to allow money supply fluctuations to stabilise interest rates. Finally, Mankiw finds it suggestive that Eichenbaum and Singleton’s results show some evidence of Granger-causality in the post-1979 period, when money supply targets replaced interest rate targeting. In the concluding discussion of Eichenbaum and Singleton’s paper, Rotemberg questions the use of first differencing, which he believes led to the conflicting results on Granger-causality derived by Sims and Eichenbaum and Singleton.
Like Eichenbaum and Singleton, Lucas (1987) suggests a synthesis of RBC with EBC models, in which monetary shocks play an important role. He regards the Kydland and Prescott (1982) model as a useful definition of the frontier of business cycle research but feels that it incorrectly focuses on real, as opposed to monetary, considerations. Nevertheless he is impressed by its coherence and the fact that it is developed to the point where it can be empirically tested.
Lucas demonstrates that money can be grafted on to the Kydland and Prescott model in a way that has no significant effect on its conclusions. He does not, however, believe money to be neutral and argues that the fluctuations observed in the real world are too large to be induced by a combination of real impulses and Kydland and Prescott’s propagation mechanism. He draws attention to the work of Friedman and Schwartz (1963a, 1982), which clearly implies an influence of money on economic activity. If he is correct, he deduces, then either larger shocks, including monetary shocks, are required or the propagation model must be modified to include larger multipliers. L
ucas points out that the problem is not to account for Friedman and Schwartz’s evidence with a model in which the money supply responds passively to real events but plays no causal role. This is easy to do, he argues, and he demonstrates using a variant of the Lucas-Stokey (1984) model, as utilised by Eichenbaum and Singleton. This leads to a restatement of the ‘Quantity Theory’ in which money is neutral in the long run but monetary shocks can affect real variables. He argues that the problem is rather to account for real fluctuations without candidates for shocks that are of the right order by magnitude.
Kydland and Prescott (1982), he observes, simply chose the variance of the technology shocks to assure consistency with the observed GNP variability. They did not attempt to provide independent evidence that technological shocks will have the required variance. Lucas doubts this to be the case and advocates that empirical work be done to settle the issue.
Lucas argues that the integration of the Lucas-Stokey (1984) insights with the real dynamics of Kydland and Prescott is slightly beyond the frontier of what is possible. Eichenbaum and Singleton (1984) do, however, make a stab at the impossible by introducing a cash-in-advance constraint into a model related to Long and Plosser (1983). Lucas does speculate about how a hybrid model with preferences and technology for producing goods akin to those postulated by Kydland and Prescott (1982), but trading not centralised, would behave. Instead of the centralised trading assumed by Kydland and Prescott he uses the Lucas-Stokey motivation for using money.
Agents are assumed to trade in securities at the beginning of the period and use the cash acquired in the course of this trading to buy consumer goods later in the period. Lucas then postulates that the money supply is erratic, following a stochastic process with parameters fixed and known by agents, and considers the conditions under which monetary expansion will be associated with real expansions. One possibility would be to retain the full public information assumption, utilised by Kydland and Prescott and Lucas and Stokey, but to introduce price rigidity, perhaps using wage contracts in the manner of Taylor (1979). He adds that because of inflation tax considerations, non-neutrality must be recognised.
Lucas’s preferred way of introducing monetary effects is to integrate Lucasian EBC models with RBC models. He notes that in a multi-product version of the Kydland and Prescott (1982) model, the volume of information would explode even if the full public information assumption is retained. If there is any sort of cost of processing information, then economic agents will economise and process only the information which materially sharpens their ability to make production or investment decisions.
As a result, the ‘signal extraction problem’ will remain and a positive supply response to monetary shocks can be expected. Lucas believes that in a modified version of the Kydland and Prescott (1982) model, elaborated to admit limited information due to costs of processing it, shocks of a monetary origin would be ‘misperceived’ by agents as signalling a change in technology or preferences. Monetary shocks would then trigger similar dynamic responses to the technology shocks considered by Kydland and Prescott.
He notes that Lucas (1975) had relied on ‘misperceptions’ over whether the shocks were real or nominal in origin but had not specified the source of the real shocks. They had been simply introduced through random error terms. Lucas cites Grossman and Weiss (1982), Grossman et al. (1983) and other models, surveyed in Scheinkman (1984), as examples of models employing limited information and allowing for an interplay of real and monetary shocks.
Lucas finally draws attention to the Sargent (1976) paper, which demonstrated the ‘observational equivalence’ of models in which monetary non-neutrality is the result of limited information and those in which money affects real variables in some other way. He concludes his analysis by advocating the use of structural models that lay down specific economic hypotheses for testing, rather than the use of reduced form testing, and of dynamic game theoretic analysis, in the light of the ‘Lucas critique’(see Lucas 1976). This accords with the conclusions of Mullineux (1984) and Chapter 5 of this book.
The proposed synthetic EBC model, in which real and monetary shocks hit a propagation model describing an economy which is always in equilibrium to produce a cycle, accords little importance to the financial sector as a propagator of cycles. Considered in the next section is recent work by economists who refuse to accept the Frisch-Slutsky approach and argue that nonlinearities must be employed to generate realistic cycles. In the next chapter, work on the financial instability hypothesis, which assigns much more importance to the financial sector in business cycle generation, will be reviewed.
Nonlinear Cycle Theory
Zarnowitz (1985, pp. 540 and 544) argues that nonlinearities are likely to be very common in economic relationships. Business cycle models incorporating nonlinearities can generate limit cycles, which can be regarded as the equilibrium motion of the economy, whereas linear models cannot.
Limit cycles occur when the energy dissipated over a period is compensated for endogenously so that there is neither a gain nor a loss of energy and a steady oscillatory state results. There may be equilibrium points or growth paths in nonlinear models that have limit cycle solutions, but these are unstable and the representative point will tend to follow the cycle instead.
Limit cycle solutions need not be unique and may be stable or unstable. The limit cycle followed will, however, be independent of initial conditions. In the stable limit cycle case, the effect of adding shocks will be to increase irregularity by causing temporary deviations from the equilibrium orbit, which itself need not be symmetric . The sine wave produced by linear models is a conservative oscillator in the sense that no energy is dissipated. The cycle followed by the representative point in the conservative oscillator case will depend on the initial conditions.
When hit by shocks, however, the conservative property leads to an explosive motion and consequently the practical usefulness of conservative models, which are also symmetric in linear formulations, is doubtful - unless a Hicksian approach is to be adopted and ceilings and floors imposed. But this entails dropping the original linearity assumption.
This accounts for the adoption of the linear Frisch-Slutsky approach, in which damped motions occur because energy is dissipated over time but new compensatory energy is supplied by exogenous shocks, as an alternative. The existence of a stable limit cycle implies that the economy will always be gravitating towards an endogenously determined cyclical motion.
Nonlinear models with such solutions can be regarded as modern attempts to derive an endogenous theory of the cycle. They are therefore related to previous attempts to develop endogenous theories of the business cycle, which are reviewed briefly by Zarnowitz (1985), and they contrast with the mainstream linear Frisch-Slutsky models, which rely on external shocks to supply energy and have dominated the post-war business cycle literature.
Some of the recent contributions to the literature on nonlinear business cycles will be reviewed in this section, which aims to update the survey in Mullineux (1984). It should be noted that economists working in this area often try to develop models of dynamic economic development, in which cycles and growth are part of the same macrodynamic process. Some of the ideas introduced in this section will be considered further in Chapter 4.
Chiarella (1986) analyses a model with a nonlinear demand for money function which is dependent on the expected rate of inflation. Initially the money market is allowed to adjust sluggishly with inflation expectations being formed adaptively. Chiarella shows that the model has a stable limit cycle if the expectations time lag is sufficiently small. By allowing the time lag to go to zero, perfect foresight is considered as a limiting case.
It is found that the stable limit cycle persists. Apart from providing an additional demonstration16 that plausible nonlinear functions can result in a stable limit cycle, Chiarella is able to cast light on the dynamic instability problem that arises because perfect foresight, in the sense that rate of change variables such as inflation can be correctly perceived, leads to saddle point instability. The introduction of a nonlinearity removed the instability commonly associated with linear perfect foresight models by replacing the unstable local saddle point equilibrium with a global stable limit cycle equilibrium.
Papers presented at an international symposium on nonlinear models of fluctuating growth17 were published in Goodwin el al. (1984) and included a number of further extensions18 to the Goodwin (1967) growth cycle model as well as other contributions to the theory of fluctuating growth.
The symposium rejected, as analytically unsatisfactory, the simple superposition of fluctuations on growth trends because, it was argued, fluctuations and growth interact in a crucial way. Instead it advocated that a general theory of fluctuating growth be pursued. Such issues will be discussed further in Chapter 4.
A central view of the symposium was that economic fluctuations are a natural, endogenously determined, consequence of the internal dynamic structures and conflicts inherent in capitalist economies and that advanced capitalist economies undergo fluctuations whether or not there is state intervention. Linear models were rejected as inadequate and judged incapable of representing the complex relationships inherent in capitalist economies.
The symposium judged the two keystones to an understanding of why capitalist economies evolve cyclically as having been provided by Marx, who stressed class conflict, and Schumpeter, who emphasised the role of technical progress. Goodwin’s (1967) growth cycle model brought these two elements together using a model, drawn from biology, of the symbiotic relationship between predator and prey populations, with capitalists as predators and workers as prey, and spawned a series of studies on Marx/Goodwin cycles.
These are listed in Goodwin el ah (1984). In Goodwin’s seminal paper, technical progress was introduced as a trend and steady growth of the labour supply was also assumed. Growth was introduced via two log linear trends and was not fully integrated with the cyclical motion around it.
Desai (1973) extended the model by introducing inflation and expected inflation and by allowing the capital-labour ratio, assumed constant by Goodwin, to vary over the cycle as utilisation rates, proxied by employment, changed. The effect of introducing inflation is to complicate the wage bargaining process and has a stabilising influence unless workers are able to incorporate actual wage inflation into their wage demands.
Desai and Shah (1981) further extend the model by reformulating the technical change relationship. They incorporate the Kennedy-Weizacker technical change function (discussed in Samuelson 1965) and find that the introduction of induced technical progress changes the stability properties of the model. The conservative oscillations in Goodwin’s original model are the result of the implicit assumption that each side in the class struggle has only one weapon.
Workers can bargain for wages, their bargaining power being dependent on the level of employment, and the capitalists can determine the growth of employment by their investment decisions. Desai and Shah’s model gives the capitalists an additional weapon: the choice of the induced rate of technical progress. As a consequence, an equilibrium point, rather than a conservative cyclical motion, results.
Van der Ploeg’s (1984) contribution to the symposium was to consider the effect of introducing endogenous technical progress, based on Kaldor’s (1957) technical progress function, and allowing workers to save and to receive dividends from share ownership. Goodwin (1967) had assumed that workers consumed all their income. The implication of the analysis is that the class conflict, and with it the cycle, is likely to die away as workers obtain an interest in capitalism.
Di Matteo (1984) considers the implications of introducing money into the Goodwin model and this enables him to examine the interplay between real and monetary factors. Two cases are examined. In the first, the money supply is assumed to be exogenously determined and it is found that the share of profits is inversely related to the rate of growth of the money supply.
If it can control the money supply, the central bank can have a profound effect on the cycle and can in fact adopt a rule to eliminate it if certain initial conditions are satisfied. In the second case, it is assumed that the central bank sets the interest rate rather than the money supply and again, if certain initial conditions are satisfied, the central bank can adopt a rule to eliminate the cycle.
Di Matteo stresses, however, that the analysis is highly abstract since it incorporates no theory of the behaviour of the banking sector. Such a theory is necessary to facilitate an analysis of the interplay between financial and industrial capitalists in the Schumpeterian tradition.
The symposium includes other interesting extensions of Goodwin’s model. Glombowski and Kniger (1984) examine the effects of introducing unemployment benefits while Balducci et al. (1984) use the theory of non-co-operative differential games to explain the effects of introducing the REH into the model.
Balducci et al. find that the cycle remains under the REH. This indicates that the cycle is due not to myopia but to the fundamental conflicts inherent in economic development under capitalism which the model tries to capture. In an attempt to move away from the high level of aggregation which he now finds unsatisfactory, Goodwin’s contribution was to analyse economic interactions within the framework of multi-sectoral models.
The multi-sectoral approach has subsequently been developed by Goodwin and will be discussed further in section: Goodwin’s Macrodynamics
Another significant contribution to the nonlinear cycle theory literature is that of Grandmont (1985), who used nonlinearities to generate an endogenous EBC. Unlike the Lucasian EBC and RBC models, it required no shocks to keep it alive and, in contrast to New Classical models, money turned out to be non-neutral. This was despite the fact that as in other EBC models, markets clear, in the Walrasian sense, at every date and in addition traders have perfect foresight.
The latter can be regarded as rational expectations with full information, in contrast with the imperfect information imposed by the “market islands’ hypothesis employed in Lucasian EBC models. The equilibrium output is shown to be negatively related to the equilibrium level of the real rate of interest, and the employment of the nominal rate of interest as an instrument of monetary policy is shown to be extremely effective. A simple deterministic counter-cyclical policy can enable the monetary authorities to stabilize business cycles and force the economy back to a unique stationary state.
The source of Grandmont’s endogenous deterministic cycles is the potential conflict between the wealth and intertemporal substitution effects, which are associated with real interest rate movements. Business cycles emerge when the degree of concavity of the traders’ utility functions is sufficiently higher for older than younger agents.
This follows from the assumption that older agents have a higher marginal propensity to consume leisure within the simple structure of an overlapping generations model. Grandmont’s analysis implies that cycles of different periods will typically coexist. He feels his results suggest that economic theorists should look more closely at the sort of mechanisms that might be responsible for significant nonlinearities in the economic system if they wish to have a proper foundation on which to build a sound business cycle theory.
He postulates that relaxation of the ad hoc Walrasian continuous market clearing assumption and the introduction of imperfect competition may lead to the sort of nonlinearities that give rise to endogenous economic fluctuations. It would also reduce reliance on variations in real interest rates through variations in relative prices by allowing quantity adjustment and thus mechanisms akin to multiplier-accelerator effects to play a role.
A sound Keynesian, or non-Walrasian, business cycle theory could then be developed. This, he argues, might form the basis of the ‘New Keynesian’ business cycle research programme, referred to in the introductory section of this chapter, although most contributions20 have so far tended to conform to the linear Frisch-Slutsky approach.
To conclude this section, the contributions of Day (1982) and Varian (1979) are discussed. Varian (1979) employs catastrophe theory to examine a variant of the nonlinear Kaldor (1940) model.21 Kaldor’s model includes sigmoid savings and investment functions that intersect in a manner that generates a stable limit cycle solution (see Chang and Smyth 1970). Catastrophe theory was developed by Thorn (1975) (see also Zeeman 1977) to describe biological processes and has since been widely applied.
Catastrophe theoretic models entail a system of differential equations in which the parameters are not constant but change at a much slower rate than the state variables. There are, therefore, essentially two sets of variables. The ‘fast’, or state, variables can be regarded as adjusting towards a short-run equilibrium and the ‘slow’ variables, or parameters, as adjusting in accordance with some long-term process.
Catastrophe theory, therefore, studies the movement of short-term equilibria as long-run variables evolve and would appear to be a particularly useful tool for business cycle analysis and the study of dynamic economic development. When a short-run equilibrium jumps from one region of the state space to another, a catastrophe is said to occur.
Catastrophes have been classified into a small number of qualitative types, the simplest of which is the ‘fold catastrophe’. This occurs when the system contains one “slow’ variable and one ‘fast’ variable. For a given value of the slow variable, the fast variable adjusts to a stable equilibrium. If the state space contains ‘bifurcation points’ at which there are abrupt changes in stability characteristics, as in the Kaldor model, then adjustment to a locally stable equilibrium can involve jumps or catastrophes.
Things naturally get more complicated as more fast and slow variables are added. With one fast variable and two slow variables, for example, “cusp catastrophes’, which allow jumps and then either fast or slow returns to short-term equilibria, can occur.
Using a cusp catastrophe, Varian shows that if there is a small shock to one of the stock variables in the Kaldor model, a similar story to that analysed using the simpler fold catastrophe unfolds and an inventory recession of minor magnitude results. If the shock is relatively large, however, wealth may decline sufficiently to affect the propensity to save and a depression can result because the recovery can take a long time.
This is related to the idea, discussed by Leijonhufved (1973),22 that economies operate as if there is a ‘corridor of stability’, within which small shocks are damped out but large shocks are amplified. Large deflationary shocks may, for example, produce financial crises (see Chapter 3) and waves of bankruptcies which throw a normally stable system into a deep depression. Varian suggests that catastrophe theory might usefully form the basis for some further business cycle research.
Day (1982) applies the mathematical theory of chaos which, like catastrophe theory, is related to bifurcation theory,23 to show that in the presence of nonlinearities and a production lag, the interaction of the propensity to save and the productivity of capital can generate growth cycles that exhibit a wandering, saw-toothed, pattern, not unlike observed aggregate economic time series.
These ‘chaotic’ fluctuations need not converge to a cycle of regular periodicity and are not driven by random shocks. Periods of erratic cycling can be interspersed with periods of more or less stable growth. Under such circumstances, the future of the model solution cannot be anticipated from past realisations.
A deterministic single equation model is found to be consistent with structural change and unpredictability. Such models allow periods of sustained growth, such as that experienced since the early 1980s, but suggest that recent claims that a combination of supply side initiatives and fine tuning have eliminated the cycle are likely to prove to be incorrect.
Day’s work indicates that even if there is substance to the Monte Carlo hypothesis that there are no regular business cycles, economic fluctuations may still be a phenomenon to be reckoned with, and that random shocks may not be as important for driving cycles as the Frisch-Slutsky approach indicates.
Goodwin has employed catastrophe theory and related ideas drawn from bifurcation theory for the analysis of dynamic economic development. He adopts a multi-sectoral approach which makes nonlinear analysis intractable and finds it necessary to make linear approximations, which hold in the short run, and to view the long run as a series of short runs.
The linear approximations are chosen carefully, however, to generate regions of stability and instability between which economic variables can bifurcate back and forth; Goodwin also makes use of fast and slow variables.