HERBERT SIMON, PAUL THAGARD, PAT LANGLEY AND OTHERS ON DISCOVERY SYSTEMS

BOOK VIII - Page 5

Muth’s Rational-Expectations “Hypothesis”

Simon distinguishes three rationality theses: the neoclassical thesis of global rationality still prevailing in academic economics today, his own thesis of bounded rationality, and the rational-expectations hypothesis.  The reader of Simon’s autobiography, how­ever, would never likely guess that about two decades after its first appearance, the rational-expectations hypothesis had occasioned the development of a distinctive type of discovery system, the Bayesian Vector Autoregression or BVAR discovery system.   In fact it is doubtful that even its cre­ator, Robert Litterman, or his colleagues recognize the sys­tem as a discovery system, even though it does what discovery systems are intended to do: it makes theories.  This irony is due to the fact that the prevailing philosophy of science in economics is romanticism, which has led economists to call BVAR models “atheoretical.”  But if the term “theory” is understood in the pragmatist sense, the equations created by the BVAR system are economic theories, because they are universally quantified and proposed for empirical testing.  Before taking up the BVAR system, con­sider the rational-expectations hypothesis.

One of the distinctive aspects of Simon’s autobiography is a chapter titled “On Being Argumentative.”  In this chap­ter's opening sentence Simon states that he has not avoided controversy, and he adds that he has often been embroiled in it.  And on the same page he also says that he has usually announced his revolutionary intentions.  But revolutionaries inevitably find reactionaries revolting against them.  In the preceding chapter of his autobiography he describes a tactical retreat in the arena of faculty poli­tics: his eventual decision to migrate from Carnegie-Mel­lon's Graduate School of Industrial Administration to its psychology department, which as it happens, is not an unsuitable place for his cognitive psychology agenda.  This conflict with its disappointing denouement for Simon was occasioned by the emergence of the rational-expectations hypothesis, a thesis that was first formulated by a colleague, John F. Muth, and which was part of what Simon calls the ascendancy of a coalition of economists in the Graduate School of Industrial Administration. 

Muth’s rational-expectations hypothesis, which Simon curiously says deserves a Nobel Prize even though he maintains that the hypothesis is unrealistic, was set forth in a paper read to the Econometric Society in 1959, and then published in Econometrica (1961) under the title “Rational Expectations and the Theory of Price Movements.”  Muth explains that he calls his hypo­thesis about expectations “rational", because it is a des­criptive theory of expectations, and is not just a pronouncement of what business firms ought to do.  The idea of rational expectations is not a pet without pedigree.  It is a variation on an approach in economics known as the Stockholm School, in which expectations play a central rôle, and which Muth references in his article.  Therefore a brief consideration of the Stockholm School is in order, to see how the rational-expectations advocates depart from it, especially in their empirical modeling.

One of the best known contributors to the Stockholm School is 1977 Nobel-laureate economist Bertil Ohlin, who is best known for his Interregional and International Trade (1933), and whose elaboration on the monetary theory of Knut Wicksell anticipated the Keynesian theory in important respects.   He called his theory of underemployment the “Swedish theory of unused resources.”  In 1949 he published his Problem of Employment Stabilization, his macroeconomic theory, which concludes with a critique of Keynes’ General Theory from the Stockholm School viewpoint.  He had earlier published a summary of his “Stockholm Theory of Processes of Contraction and Expansion” as “The Stockholm Theory of Savings and Investment” in the Economic Journal (1937).

In his critique Ohlin draws upon a distinction between ex ante or forward-looking anticipations perspective and ex post or backward-looking historical perspective.  The distinction refers not to the viewpoint of economists but to the viewpoint of the economic participants in the economy.  This distinction was firstly proposed by 1974 Nobel-laureate economist Gunnar Myrdal (1898-1987), Ohlin‘s colleague of Stockholm School persuasion and fellow critic of Keynes.  Later in life Myrdal evolved his theory of ex ante perspective into an Institutionalist economic theory, and in his Against the Stream (1973) he uses it to explain a phenomenon that is problematic for Keynesian economics: “stagflation”, the co-existence of economic stagnation and price inflation.  In Keynesian economics price inflation is thought to be due to excessive aggregate demand, the opposite of stagnation.  Myrdal does not address the effect of institutional change on the structural parameters in econometric models, and he dislikes econometrics.

In the first chapter, “Development of Economics: Crises, Cycles”, Myrdal says that when he was still in his “theoretical stage” of thinking, i.e., pre-Ins­titutionalist stage, he was involved in the initia­tion of the Econometric Society, which he says was planned at the time as a defense organization against the advancing American Institutionalists, an advance which was halted in the economics profession by the Keynesian revolution.  He says that Keynesian theory is now in crisis as a result of prob­lems such as stagflation and structural unemployment, and that the future development of economics will be interdisci­plinary and Institutionalist.

Ohlin, who is not an Institutionalist but is a neoclassical economist, also maintains that the ex post perspective alone cannot provide an explanation in economics, because any explanation must reference factors that govern actions, and actions refer to the future.  Any economic explanation must therefore contain the ex ante perspective, which consists of the expectations or plans of the participants in their economic roles.  Ohlin notes that Keynes theory may be said to contain an ex ante perspective of investment, because it includes the “marginal efficiency of capital”, which is simi­lar to Wicksell’s “natural rate of interest” – the expected rate of return from newly constructed capital.

But Ohlin took exception to Keynes’ exclusively ex post analysis of saving, in which saving is merely the residual of aggregate income net of aggregate con­sumption.  On the Stockholm School viewpoint there must be an ex ante analysis of saving, because saving and investment are performed by different persons.  Ohlin maintains that ex ante saving is determined by the difference between current con­sumption and the level of income in the prior period.  He calls the ex ante saving rate the “average propensity to save”, and says that the saving-investment equilibrium must be expressed in terms of an equality of ex ante aggregate variables.  Then contrary to Keynes’ law of consumption Ohlin makes ex post consumption residual to ex ante savings and income.  Oddly he does not also require an ex ante variable for aggregate consumption, which must also partake in macroeconomic equilibrium.  Ohlin’s Stockholm School approach is significant in the present context not only because Ohlin offers an explanation of how expectations are formed, but also because unlike the rational-expectations advocates he accounts for expectations by explicit variables, namely the ex ante variables, so that their effects need not be incorporated implicitly in the statistically estimated parameters of the econometric models. 

Ohlin’s elaborate explanation notwithstanding, Muth blithely criticizes the Stockholm School for failing to offer an explan­ation of the way expectations are formed, and he advances his rational-expectations hypothesis as the needed explanation.  Muth notes two conclusions from studies of expectations measurements, which he says his rational-expectations hypothesis explains.  The first conclusion is that, while admitting considerable cross-sectional differences of opinion, the averages of expectations made by economic participants in an industry are more accurate than the forecasts made with naїve models, and are as accurate as elaborate equation systems.  The rational expectations hypo­thesis explains this accuracy by the thesis that expecta­tions viewed as informed predictions of future events are essentially the same as the predictions of the relevant economic theory.  Muth says that he is not asserting that the scratch work of entrepreneurs resembles a system of equations in any way, although he says notably that the way expecta­tions are formed depends on the structure of the entire relevant system describing the economy.  His more convoluted statement of his hypothesis is as follows: he says that the expectations of firms (or, more generally, the subjective probability distribution of outcomes) tend to be distributed, for the same information set, about the prediction of the theory (or, the “objective” probability distributions of outcomes). 

Muth argues that if expectations were not moderately rational, then there would be opportunities for economists to make profits from commodity speculation, from running a business firm, or from selling information.  In fact contrary to Muth, economists from Ricardo to Keynes have made large profits in speculation, as do such famous investors as Warren Buffet and George Sorros.  In his discussion of price expectations Muth offers an equation for determining expected price in a market, and references a paper to be published by him.  The published equation says that expec­ted price is a geometrically weighted moving average of past prices.  It is actually an autoregressive model.  He also argues that rationality is an assumption that can be modified to adjust for systematic biases, incom­plete or incorrect information, poor memory, etc., and that these deviations can be explained with analytical techniques based on rationality.

The second of his two conclusions is that reported expectations generally underestimate the extent of changes that actually take place.  Like the Stockholm School, Muth’s hypothesis does not assert that there are no expectations errors.  He states that in the aggregate or on average a reported expected magnitude such as a market price is an unbiased predictor of the corresponding actual magnitude except where a series of exogen­ous disturbances are not independent.  Muth’s explanation of the reported expectations errors of underestimation is his argument that his hypothesis is not inconsistent with the fact that the expectations and actual data have different variances. 

Muth references Simon’s “Theories of Decision Making in Economics” in American Economic Review (1959), and describes Simon as saying that the assumption of ration­ality in economics leads to theories that are inadequate for explaining observed phenomena, especially as the phenomena change over time.  Muth’s view is the opposite of Simon’s: Muth maintains that econ­omic models do not assume enough rationality.  Simon’s critique of the rational expectations hypo­thesis is set forth in the second chapter titled “Economic Rationality” in his Sciences of the Artificial (1969).  In the section titled “Expectations” he notes that expectations formed to deal with uncertainty may not result in a stable equilibrium or even a tendency toward stable equilibrium, when the feed forward in the control system has destabilizing con­sequences, as when each participant is trying to anticipate the actions of others and their expectations. Simon writes that the paradigmatic example in economics is the speculative price bubble.  Feed forward is also known as positive feedback loop.

In the next section of Sciences of the Artificial titled “Rational Expectations” Simon references Muth’s 1961 article.  He characterizes Muth’s hypothesis as a proposed solution to the problem of mutual outguessing by assuming that participants form their expectations “rationally”, by which is meant that the participants know the laws that govern the econ­omic system, and that their predictions of the future position of the system are unbiased estimates of the actual equilibrium.  Simon argues that the rational-expectations hypothesis erroneously ignores destabilizing speculative behavior. More fundamentally Simon maintains that there is no empirical evidence supporting the rational-expectations hypothesis.  And he doubts that business firms have either the knowledge or the computational ability required to carry out the expectations strategy.  He concludes that since economists have little empirical knowledge about how people form expectations about the future, it is difficult to choose among the models that are currently proposed by competing economic theories to account for cyclical behavior of the economy.

The recent Great Recession crash of 2007 has fully vindicated Simon’s 1959 critique.  In his After the Music Stopped (2013) Princeton University economist and former Vice-Chairman of the Federal Reserve System, Alan S. Blinder wrote that beginning in the 1990’s Americans built a “fragile house of cards” based on asset-price bubbles exaggerated by irresponsible leverage, encouraged by crazy compensation schemes and excessive complexity aided and abetted by embarrassingly bad underwriting standards, by dismal performances by the statistical rating agencies, and by lax financial regulation.  Together these elements of the house of cards supported each other to create a positive feedback loop.

Similarly in his Nobel Prize Lecture “Speculative Asset Prices” reprinted as the appendix to his book Irrational Exuberance (2015), 2013 Nobel laureate Yale University economist Robert J. Shiller established that stock markets’ excessive volatility violates the efficient-markets hypothesis. The rational-expectations hypothesis is also known as the “efficient-market” hypothesis and also as the “random-walk” hypothesis.  Since 1991 Shiller has also been a Director of the National Bureau of Economic Research (NBER) program in Behavioral Economics, which like other Institutionalists recognizes the importance of psychological, sociological and epidemiological behaviors in price determination, while depreciating the traditional rationality postulates. Federal Reserve Board Chairman Alan Greenspan coined the phrase “irrational exuberance” in 1996.  The thesis of Shiller’s book Irrational Exuberance based on his questionnaire surveys made at the Yale International Center for Finance, is that a positive feedback between investor psychology (irrational exuberance) and rising prices for assets such as equity shares, bonds and real estate, produce speculative price bubbles.  Shiller measures the psychology component with the Yale International Center for Finance’s “Valuation Confidence Index”, a time series spanning1989-2014, and he likens the deceptive speculative bubbles to “natural” Ponzi scams and pyramid schemes.

In Irrational Exuberance Shiller lists many precipitating factors initiating irrational exuberance in three recent booms: the stock market, the bond market and the real estate market.  Most of the factors are historically unique.  Their irrational effects in turn are amplified by a positive feedback loop, a speculative bubble; as prices continue to rise, the level of exuberance is enhanced by the price rise itself.  The psychological feedback involves changes in thought pattern that infect the entire culture as well as changes in prices, such that investors optimistically believe that a “new era” of opportunity has arrived.  This irrational exuberance drives asset prices to unjustifiable heights.  Shiller demonstrated with data graphs the relation between the volatile real (inflation adjusted) S&P Composite Stock Price Index for the period 1871 to 2013, and the much steadier trend in the calculated present values for the same period of subsequent real dividends that firms paid out.  The excessive volatility shown by the stock prices violates the efficient-markets rational-expectations hypothesis.

Muth had proposed his rational-expectations hypothesis as an explanation of two conclusions about expec­tations measurements.  Therefore these empirical measurements should be used to provide the independent semantics and magnitudes needed for empirical testing of the rational-expectations hypothesis.  What might rationally have been expected of the rational-expectations advocates therefore is an attempt to construct conventional structural-equation econometric models using ex ante expectations data, in order to demonstrate and test their explanatory hypothesis.  But neither Muth nor the rational-expectations advocates took this approach.  Historical macroeconomic ex ante time-series data are rare.  But on the basis of his hypothesis Muth shifted from an explanation of empirical measurements of reported ex ante expectations to consideration of a forecasting technique using only ex post data. 

This semantical shift has had three noteworthy effects on subsequent empirical work by the rational-expectations school: Firstly there was a disregard of available empirical expectations measure­ments that could serve as values for ex ante variables however few there are.  Secondly there was an attack upon the conventional structural-equation type of econometric model and the development of an alternative type of empirical model as an implementation of the rational-expectations hypothesis but with no independently collected expectations measure­ments.  Thirdly there evolved the design and implemen­tation of a computerized procedure for constructing this alternative type of model, a computerized procedure which is a distinctive type of discovery system.

This semantical shift has been consequential for econometric modeling.  Haavelmo’s structural-equation type of econometric model has been definitive of empirical economics for more than three-quarters of a century, and it is still the pre­vailing practice in the economics profession where neoclassical economics prevails.  To the dismay of conventional econometricians the rational-expectations advocates’ attack upon the conventional neoclassical structural-equation econometric model is, therefore, barely less subversive to the status quo in the science, than Simon’s attack on the neoclassical rationality postulate.  And this outcome certainly has an ironic aspect, because the structural-equation econometric model had been advanced as the empirical implementation (at least ostensibly) of the neoclassical economic theory, while the rational-expectations hypothesis has been advanced as offering greater fidelity to neoclassi­cal theory by extending rationality to expectations.  To understand such a strange turn of events, it is helpful to consider the still-prevailing, conventional concept of the econ­ometric model, the structural-equation model.  And for this we turn to Trygve Haavelmo.


Haavelmo’s Structural-Equations Agenda and Its Early Critics

The authoritative statement of conventional econometric modeling is set forth in “The Probability Approach in Econo­metrics”, which was initially a Ph.D. dissertation written in 1941 by 1989 Nobel-laureate econometrician, Trygve Haavelmo (1911-1999), and then later published as a supplement to Econometrica (July 1944).  Econome­trica is the journal of the Econometric Society, which was founded in 1930, and which describes itself as “an interna­tional society for the advancement of economic theory in its relation to statistics and mathematics" and for “the unifi­cation of the theoretical-quantitative and the empirical-­quantitative approaches in economics”. The July supplement by Haavelmo advanced certain fundamental ideas for the testing of mathematical hypotheses expressing economic theory by application of the Neyman-Pearson theory of statistical inference.  At the time that the supplement was published the society’s offices were located at the University of Chicago, where econometricians found themselves isolated and unwelcome.  In those days most economists believed that probability theory is not applicable to economic time series data, partly because the data for successive observations are not statistically indepen­dent, but mostly because like sociologists today few economists were competent in the requisite techniques, which are now routinely taught to undergraduate students in economics departments.

Haavelmo argued quite unconventionally that time series data points are not a set of successive observations, but are one single observa­tion with as many dimensions as there are independent vari­ables in the model.  This bizarre rationalization is not mentioned in textbooks today.  The more enduring aspect of Haavelmo’s structural-equation agenda consisted of construing the econometric model as a probabilistic statement of the economic theory, so that theory is neither held harmless by data that falsifies it nor immediately and invariably falsified as soon as it is confronted with measurement data.  He says that the model is an a priori hypothesis about real phenomena, which states that every set of numeric values that the economist might observe of the “true” variables, will be one that belongs to the set of numeric values which is admissible as the solution for the model’s equations.  This attempt to construe the model as a third linguistic entity between theory and data leads him to develop an unusual and complicated semantical analysis.

The first chapter titled “Abstract Models and Real­ity” sets forth his theory of the semantics of measurement variables in econometric models.  Haavelmo distinguishes three types of “variables”, which actually represent three separate mean­ings associated with each variable symbol that may occur in an empirical economic theory.  The first type is the “theo­retical variable”, which is the semantics that a variable symbol has due to its context consisting of the equations of the model, and its values are subject only to the consis­tency of the model as a system of one or several equations. 

The second type is the “true variable”, which has its semantics defined by an ideal test design that the economist could at least imagine, in order to measure those quantities in real economic life that he thinks might obey the laws imposed by the model on the corresponding theoretical variable.  Haavelmo says that when theoretical variables have ordinary words or names associated with them, these words may merely be vague descriptions that the economist has learned to associate with certain phenomena.  And he claims that there are also many indications that the economist nearly always has some such ideal test design and true variables “in the back of his mind”, when the economist builds his theoretical models.  In other words in the verbal description of his model in economic terms the economist suggests either explicitly or implicitly some type of test design to obtain the measurements for which he thinks his model would be empirically adequate.  The measurements for the true variables are not only collected in accordance with an ideal test design, but are also error free.  Thus before estimation and test­ing of the model the theoretical and true variables are distinguished but are not separated in the fully interpreted theory.

The third type of variable is the “observational vari­able”, which describes the measurements actually used by the economist for his model construction.  Haavelmo says that the economist often must be satisfied with rough and biased measures, and must dig out the measurements he needs from data that are collected for some other purpose.  For example the National Income Product Accounts (N.I.P.A.) data used for macroeconometric modeling are collected from tax records. The true variables are those such that if their behavior should contradict a theory, the theory would be conclusively rejected as false.  On the other hand were the behavior of the observational vari­ables to contradict the theory, the contradiction would be due to the fact that the economist is using observational variables for which the theory was not meant to hold.  This may cause confusion, when the same names are often used for both types of vari­ables.  To test a theory against facts or to use it for prediction, either the statistical observations available must be corrected or the theory itself must be adjusted, so as to make the facts the economist considers the true vari­ables relevant to the theory.  Thus in Haavelmo’s approach to econometrics, probability distributions not only adjust for measurement errors, but also adjust for the deviations between the true and observational values due to their semantical differences.

An experienced econometrician, Haavelmo is adequately cogni­zant of the difficulties in the work that makes economics an empirical science.  In contrast, most of his contemporaries in the 1940’s were windowless ivory-tower theoreticians.  Today there is much more adequate data available to economists from government agencies and private data-collection syndicates.  Nonetheless, economists still sometimes find they must use what they call “proxy” variables, which are recognized as measurements of phenomena other than what the economist is interested in explaining with his models.  And sometimes the government statistical agency will use names to identify data that describe phenomena for which the data are a proxy rather than what the data actually meas­ure.  For example in their Industrial Production monthly releases the Board of Governors of the Federal Reserve System says that when its monthly production index series cannot be based on physical measures of output, such as tons of steel or assemblies of automobiles and trucks, then it reports that monthly input measures, such as hours worked or kilowatt hours of electri­city consumed in production are used to develop a monthly output quantity series.  Nonetheless, the Federal Reserve Board calls these proxy data “production.”

Except in these explicit cases involving proxy variables, however, it is questionable whether the economist has “in the back of his mind”, as Haavelmo says, any specific ideal test design setting forth ideal measurement procedures.  Most often the descriptive words associated with theoretical variable symbols contextually defined in a mathematical model are vague with respect to test design and not given further semantical resolution until measurements are actually collected and associated with the model.  Then the description of the actual measurement procedures supplies additional information to resolve this vagueness.  In the case of macroeconometric models for example descriptions of the procedures and sources used by the U.S. Commerce Department’s Bureau of Economic Analysis (B.E.A) for collecting the N.I.P.A. data, supply the additional semantics that resolves the vagueness in the concepts symbolized by descriptive variables in the macroeconomic theory.  It is only when economists like those with the Federal Reserve Board decide to use proxies for what they wish to measure, that there is more deviation involved in the data than just errors of measurement.  Then such proxies introduce an equivocation like Haavelmo’s “true” and “observational” semantics instead of supplying a resolution to the vagueness in the univocal meanings of the terms in the theory.

The second chapter titled “The Degree of Permanence of Economic Laws” sets forth Haavelmo’s concept of scientific law in economics, and specifically his treatment of the degree of constancy or permanence in the relations among economic variables in econometric models.  Nonconstancy is manifested by structural breakdown of the traditional structural-equation model, the type of model that Haavelmo advocates in this monograph.  The rational-expectations hypothesis is proposed as an explanation for struc­tural breakdown, and it is the rationale for the vector-autoregression type of model that is an alternative to the structural-equa­tion model.  The BVAR discovery system constructs a refined version of the vector-autoregression type of model.

Haavelmo says that the constancy in a relationship is a property of real phenomena, as the economist looks upon the phenomena from the viewpoint of a particular theory.  This is an unwitting statement of ontological relativity.  At the very opening of his monograph he states that theoretical models are necessary to understand and explain events in real life, and that even a simple description and classifi­cation of real phenomena would probably not be possible or feasible without viewing reality through the framework of some scheme conceived a priori.  This statement is equivalent to Popper’s thesis that there is no observation without theory, and to Hanson’s characterization of observation as theory laden; it is a statement of semantical relativity.  But the term “theory” in Haavelmo’s monograph means specifically the neoclassical economic theory with its rationality postulates, and the basic task of his monograph is to describe his probability approach in econometrics understood as the application of Neyman-Pearson statistical inference theory to economic theory for empirical testing.

In the first chapter of the monograph Haavelmo distin­guished three types of quantitative economic relations.  The first type is the definitional or accounting identity.  A common example is the gross domestic product (GDP), which is merely the summation of its component sectors on either the income side or expenditure side.  The second type is the technical relation.  The paradigmatic case of the technical relation is the production function, which relates physical output to physical inputs such as capital and labor inputs.  Technical engineering equations are more properly the tasks of applicable natural sciences, but the practice among econometricians has been to estimate aggregate production functions with the same statistical techniques that they use for all econometric equations.  And to measure physical quantities in production functions, econometricians routinely use constant dollars, i.e., deflated current-dollar aggregates. 

The third type is the relation describing the economic decisions of the participants.  Neoclassical economists call equations of this type “behavioral equations” or “decision functions”.  The behavioral equations in conventional romantic econometric models are based on economic theory, and are not like the laws and theories developed in the natural sciences such as physics.  Romantic neoclassical economic theory purports to describe a mental decision-making process made by economic participants, notably consuming households and producing business firms.  The econometric equation based on neoclassical theory contains independent variables that represent a set of conditions that are consciously considered by the economic participants in relation to their motivating preference schedules or priorities as they make their best or optimized decisions, and the outcomes of these optimizing decisions are represented by the value of the dependent variable of the equation.   The system of preference sched­ules is not explicitly contained in the equation.  But Haavelmo says that if the system of preference schedules establishes a correspondence between sets of given condi­tions and optimized decision outcomes, such that for each set of conditions there is only one best decision outcome, then the economist may “jump over the middle link” of preference schedules in the scheme, and claim that the decisions of the individuals or firms are determined by the set of independent variables representing factors that the participants mentally consider.

          In this romantic neoclassical scheme the econometric model is based on the assumption that participating consumers’ decisions to consume and businesses’ decisions to produce can be described by certain fundamental behavioral relations, and that there are also certain behavioral and institutional restrictions upon the participant’s freedom.  A par­ticular system of such relationships with their equations statistically estimated defines one particular theoretical “structure”.  The problem of finding permanent economic laws becomes the problem of finding structures in this sense; the failure in particular cases to solve this problem is usually manifested by an erroneous forecast with the model, which is called a “structural breakdown”. 

       Haavelmo then considers several reasons for the structural breakdown of an econometric model.  In all cases the problem is diagnosed as the absence from the model of a variable representing some operative factor that in reality has a significant effect on the phenomenon represented by the model’s dependent variable, and the solution there­fore consists of recognizing the missing factor and then introducing an explanatory variable for it into the model.

Firstly in the case of a model of supply and demand in a market, one of the reasons for structural breakdown is a structural change due to the irreversibility of economic relations.  This change is a shift in a demand curve, such that price-quantity pairs no longer repre­sent movements along the demand curve, because the economic participants are revising their preference schedules as prices change.  Haavelmo rejects claims that demand curves cannot be constructed from time series of observed price-quantity pairs, and instead says that the economist should introduce into his model variables representing the additional factors respons­ible for the revision of preference schedules and consequent shifts in the demand curve.  Econometricians routinely do this today.

A second explanation for structural breakdown is the simplicity of the model.  Economists like simple models, even though the real world is complex.  From a purely statistical point of view the simpler the model, the less the likelihood of distorting collinearity. Haavelmo distinguishes potential from factual influences in the real world, and says that models can be simple, because only factual influences need be accounted for in the models.  But he says that economists making models may exclude factors mentioned in a theory, which would be sufficient to explain apparent structural breakdown that may occur later in reality, because the excluded factors do not presently exhibit a statistically detectable factual influence in the sample history used to estimate the equation.

One recent example of this reason for structural breakdown is the American domestic cigarette industry.  Statistics collected by the U.S. Federal Trade Commission (FTC), and the U.S. Center for Health Statistics (NCHS) show that for most of the post-World War II era until the late 1990’s, the quantity of domestic cigarette consumption in the United States was determined almost wholly by changes in the national demographic profile, advertising bans notwithstanding.  And statistics collected by the U.S. Bureau of Labor Statistics (BLS) during this time show that relative prices rose little and only very gradually, making the relative price variable statistically nonsignificant in a model estimated with data prior to 1997.  But with the “Global Settlement Agreement” with several State governments in 1997 the industry agreed to $370 billion settlement in response to litigation, and then with the “Master Settlement Agreement” with the remaining State governments in 1998 the industry agreed to an additional $246 billion settlement.  The industry then greatly raised prices of cigarettes to acquire the needed funds for making the large settlement payments over an agreed twenty-five years.  The result was effectively a high excise tax passed on to consumers, with the result that consumption declined dramatically, in spite of significant positive changes in the national demographic profile.  Thus the new and formerly missing factor that produced structural breakdowns of cigarette industry econometric models estimated with pre-1997 data was the sharply increased relative price of cigarettes, making the relative price variable statistically significant in a model with the longer time series.

Finally a third reason for structural breakdown is the absence of a semantical property that Haavelmo calls “auto­nomy.”  Autonomous equations in a multi-equation model have an independence that is not just the syntactical independence of axioms in a deductive system.  The semantical independence or autonomy is due to the success of an equation at identifying the preference schedules of just one social group or social rôle in the economy.  For example the demand equation in a market model represents the decisions of buyers in the market, while the supply equation for the same price-quantity pair represents the decisions of sellers in the same market.  If the supply and demand equations for a market model are autonomous, then a structural breakdown in one equation will not also affect the other.  An autonomous equation is one that has successfully identified a fundamen­tal behavioral relation described by neoclassical theory.

In addition to his semantical theory and his theory of scientific law in economics, Haavelmo also gives lengthy consideration to statistical inference.  One statistical topic he considers is the meaning of the phrase “to formu­late theories by looking at the data.”   He is concerned with the problem of whether a well fitting statistically esti­mated model is merely a condensed description of the empiri­cal data, i.e., ad hoc, or whether it is an effective test of a valid generalization.  He maintains that how the economist happens to choose a hypothesis to be tested from within a class of a priori admissible theories is irrelevant, and he states that the selection may be made by inspection of the data.  But he says that the class of admissible theories must be fixed prior to applying the statistical testing procedure, so that it is possible to calculate the power of the test and to determine the risk of error involved in accepting the hypo­thesis tested.  He rejects the practice of selecting the whole class of admissible theories by the empirical testing process.  The class of admissible theories cannot be made a function of the sample data, because then the Neyman-Pearson statistical test no longer controls the two types of errors in testing hypotheses, either the error of accepting a false hypothesis or the error of rejecting a true hypothesis.

Haavelmo’s prohibition of use of the Neyman-Pearson statistical inference theory for discovery is ignored by the rational-expectations advocates.  And it is also ignored by social scientists who have taken up the practice generically referred to as “data mining”, which today is enabled by the enhanced processing power of the electronic computer.  Developers of discovery systems like Hickey, who use regression modeling for computational philosophy of science, also ignore Haavelmo’s prohibition.

Mary S. Morgan states in her History of Econometric Ideas that acceptance of Haavelmo’s approach made econometrics less creative, because data were taken less seriously as a source of ideas and information for econometric models, and the theory-development rôle of applied econometrics was downgraded relative to the theory-testing rôle.  She notes that Haavelmo’s paper was very influential both within the Cowles Commission and with others including Herbert Simon, which may explain why Simon never designed a discovery system for use in social science.



Pages [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [Next 10 >>]
NOTE: Pages do not corresponds with the actual pages from the book