HERBERT SIMON, PAUL THAGARD, PAT LANGLEY AND OTHERS ON DISCOVERY SYSTEMS

BOOK VIII - Page 7


Hickey’s Metascience or “Logical Pragmatism”

Thomas J. Hickey was a graduate student in the philosophy department and in the economics department of the University of Notre Dame, South Bend, Indiana.  After receiving an M.A. degree in economics and completing his philosophy coursework he intended to develop his computerized discovery system for a Ph.D. dissertation in philosophy.  But the Notre Dame philosophers were obstructionist and Hickey got out. Notre Dame has always been better at football than philosophy.  After leaving Notre Dame he developed his METAMODEL computerized discovery system at San Jose City College in San Jose, California. Today development of such discovery systems is recognized as “computational philosophy of science”.  For more than thirty years thereafter Hickey used his discovery system occupationally, working as a research econometrician in both business and government.  He used his system for Institutionalist macroeconometric modeling and regional econometric modeling for the State of Indiana Department of Commerce.  He also used it for Institutionalist econometric and sociodemographic modeling projects for various business corporations.

Hickey described his METAMODEL discovery system in his Introduction to Metascience: An Information Science Approach to Methodology of Scientific Research (1976).  Since publishing this monograph he has also referred to metascience as “Logical Pragmatism”, meaning the contemporary pragmatist philosophy of science.  His “Logic” is emphatically not the irrelevant Russellian “symbolic” logic.  More recently the phrase “computational philosophy of science” has also come into use thanks to Paul Thagard.  Hickey’s intent in using the term “Metascience” is to recognize that philosophy of science is becoming empirical and breaking away from metaphysical foundationalism, just as the modern empirical sciences have done historically.  The first half of Introduction to Metascience set forth Hickey’s the “pragmatist” part of his Logical Pragmatist philosophy.  The second half described his METAMODEL discovery system, his computational or “Logical” part of Logical Pragmatism, and exhibited his system with a simulation of the Keynesian revolution in economics.  His ideas have naturally evolved since Introduction to Metascience was published nearly forty years ago.  The current rendering of his metascience is very briefly summarized above in BOOK I titled “Introduction to Philosophy of Science”, and in his e-book Twentieth-Century Philosophy of Science: A History (Third Edition)BOOK I, which is now also an e-book titled, Philosophy of Science: An Introduction (Third Edition) with hyperlinks to this web site.

Logical Pragmatism may be contrasted with the alternative psychologistic approach, which descends from Simon and is exemplified in the more recent efforts of Langley and Thagard.  The contemporary pragmatist philosophy of science is in the analytic-philosophy tradition, which originated with the historic “linguistic turn” in early twentieth-century philosophy.  In the United States this linguistic-analysis tradition has since evolved considerably into the contemporary pragmatist philosophy of language due in no small part to the writings of Harvard University’s Willard van Quine.  The contemporary pragmatism supersedes the classical pragmatism of Peirce, James and Dewey.  Hickey prefers the linguistic-analysis approach because he believes that the psychologistic approach reveals an inadequate appreciation of the new pragmatist philosophy of language, and he notes that advocates of the psychologistic approach typically include some residual positivist ideas.  Furthermore Hickey’s metascience agenda with its computerized linguistic constructionalism makes no claims about representing human psychological processes. Thus no experiments are needed to validate any psychological claims associated with the computer-system designs.  In fact the computational philosopher of science need not understand the intuitive human discovery process, in order to produce a system design yielding manifestly superior outcomes.  He need only understand the characteristics of a good theory and develop a procedure whereby such theories can be produced mechanically.  Computational philosophy of science more closely resembles computational linguistics than psychology.


Hickey’s Linguistic Analysis

Hickey’s contemporary pragmatist philosophy of language is detailed above in BOOK I in this web site.


Hickey’s Functional Analysis

Hickey organizes philosophy of science into four functional topics: the aim of science, discovery, criticism and explanation.  His pragmatist philosophy of science is detailed above in BOOK I in this web site.


Hickey's METAMODEL Discovery System

Hickey’s METAMODEL discovery system antedates Simon’s applications of his problem-solving theory of heuristic search to the problem of scientific discovery by about ten years.  Initially Simon did not apply artificial-intelligence systems to scientific discovery.  Hickey developed an original combinatorial generate-and-test design that differs from the heuristic-search design used by Simon and his colleagues at Carnegie-Mellon or by their later followers including Langley, Zytkow and Thagard.  The second part of his Introduction to Metascience sets forth the design of his METAMODEL discovery system together with a description of an application of the system to the trade cycle specialty in economics in 1936, the year in which John M. Keynes published his General Theory of Employment, Interest and Money.  The METAMODEL performed revisionary theory construction of Keynes theory, an episode now known as the “Keynesian Revolution” in economics.  The applicability of the METAMODEL’s revisionary theory construction is already known in retrospect.  As 1980 Nobel-laureate economist Lawrence Klein says in his Keynesian Revolution (1966, [1947]), all the important parts of Keynes theory can be found in the works of one or another of Keynes’ predecessors.

Hickey firstly translated Keynes’ theory into mathematical form.  His translation was informed by J. R. Hicks’ in “Mr. Keynes and the Classics” in Econometrica (1937). In the Journal of the History of the Behavioral Sciences (1979) Walter A. Sedelow, professor of computer science and sociology, and Sally Y. Sedelow, professor of computer science and linguistics, both at University of Kansas, wrote that Hickey’s mathematical explication of Keynesian theory reveals a useful way of formalizing the history of science.  And they add that Hickey shows how the history of science in the course of such formalization may contribute to the enhanced effectiveness of science itself by means of computer-implicated procedures.

The METAMODEL performs an extensive cognitive exploration of the revisionary theory-constructional possibilities that are latent in the system’s input state description.  The principal disadvantage of this combinatorial generate-and-test design is its extensive utilization of computer resources.  On the other hand the principal advantage is that unlike heuristic search and others that are more efficient, it minimizes the risk of preemptively excluding theories that are worthy of consideration. Some employers have allowed Hickey unlimited mainframe computer resources after demonstrating successful computer runs for market analysis.  The system is not a satisficing system, but rather is an optimizing system severely constrained by several statistical testing criteria that outputs a small number of constructionally generated and empirically tested theories.  As computer hardware technology continues to improve (e.g., supercomputing, quantum computing) the trade-off between efficiency and thoroughness will move far toward thoroughness.

The discovery system’s inputs and outputs are called “state descriptions”. To simulate the Keynesian revolution with the METAMODEL Hickey developed a cumulative input state description containing the descriptive variables in the object language. The input state description also contained the measurements for the associated time-series historical data. He researched the literature of the economics profession that pertains to the trade cycle problem for the interwar years prior to 1937.  The American Economic Association’s Index of Economic Journals was a useful bibliographic source.  The examination of the relevant professional literature yielded ten economic theories of the national trade cycle, which Hickey also translated into mathematical form.  The ten theories were those of J.A. Hobson, Irving Fisher, Foster and Catchings, J.M. Clark, F.A. von Hayek, R.G. Hawtrey, Gusatv Cassel, Gunnar Myrdal, Johan Akerman, and A.C. Pigou.  The descriptive vocabulary occur­ring in these theories was highly redundant, and yielded a set consisting of eighteen unique variables.

The data for these variables are annual time series for the period 1921 through 1934, which were available to economists in 1936, the year Keynes’ book was published.  The selected time series measurement data were originally published prior to 1937 in annual issues of the U.S. Department of Commerce Statistical Abstract and in the U.S. Department of Commerce Historical Statistics of the United States (1976).  The input state description contains these time series data converted to index numbers of period-over-period change ratios to minimize collinearity together with variable symbols including one time lag.

The output state description contains an econometric model of Keynes theory constructed by the discovery system.  The original theory is actually a static theory, but it was made dynamic by including considerations contained in an appendix to the General Theory titled “Notes on the Trade Cycle”, in which Keynes explicitly applies his theory of income determination to the phenomenon of the trade cycle.  Keynes theory contains ten variables and seven equations with three exogenous variables.

Operating the METAMODEL requires two initial designations that must be made prior to execution of the discovery system in the computer.  Firstly the user must designate which descriptive variables among the current-valued variables in the input state description are the problematic variables, i.e., those that identify the problem the theory is to solve.  In the application to the trade cycle problem, the problematic variables are aggregate employment and aggregate real income for the national economy.  Every macroeconometric model printed in the output state description generated by the system contains these two problematic variables and equations determining their numeric values.

Secondly the user must designate which among the current-valued variables are exogenous variables.  These variables have their values inputted to the system and not generated by it, because the values were determined independently by economic policy decisions of political authorities.  The three exogenous variables designated in the trade cycle application are real aggregate Federal fiscal expenditures, real aggregate Federal fiscal tax revenues, and the Federal Reserve’s measure of the aggregate nominal money stock.  These two types of designations together with other information such as the number of observations in the time series data are entered into a control record, which is the first record read when the system is run.  Records containing the character symbols of the input variables with separate identifiers for current values and lagged-valued variables follow the control record, which in turn is followed by the data records.

The METAMODEL discovery system is a FORTRAN computer program with an architecture consisting of a main program, SLECTR, and two subroutines named REGRES and SOLVER.  SLECTR is the combinatorial procedure that selects nonredundant combinations of language elements. The system has a control switch, which is initialized as open.  When its control switch is open, SLECTR selects combinations of time series from the input file initially read by the system.  For each selection if it had an unsatisfactory triangular correlation matrix for the equation’s independent variables then control is returned to SLECTR for another selection.  Otherwise it calls REGRES, which is an ordinary-least-squares-regression procedure that statistically estimates an intercept and coefficients thereby constructing an equation for the selection of variables passed to it by SLECTR.  If the estimated equation does not have a satisfactory R2 coefficient-of-multiple-determination statistic associated with it as well as a satisfactory Durbin-Watson statistic and satisfactory Student t-statistics, control is returned to SLECTR for another selection.  But when all these statistical criteria are satisfied, the equation and its statistics are stored as a record in an interim accumulation file, and control is returned to SLECTR for more selections.

With its switch closed SLECTR makes nonredundant selections of sets of estimated equations from the accumulation file generated by REGRES.  For each selection it calls subroutine SOLVER, which solves each multi-equation model with the Gauss-Jordan simultaneous-equation algorithm, and then executes the model to generate a reconstruction of the historical data.  In order to accomplish this, there are certain criteria that every selected set of equations must satisfy, and SOLVER checks for four conditions.  Firstly the combination of equations constituting the model must contain equations that determine the two designated problematic variables.  Secondly the model must be uniquely determined, such that there are just as many current-valued endogenous variables as there are equations.  Thirdly the model must be recursively executable to generate a time series, such that there is at least one current-valued variable for each lagged-valued variable describing the same phenomenon.  Fourthly the model must be a minimal statement, such that except for the problematic variables it contains no current-valued variable that is not needed to evaluate a lagged-valued variable describing the same phenomenon. 

When SOLVER finds an equation set that does not satisfy all these criteria, it returns control to SLECTR for another set of equations.  Models that do satisfy all these criteria are capable of being solved, and SOLVER then solves and recursively iterates the model both to recreate the history with synthetic data for the years 1921 through 1933.  The simulation must capture all the critical points in the time-series history.  If it does this, it then must make a one-period out-of-sample postdictive forecast for the year 1934.  The control record for the system also contains a minimum error for the retrodictive out-of-sample forecasts of the problematic variables, and the final test for the model is for its forecast accuracy.  Each model that also satisfies this criterion is outputted to a file for printed display in conventional mathematical form with each equation listed together with its associated statistics.  The output also lists the synthetic data generated by the iteration of the model together with the forecast values for its endogenous variables.

Four years after designing and testing his METAMODEL discovery system with the simulation of Keynes’ macroeconomic theory, Hickey had occasion and opportunity to use the system to address a contemporary problem.  At that time he was a senior economist in the Analysis and Statistics Bureau of the Finance Department of United States Steel Corporation.  He had completed a conventional Keynesian quarterly macroeconometric forecasting model using Haavelmo’s procedures, but found that the model did not perform satisfactorily.  This occurred during the years following the large increase in crude oil prices imposed by the Organization of Petroleum Exporting Countries (OPEC), and no macroeconometric models available at the time had the consequences of this unprecedented shock in the sample data available for statistical modeling.  Many economists reacted to the structural breakdown of their models with patience, and updated their databases as new data became available and revised their models.  Others, however, believed that more than oil prices were at fault, and that there are more basic reasons for dissatisfaction with their Keynesian models.  One such group was the rational-expectations economists, and they had their distinctive agenda, as described above.

Hickey also believed that more was involved than inadequate sample data. But unlike the rational-expectations advocates, he views structural breakdown in the same manner as did Haavelmo, who maintained that the problem is remedied by introducing into the model new variables for missing factors, the absence of which had caused the breakdown.  Initially this suggests a theory-elaboration approach.  But unlike Haavelmo, Hickey agrees with the Institutionalist economists like Mitchell that neoclassical economics limits economic explanation to an excessively small number of factors, and that it assumes incorrectly that all the other complexities in the real world are irrelevant.  Furthermore Hickey is not philosophically sympathetic to the romanticism in neoclassical economics, and prefers the explicitly pragmatic orientation of the American Institutionalist economists, who were influenced by the classical pragmatists.

However, historically Institutionalists did not make econometric models.  Even today most of them are more interested in the historical evolution of economic institutions. Hickey ventured beyond conventional Institutionalism and decided to integrate functionalist sociology into his econometric model, even though functionalist sociologists do not make econometric models either.  Functionalism in sociology is an equilibrium thesis that all institutions of a national society are interrelated. Therefore he used his METAMODEL discovery system to investigate how variables representing each of five basic institutions of the American society can be related by statistically estimated equations of the type used in econometric models.

The discovery system generates many alternative equations and models that are empirically acceptable thus exemplifying the contemporary pragmatist’s thesis of empirical underdetermination of language and the thesis of scientific pluralism.  For romantic philosophers of science this is an argument against development of hypotheses by data analysis, and thus an argument for invoking some prior semantics/ontology with its preconceived concepts of causality.  But for the contemporary pragmatist, pluralism is simply an inevitable fact of life in basic-scientific research routinely encountered in the history of research science.  Einstein had called this pluralism an “embarrassment of riches.”

Hickey used his METAMODEL to make macrosociological models with eleven current-valued input variables with each allowed two lagged-valued variables.  The total number of equations estimated and stored by REGRES for further processing by SOLVER was thirteen, and the total number of macrosociological models generated and critically accepted by SOLVER for output was three.  As it happens, two of the three models were actually the same model for reasons that SOLVER cannot detect, and so the total number of models actually outputted was only two.

The functionalist macrosociometric model generated by the METAMODEL was used as a guide for integrating sociological, demographic, and human ecological factors into an integrated model of the U.S. national society for the Indiana Department of Commerce.  A description of the resulting integrated macromodel was published in “The Indiana Economic Growth Model” in Perspectives on the Indiana Economy (March, 1985).   Later in the September 1985 issue of the same publication Hickey published “The Pragmatic Turn in the Economics Profession and in the Division of Economic Analysis of the Indiana Department of Commerce”, in which he described the METAMODEL and compared it with some VAR models and with the BVAR system constructed by the rational-expectations advocates.

In addition to using his system for the State of Indiana Department of Commerce, Hickey has used a commercial version of the METAMODEL system for many other Institutionalist econometric and sociodemographic modeling projects for various business corporations including USX/United States Steel Corporation, BAT (UK)/Brown and Williamson Company, Pepsi/Quaker Oats Company, Altria/Kraft Foods Company, Allstate Insurance Company, and TransUnion LLC.  Monthly, quarterly, and annual versions of the system were used for both quantitative market analysis and for quantitative risk analysis.  The METAMODEL system has been licensed perpetually to TransUnion for their consumer credit risk analyses using their proprietary TrenData aggregated quarterly time series extracted from their large national database of consumer credit files.  They use the models generated by the discovery system to forecast payment delinquency rates, bankruptcy filings, average balances and other consumer borrower characteristics that constitute risk exposure for lenders.  Hickey has also used the system to discover the underlying sociological and demographic factors responsible for the secular long-term market dynamics of food products and other nondurable consumer goods.

It might also be noted about these market analyses that much of the success of the METAMODEL system is due to Hickey’s Institutionalist approach in economics.  A review of the membership roster of the National Association of Business Economists (NABE) reveals that economists in private industry are almost never employed in the consumer nonfinancial services and consumer nondurable goods sectors of the economy that lie outside the financial, commodity, or cyclical industrial sectors.  This is due to the education offered by the graduate schools that is restricted to neoclassical economics, which has become a kind of a romanticist ideology having the status of an orthodox theology.  Employers in the consumer nondurable goods and nonfinancial services sectors, whose output accounts for approximately half of the U.S. national Gross Domestic Product, have no need for neoclassical orthodoxy.  They have no need for macroeconomic aggregate income theory of the business cycle, and very limited need for microeconomic relative price theory of commodities.  Microeconomic theory treats all industries as commodities in which there is only price competition to the exclusion of all franchise or branded products where advertising and other forms of nonprice competition prevail.  And it treats aggregate income as the only aggregate factor to the exclusion of the many underlying sociodemographic factors considered by the Institutionalist economist.  The doctrinairism of the neoclassical academic economists is costing their graduates a very high opportunity cost in lost employment opportunities.  And it has also created an occupational vacuum, which Institutionalist economists like Hickey have exploited financially.

Hickey also used his METAMODEL system to develop a macrosociometric Institutionalist model of the American national society with fifty years of historical time-series data.    From 1978 to 1982 Hickey submitted a paper describing his macrosociometric model developed with his METAMODEL system to four sociological journals. The paper was acceptable on empirical grounds.  But to the chagrin and dismay of academic sociologists it is not a social-psychological theory.  Hickey was unable to break through the sociologists’ obstructionist complacency barrier, and all of the journals rejected the model for publication.  The paper is reprinted in Appendix I, and the referees’ critiques and Hickey’s rejoinders are in Appendix IIAppendix III is a critique of the sociological literature.

Hickey describes his macrosociometric model as a “post-classical” functionalist theory.  The term “classical” when applied in a science is not a proper name for an historical period like “mediaeval”.  It is better described as the name for a style of thought or more precisely for analyses using certain basic premises.  It is a relative term like “liberal” or “conservative”, which change with shifts in the political spectrum.  Yesterday’s liberal has often become today’s conservative.  Likewise in science “classical” refers to the immediately preceding view that has been superseded by a new and current one due to a scientific revolution.  Furthermore “classical” cannot be assigned to an historical period, because like an artistic style its residual characteristics often linger about for many decades.  In economics “classical” originally referred to pre-marginalist economists, but Keynes referred to the marginalists economists that preceded his macroeconomics as “classical”.  The classical premises he rejected included Say’s Law, which says that supply creates its own demand, and the full-employment equilibrium outcome of the optimum allocation of resources that was postulated by relative price theory, later known as microeconomics.  Likewise in physics Neils Bohr referred to “classical physics” that included relativity theory as well as Newtonian physics but which preceded the Copenhagen quantum theory.  The premise of classical physics included determinism whereas the indeterminacy equations of quantum theory are stochastic.

Similarly Hickey uses “classical” to describe sociological thought in same the manner that sociologist Donald Black used it in his address to the American Sociological Association in 1998 reported in his “The Purification of Sociology” article in Contemporary Sociology.  Black stated that sociology is classical, because its explanations of social behavior are (1) teleological, i.e., in terms of means and goals, (2) psychological, i.e., in terms of subjective mental motivations, and (3) individualistic, i.e., in terms of individual persons.  Black proposed a scientific revolution in sociology in the manner described by Thomas Kuhn, and notes that sociology has never had such a revolution in its short history.  In his macrosociometric modeling Hickey dispenses with all three of these premises of classical sociology.  Hickey refers to the romantic sociology with its social-psychological reductionism as “classical”, because his macrosociological quantitative functionalist theory supersedes the prevailing social-psychological reductionism, and manifests a basic discontinuity in sociological thought as evidenced by the criticisms by orthodox journal referees, who recognize Hickey to be a heretic.


Hendry and Doornik’s AUTOMETRICS Discovery System

In the “Introduction” of their Empirical Model Discovery and Theory Evaluation: Automatic Selection Methods in Econometrics (2014) David F. Hendry and Jurgen A. Doornik of Oxford University’s Program of Economic Modeling Institute for New Economic Thinking write that automatic model selection has “come of age.”  Indeed computational philosophy of science is the future that has arrived, even if it is called by other names as practiced by scientists working in their special fields instead of philosophy or cognitive psychology. 

But the news has been slow to get around.  For example in April 2009 the journal Science reported that robotics engineer Hod Lipson of Columbia University and computational biologist Michael Schmidt of Cornell University’s Creative Machines Lab had created a symbolic regression and genetic algorithm that they call the “Eureqa Machine” (pronounced eureka).  Their computer system found invariants in the motion of the double pendulum and outputted Newton’s second law of motion, F = ma, in just a few hours of run time.  It was later given data on yeast cells and developed equations that made highly original and successful predictions that do not relate to existing knowledge in microbiology.  The achievements were also reported in the Guardian, which naïvely announced that for the first time a machine has independently made scientific discoveries.  The Guardian reporter was blithely oblivious to the several routinely functioning discovery systems developed over the last fifty years.

Hendry was head of Oxford University’s Economics Department from 2001 to 2007, and is presently Director of the Program of Economic Modeling Institute at Oxford’s Martin School.  Doornik is a colleague at the Institute. These authors explore mechanized determination of the equation specifications for econometric models with their automated computer system AUTOMETRICS, which is contained in their PcGive software package.  The authors’ automatic model selection takes econometrics beyond the Haavelmo agenda, which viewed econometrics as merely empirical testing of economic theory.  In their summarizing “Epilogue” the authors write that much of the effort of an empirical study is devoted to theorizing about the relevant joint density to explain the economic behavior of interest, selecting the measured variables, incorporating the historical and institutional knowledge of the epoch, and building on previous empirical findings.  But they add that without unjustifiable assumptions of omniscience, these steps are insufficient, and they maintain that empirical model discovery inevitably requires search outside the pre-existing framework. 

Hendry and Doornik state that an automatic program can outperform experts in formulating models when there are many candidate variables, possibly long lag lengths, potential non-linearities, outliers, data contamination, or parameter shifts of unknown magnitudes at unknown points of time.  It also outperforms manual selection by its ability to explore many search paths and thus handle many variables, yet have high success rates.  Furthermore despite selecting from a large number of candidate variables, an automatic selection method can achieve desired targets for incorrectly retaining irrelevant variables, and still deliver near unbiased estimates of policy relevant parameters.  They call their AUTOMETRICS discovery-system design a “structured path search”, which is controlled by a variety of model selection criteria.  Their structured-path-search design is more efficient than a combinatorial approach.  Like Simon, they maintain that a combinatorial design is too extensive to be feasible, although they observe that feasibility is in conflict with generality. 

Hendry and Doornik’s aims are modest and conservative; they express no plan or expectation to revolutionize theoretical economics.  They write that empirical model discovery aims to provide an extension of and improvement upon many existing practices in applied economics, but add that it is not a replacement for analytical reasoning or theory, which they say offers too many crucial insights to be sidelined.  But they also note that it is unwise to impose today’s theory on data, because tomorrow’s theory may be more complete and different, and new theory may lead to earlier theory-based evidence being discarded.  Thus they say that their strategy is for available theory to be “embedded” in the modeling exercise, to be retained in its entirety when it is complete and correct, while at the same time by including a far larger number of candidate variables they allow for the possibility that aspects absent from an abstract theory can be captured. They suggest that embedding both Friedman’s monetarist theory and Modigliani’s Keynesian theory in a general model would have allowed a rapid resolution of the disagreement, perhaps with neither having the complete answer.  Hendry reports in his “Modeling UK Inflation, 1875-1991” in the Journal of Applied Econometrics (2001) that almost every theory – excess demand, monetary, cost push, mark-up, imported, etc. – played a rôle, but even when combined they failed to account for many of the major episodes of inflation and deflation experienced historically.



Pages [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [Next 10 >>]
NOTE: Pages do not corresponds with the actual pages from the book