On a global level, to achieve the 2˚C agreed upon during the Paris Agreement, a decrease in emissions of 40-70 percent (relative to 2010) should be obtained by 2050. According to Bloomberg New Energy Finance, 2017 global green investments exceeded 2016´s total of $287.5 billion. With strong government policy support, China has experienced a rapid increase in sustainable investments over the past several years and nowadays this country is the leader of global renewable investments. Besides, as of 2017, giant wind projects spread between the U.S., Mexico, U.K., Germany and Australia.

Considering the global market at the beginning of the 21st century, sustainable or green investments have gone through three stages—Envirotech, Cleantech and Sustaintech (2017 White Paper, Tsing Capital Strategy & Research Center).

First, the envirotech stage has been driven by environmental technology in addition to government policy and regulations. Envirotech investments have aimed to address traditional environmental issues, such as solid waste treatment, water treatmen and renewable energy.  Considering the related business models, the envirotech business has been characterized by capital intensive investments reliant on scaling up for competitive advantage.

After the emergence of the envirotech stage, cleantech investments have described those green investments driven by technological innovations and cost-reduction. Among other examples, we can highlight solar photovoltaics, electric vehicles, LEDs, batteries, semiconductors, and energy efficiency-related investments. The requirement of long research and development periods in cleantech business models has created high technical barriers for competitors.

The latest evolution of green investments has been defined as the sustaintech stage where digital and cloud based technologies are currently being applied to accelerate sustainable investments through the removal of environmental, energy and resource constraints. Venture Capital investments have successfully funded sustaintech companies  through the past several years, such as  Opower, Nest, Solarcity and Tesla. While Google acquired Nest for $3.2 billion in 2014, Oracle acquired Opower for $532 million in 2016.

Considering the new business models, sustaintech firms have shifted towards less capital intensive investments and the proliferation of disruptive technologies. Disruptive technologies are playing a key role in sustainable development such as the Internet of Things, Artificial Intelligence, Augmented Reality/Virtual Reality, Big Data, 3D Printing and Advanced Material.

  • Internet of Things sensor technology are enhancing sustainability with regards to energy efficiency, water resources and transportation.
  • Artificial Investment technology, satellite imagery, computational methods are being used to improve predictions to improve agriculture sustainability.
  • Virtual Reality and Augmented Reality (AR/VR) technologies have shown signs of potential to transform business processes in a wide range of industries.
  • Big Data has been oriented to optimize energy efficiency and to reduce the cost of clean technologies -related to solar panels and electric vehicles, for instance.
  • 3D Printing technology can improve resource efficiency in manufacturing and increase the use of green materials.
  • Advanced Materials technology can substitute non-renewable resources by recyclable and it can also enable efficiency in power devices

Indeed, $13.5 trillion in investments are needed in energy efficiency and low-carbon technologies up to 2030 to meet the Paris Agreement’s.  Considering the investment landscape, despite the new investment frontier, we are still far from this target. The case for climate action has never been stronger.

 

 

 

 

 

Advertisements

Polanyi offers a deep historical study of how European societies based on traditional values of cooperation and social responsibility were tansformed into modern secular societies. In Polanyi’s terminology, social relations became embedded within the market, creating a market society driven by the imperative of commercialization, which makes money the measure of all things, including human lives. This transformation has affected all dimensions of human existence – politics, economics, society and most importantly, our ways of thinking about these areas. In particular, Modern economic theory is a product of historical forces, and provides an intellectual framework for glorifying the market as the best way of organizing our economic affairs.

I believe that understanding Polanyi is of great importance in understanding the conflict between the values and intellectual frameworks of market societies and traditional society (and also Islamic ideal societies). Understanding how the great transformation took place also provides some clues as to how we can try to create the counter-revolution in thought and action that is needed to undo the damages caused by this commercialization of all spheres of human existence. Over the past decade, I have spent a lot of time thinking about ahd studying Polanyi. The links below provides an introduction to my papers, video-talks, and shorter posts about many aspects of Polanyi’s work in The Great Transformation:

Summary: My 1000+ word summary of Polanyi’s classic: “The Great Transformation: The Political and Economic Origins of Our Times” has been wildly popular, remaining constantly among the top ten on the RWER Blog since it was was published nearly five years ago.  I have recently (25/12/26) revised and updated the post to clean up extraneous elements and clarify the substance in light of readers comments as well as my own improved understanding. Perhaps the most important element of this post is that it explains how living in a market society shapes our thoughts to conform with the commercialization it creates. Creating radical changes requires the first step of liberating our selves from these blinders, to be able to imagine radical alternatives.  I have also recorded a 28m video-talk on this topic, which has been added to the original post.

Methodology: Moving forward from critique, Polanyi’s analysis is based on methodological principles radically different from those currently in use. Understanding and implementing these principles woujld allow us to create a new approach to economics and social sciences. My 20 page paper explaining the three fundamental principles used by Polanyi was published in the WEA Journal: Asad Zaman (2016) ‘The Methodology of Polanyi’s Great Transformation.’ Economic Thought, 5.1, pp. 44-63. A brief 1000 word explanation of this methodology is available in a WEA Pedagogy Blog post:    The Methodology of Polanyi’s Great Transformation. The post also provides a link to a 45m video lecture on this topic. (This lecture has been by far my most popular video-lecture, with more than 2000 views.) Polanyi’s analysis provides the basis for a radically different approach to economics, which considers politics, society, environment, and economics as inter-related subjects which cannot be understood in isolation. One of the deep insights of Polanyi is that economic theory itself is a product of a power struggle between different social classes and cannot be understood outside its historical context.

Ecological Collapse: The relationship between the Great Transformation and the looming environmental catastrophe which threatens the future of humanity on planet Earth is discussed in Zaman, Asad, “Unregulated Markets and the Transformation of Society” Chapter 18, Routledge Handbook of Ecological Economics: Nature and Society. Editor Clive Spash. 2016. Major points made in this 5000 word paper are summarized in my earlier post on “Markets and Society” which also provides links to the full paper and a 50min Video-Talk on this topic. Very briefly, markets generate profits by appropriating and exploiting resources, eventually exhausting them, before moving on to the next frontier. The dynamics of growth is such that it is threating to exhaust the last remaining frontiers at the planetary level, leading to collapse. This topic is also addressed in my paper on “Evaluating the Costs of Growth” Real World Economics Review, issue 67, 9 May 2014, page 41-51.. Available at SSRN: https://ssrn.com/abstract=2499115.

Islamic Economics: One of the central themes of Polanyi is the opposition between values of traditional societies and those of Market Societies. Islamic Economics is aligned with traditional values and opposes the commercialization generated by market societies. Studying these contrasts leads to a sharper understanding of the underlying principles of an Islamic Economy. These relationships are clarified in my 30 page essay on   “The Rise and Fall of the Market Economy,” Review of Islamic Economics, Vol. 14, No. 2, 2010, pp. 123–155. A brief explanation is also available from a post on “The Great Transformation in European Thought” in my “Islamic WorldView Blog”. A longer 5000 word explanation, meant as an entry for an Encyclopedia of Islamic Economics, was never published: The Limits of Market Economy.

Four Lectures on Polanyi: In my Advanced Micro class, I covered “The Great Transformation” in detail in four lectures listed below. Each lecture is about 90 minutes. The links provide both video-recording and transcripts of the lecture for faster reading.

  1. L16: From Hunter-Gatherer to World War 2
  2. L17: The Transition from traditional paternalistic and regulatory economies to market economy.
  3. L18: Three Artificial Commodities – Labor, Land, Money. Analysis of Social Change.
  4. L19: Devastating Impact of Unregulated & Expanding Markets, and how to reverse the Great Transformation – concluding lecture on Polanyi.

In addition to the longer articles/talks above, some short previous posts on the WEA Pedagogy Blog deal with topics related to Polanyi; these are listed below.

Meta-Theory and Pluralism in the Methodology of Polanyi: Post explains the meta-theoretical methodological stance of Polanyi. Polanyi is concerned with the process of social change. He analyzes how theories emerge as attempts by different social classes to understand, explain, control, and harness for their own benefit, changes which are created by external drivers. Thus, his is a meta-theory which studies the emergence of theories about economics, society and politics, and the impact of these theories on the alignment of power between different social groups.

The Neo-Liberal Way of Life: Madi’s post explains how the market society molds our way of life, as well as our ways of thinking, in accordance with Polanyi’s conception of “embeddedness” – that is, social relations are embedded within economic relations in a market society.

Hunter-Gatherer Societies: The idea that political and social structures of a society depend on the economic relations of production is cleanly demonstrated in context of primitive hunter-gatherer societies. This shows how economic theories are situated within historical context, unlike scientific theories which are universal invariants. It also shows the impossibility of analyzing economics in isolation from political, social and historical context.

Three Methodologies: The differences between contemporary, Marxist, and Polanyi methodology are clarified in this post. Contemporary economics treats the economy like a physical system subject to laws which are independent of what observers think – that is, economic theories do not affect the laws governing the economic system. Marx tells us that the economic relations of production are primary, and give rise to the social and political systems. Also economic theories emerge to justify the powerful (capitalist) classes. Thus economic theories are born out their historical context. This is well-illustrated by the Hunter-Gatherer Societies. Polanyi argues for two-way interactions. Economic theories are born out the historical context as a result of the struggle for power between different classes. At the same time, these theories are use to explain and control the economic system, so that theories actually influence the behavior of the economic system. For example, Marx’s theory of communism influenced the structure of the economy of Russia and China. This idea, that economic theories influence the behavior of the economic system, is alien to both modern economics and also to Marx, since material determinism excludes human will and interpretation from influencing the behavior of economic systems. However, human agency is at the heart of Polanyi’s analysis. For a link to more materials and a 90m video lecture on this topic, see: Advanced Micro Lecture 15: 19th Century European History

Entanglement of the Objective and Subjective:   Western epistemology is built on numerous false dualities which deeply damage our ability to understand the world we live in. Sharp separation of the body and soul, the unobservable motivations and the observable behaviors, normative and positive, and objective and subjective, are just a few examples. As philosopher Hilary Putnam has said, facts and values are inextricably entangled within the body of economic theory. We cannot separate the two, as economists assume, and assert. Many authors have realized how numerous un-appealing value judgments are built into the foundations of objective-seeming economic theories. See, for example, “The Normative Foundations of Scarcity,” Real-World Economics Review, issue no. 61, 26 September 2012, pp. 22-39, to see how three major value judgments are involved in making scarcity the fundamental concern of economists. This post shows how the objective and subject are inextricably entangled, which means that economists must take human agency into account, instead of treating them as robots subject to mathematical laws of behavior.   For a link to more materials, and a 90m video lecture on this topic, see: Advanced Micro Lecture 13: Entanglement of History and Economic Theories

An earlier (unsuccessful) attempt at organizing material on Polanyi: (to be updated later)

Current food challenges involve issues ranging from land and food access to commodity price volatility, besides national and international regulation. Although the scope and intensity of these challenges vary according to the different economic and social situations of countries, the debate has been global.

Today, once again, these issues arise deep concerns on behalf of the 2017 WTO ministerial conference  that has just been closed, in Buenos Aires, Argentina. Indeed, the WTO has not seemed to enhance effective actions on long-standing proposals. Agriculture negotiations remain among the most important and challenging issues. These negotiations began in 2000 as part of the mandated “built-in agenda” agreed at the end of the 1986-1994 Uruguay Round and, then, they were incorporated into the Doha Round launched at the end of 2001.

The process of globalization of capital in agriculture and food production has shaped a global network of institutions that supplies the worldwide food markets. Contract farming and integrated supply chains are deeply transforming the structure of the agriculture and food industries and, as a result, they have put the local farm sector under high pressure. Further, the expansion of big investment projects, led by transnational companies and institutional investors, has expose small farmers to a situation of hunger and food insecurity by expelling them from the land where they live. In addition to these challenges, the biotech revolution and the introduction of genetically improved varieties of seeds have fostered structural changes.

While the agriculture and food systemic changes are linked to financial and trade flows – mainly profit-driven – international organizations and non-governmental organizations have shaped hunger reduction projects. More recently, for example poverty and hunger reduction targets have been included in the Millennium Development Goals (MDGs) of the United Nations Development Program (UNDP). In truth, hunger and poverty are correlated issues. They are primarily linked to land access, income distribution, employment and food prices, among other factors.

In this scenario, even with the global financial crisis, international prices for agricultural commodities remained substantially above historical averages. Some factors contributed to these high prices:  growth of the world’s population, growth of the Chinese GDP and the urbanization of China. As a result,  at the end of the 2000s, the FAO predicted the global challenge of “a decade of high food prices” and pointed out the need to increase food production.

Since 2014, global commodity crop prices have come back to pre-food-crisis levels. Indeed, the pre-crisis rising food prices turned out to draw investment into agriculture, mainly in the U.S., Brazil, Argentina, Ukraine, and other exporters of commodity crops, such as corn and soybeans. However, according to the Institute for Agriculture and Trade Policy (IATP), the American exports of corn, soybeans, wheat and cotton at prices has been characterized by significant “dumping margins”.

What seems relevant to recall is that the financialization of cop prices and their volatility are systemic challenges. On behalf of these challenges, there has been a global increase not only in the vulnerability of small farmers but also in the number of chronically hungry people – that amounts more than 800 million. Considering this background, after a decade of high prices, current low crop prices and dumped crops – without effective WTO proposals and actions – will drive the most vulnerable people even more into hunger and poverty.

References

FAO. The future of food and agriculture – Trends and challenges. 2017. Rome.

Institute for Agriculture and Trade Policy. Excessive Speculation in Agriculture Commodities: Selected Writings from 2008–2011. Ben Lilliston and Andrew Ranallo (Editors). IATP, 2011. Available on line at: http://www19.iadb.org/intal/intalcdi/PE/2011/08247.pdf.  Accessed 29 July 2016.

United Nations. The Millennium Development Goals Report 2012. Available on line at: http://www.un.org/millenniumgoals/pdf/MDG%20Report%202012.pdf. Accessed 20 April 2016.

WTO. 2017 Ministerial Conference. Agriculture. https://www.wto.org/english/thewto_e/minist_e/mc11_e/briefing_notes_e/bfagric_e.htm

Talk at PIDE Nurturing Minds Seminar on 29th Nov 2017. Based on “Lessons in Econometric Methodology: Axiom of Correct Specification”, International Econometric Review, Vol 9, Issue 2.

Modern econometrics is based on logical positivist foundations, and looks for patterns in the data. This nominalist approach is seriously deficient, as I have pointed out in Methodological Mistakes and Econometric Consequences. These methodological defects are reflected in sloppy practices, which result in huge numbers of misleading and deceptive regression results — nonesense or meaningless regressions. The paper and talk below deals with one very simple issue regarding choice of regressors which is not explained clearly in textbooks and leads to serious mistakes in applied econometrics papers.

BRIEF SUMMARY OF TALK/PAPER:

Conventional econometric methodology, as taught in textbooks, creates serious misunderstandings about applied econometrics. Econometricians try out various models, select one according to different criteria, and then interpret the results. The significance of the fact that interpretations are only valid if the model is CORRECT are not highlighted in textbooks. The result is that everyone presents and interprets their models as if the model was correct. This relaxed assumption – that we can assume correct any model that we put down on paper, subject to minor checks like high R-squared and significant t-stats – leads to dramatically defective inferences. In particular, ten different authors may present 10 different specifications for the same variable, and each may provide an interpretation based on the assumption that his model is correctly specified. What is not realized is that there is only one correct specification, which must include all the determinants as regressor, and also exclude all irrelevant variables (though this is not so important). This means that out of millions of regressions based on different possible choices of regressors, only one is correct, while all the rest are wrong. Thus all 10 authors with 10 different specifications cannot be right – at most one of them can be right. In this particular case, we could see at least 90% of the authors are wrong. This generally applies to models published in journals – the vast majority of different specification must be wrong.

Now the question arises as to how much difference this Axiom of Correct Specification makes. If we can get approximately correct results, then perhaps the current relaxed methodology is good enough as a beginning point. Here the talk/paper demonstrates that if one major variable is omitted from the regression model, than anything can happen. Typically, completely meaningless regressors will appear to be significant. For instance, if we regress the consumption of Australia on the GDP of China, we find a very strong regression relationship with R-squared above 90%. Does this means that China’s GDP determines 90% of the variation in Australian consumption. Absolutely not. This is a nonsense regression, also known as a spurious regression. The nonsense regression is cause by the OMISSION of an important variable – namely Australian GDP, which is the primary determinant of Australian Consumption. A major and important assertion of the paper is that the idea that nonsense regressions are caused by INTEGRATED regressors is wrong. This means that the whole theory of integration and co-integration, developed to resolve the problem of nonsense regression, is searching for solutions in the wrong direction. If we focus on solving the problem of selecting the right regressors – ensuring inclusion of all major determinants – then we can resolve the problem of nonsense or meaningless regressions.

Next we discuss how we can ensure the inclusion of all major determinants in the regression equation. Several strategies currently in use are discussed and rejected. One of these is Leamer’s strategy of extreme bounds analysis, and some variants of it. These do not work in terms of finding the right regressors. Bayesian strategies are also discussed. These work very well in the context of forecasting, by using a large collection of models which have high probabilities of being right. This works by diversifying risk – instead of betting on any one model to be correct, we look at a large collection. However, it does not work well for identifying the one true model that we are looking for.
The best strategy currently in existence for finding the right regressors is the General-to-Simple modeling strategy of David Hendry. This is the opposite of standard simple-to-general strategy advocated and used in conventional econometric methodology. There are several complications in applying this strategy, which make it difficult to apply. It is because of these complications that this strategy was considered and rejected by econometrician. For one thing, if we include a large number of regressors, as GeTS required, multicollinearities emerge which make all of our estimates extremely imprecise. Hendry’s methodology has resolved these, and many other difficulties, which arise upon estimation of very large models. This methodology has been implemented in Autometrics package within the PC-GIVE software for econometrics. This is the state-of-the-art in terms of automatic model selection, based purely on statistical properties. However, it is well established that human guidance, where importance of variables is decided by human judgment about real-world causal factors, can substantially improve upon automatic procedures. It is very possible, and happens often in real world data sets, that a regressor which is statistically inferior, but is known to be relevant from either empirical or theoretical considerations, will outperform a statistically superior regressor, which does not make sense from a theoretical perspective. A 70m video-lecture on YouTube is linked below. PPT Slides for the talk, which provide a convenient outline, are available from SlideShare: Choosing the Right Regressors. The paper itself can be downloaded from “Lessons in Econometric Methodology: The Axiom of Correct Specification

 

For this post on my personal website: asadzaman.net, see: http://bit.do/azreg

For my lectures & courses on econometrics, see: https://asadzaman.net/econometrics-2/

For a list of my papers on Econometrics, see: Econometrics Publications

 

 

The title is an inversion of De-Finetti’s famous statement that “Probability does not exist” with which he opens his famous treatise on Probability. My paper, discussed below, shows that the arguments used to establish the existence of subjective probabilities, offered as a substitute for frequentist probabilities, are flawed.

The existence of subjective probability is established via arguments based on coherent choice over lotteries. Such arguments were made by Ramsey, De-Finetti, Savage and others, and rely on variants of the Dutch-Book, which show that incoherent choices are irrational – they lead to certain loss of money. So every rational person must make coherent choices over a certain set of especially constructed lotteries. Then the subjectivist argument shows that every coherent set of choices corresponds to a subjective probability on part of the decision maker. Thus we conclude that rational decision makers must have subjective probabilities. This paper shows that coherent choice over lotteries leads to weaker conclusion than the one desired by subjectivists. If a person is forced to make coherent choices for the sake of consistency in certain specially designed environment, that does not “reveal” his beliefs. The decision may arbitrarily chose a “belief”, which he may later renounce. To put this in very simple terms, suppose you are offered a choice between exotic fruits Ackee and Rambutan, neither of which you have tasted. Then the choice you make will not “reveal” your preference. But preferences are needed to ensure stability of this choice, which allows us to carry it over into other decision making environments.

The distinction between “making coherent choices which are consistent with quantifiable subjective probabilities” and actually having beliefs in these subjective probabilities was ignored in era of dominance of logical positivism, when the subjective probability theories were formulated. Logical positivism encouraged the replacement of unobservables in scientific theories by their observational equivalents. Thus unobservable beliefs were replaced by observable actions according to these beliefs. This same positivist impulse led to revealed preference theory in economics, where unobservable preferences of the heart were replaced by observable choices over goods. It also led to the creation of behavioral psychology where unobservable mental states were replaced by observable behaviors.

Later in the twentieth century, Logical Positivism collapsed when it was discovered that this equivalence could not be sustained. Unobservable entities could not be replaced by observable equivalents. This should have led to a re-thinking and re-formulation of the foundations of subjective probability, but this has not been done. Many successful critiques have been mounted against subjective probability. One of them (Uncertainty Aversion) is based on the Ellsberg Paradox, which shows that human behavior does not conform to the coherence axioms which lead to existence of subjective probability. A second line of approach, via Kyburg and followers, derive flawed consequences from the assumption of existence of subjective probabilities. To the best of my knowledge, no one has directly provided a critique of the foundational Dutch Book arguments of Ramsey, De-Finetti, and Savage. My paper entitled “Subjective Probability Does Not Exist” provides such a critique. A one-hour talk on the subject is linked below. The argument in a nutshell is also given below.

The MAIN ARGUMENT in a NUTSHELL:

Magicians often “force a card” on a unsuspecting victim — he thinks he is making a free choice, when in fact the card chosen is one that has been planted. Similarly, subjectivists force you to create subjective probabilities for uncertain events E, even when you avow lack of knowledge of this probability. The trick is done as follows.  I introduce two lotteries. L1 pays $100 if event E happens, while lottery L2 pays $100 if E does not happen. Which one will you choose? If you don’t make a choice, you are a sure loser, and this is irrational. If you choose L1, then you reveal a subjective probability P(E) greater than or equal to 50%. If you choose L2, then you reveal a subjective probability P(E) less than or equal to 50%. Either way, you are trapped. Rational choice over lotteries ensures that you have subjective probabilities. There is something very strange about this argument, since I have not even specified what the event E is. How can I have subjective probabilities about an event E, when I don’t even know what the event E is? If you can see through the trick, bravo for you! Otherwise, read the paper or watch the video.. What is amazing is how many people this simple sleight-of-hand has taken in. The number of people who have been deceived by this defective argument is legion. One very important consequence of widespread acceptance of this argument was the removal of uncertainty from the world. If rationality allows us to assign subjective probabilities to all uncertain events, than we only face situations of RISK (with quantifiable and probabilistic uncertainty) rather then genuine uncertainty where we have no idea what might happen. Black Swans were removed from the picture.

Olivier Blanchard and Lawrence Summers has recently called for a reflection about the macroeconomic tools required to manage the outcomes of the 2008 global crisis  in their paper Rethinking Stabilization Policy. Back to the Future. The relevant question they address is: Should the crisis lead to a rethinking of both macroeconomics and macroeconomic policy similar to what we saw in the 1930s or in the 1970s? In other words, should the crisis lead to a Keynesian approach to macroeconomic policy or will it reinforce the agenda suggested by mainstream macroeconomics since the 1990s?

Since the 1990s, mainstream macroeconomics has largely converged on a view of economic fluctuations that has become the basic paradigm of research and macroeconomic policy. According to this view of unexplained random underlying shocks, the fluctuations result from small shocks to components of demand and supply with linear propagation mechanisms which do not prevent the economy to return back to the potential output trend.  Considering a world of regular fluctuations:  (1) dynamic stochastic general equilibrium (DSGE) models are used to develop structural interpretations to the observed dynamics, (2) optimal policy is mainly based on monetary feedback rules- such as the interest rate rule- while fiscal policy is avoided as a stabilization tool, (3) the role of the finance is often centered on the yield curve, and (4) macro prudential policies are not considered.

As the real-world of financial crisis does not fit this representation of fluctuations, Blanchard and Summers, following the influence of Romer’s reference of the DSGE regular shocks as phlogistons, assess that the image of financial crises should be “more of plate tectonics and earthquakes, than of regular random shocks”. And this happens for a number of reasons (1) financial crises are characterized by non-linearities that amplify shocks (for instance, bank runs), (2) one of the outcomes of financial crises is a long period of depressed output followed by a permanent decrease in potential output relative to trend as the  propagation mechanisms  do not converge to the potential output trend,  (3) financial crises are followed by “hysteresis” either through higher unemployment or  lower productivity,  .

Almost ten years after the 2008 crisis, among the current “non-linearities” that led to the current deep policy challenges, Blanchard and Summers also highlight

  • The large and negative output gaps in many advanced economies,  in addition to low growth, low inflation, low nominal interest rate, reduction of nominal wages in advanced economies,
  • The interaction between public debt and the banking system, a mechanism known as “doom loops” since higher public debt might lead to public debt restructuring that might turn out to decrease the level of banks’ capital and, therefore, this situation might increase concerns about their liquidity and solvency.

Considering the current policy challenges, they suggest to avoid the return to the pre-crisis agenda or even to avoid the adoption of what they call “more dramatic proposals, from helicopter money, to the nationalization of the financial system”. In their view, there is the need to use macro policy tools to reduce risks and stabilize adverse shocks. As a result, they suggest:

  • A more aggressive monetary policy, providing liquidity when needed.
  • A more active use of fiscal policy as a stabilization macroeconomic tool, besides a more relaxed behavior in relation to fiscal debt consolidation.
  • A more active financial regulation.

Interesting to say that Blanchard and Summers mention the importance of Hyman Minsky in warning the special role of the complexity of finance in contemporary capitalism. However, in the defense of their proposal, they should have remembered the Minskyan concern:  Who will benefit from this policy agenda?

Any policy agenda  refers to forms of power: there are tensions between private money, consenting financial practices and national targets that emerge in the context of  the neoliberal global governance rules.

Indeed, almost ten years after the 2008 global financial crisis, it is time to rethink the contemporary political, social and economic challenges in a broader context and in a broader and longer perspective.  Power, finance and global governance are poweful interrelated issues that shape livelihoods.

Lecture 5 of Advanced Microeconomics at PIDE. The base for this lecture Hill & Myatt Anti-Textbook Chapter 4 on Consumer Theory.

Hill and Myatt cover three criticisms of conventional microeconomic consumer theory.

  1. Economic theory considers preference formation as exogenous. If the production process also creates preferences via advertising, this is not legitimate.
  2. Consumers are supposed to make informed choices leading to increase welfare. However, deceptive advertising often leads consumers to make choices harmful to themselves. The full information condition assumed by Economics is not valid.
  3. Economic theory is based on methodological individualism, and treats all individual separately. However, many of our preferences are defined within a social context, which cannot be neglected.

Before discussing modern consumer theory, it is useful to provide some context and

1      Historical Background:

In a deeply insightful remark, Karl Marx said that Capitalism works not just by enslaving laborers to produce wealth for capitalists, but by making them believe in the necessity and justness of their own enslavement. The physical and observable chains tying the exploited are supplemented by the invisible chains of theories which are designed to sustain and justify existing relationships of power. Modern economic consumer theory is an excellent illustration of these remarks. Read More