Read about some of the presentations and discussions topics from previous conferences.
10 May 2013
"Workfare Programme and Household Economic Security: Credibility & Concern - Findings from Household Level Panel Data on World’s Largest Workfare Programme of India"
In 2006, to enhance livelihood security, the Government of India launched the workfare fare programme National Rural Employment Guarantee Scheme (NREGS), later known as Mahatma Gandhi NREGS which guarantees 100 days of unskilled employment on demand to rural households. While such employment programmes are not new, the current scheme contains several innovative elements. Among others, its sheer scale - an annual budget of $ 8.91 billion and job access to almost 50 million households on average in each year since in 2006, a rights-based approach to development, a self-targeting mechanism, stipulation that a third of the jobs should go to women, monitoring through community-led social audits and decentralized implementation. Based on three waves (2009, 2010, and 2012) of household level longitudinal data this paper estimates the household level impact of this self targeted employment guarantee programme. Apart from different panel estimation techniques, we also estimate average causal effects of different lengths of exposure of the participating households to the programme using Generalized Propensity Score (GP Score) with continuous treatment under the assumption that selection into different lengths is based on a rich set of observed covariates and time-invariant factors.
"A Unified Model with Inter- and Intra-Industry Trade"
This paper builds a more general unified model with both inter- and intra-industry trade. Inter-industry trade is as in Heckscher-Ohlin due to the variation of comparative advantages by country and industry, and intra-industry trade is as in Krugman due to consumers’ like of variety and firms’ increasing returns to scale. By integrating the two approaches together, we are able to examine trade patterns within countries with similar factor endowments and between countries that factor endowments are different. As our model takes account of more forces in a unified analytical framework than the existing models, it predicts fuller patterns of trade, in terms of trade directions, volumes, and their welfare implications, that are more consistent with the real world.
"A Similarity Based Approach to Forecasting the Variance-Covariance Matrix"
Variance-covariance matrices (VCM) of a vector of asset returns are of particular importance for risk management and asset pricing. Although the modelling of VCM has made substantial improvements in the last decade, one largely unexplored area is the relation between macroeconomic developments and variations in the VCM. This study proposes a modified version of the Dynamic Conditional Correlation (DCC) model, which allows the long-run correlation to be time-varying. The time-varying long-run correlation is modelled by a multivariate kernel determined by a set of economic variables. A simulation study shows the superior estimation performance of the proposed model over the normal DCC in cases in which the VCM indeed varies with economic factors. Economic and statistical loss functions compare the forecasting performance with other methods proposed in the literature.
"Multiple Break Point Estimation in Linear Econometric Models"
This research considers discrete multiple changes in the parameters of linear econometric models with endogenous regressors, estimated via Generalised Method of Moments (GMM). Existing literature refers to these discrete multiple changes as “break points” and the focus of this research is to determine their location, by consistently estimating the “break fractions” which index these break points. Hall et al. (2012) show that minimising the GMM criterion over all sets of possible break points yields inconsistent estimators of the true break fractions. This study on the other hand, proposes yet another method of obtaining estimators of the break fractions, still using GMM. The Wald, Lagrange Multiplier and Difference tests statistics are obtained for each candidate break point and the supremum of these tests statistics is chosen. The methodology proposes to use the location of this supremum to determine the true break fractions. From the preliminary results of the Monte Carlo simulations, there is a strong indication that this proposed methodology yields consistent estimators of the true break fractions in the cases of one and two-break linear models and future work will attempt to formally establish the asymptotic properties of these estimators.
"Financial Development and Economic Growth: The Role of Financial Liberalization"
This paper provides an empirical evidence for the significance of the role of financial reforms in determining the impact of financial development on long-run growth. We use a dynamic panel data analysis for 88 countries over the period of 1973 to 2005. Our index for the financial reform covers seven aspects: credit controls and reserve requirements, interest rate controls, entry barriers, state ownership, policies on securities markets, banking regulations and restrictions on capital market. The results indicate that the positive effect of financial development on long-run growth is conditional on the level of financial reforms.
"Corruption and Political Turnover in a Sovereign Default Model"
This paper looks at government spending, bureaucratic corruption, political turnover and centralized government borrowing. Endogenizing corruption in a sovereign default model, we show that sovereign borrowing and default decisions are affected by corruption. We find that, once government spending and corruption are taken into account, the risk of sovereign default increases through the negative effect of corruption on bond prices, government spending and borrowing. Another result is that, in terms of the economy's welfare, a more corrupt policy maker would borrow more to achieve the same level of welfare for the economy that the less corrupt policy maker would. A further finding is that when a negative shock hits the economy, less corruption leads to a quicker improvement of household consumption than when there is more corruption. Conversely, bureaucratic consumption increases quickly if there is more corruption. Counter intuitively, we establish that changes in government from more corrupt to less corrupt is more likely to cause default than from otherwise. This suggests that with political turnover, a more corrupt government will not default so easily as government officials may face prosecution if the government defaults and there is a change in power. As a result the corrupt government may want to hold onto power and continue with their corrupt deeds. However, with a change in government to a less corrupt, it is optimal to announce default. Finally, our empirical results support the negative effect of corruption on sovereign default decision.
"Excess Reserves, Monetary Policy and Financial Volatility with Credit Market Frictions"
This paper examines the financial and real effects of excess reserves in a New Keynesian Dynamic Stochastic General Equilibrium (DSGE) model with monopoly banking, credit market imperfections and a cost channel. The model explicitly accounts for the fact that banks hold excess reserves and they incur costs in holding these assets. Simulations of a shock to required reserves show that although raising reserve requirements is successful in sterilizing excess reserves, it creates a procyclical effect for real economic activity. This result implies that financial stability may come at a cost of macroeconomic stability. The findings also indicate that using an augmented Taylor rule in which the policy interest rate is adjusted in response to changes in excess reserves reduces volatility in output and inflation but increases fluctuations in financial variables. To the contrary, using a countercyclical reserve requirement rule helps to mitigate fluctuations in excess reserves, but increases volatility in real variables.
"Income Distribution and Human Capital Investment: The Role of Corruption"
This paper presents an analysis of the role of corruption in determining the distribution of income and, with this, the degree of poverty and inequality. The analysis is based on an overlapping generations model in which individuals may seek to improve their productive efficiency (and hence earnings) by supplementing or substituting publicly provided services (such as education and health) with personal expenditures on human capital investment. Because of capital market imperfections, their ability to do this depends on their inherited wealth which serves as collateral for loans. Corruption is reflected in the pilfering of public funds and a reduction in public service provision, the effect of which is to reduce the earnings of those who rely on such services and to exacerbate the extent of credit rationing for these agents. The dynamic general equilibrium of the model is characterised by multiple steady states to which different income classes converge. Higher levels of corruption lead to higher levels of poverty and may result in complete polarisation between the rich and poor by eliminating the middle class.
"Child Labour, Intra-Household Bargaining and Economic Growth"
This paper develops a three-period, gender-based overlapping generations model of endogenous growth with endogenous intra-household bargaining and child labour in home production by girls. Improved access to infrastructure reduces the amount of time parents find optimal for their daughters to spend on household chores, thereby allowing them to allocate more time to studying at home. The model is calibrated for a low-income country and various quantitative experiments are conducted, including an increase in the share of public spending on infrastructure, an increase in time allocated by mothers to their daughters, and a decrease in fathers’ preference for girls’ education. Our analysis shows that poor access by families to infrastructure may provide an endogenous explanation for the persistence in child labour at home and gender inequality in low-income countries.
The conference was held jointly with Lancaster University.
10 - 11 May 2012
"The Environmental Kuznets Curve at Different Levels of Economic Development and Distributions of Income"
The Environmental Kuznets Curve (EKC) theory posits that the early stages of a country's developmental process are associated with increasing environmental damage. However, after the attainment of a threshold level of income, progress leads to greening. Advocates of this theory prescribe that economic growth is the cause and panacea to environmental degradation; thereby undermining the role of environmental policies in mitigating pollution. Methodologically, the conventional techniques used in investigating the hypothesis focus on estimating the rate of change in the mean distribution of emissions as a function of income, thus, being incapable of capturing country heterogeneity within the (panel) sample investigated. We employ a relatively new technique used in Kuznets Curve estimations - the quantile fixed effects method - in our exploration of the CO2 Kuznets Curve. Further, we extend decomposition methods largely employed in labour economics research to the Kuznets Curve framework to explain the most important factors accounting for the OECD-Non-OECD emissions gap. Our study confirms the existence of the EKC in the global, OECD, Western, Latin American and East Asian samples. Additionally, we find that the OECD countries polluted about 60 to 369 percent more CO2 than their Non-OECD counterparts. In sum, we suggest that all hands must be on deck to mitigate CO2 pollution and policies promoting economic progress should move in tandem with those promoting greening.
"Health, Labor Supply and Productivity: An Econometric Study of Cambodian Farm Households and Arsenic Consumption"
The consumption of arsenic through contaminated groundwater and agricultural crops in South-East Asia is a major public health concern, exacerbated by the encouragement of groundwater well construction. Whilst the mortality and morbidity attributable to exposure to geogenic arsenic through drinking, cooking or consumption of rice and other foods is well known and increasingly studied, the economic impacts of geogenic arsenic exposure have been less well investigated. We present the framework and results of an econometric study of the impacts of arsenic consumption in rural Cambodia on household decision making, particularly labour supply choices, and farm productivity. The research analyses how the health impacts of arsenic consumption are manifested in economic decision making and agricultural production, utilizing household data from the Cambodian Socio-Economic Survey (CSES) along with village level arsenic estimates.
"Energy from Microgeneration: Sustainability and Perceptions in the UK"
The drive for climate change mitigation and more secure energy supplies has led to government support of the microgeneration industry in the UK. Microgeneration is the small scale production of heat and/ or electricity from a low carbon source such as with solar panels, wind turbines and heat pumps. Despite government targets and incentive schemes, microgeneration uptake in the UK remains low with only 180,000 homes, out of over 25 million, having microgeneration installations. To achieve greater microgeneration uptake, significant demand-side barriers must be reduced, including high capital costs, long payback times, large space requirement within the home, personal effort required and perceived bureaucracy associated with planning and installation. The main research question is: can microgeneration contribute to meeting UK climate change, energy security and fuel poverty targets and, if so, how can greater uptake be achieved? The research is framed into three sub-questions: how sustainable is microgeneration; what are the motivations and barriers associated with microgeneration adoption in the UK; and, if microgeneration is sustainable, where does the greatest potential lie in increasing uptake and maximising the benefits? Methodological tools to be used within this research are life cycle analysis (LCA), best-worst scaling and choice experiments. Thus far research has focussed on determining the environmental sustainability of microgeneration and understanding consumer motivations and barriers associated with microgeneration adoption. The presentation will illustrate the work carried out so far and the design of a best-worst scaling survey due to be implemented in the coming months.
"Is India's Outward FDI Consistent with Dunning's Investment Development Path Sequence?"
The purpose of the study is to examine whether India's Outward Foreign Direct Investment (OFDI) pattern suggests consistency with the Dunning's Investment Development Path (IDP) Sequence or is a refinement to the established theories. The question is addressed using macro level data on India's growth and investment position over the period 1980-2010. We test whether the level of development proxied by GDP per capita is the main factor explaining OFDI, and also extend the hypothesis to examine other major determinants of OFDI — Exports, GDP per person (proxy for labour productivity), Inward FDI (IFDI) and R&D. We adopt the Cointegration and Error Correction Model technique to carry out the analysis. The issues raised here have an important policy implication for developing nations. Should the countries seeking internationalization wait for their per capita incomes to grow to undertake OFDI or should they invest at an early stage of their development, forming an exception to the IDP theory? The latter emphasizes the importance of other contributing factors, apart from the income level of a country. The paper also contributes towards finding if there is a two-way causal relationship between OFDI and the explanatory factors, following the Granger Causality test. An interesting finding is that the OFDI granger causes R&D, suggesting a possibility of reverse technology spillover effect. It also calls for a further research to find whether R&D spillovers have a positive impact on domestic productivity, the so called Feedback Effect.
"The Maximin Value Allocation: Interpretation and Properties"
There are three main (Walrasian general equilibrium related) reasons to employ maximin-non Bayesian preferences, instead of the standard-Bayesian ones, for the agents of a Radner's (1968) partition type differential information exchange economy. All of them are originated by the (inherent in such an economy) private asymmetric information measurability requirement. All of them are jointly discussed and formally presented in a recent paper of Castro-Yannelis (2010), who, taking this measurability condition for granted, assert that: (i) the Bayesian agents have incomplete preferences because they are only able to compare private information measurable individual allocations, (ii) the efficiency of the general equilibrium out- comes is reduced since we lose the non private information measurable Pareto optimal allocations of this economy and (iii) its seems "unreasonable" to assume that the agents are Bayesian in the first place since they cannot possibly assign a probability of occurrence to the states between which they cannot distinguish. By adopting the Gilboa Schmeilder's (1989) maximin preferences for the agents of this economy, the aforementioned issues are naturally resolved. On these grounds, we pursue in an "appropriately" well defined exchange economy of this type, able to accommodate the agents' maximin preferences, a cooperative Walrasian equilibrium allocation, the maximin (Shapley, 1969) value allocation. We do this by extending the (Bayesian) private value allocation of Krasa-Yannelis (1994). This maximin value allocation concept was introduced by Castro-Yannelis (2010) as well, but it was not axiomatized. By imposing, therefore, a set of assumptions on the maximin expected utility of the agents, trading (ie, writing contracts) in such an economy, we prove existence of the maximin value allocation (ie, contract). We also establish all its other normative maximin properties: Pareto optimality, individual rationality and incentive compatibility.
"Liminal Exponential Discounting"
A decision maker's propensity to forgo current utility for future utility is known as their discount rate. The classical model of decision making over time, exponential discounting, assumes that the discount rate is constant. This paper introduces a new model we call Liminal Discounting. This model generalizes exponential discounting model in a simple way, yet the model can accommodate preferences exhibiting decreasing or increasing impatience. An individual with such preferences has a constant rate of time preference up to some threshold point in time. After this threshold the rate may change, but will then remain constant at the new rate. Such preferences are stationary before and after the threshold. These long periods of stationarity make the model especially tractable for economic applications. Violations of stationarity, such as the present bias, may occur when comparing the near and distant future. Our main theorem provides a preference foundation for the Liminal Discounting model. The theorem is proved within the standard framework, so is a genuine generalisation of the exponential discounting model. In particular, the threshold time arises as a consequence of our preference axioms.
"Foundations for Prospect Theory through Probability Midpoint Consistency"
For the famous prospect theory model there is hitherto no preference foundation for general sets of outcomes. All existing models assume a rich structure for the set of outcomes and propose preference conditions that hinge upon that structure. Yet in many important applications where prospect theory is assumed, like health or insurance, the set of outcomes is degenerate. In these more general settings it is unclear what preference conditions are required, beyond the standard assumptions, to pin down prospect theory. This paper proposes a consistency principle for elicited probability midpoints that requires a consistent treatment of probabilities of gains and similarly a consistent treatment of probabilities of losses. We show that, in the presence of the other standard preference conditions, this consistency principle implies prospect theory.
"The von Neumann-Gale Model of Financial Markets"
We consider a model of a stochastic financial market that generalizes the classical model by including transaction costs, portfolio constraints and a new notion of hedging. Our model is based on the framework of von Neumann-Gale dynamical systems - the theory that was originally related to the problem of optimal redistribution of resources in a growing economy. The central question of the model consists in determining whether it is possible to hedge a contract (ie to make certain payments at certain dates) starting from a given initial endowment. We provide a criterion of hedging that generalizes the celebrated Fundamental Theorem of Asset Pricing and the Risk-Neutral Pricing Principle. As an example, we apply the criterion to a specific model of a stock market.
"Bank Capital Regulation, Credit Friction and the Cyclical Behaviour of Interest Rates"
This paper examines the macroeconomic effects of bank capital regulation in a simple Dynamic Stochastic General Equilibrium (DSGE) model with credit market imperfections. A key feature of this model is the derivation of the bank loan rate and the endogenous probability of default from break even conditions. We show that in a model which accounts for bank capital and bank capital regulation in the form of the Basel Accords, the endogenous probability of default impacts the lending rate through multiple channels. We also define the Basel I and Basel II regulatory regimes in terms of the calculation of the risk weight attached to loans with a distinction made between the Foundation Internal Ratings Based (IRB) and Standardized approaches of Basel II. Our simulation results suggest that the Basel II regulatory regime amplifies the response of macroeconomic variables following both supply and monetary shocks when compared to Basel I, while a comparison between the two variants of Basel II depends on the nature of the shock.
"A Comparative Assessment of Adaptive Learning Algorithms as Representative of Macroeconomic Expectations Formation"
Adaptive learning algorithms have been proposed to provide a bounded rationality view on agents process of expectation formation and as the means through which expectations shocks provide another source for business cycle fluctuations. Despite of the preeminence of the Least Squares (LS) algorithm as the representative of agents learning, it is now understood that the dynamic properties resulting from adaptive learning depend on the chosen algorithm. We evaluate the empirical plausibility of assuming the LS algorithm as representative of agents macroeconomic expectations formation by comparing it with a computationally simpler alternative, namely, the Stochastic Gradient (SG) algorithm. This latter has been proposed as a more stringent alternative to the LS in terms of bounded rationality. We assess this hypothesis with an empirical exercise comparing the performance of forecasts provided by these algorithms, as well as their resemblance to forecasts obtained from surveys. This is done within the context of vector autoregressions with time-varying coefficients for inflation and growth using US real-time data. We find mixed evidence on the comparative of forecasting performance, though with scarce support for statistically significant differences between these algorithms. Compared to the survey forecasts, however, we find that the forecasts from the LS algorithm provide a closer resemblance to the survey forecasts than the SG algorithm does. A thorough analysis on the initialization of these algorithms is also covered as a by-product of our study.
Maria Paola Rana
"Organized Crime, Corruption and Economic Growth: An Empirical Analysis for the Case of Italy"
The paper examines the impact of corruption on economic growth in the presence of organized criminal activities. Using a panel data of 20 Italian regions for the period 1961-2009, the analysis reveals (i) a growth-inhibiting effect of both corruption and organized crime, and (ii) that in the presence of organized crime the impact of corruption is less severe. This finding offers support to the argument that with organized corruption arrangements and better coordination in the bureaucrat's rent-seeking behavior, corruption is less distorting for economic growth. The results are robust to different specifications and different estimation methods.
"The Role of the State: Taxation and the Composition of Public Spending"
The paper responds to a prevalent empirical observation that as the economy grows, the share of the government expenditure spent on transfer payment also increases. In my model, the state decides policies that involve setting a tax rate and the composition of how to spend its tax revenue, specifically, it chooses the ratio between public goods provision and transfer payments. In essence, public goods propel growth and transfer payments mitigate the utility inequality between agents that are heterogeneous in wealth. I will argue that there exists a habit formation process on transfer payments that cause the state to choose a policy that harms the growth of the society. This habit induces a welfare loss in the economy and poses financial challenge to the state.
"Instability and Human Capital Formation"
This paper focuses on the impact that instability of the macroeconomic and political environment has on human capital formation. The transmission mechanism is analysed in the context of an OLG model. With the help of some calibration exercises, instability is shown to have a positive effect on the ratio between unskilled and skilled wages. A decreased skill premium discourages education decisions and leads to low levels of human capital accumulation. These predictions are subsequently tested empirically and confirmed by the results of a cross section panel data analysis.
"Inequality without Imperfection: The Role of Loss Aversion"
We construct a stochastic model in order to demonstrate how individual's loss averse preferences can result in multiple equilibria without the need for capital market imperfections. Individuals face a decision of whether to invest in a risky wealth enhancing project. We find that an increase in uncertainty, or in the preference to avoid losses, increases the critical value of wealth needed for individuals to invest in these projects. Hence, if initial wealth is distributed heterogeneously, initial inequality may persist and increase over time.
Emma Apps (University of Liverpool)
"Contagion in VaR amongst UK Financial Institutions"
My discussion focuses on the use of "Value-at-Risk" in defining a financial institution's exposure to systemic market risk. Specifically, its application results in the estimation of potential maximum losses in times of sustained market turbulence. Furthermore, I suggest that the current methodologies are, perhaps, at best, basic but necessary and ultimately failed to adequately quantify the huge financial losses sustained post 2007. The objective of the research is to introduce the concept of "Conditional Value-at-Risk"- where this measure attempts to identify how the risk of one institution may increase when a second institution falls into distress. This suggests that, rather than quantifying your risk in isolation, you should consider the negative "risk-spillover" effects of other financial institutions ie the contribution of individual financial institutions on the risk of the whole financial system. Consequently, I develop the theory that time varying Co-VaR and forward delta Co-VaR are more appropriate bases for the setting of regulatory risk constraints on Financial Institutions.
Anwen Zhang (University of Lancaster)
"Returns to Education for the Self-Employed in England and Wales: An Attempt to Correct Income Underreporting"
The paper attempts to correct income underreporting of self-employed workers in two UK surveys by adopting an expenditure-based approach. A functional form of Engel curve is first estimated for employee households, then applied to self-employed households to infer their true earnings from self-employment by looking at their food and fuel expenditures. Further, inferred self-employment earnings are used to estimate rates of return to schooling and vocational qualifications. The results show evidence of self-employment income under-reporting and further suggest that income underreporting largely affects the estimates of rate of return to education for the self-employed.
Vasileios Pappas (University of Lancaster)
"Analysing Default Risk in Islamic Banks"
This paper compares the level and determinants of default risk in Islamic banks versus commercial banks located in 20 Middle and Far Eastern countries. Survivor and hazard functions are estimated conditional on annual firm-level accounting data pertaining to 421 banks and country-level macroeconomic fundamentals over the period 1995 to 2010. Islamic banks are shown to be better positioned in terms of failure risk than commercial banks suggesting that they contribute favourably to the overall stability of the financial system. The survival rates are shown to be driven by firm-level balance sheet, income statement and financial ratios inter alios, but with different sensitivities for the two banking systems. Latent factors raise the chance of within country co-default for commercial banks but not so for Islamic banks.
"Structural Breaks in International Inflation Linkages for OECD countries"
The paper proposes an iterative structural break testing methodology, which aims to provide more reliable inference for structural break tests applied to conditional mean and conditional variance parameters by iterating between these two components, while also taking account of outliers. This iterative testing procedure is applied to both univariate and bivariate inflation models; the first is employed to examine the stability of domestic inflation, and the second is to investigate changes in a linkage between global and domestic inflation over time. The empirical analysis uses monthly Consumer Price Inflation (CPI) for 19 OECD countries, over the period between 1970M01 and 2010M09, and following key results emerge. First, univariate models yield less but broadly consistent mean breaks to the existing literature. We also document clusters of variance breaks occurring around mid 1970s, early 1980s and early 1990s, while only clusters of mean breaks have been documented widely so far. Second and more importantly, we find positive and increasing contemporaneous relationship between domestic and country specific global inflation. These results may be informative with regard to a co-movement of inflation across countries.
"Daily Volatility of Large-Dimensional Portfolios in Financial Markets with Asynchronous Trading and Microstructure Noise"
Volatility plays an important role in financial econometric models. We use the high-frequency data to study the daily variance-covariance-matrix of assets returns in large-dimensional portfolios in financial markets with market microstructure noise and non-synchronous trading. Our focus is on portfolios where the number of assets is larger than the number of synchronised intra-daily observed returns. We first outline the theoretical shortcomings of conventional daily variance-covariance-matrix estimators using intra-daily returns which motivates us to provide an introduction and evaluation on the performance of the recently suggested estimators that address these shortcomings. We use the multivariate realised kernel estimator which addresses the consequences of microstructure noise and use the blocking and regularisation method (Hautsch et al., 2008) which reduces the loss of the data due to the non-synchronous trades, then we discuss the implications of allocating the assets into irregularly sized blocks based on the trading frequencies. We analyse the performance and applications of this estimator in Value at Risk context and forecasting the covariance matrix. We use the Conditional Autoregressive Value at Risk by Regression Quantiles method as a measure to evaluate the performance of blocking and regularisation method.
"First Order and Second Order Asymptotic Analysis of GEL Estimators for Grouped Data Model"
The existing estimation techniques for grouped data models can be analysed as a class of estimators of IV-GMM type with matrix of group indicators being the set of instruments. Econometric literature (eg Smith 1997, Newey and Smith 2004) show that in some cases of empirical relevance, GMM can have shortcomings in terms of the large sample behavior of the estimator being different from the finite sample properties. GEL estimators are developed that are not sensitive to the nature and number of instruments and possess improved finite sample properties compared to GMM estimators. In this paper, with the assumption that the data vector is iid within group, but inid across groups, we developed GEL estimators for grouped data model having population moment conditions of zero mean of errors in each group. First order asymptotic analysis of the estimators show that they are consistent and normally distributed. The paper explores second order bias properties that demonstrate sources of bias and differences between choices of GEL estimators.
"Is the Retirement-Consumption Puzzle Solved in the UK?"
Empirical evidence in some developed countries has found that the life cycle hypothesis cannot fully explain why consumption falls at the retirement age for some individuals. This suggests that the individual's wealth at the retirement age is below the optimal level. However, the buffer-stock saving model where income is the driving force to change consumption can provide an explanation for this puzzle as named in the literature. On the other hand, the consumption growth is independent of income in the life cycle hypothesis. Applying Fuzzy Regression Discontinuity design on household survey data- Family Expenditure Survey (1968-2001), Expenditure and Food Survey (2002-2007), and Living Cost and Food Survey (2008-2009) in the UK, I find that consumption substantially fell at the retirement age before 1980s. This fall is less severe after the 1980s, because of tax policies designed to boost pension income. However, throughout the data period, consumption falls at the retirement age are fully explained by the income falls, which contradicts the life cycle hypothesis but supports the buffer-stock model.
12 - 13 May 2011
"On The Efficiency of Trade in the Ethiopian Agricultural Markets (FFD)"
The 2007-09 food price inflation hit Ethiopia harshly; even while world food prices started exhibiting a sharp drop, domestic prices kept on surging. A long-lasting reason of price instability was associated to market failure (Rashid and Assefa, 2006): the inability of agricultural markets to efficiently move food crops from surplus to deficit regions. The recognition that policies aimed at "getting prices right" in less developed countries were failing due to incomplete markets has spurred a new wave of reforms, directed instead at "getting markets and institutions right". Similarly, the Ethiopia Government started implementing post-structural reforms in the late Nineties, in order to enhance the "3 I's of market development": Incentives, Infrastructure and Institutions (Gabre-Madhin, 2006). We wonder whether the focus on the 3 I's is a necessary, non sufficient, condition for market efficiency. Is the presence of many small traders in the Ethiopian agricultural markets a source of inefficiency? Should the Ethiopian Government restrict entry in agricultural trade and formalise trading activities? Using detailed data collected in 2006/07 by Gabre-Madhin, IFPRI and EDRI, this paper investigates whether agricultural trade in Ethiopia exhibits increasing returns. Similarly to the approach of Fafchamps, Gabre-Madhin and Minten (2005) for Benin, Madagascar and Malawi, we test the existence of increasing returns to load size and returns to scale in transport. Instead of price movements, the focus is on traders' marketing costs and margin rates. Findings of unexploited increasing returns to scale provide evidence that public efforts aimed at concentrating trading activities would be beneficial. Such efforts would increase the aggregate efficiency of agricultural marketing, thus eventually benefit both farmers and final consumers.
"The Impact of Macroeconomic Volatility on Innovative Investment"
Bearing in mind the importance that investment in innovation and R&D has for growth and development, this paper analyses econometrically the impact of macroeconomic volatility on private innovative investment. In the analysis, macroeconomic volatility is measured by the standard deviation of GDP per capita, the standard deviation of interest rate and by the governmental deficit, the last variable is held to represent a possible proxy for the quality of a country's macroeconomic management. The results show that volatility has a negative impact on innovative investment when the effect is tested in a DC-LDC mixed panel. The coefficients cease to be significant once the same specification is tested for a panel of high-income countries. These findings lend credibility to the theory maintaining that cash-constrained SMEs' innovative investment tends to be negatively impacted by volatility, whereas this impact is positive for big enterprises or MNCs, which are not cash-constrained . In fact, it has been claimed the latter find it more profitable to direct their focus towards research and to invest in R&D when the scope for sales is lower due to decreased levels of aggregate demand. This, of course, is what happens during recessions. The difference between SMEs and MNCs could be used to represent broadly the difference existing in the industrial and productive landscape between developing and developed countries. Therefore, this paper sheds some light on the implications that macroeconomic volatility has for the development and growth processes, when its relationship with innovative investment is negative.
"Corruption, Decentralisation and Economic Growth"
Corruption presents a major obstacle to economic development. This has led to extensive research on the subject of corruption with, in recent times in particular, economists playing a central role in this research. However, despite its high priority, the fight against corruption has been largely unsuccessful. There is growing realisation that many anti-corruption measures are ineffective and, in some circumstances, counter-productive. It seems clear that if the fight against corruption is to be successful, new approaches must be devised. Decentralisation has long been viewed as a means to improve the performance of government. More recently there has also been a focus on the relationship between decentralisation and corruption. The potential benefits of decentralisation include improved information, greater accountability and transparency, and increased competition. These traits are widely viewed to be vital in fighting corruption. However, decentralisation is a complex process; only if implemented in the right conditions and in the right way will these benefits arise. It has also been suggested that decentralisation can increase corruption. Despite extensive empirical and theoretical research on the topic, the overall effect of decentralisation on corruption remains ambiguous. This paper aims to investigate further the relationship between corruption, decentralisation and economic performance from a macroeconomic perspective, using a fully specified dynamic general equilibrium model in which, by bringing the people closer to government, decentralisation can enable bureaucrats to internalise the negative externalities caused by their behaviour. As a result, decentralization can induce public officials to reduce the bribes they charge entrepreneurs, thereby lowering corruption and increasing economic growth.
Maria Paola Rana
"Organized Crime, Corruption and Economic Growth"
Nowadays it is well-recognized that organized crime and corruption can obstruct economic growth and development, however only a few empirical and microeconomic studies have jointly considered these phenomena. We develop a simple macroeconomic model in which criminal organizations co-exist with law-abiding productive agents and potentially corrupt law enforcers. The crime syndicate obstructs the legal operations of agents through extortion, and may pay bribes to law enforcers in order to avoid detection. An important implication is that the amount of extortion would be higher under corruption since the mafia needs to pay bribes to law enforcers. In this way, the presence of both organized crime and corruption increases the costs to society by deterring more individuals from setting up business.
"Varying Term Loan Contracts and Interest Rate Persistence"
This paper applies the literature of multiple fixed price and wage contracts on persistence (see Dixon and Kara, 2007, 2010) to fixed loan contracts and their effects on persistence of real variables. Loan contracts are fixed both in terms of the interest rate and the loan amount for specified period which varies from sector to sector. As a result, like Dixon and Kara (2007, 2010) wages and prices may be suboptimal for a given firm. Additionally, since a proportion of firms are unable to adjust their contract in a particular period, their levels of employment, production and the loan rate may also be fixed to a predetermined suboptimal level. We aim to examine the impact that loan contracts of various durations has on the persistence of aggregate variables and optimal monetary policy.
"Supply Shocks and the Cyclical Behaviour of Bank Lending Rates Under the Basel Accords"
This chapter examines the procyclical effects of bank capital requirements in a simple static general equilibrium model with credit market imperfections. A bank capital channel is introduced by assuming that bank capital buffers increase banks' incentives to screen and monitor borrowers more carefully, thus reducing the borrowers' probability of default and allowing banks to charge a lower interest rate on loans provided for investment purposes. Basel I and Basel II regulatory regimes are defined in terms of the calculation of the risk weights on loans with a distinction made between the Standardized and Foundation Internal Ratings Based (IRB) approaches of Basel II. We analyse the role of the bank capital channel in the transmission of changes in prices and a supply shock and show that all regulatory regimes amplify the procyclical effects in the lending rate following a productivity shock. Finally, it is crucial to know the direction in which prices fluctuate following a supply shock in order to compare between the procyclical effects of the different regulatory regimes.
"The Mortality and Economic Costs of Particulate Air Pollution in Developing Countries: A Nigerian Investigation"
The value of statistical life is an essential parameter used in ascribing monetary values to the mortality costs of air pollution in health risk analyses. However, this willingness to pay estimate is virtually non-existent for most developing countries. In the absence of local estimates, two benefit transfer approaches lend themselves to the estimation of the value of statistical life; the back of the envelope method and meta-regression. Using Nigeria as a sample country, a comparison of the two methods reveals that the back of the envelope technique is very likely to underestimate the value of statistical life for very low-income countries. As a result, the meta-analytic approach seems to be a more credible method of providing value of statistical life estimates for these countries. This method provides a best estimate of $462,000 for Nigeria's willingness to pay for fatal risk reductions. Combining this estimate with dose response functions from the epidemiological literature, it follows that had Nigeria mitigated its 2006 particulate air pollution to the World Health Organisation standard, it could have avoided at least 58,000 premature deaths and recorded a potential gross economic saving of about $27,000 million or 18 percent of the nation's GDP for that year.
"Risk Perception of Food Safety Hazards"
This interdisciplinary research examines the food safety knowledge of consumers in order to investigate sporadic food poisoning in the home. The Food Standards Agency (FSA) estimates that foodborne illness in England and Wales costs the country approximately £1.5 billion a year. It is thought that a reduction in the number of cases by 10,000 fewer instances of infection per year would represent an economic saving of £15 million per year. Whilst there are food poisoning micro-organisms that are well known due to their press coverage, Campylobacter is little known but is the most commonly reported bacterial gastrointestinal illness in the UK and indeed in Europe. This lack of media coverage is in part due to the sporadic nature of infection rather than via outbreaks associated with Salmonella and E.coli 0157. This paper provides a brief overview of my research questions and methodologies, before focussing on the development, and use, of an interactive, food safety video challenge. This innovative survey technique has been developed to measure real-time knowledge and perception of hazards – in this case domestic food safety. Hazard perception testing in this way can be described as Situation Awareness (SA) which brings together a range of cognitive processes. Situation Awareness is the active processing of situational information within the short-term memory as new information is combined with existing knowledge and a composite picture of the situation is developed. The literature states that hazard perception driving tests assess implicit knowledge that is the non-conscious retrieval of previously acquired information, demonstrated by performance on tasks that do not require conscious recollection of past experiences. A benefit of this type of assessment is that participants are less likely to be prone to social desirability bias as the real-time testing method is thought to reveal (food safety) knowledge and personal habit, rather than the participant having time to consider what answers the researcher may want. Initial results of the survey will be discussed highlighting the type of data that is extracted for analysis with demographic variables such as hazards, hazard type and correct versus incorrect behaviours.
"Product Differentiation Decisions Under Ambiguous Consumer Demand and Pessimistic Expectations"
The paper examines the effects of violating the common prior assumption embedded in the "Product differentiation and location decisions under demand uncertainty" model by Meagher and Zauner (Journal of Economic Theory ). In particular, a situation is discussed in which the firms do not know the exact distribution of the location and price elasticity of consumer demand, but resolve the resulting ambiguity using the Arrow and Hurwicz alpha-maxmin criterion. When the firms are sufficiently pessimistic (alpha is high enough), the results are in contrast with the existing literature. In particular, an increase of demand location uncertainty decreases the equilibrium product differentiation, as well as the resulting second-stage equilibrium prices and profits for any realisation of consumer demand, although the effect is dampened by a possibility of higher price elasticity of demand. Furthermore, pessimism could serve as a form of strategic deterrence, because any firm that can commit itself to a more pessimistic approach increases its equilibrium share of the market and becomes better off at the competitor's expense. However, this generates a Prisoner's Dilemma situation, since both firms lose when they both become more pessimistic, suggesting that the presence of ambiguity can make the market more competitive.
"On The Bertrand Core and Equilibrium Of A Market".
A striking result in economic theory is that price competition between a small number of sellers producing a homogeneous good may result in the perfectly competitive market outcome. We analyse the formation of price-making contracts when there is the possibility of coalitional deviations from the market. We consider a market with a finite number of buyers and sellers and standard market primitives. In this context we introduce a new core notion which we term the Bertrand core. A trading price is said to be in the Bertrand core if all sellers quoting this price constitutes an equilibrium and no subset of traders, buyers and sellers, can leave the market and improve their outcomes by trading by themselves. Under standard assumptions we show that the Bertrand core is non-empty. Moreover, we are able to obtain a price-making analogue of the well-known Debreu-Scarf (1963) result by showing that as the set of market traders is replicated then any price other than the competitive equilibrium can be blocked by some subset of traders provided that the market is replicated sufficiently many times.
"The Endogenous Poverty Line: Existence and Implications"
This paper provides necessary and sufficient social preference conditions under which a poverty line can be shown to exist endogenously. In so doing, it turns out that the apparently independent "identification" and "aggregation" problems in poverty measurement are inseparable. A necessary condition for the existence of the poverty line is a weaker form of the well known Pigou-Dalton distribution principle.
"Aid Versus Remittance - Which Works Better?"
In this chapter we investigated whether less dependence on foreign aid and more dependence on enhanced remittance earnings could accelerate capital accumulation and growth recipient countries. Using a panel of low and middle income countries, we presented empirical evidence of quite opposite effect of these two forms of external capital on growth; foreign aid has detrimental effect whereas international remittances have favourable effect. We argue that remittances have favourable impact on growth as it is more effective than foreign aid in enhancing capital accumulation. Our theoretical model shows that foreign aid increases the amount of investment capital of the economy by increasing the quantity of subsidies given to the poor households by the government. In case of remittances, the amount of capital accumulation is even higher since remittances flow directly to recipients whilst aid is channelled through public institutions that suffer from corruption.
Obbey A Elamin
"Discrete Time Duration Model Estimation Using Nonparametric Kernel Method"
This study employs the kernel method with fixed bandwidth to estimate discrete time single risk conditional hazard models. It uses two flow samples for employment and unemployment spells and data on the females in the working age 16-59 from 18 waves from the BHPS. Transition out from the state is modelled conditioning on individual's age, fertility and education levels. The study investigates two kernel estimation approaches with the Mixed Proportional Hazard approach. The unobserved heterogeneity is accounted for in the MPH models parametrically as a mixing distribution and nonparametrically using the NPML estimation method. The first kernel approach uses the conditional density method to replicate the estimation of the MPH models. The second approach uses the weighted kernel method to handle right censored data and an "external" approach to estimate the conditional hazard. Results from the latter indicate that the kernel external hazard approach is adequate computationally and less affected by spurious duration dependence even after an initial negative dependence period. The Mixed Proportional Hazard approach can acquire this property only when the restrictive parametric specifications of the baseline hazard and the unobserved heterogeneity distribution are relaxed.
"Estimation and Inference in Grouped Data Models"
The use of group-averaged data widened the scope of research using repeated cross section surveys. Following Angrist (1991) this study shows that estimators based on group averaged data are equivalent to instrumental variable estimators when we use group indicators as instruments. Estimators in the literature can therefore be grouped into a class of IV-GMM type estimators. However if the instruments are weak and/or many, conventional first order asymptotic theory may not provide a good approximation to the finite sample behavior of the IV-GMM estimators and test statics. Here I proposed the generalized empirical likelihood estimators for grouped data models that are alternative moment condition techniques known to have improved finite sample properties. Assuming across group heteroscedasticity and independence, group size asymptotics can be used to show consistency and asymptotic normal distribution of the estimators. Further research will proceed to provide comparative finite sample properties of the estimators for the model.
"The Effect of Social Safety Net Programmes on the Calorie Consumption Of Poor Households in Bangladesh: An Application of Regression Discontinuity Design"
Using the Bangladesh Household Income and Expenditure Survey 2005, this study examines the effect of social safety net programs on per capita daily calorie consumption for each household. Through these programs, the Bangladesh government and some national and international agencies have been providing food or cash or both to poor households in Bangladesh since the famine in 1974. We seek to estimate how much these programs affect well-being of poor households in Bangladesh. Most of the previous studies have produced negative impacts of these programs on calorie consumption, simply computing the raw differential. However, both observable and unobservable characteristics bias this treatment effect. Using fuzzy Regression Discontinuity (RD), we control for these selection effects and find a significant positive impact of the programs. Local Linear regression under RD setup with optimal bandwidth of Imbens and Kalyanaraman (2009) produces treatment effect 1132 kcal as the best estimate, which is 52 percent of per capita daily calorie consumption of each household.
10 - 11 May 2010
Human Capital, Corruption and Economic Growth: A Cross Sectional Analysis
Abstract: This study explores relative new channel of corruption in the link between human capital and economic growth. The empirical literature documents significant or even negative impact of human capital on economic growth casting doubts on the role of human capital on economic growth. This study takes an innovative step in highlighting the mediating role corruption in explaining discouraging effect of human capital by taking into account the interaction between human capital and corruption in the growth equation. The underlying hypothesis states that in the multiplicative interaction growth model, the direct effect of schooling on growth is anticipated as positive while a negative coefficient on interaction of schooling and corruption which indicates that the impact of schooling on growth is dependent on the level of corruption and it may decrease with higher level of corruption.
Crime, fertility and economic growth in endogenous growth models
Abstract: This paper examines interrelationship between the crime level, the fertility rate, child rearing and economic growth and how these are affected by public policy making. We develop a three period overlapping generations model where reproductive agents face a non-zero probability of death in adulthood. As adults, agents allocate their time to work, leisure, child rearing and criminal activities. All adult agents engage in criminal activities and as in Mauro and Carmeci (2007) thieves are both hunters and hunted. The agent’s productivity depends on health status which exhibits ‘state dependence’ as it depends on health in childhood. We show that fertility rate has positive effect on crime and that a greater payoff from criminal activities induces a higher fertility rate. Greater government spending towards police enforcement gives rise to an ambiguous effect on both the level of crime and economic growth while it leads to a lower fertility rate. These results offer support to a variety of findings in the related empirical literature.
A New Keynesian Model with a Cost Channel and Relative Price Effects
Aggregate Output in the standard New Keynesian literature, with flexible wages and constant returns, is assumed to be free of relative price dynamics. This is because the former is driven by an aggregation of output that assumes symmetric prices. We show that when aggregate output is derived based on the aggregation of individual product demands, under sticky prices, relative price dynamics affect inflation and output. We show that in the absence of labour heterogeneity and regardless of the returns to scale, the slope of the New Keynesian Phillips Curve is affected by the price elasticity of demand in individual goods. We then compare our model with the cost channel model of Ravenna and Wash (2006). Our results indicate a flatter Phillips curve which is reduced further as individual firms lose their market power. Hence as the product markets become more competitive, demand and productivity shocks cause output fluctuations to have less of an impact on inflation movements, as well as reduction in expected welfare loss according to a Taylor rule policy.
Financial Development and Economic Growth: The Role Of R&D
This study is an attempt to explore the conditional effects of financial development on growth by employing a system GMM estimator for dynamic panel data models using a panel of 26 OECD and 10 non-OECD countries for the period 1980-2006. Besides looking at the direct effects of financial development and R&D on growth, we address an important question of whether economic policies should be directed to promote financial sector development or R&D activities (or both) by exploring the nature of the relationship between financial development and R&D, that is whether they are substitutes or complements. Second, we take care of influential outliers that may hinder our understanding of the conditional effects of finance on growth via innovation or R&D. Our results show that the effect of financial sector development on economic growth decreases as we move from lower levels of R&D expenditures to higher levels. It suggests that for economies which spend more on R&D activities, finance performs less well in promoting economic growth.
On credible commitment in Bayesian oligopoly games
Consider a market with firms producing gross substitute goods which have differential information regarding the state of the market. In this context, it is typical to assume a firm with superior information earns higher ex ante expected profits. We show that this is not necessarily the case: it is possible for a firm with an information disadvantage to earn the highest ex ante expected profits in a Bayesian equilibrium. However, when firms have constant returns to scale cost functions, are ex ante and ex post symmetric, then the firm with superior information always earns higher ex ante expected profits. The results suggest that whether credible commitment is beneficial in price/quantity games is determined by the convexity/concavity of the cost function.
A New Class of Inter-temporal Poverty Measures
This paper introduces a new class of inter-temporal poverty measures based on any of a wide range of static poverty measures. An individuals level of inter-temporal poverty is computed using a weighted average of their snapshot poverty in each time period. The weight assigned to the level of poverty in each time period is determined by the number of periods of relative affluence directly preceding each poor period. A societal intertemporal poverty measure is then obtained by aggregating across individuals. The measures are found to have a number of attractive properties and axiomatic characterisations are provided.
Food Safety Economics
Using Best Worst Scaling to Investigate Perceptions of Control & Concern over Food and Non-food Risks
This research locates a series of risks or hazards within a framework characterised by the level of control respondents believe they have over the risks, and the level of worry the risks prompt. It does this for a set of both food and non-food risks. The means by which this is done is novel and differs from past risk perception analyses in that it asks people directly regarding their relative assessments of the levels of control and worry regarding the risks presented. The cognitive burden associated with people ranking and scaling items in large sets is notoriously heavy and so this study uses an elicitation method designed to make the process intuitive and cognitively manageable for respondents.
The substantive analysis of the risk perceptions has 4 main foci concerning the relative assessment of (i) novel as opposed to more familiar risks (eg swine flu vs. heart attack) (ii) food risks as opposed to non-food risks, (iii) perceived levels of control over the risks versus how worrying the risks are considered to be, (iv) differences in the risk perceptions across social groups, in this research we analyse the relative assessments of farmers and consumers with a particular orientation on E. coli.
Food Risks And Food Poisoning: Awareness, Behaviours And Change
There are an estimated 9.5 million cases of food poisoning annually in the UK, costing the economy £0.75 billion. Whilst outbreaks of food poisoning from catering establishments can often be identified, the more prevalent problem of food poisoning arising from behaviour in the home through poor hygiene, cross contamination and inadequate cooking or cooling is less open to scrutiny and regulation. Education and the raising of awareness regarding food safety practices remain more appropriate approaches to management of this economic and social problem
This interdisciplinary research examines the food safety behaviour of consumers in order to investigate sporadic food poisoning in the home, focussing particularly on the food poisoning pathogens Campylobacter and Salmonella.
This paper introduces the methodology that will be used in two key segments of the doctoral research:
- The use of innovative interactive, web-based, survey techniques to assess risk perception regarding food behaviours;
- A case-control study to identify differences in food behaviour and kitchen hygiene between those with a diagnosis of Campylobacter or Salmonella and a control sample of the public.
Multivariate Kernel Forecasting of Variance Covariance Matrices
We present a non-parametric, kernel based method for forecasting the variance-covariance matrix of a portfolio of stocks. Our method utilises a selection of information about the historical values of the covariance matrix and macroeconomic data in order to choose a weighting structure. This allows us to make positive semi-definite forecasts of the variance-covariance matrices, regardless of portfolio size, which we compare to more established parametric techniques, such as the DCC, and approaches which are non-parametric but rely exclusively on time structures when determining forecasts of the variance–covariance matrix. We hope to show that it is possible to use a wide range of information to produce accurate forecasts of covariance matrices in a straightforward manner.
Wasel bin Shadat
Consistent Joint and Marginal Specification Tests for GARCH Regression Model using Residual Marked Empirical Processes
In this research the joint and marginal Integrated Conditional Moment (ICM) testing problem is investigated for conditional mean and condition variance specification within the Univariate GARCH (UGARCH) regression framework. Halunga and Orme (2009) provided a unifying parametric (mis)specification testing framework for UGARCH models based on Conditional Moment (CM) principle, which includes score-type tests proposed by Engle and Ng (1993) and Lundbergh and Terasvirta (2002). These tests, however, assume correct specification for conditional mean. Besides, CM tests employ a finite number of moment conditions leading to the inconsistency of the test. Escanciano (2008) puts forward a joint-marginal (consistent) specification testing principle for conditional mean and variance models applying generalized spectral approach. Under the null of correct specification, Escanciano?s test statistics have a limit distribution which depends on the DGP and the null model in a complicated way; hence bootstrap procedure is required to implement the test. Escanciano advocated a fixed design wild bootstrap procedure (FDWB) and considered AR-ARCH model in his simulation. We employ Escanciano?s framework for GARCH regression model using various test variables offered in Halunga and Orme (2009) where apart from FDWB other bootstrap schemes are also investigated.
Obbey E Elamin
Nonparametric Conditional Density Analysis of Female Labour Supply Using The UK Labour Force Survey For 2007
This paper compares empirically between two estimation methods of the conditional probabilities for a discrete choices dependent variable that measures females' choices for labour market supply in UK in 2007, using the data of UK Labour Force Panel Survey. The comparison is made between the Multinomial Logit Model that built on the Additive Random Utility Theory and the Nonparametric Kernel Method with fixed bandwidth that does not impose any functional form to the covariates contribution on the conditional probabilities, using some techniques that up to the start of the work in this paper were available only theoretically in the nonparametric estimation framework. The models were used to estimate unordered choices for labour supply for the females in the working age 15-59 as a dependent variable on a covariates set of mixed variable types represents age, fertility and education with a quadratic form assumed for age in the MNL model. The work is motivated by the advantages of the nonparametric techniques and aims to illustrate the differences between the predictions analytically and descriptively, with a particular concern to judge on the precision of the estimated probabilities derived from each framework as well as the specification of the model in each framework.
The MNL model provided biased estimates of the effects of the covariates on female labour supply as a consequence of the specific function form of that model. A consequence of this was under or over estimate of the probabilities and preformed weakly in the out-of-sample predicted probabilities. In contrast, the NP model correctly revealed the effects of the covariates on the females' labour supply.
To examine the differences between the two frameworks and the advantages of the kernel method better, the models have been estimated again separately for:
- Subsamples of the females in the working age 16-59 with three different education levels.
- A subsample of the all females in the labour force (all economically active females). From which three more education level subsamples were generated. In all these subsamples the dependent variable is a binary in the labour unemployment choice.
All the models consistently demonstrated the same quality of estimates as for the main model in this research.
The allocation of time and welfare within rural households: Evidence from Tanzania
Using data on individual consumption and time use from Tanzania, we analyse the distribution of well-being within rural couples using Browning&Goertz's (2007) collective household model.
Restricting the sample to only those couples with individual incomes, we fail to reject collective behaviour and recover the structural model. We find that the individual income sources have a positive impact on relative consumption and a negative impact on leisure. We also find that inter-generational influences impact on the female position within the household and that polygamy influences the inter-spousal allocation of goods.
Accepting strong distributional assumptions, we show that about 25% of the unexplained variation in the relative expenditure and 0.3% in the relative leisure equation can be attributed to factors that influence the partner's decision-power within the household.
Once we extend the sample by the use of other income measures we do not find that the collective restrictions hold.
Risk, Agricultural Assets and Poverty Traps in Rural China
Despite the significant reduction in the poverty headcount ratio over the last three decades, poverty in rural China has become more concentrated and persistent since the late 1990s. The stagnation of income has been a particular problem for many of the poor. In order to assist in policy, this paper aims to gain new insights into the persistence of poverty for some rural households, stressing the role of shocks and risk. We find that rural households' behavioural responses to uninsured shocks and risk could lead them to low risk-low return agriculture. In the long-run this results in a persistence of their low income. Our results indicate the importance of establishing productive safety nets for rural households. This would stimulate self-reinforcing growth and improve the poor's possibilities of escaping the low equilibrium traps.
How to Reduce Unpredictability Facing Ethiopian Farmers: The Use of Brokers in the Ethiopian Grain Market (ffd)
About 84 percent of Ethiopia's population lives in rural areas and the agricultural sector accounts for almost 45 percent of GDP and 85 percent of total exports. However, rain-fed agriculture makes Ethiopia particularly vulnerable to erratic and unreliable rainfall. Recurrent floods and droughts periodically hit the country, while food insecurity is addressed through food aid shipments. Well-functioning agricultural markets can help to mitigate such unpredictability, but in Ethiopia agricultural markets remain thin, seasonal and segmented. This paper investigates the use of brokers by wholesalers to facilitate the movement of crops from moisture-reliable to drought-prone regions of Ethiopia, based on data collected by the International Food Policy Research Institute (IFPRI) in 2007/08. Incentives, infrastructures and institutions, trading-business and transaction-specific characteristics are controlled for, while traders' transaction costs and social capital enter the analysis through proxy variables. Heckman's sample selection approach models the decisions of `whether' and of `how much' to use brokers for buyers and sellers separately. The results suggest that traders use brokers to compensate for the lack of social capital, market infrastructures and institutions. Furthermore, as brokers can better relate deficit to surplus regions of the country, their use has the potential to improve agricultural markets' functioning to the benefit of Ethiopian farmers.
Macroeconomic Instability, Institutional Quality, and Capital Flight: Evidence from Africa
According to economic theory, capital-scarce less developed countries should be able to retain their own domestic capital, since the marginal returns should be higher there. Capital flight, the outflow of foreign exchange from poorer countries, seems to defy that logic. This paper empirically determines the causes of this phenomenon. Using panel data from 40 African countries, we show that macroeconomic instability is associated with a higher incidence of capital flight. We also find that the impact of macroeconomic instability on capital flight is conditional on the level of institutional development so that fragile macroeconomic environments fuel higher flight in countries with weak institutional structures. This finding controls for the endogeneity of both variables of interest and is robust to a number of robustness tests including the use of alternative measures of both macroeconomic instability and institutional quality
Md Shafiul Azam
Household Income Dynamics in Rural Bangladesh - An Asset Based Approach
Adequate understanding of the degree and nature of non-linearity in household wealth dynamics and the potential existence of asset thresholds can unbundle some of the micro level binding constraints undermining poverty reduction interventions. The relevant questions are: first, do household asset holdings converge unconditionally to a single long run equilibrium that is high enough for all poor households to escape poverty over time? Or do asset thresholds exist below the poverty line that the households can not overcome without outside intervention? These are fundamental issues to be considered in designing effective poverty reduction policies as the answers to these questions might warrant entirely different set of propositions for different groups of households in terms of policy options.
As opposed to well developed theoretical frameworks, empirical literature on identifying household welfare dynamics and poverty thresholds is very small to date. This paper uses panel data from Bangladesh to generate asset based poverty classification scheme. Regression results are used to derive as asset index and classify households into various categories of dynamic poverty groups. Asset index dynamics are also explored to test for the existence of multiple equilibria; evidence of poverty traps. In addition, it compares various techniques for identifying poverty dynamics by applying a number of existing methods to the same data set and check whether other semi and nonparametric techniques may be more suitable for locating asset poverty equilibrium than these existing techniques.
7 - 8 May 2009
Development Economics and Policy
"Targeting the Poor versus Financial Sustainability and External Funding: Evidence of Microfinance Institutions in Ghana"
Abstract: The creeping effect of financial crisis and economic turmoil on African economies potentially questions the sustainability of microfinance institutions. This is informed by the significant investment received from both development partners and government. This study tests the hypotheses that: (i) interacting own-mobilised funds with formal institutions, microfinance organisations reach poorer clients; and (ii) concentrating on the achievement of financial sustainability causes an institution to target non-poor clients. Using data from Ghana, we revisit the microfinance argument of serving poorer clients on a commercial basis, and control for the effect of source of funds and type of institution. Unlike financial self-sufficiency, operational self-sufficiency appeared to predict the targeting of poorer clients. This upholds sceptics’ view of a trade-off. Categorising institutions based on source of funds, this study adds to knowledge on the future of microfinance. Formal institutions dispensing their own funds appeared to target clients in poorer. Using instrumental variable estimation, plausible problems of endogeneity emerging via measurement error were observed. We instrument financial and operational self-sufficiency with density of microfinance institutions in a given location and the group-lending mechanism to resolve attenuation bias. This finding alludes to complementary development strategies and a deliberate harmonisation of microfinance intervention, irrespective of the source of funds.
Md Shafiul Azam
"Vulnerability and Poverty in Bangladesh"
Abstract: This study estimates ex ante poverty and vulnerability of households in Bangladesh using Household Income and Expenditure Survey (HIES) data in 2005. Our results show that poverty is not same as vulnerability as a substantial share of those currently above the poverty line is highly vulnerable to poverty in the future. The study finds that those without education or agricultural households are likely to be the most vulnerable. The geographical diversity of vulnerability is considerable, for example, vulnerability in coastal division, ie, Chittagoan Division is almost double to that of Dhaka and almost four times higher than Khulna Division. It is suggested that ex ante measures to prevent households from becoming poor as well as ex post measures to alleviate those already in poverty should be combined in evaluating poverty. In designing policies one should take note of the diverse nature of poverty and vulnerability. For the chronic poor who lack economic assets, priority should be given to reduction of consumption fluctuations and building up assets through a combination of protective and promotional programmes. Access to financial services, for example, though micro credit programmes, might help poor households build up assets as it smooths income and consumption, enables the purchase of inputs and productive assets, and provides protection against crises. On the other hand, the transient poor and high vulnerable non-poor households are most likely to benefit from combination of prevention, protection, and promotion which would give them a more secure base to diversify their activity into higher-return, higher risk activities.
"PRSP-adoption and poverty outcomes: a country-level evaluation"
Abstract: This paper presents the results of a cross-sectional appraisal of the Poverty Reduction Strategy initiative sponsored by the International Financial Institutions (IFIs). It examines whether any aggregate evidence can be found of a beneficial impact on poverty levels, or the proximate drivers of poverty reduction (higher economic growth and more equitable income distribution). It makes use of a specially constructed panel of Poverty Reduction Strategy Paper (PRSP) adopting and non-adopting countries, and applies a series of analytical techniques based on counterfactual comparisons. The results reveal only very weak evidence of a post-adoption improvement in performance, and moreover, this effect disappears as the appraisal methods become more sophisticated. The analysis also makes clear the variety of individual country experiences and thus the limitations of aggregative approaches for examining such complex policy questions.
Development Economics and Policy/ Environment and Resource Economics
"Vulnerability to Poverty in Post-reform Rural China"
Abstract: This paper examines households’ vulnerability as expected low utility in rural China during the period 1989-2006. We also present a two-dimensional dynamic measure of vulnerability. One of our findings is that per-period vulnerability consistently increases over the sample period and is predicted to be higher if future vulnerability is also of concerned to households. The majority of rural households are vulnerable in each sub-period, indicating that transient hardship might easily become a chronic phenomenon. The bidimensional poverty status in terms of both vulnerability and conventional poverty measures for those lying at the bottom of the consumption distribution has deteriorated. Vulnerability in coastal provinces appears as high as in the west, which is the poorest region. Farmers are particularly subject to lower expected welfare than others. Vulnerability is mainly driven by the intra-province poverty/inequality component, but we also find, via our decomposition, that idiosyncratic risk seems to be playing an increasingly important role. Based on these estimates, we also explore the determinants of households’ vulnerability and its components. Primary education and health insurance tend to have a significantly positive effect on alleviating people’s vulnerability, while diversification of income from traditional agricultural production has a relatively limited effect. We find inter alia that Chinese long-lasting "development" oriented anti-poverty policy may not have been entirely successful in reducing the dangers of welfare loss to the chronically poor.
"Food Safety, Risks, and Responsibility"
Abstract: There has been increasing concern in recent years over the human health risks posed by biological, chemical, and physical hazards in the food chain. Consumers have become more aware of food quality and safety, and trust in the food chain has been strained by BSE as well as high profile food poisoning outbreaks such as the E.coli outbreak in Lanarkshire that killed 17.
The economic benefits of safer food include the reduced sickness burden (NHS costs, lost workdays) but there is a more general benefit that of the general reduced exposure to risk which people may value.
My doctoral research investigates consumers’ perception of food safety, risks and responsibility. The questions it addresses are:
- What is the value to consumers of reductions in the risk of getting a foodborne illness?
- How are these valuations affected by the numerical and graphical representations of these risk reductions within surveys?
- How do people allocate responsibility, between the stages of the food chain, for ensuring their meat is safe? How does this differ between food types?
- How much control do people believe they have over avoiding foodborne illnesses, and how does this perception of control differ across food risks, and between food and non-food risks?
- How worrying or severe do people consider foodborne illnesses to be, and how do these perceptions of worry/fear differ across food risks, and between food and non-food risks?
This presentation introduces the overall methodology used in the research and presents some initial results from Parts 3-5.
These results are from a pilot using MaxDiff (or "best-worst scaling") Conjoint techniques. These are used to analyse consumers’ perceptions of food safety responsibility across the food chain. In addition they are used to investigate food and non-food risks in terms of how worrying they are seen to be and the level of control people believe they have over them.
Microeconomics and Mathematical Economics I
"Strategies of survival in dynamic asset market games"
Abstract: The paper examines a game-theoretic evolutionary model of a financial market with endogenous equilibrium asset prices. Assets pay dividends that are partially consumed and partially reinvested. The traders use general, adaptive strategies (portfolio rules), distributing their wealth between assets, depending on the observed history of the game and the exogenous random states of the world. A strategy profile of the investors determines the market dynamics with asset prices derived from a short run equilibrium of supply and demand. This random dynamical system generates a path of the unfolding simultaneous-move N-player stochastic game, the outcome of which for every player is characterized by a sequence of (time-dependent) shares of total market wealth. The main goal is to identify strategies, allowing an investor to "survive", ie to keep a positive, bounded away from zero, share of total wealth over the whole infinite time horizon, irrespective of the portfolio rules used by the other traders. We construct a strategy, generalizing Kelly's portfolio rule of "betting your beliefs", well known in capital growth theory, which possesses this remarkable property of unconditional survival. The present work brings together recent studies on evolutionary finance (Blume, Easley, Evstigneev, Hens, Schenk-Hoppé) with the classical topic of non-cooperative market games (Shapley, Shubik and others). The talk will give a brief introduction into the field and outline new results obtained by the authors.
"Bertrand competition with cost uncertainty"
Abstract: We analyse the classical model of Bertrand competition in a homogeneous product market with constant marginal costs and uncertainty regarding rivals' costs. First, we show that there exists a mixed strategy Nash equilibrium under the conventional equal sharing rule. Second, we illustrate the result for the case of linear market demand.
"Platform Competition and the Emergence of Alternate Pricing Policies"
Abstract: Otherwise identical platforms are able to differentiate themselves using price-structure (ratio of buyer-fee to seller-fee). Platforms then compete on price-level (buyer-fee + seller-fee) but are able to sustain supernormal profits.
Microeconomics and Mathematical Economics II
"The role of demand uncertainty in the two-stage Hotelling model"
Abstract: This paper re-examines the Hotelling location-then-price duopoly game, with the firms uncertain of the exact location of the demand at the time of choosing locations. A model is proposed that allows changes in the degree of demand uncertainty while preserving the average demand across all states of nature. This adjustment leads to strikingly different comparative statics results from those present in the existing literature. The effect of uncertainty is found to be similar to that of price discrimination in the `certainty' model, as it leads to a decrease of product differentiation, profit reduction and social welfare improvement, with the standard `certainty' results appearing as limiting cases."
"A Good Reason To Give Me a Discount: Measuring Time Preferences and Incentive Compatibility"
Abstract: This paper introduces two new experimental methods for eliciting time discount functions and utility functions at the individual level. Each of these methods gives precise and incentive compatible measurements, achieved in both cases without extending the domain to risky lotteries. The first method, Intertemporal Scoring Rules, elicits any discount function and utility function when the utility function is known to belong to some parametric family. The second method, Timing Rules, elicits discount functions when utility is arbitrary and the discount function is known to belong to some parametric family.
Wasel Bin Shadat
"Testing CCC assumption in MGARCH Framework"
Abstract: In this paper we introduce an asymptotically valid Conditional Moment (CM) test for the Constant Conditional Correlation (CCC) hypothesis in multivariate GARCH (MGARCH) framework. This test examines the conditional moment restriction that the covariance matrix of the standardized residuals is constant over time. The test is constructed using the estimates of the individual GARCH equations only (and the indirect estimates of correlation parameters from these); and computationally convenient. Some Monte Carlo results are also reported on the finite sample properties of the CM test. We also consider the implicit null of the Tse’s Lagrange Multiplier (LM) test to compare with this test.
"The Cholesky-Decomposition-MIDAS model for forecasting the variance-covariance matrix of a stock portfolio."
Abstract: This paper introduces a model which uses realised variance-covariance data to model and forecast the variance-covariance matrix of a stock portfolio. The model makes use of the properties of the Cholesky decomposition of positive definite matrices and Mixed Data Sampling (MIDAS) techniques to provide a flexible framework for obtaining a forecast which is symmetric and positive definite, thus qualifying as a valid covariance matrix forecast. The model and its estimation technique are introduced before simulation evidence is provided to show that this method produces forecasts which are more statistically accurate than those obtained by the widely used DCC model which employs only returns data.
"Kernel Smoothed Empirical Likelihood and Structural Stability"
Abstract: Simple structural stability tests for a known breakpoint for IID and weakly dependent data are investigated after the review of kernel smoothed Empirical Likelihood (EL), which enables EL to accommodate weak dependence in the data (eg Smith 2004). For structural stability testing, the null hypothesis of a stable structure is decomposed according to the over-identification of moment conditions, and due to this decomposition, the structural stability tests reveal some extra information on the source of instability if a break is detected. This decomposition was implemented by Hall and Sen (1999) for GMM. Asymptotic results of these test statistics based on (kernel smoothed) EL, under the null hypothesis and local alternatives, agree with the findings of Hall and Sen (1999) using GMM.
"Social Stigma in Models of Corruption"
Abstract: We present an equilibrium theory of social stigma and bureaucratic corruption under alternative corruption regimes. By social stigma is meant an individual’s non-pecuniary cost of engaging in illicit activities, a cost that depends on the number of other individuals who engage in such activity. By corruption regime is meant the extent to which public officials collude or do not collude with private citizens in their illegal profiteering. We compare and contrast the influence of social stigma on individual decision making under different corruption regimes. In each case we compute the equilibrium levels of stigma and corruption, and establish conditions under which an equilibrium exists in uniqueness or multiplicity.
Yuan Yuan Wang
"Growth and Development Under Alternative Corruption Regimes"
Abstract: Empirical observation suggests that not all countries of the world have suffered as a result of widespread corruption. Whilst many countries have undoubtedly suffered considerably, others appear to have coped well – in some cases, very well – with the problem. The analysis that follows seeks to provide an explanation for this puzzle. It does so by differentiating alternative types of corruption regime according to the way that that corruption is practised. Specifically, we distinguish between organised and disorganised, collusive and non-collusive corruption. This gives four possible scenarios, the implications of which are compared and contrasted to provide a ranking of regimes in terms of their impact on growth. We find that the least (most) damaging regime is one in which corruption is both organised and collusive (disorganised and non-collusive), as broadly characterises the situation in China and its fast-growing neighbours (many African countries).
"Dynamic Effects of Migrants’ Remittances on Inequality and Income Distribution"
Abstract: This paper presents an analysis of the effects of overseas remittances on the evolution of income inequality and wealth distribution. The analysis is based on an overlapping generations model in which inequalities are explained by a combination of capital market imperfections and fixed costs of investment. Together, these features give rise to credit rationing such that some members of the population are denied opportunities that would make them better off. Within this framework, we study the implications of remittances associated with child migration. We consider two alternative scenarios which differ according to who receives remittances – parents or siblings. Our results show that both types of intra-family transfer reduce inequality, but that the latter is more potent in doing this by relaxing borrowing constraints.
"Human Capital and Economic Growth: The Role of Corruption"
Abstract: The study is an attempt to provide additional explanation for the weak effect of human capital on economic growth as highlighted in the literature. The analysis considers the important role of corruption as an additional channel for the heterogeneous effect of human capital on economic growth. The study suggests that the role of corruption may be seen as a channel by which human capital looses its influence on economic growth.
"Credible Inflation Targets and the Inflation-unemployment Variability Trade-off"
Abstract: This paper uses a simple New Keynesian Model to analyse the effects of credible inflation targets on the inflation-unemployment variability trade-off. We argue that adopting an explicit inflation targeting framework provides clarity to the inflation stabilisation objective of the central bank, allowing the public to attribute higher credibility to the conduct of monetary policy, thus improving the efficiency frontier. On the other hand, increasing the policy weight on achieving an inflation target with less clear monetary policy objectives merely moves an economy along a frontier. Empirically, several key explicit inflation targeters show reduced inflation variability as well as unemployment variability. In contrast, non-inflation targeting economies that have seen reduced inflation variability have also experienced increased unemployment variability. These suggest that in terms of the inflation unemployment variability trade-off, explicit inflation targeting could result in a superior outcome, confirming the findings of our model.
"Analysing Collective Farm Behaviour in Uganda"
Abstract: Assuming Pareto efficiency of intra-household decision outcomes, Chiappori's (1992) collective model has been established as 'work horse' for the analysis of household behaviour. Though many studies confirm efficiency in developed countries, there is evidence of inefficiencies in the intra-household allocation of productive resources from the rural context in developing countries that seems to contradict the model (Udry(1996); Jones(1983)): Females and males appear to struggle over productive resources with the result that Pareto-improvements could be possible. Based on Browning and Gortz (2007)'s collective model, we develop a household model with farm production. The model distinguishes cash and food crops that are often said to be controlled by males and females, respectively, and labour inputs by gender. The presentation highlights the model's implications under the assumption that households do not face limitations in accessing input or output markets. Under these assumptions, the model does not impose any restriction on productive behaviour except that individuals allocate their resources so to maximise on-farm profits, as pointed out by Udry and Bardhan (1999) and Donni (2008). In particular, with regard to on-farm labour supply females and males allocate their labour so to equate the marginal revenue product to the marginal costs; that is, the wage rate.
In line with the agricultural literature on shadow pricing, we test whether this condition holds using the latest Uganda National Household Survey (UNHS) and applying OLS and a (farm-level) fixed-effects estimator for the different crops grown. The results highlight the need to consider the implications of limitations in labour markets within the collective model for the analysis of efficiency in the allocation of on-farm labour supply.
"Financial Development and Economic Growth: The Role of RandD"
Abstract: This study explores a relatively new channel of RandD activities, through which financial sector development may affect economic growth using annual cross-sectional data of 86 countries for the period 1997-2007. It is an attempt to capture precisely the conditions under which finance affects economic growth. By employing LTS method, we test whether the effect of financial development on economic growth is positive and that it varies as the level of RandD varies. Our results show that the effect of financial sector development on economic growth decreases as we move from lower levels of RandD expenditures to the higher level. It suggests that the economies where governments spend more on RandD activities, finance perform poorly in promoting economic growth.
"Inflation Persistence and Monetary Policy Regimes: Long-run Evidence from UK and US"
Abstract: This paper tests the hypothesis that inflation persistence varies with monetary policy regimes, using data from 1850 onwards, for the UK and US. The paper employs new econometrics of structural breaks and estimation of unobserved components-stochastic volatility (UC-SV) models. Results suggest that there is considerable time-variation in inflation persistence which can partly be explained by changes in monetary regimes. Also, results imply that serial correlation of inflation should not be treated as an intrinsic feature of the economy but rather as a historical outcome that is partly contingent upon the monetary policy framework.
6 - 7 May 2008
Econometrics and Applied Economics
"Bootstrap Based Unit-Root Tests in Nonlinear STAR Models"
Abstract: While some unit root testing procedures available in the literature allow for the presence of non-linearities under the alternative hypothesis of stationarity, they uniformly assume a linear DGP under the null hypothesis of a unit root. The only exception to this is a testing procedure developed by Caner and Hansen (2001) which, however, imposes a particularly strong restriction on the form of nonlinearity. This paper develops a new unit root test that allows for a much wider class of nonlinear processes even under the null hypothesis with the specification considered by Caner and Hansen being a special case. A model-based bootstrap procedure delivers a test that is correctly sized for near-linear and smooth transition type models. Moreover when applied to near-linear DGPs the test’s power is comparable to that of the standard ADF test.
"Empirical Likelihood for Time Series and Structural Stability Testing"
Abstract: This presentation appreciates the consistency, efficiency and flexibility features of an estimation and inference method, empirical likelihood (Owen, 1988). Essentially, the application of empirical likelihood to handle weakly dependent data is discussed and the extension to conduct simple structural stability tests using this method is attempted. More precisely, parametric model fitting, blocking and kernel smoothing are the techniques employed to enable empirical likelihood to account for the weak dependence in the data, while its structural stability testing rests on the simple linear model with one break. In addition, Monte Carlo experiments are designed to produce visible outputs of the theories.
"An investigation of the benefits of using disaggregated stock market data to forecast portfolio volatility"
Abstract: The issue of volatility forecasting is of interest to investors and its modelling has been of significant interest to econometricians. This research investigates if it is possible to improve volatility forecasts by the use of disaggregated data, in the form of data relating to individual stocks, in the forecasting process as has been possible in other areas. This presentation introduces the methodology used to investigate this question and presents the result of an empirical investigation which identifies the value of using disaggregated data for forecasting the volatility of a stock portfolio.
Macroeconomics, Growth and Development I
Yuan Yuan Wang
"Uncertainty, Entrepreneurship and the Organisation of Corruption "
Abstract: Empirical evidence shows that not all countries with high levels of corruption have suffered poor economic performance. Bad quality governance has clearly been much less damaging (if at all) in some economies than in others - most notably perhaps, China. Why this is so is a question that has largely been ignored, and the intention of this paper is to provide an answer. The analysis is based on a general equilibrium model of occupational choice in which agents may work in either some basic (traditional or subsistence) activity or a more speculative (advanced or entrepreneurial) venture. The latter is risky and requires external funding from financial intermediaries, together with licenses from public officials (bureaucrats) who are able to exploit their monopoly power by demanding bribes in exchange for these (otherwise free) permits. This imperfection in governance is combined with an imperfection in capital markets through costly state verification. The analysis shows that the effects of corruption depend on the extent to which bureaucrats coordinate their rent-seeking behaviour. Non-coordinated (or disorganised) rent-seeking creates uncertainty for both agents and banks about the total bribe payment that will be demanded and, with this, uncertainty about the ability to repay loans. Coordinated (or organised) rent-seeking implies a total bribe payment that is certain and that avoids bankruptcy, being chosen deliberately by bureaucrats in order to maximise their joint interests. The implication is that, under appropriate conditions, entrepreneurial activity is higher in the case of the latter than in the case of the former. China and some of its fast-growing neighbours are prime examples of countries that have well-organised corruption networks
"Monetary Policy and Real Wage Cyclicality"
Abstract: Several studies have highlighted the potential biases that may arise in measuring real wage cyclicality. This paper points to the potential biases that may arise when monetary policy is ignored or assumed exogenous. Using a simple model we show that systematic monetary policy affects the real wage-output correlation. Under demand shocks, real wages are typically countercyclical, but countercyclicality significantly reduces when monetary policy is endogenous to price and output gaps. Supply shocks typically result in procyclical real wages, but more active monetary policy with a relatively higher policy weight on price stabilisation is shown to raise the procyclicality of real wages, whereas higher policy weight on output stability is shown to reduce procyclicality of real wages.
Keywords: Monetary Policy Rules, Real Wage Cyclicality, Nominal and Real Shocks.
"Natural Rate Shocks, Inflation Persistence and Exchange Rate Regimes"
Abstract: This paper presents an open economy extension of the Barro-Gordon model, with degree of exchange rate flexibility and shocks to natural rate of employment, to analyse the implications of different exchange rate regimes on inflation persistence. Higher exchange rate flexibility results in more persistence in the inflation process. Inflation persistence exhibits a non-linear response to varying degrees of autocorrelation of shocks. In the absence of asymmetric information, increased volatility in transitory shocks would not increase persistence. By contrast, asymmetric information makes inflation persistence responsive to past transitory shocks, where by an increase in its volatility would cause more persistence. Overall, the presence of asymmetric information brings about more persistence into the inflation process, because inflation expectations are `contaminated' with the effects of past transitory shocks and policymaker partially accommodates current inflation expectations in setting optimal inflation rate.
Key words: Inflation persistence; Exchange rate regimes; Asymmetric information; The Lucas critique
JEL Classification: E31; E42; E52; F41
Macroeconomics, Growth and Development Ii
"On the Macroeconomics of Microfinance"
Abstract: Microfinance (small scale lending to the poor) is integrated into a dynamic macroeconomic model of income distribution. Two-period-lived agents, belonging to overlapping generation of dynastic families, choose between three alternative occupations-subsistence production, small-scale project investment and large-scale project investment. Subsistence activity is costless and riskless, whilst project investment is the opposite and may require external funding from financial institutions with imperfect powers of contract enforcement. In the absence of microfinance, only large-scale, collateralised loans are available through the traditional banking sector. Under such circumstances, initial inequalities persist as only the wealthy are able to acquire these loans, and as the small-scale enterprise is either not feasible or not profitable. With the introduction of microfinance, this venture is made both possible and attractive through the provision of non-collateralised loans and other features of microlending arrangements. Poverty and inequality are reduced as a result.
"Aid Allocation, Growth and Welfare with Productive Public Goods"
Abstract: This paper develops an open-economy intertemporal growth model with endogenous relative prices and an imperfect world capital market. The government provides two categories of public services, infrastructure and health, which are both productive. Externalities associated with infrastructure in the production of health services are also accounted for. The model is calibrated for a ‘typical’ low-income country and used to examine the growth and welfare effects of both permanent and temporary, tied and untied, increases in aid. The analysis highlights the existence of dynamic trade-offs between the short- and the longer-run effects of aid on the real exchange rate.
Development Economics and Policy
"Determinants of Poverty and Vulnerability in Rural Vietnam"
Abstract: This paper analyses the incidence and the determinants of poverty and vulnerability (defined as probability of falling into poverty in the future), in rural Vietnam between 2002 and 2004. It shows that vulnerability translates well into the poverty. Although both indicators measure deprivation in households and many of the determinants are common to both, the associations vary to a large extent and some of factors are more particularly associated with one rather than the other of these indicators. This may allow a more accurate identification of the poor and their vulnerability and thus enable a more durable poverty reduction policy for rural Vietnam. For example, while agricultural land holding, receipt of higher education or access to roads is likely to reduce poverty more than vulnerability, sylvicultural land holding, ethnicity and access to electricity have a greater impact on vulnerability. An implication arising out of our analysis of a quantile regression on estimated vulnerability is that a detailed knowledge of the distribution of vulnerability across households is desirable. This is because we find that over time there are changes in the coefficients estimates across the percentiles. For instance one of our findings is that the biggest impact on household vulnerability comes in both periods from different forms of land holding. Interestingly the relative sizes of impact of these forms of land holding on different percentiles’ vulnerability changes substantially between the periods.
"Tracking Economic Policy and Poverty Outcomes in Mongolia"
Abstract: Mongolia’s transition strategy is unique in Asia and has been accompanied by very high levels of poverty. For these reasons, policy choices have been the focus of substantial national and international attention. This paper examines the relationship between these policy choices and the evidence on which they were based. The salient features of Mongolia, its transition and the evolution of its policy stance are presented first. This is followed by an examination of the poverty surveys, undertaken in 1995, 1998 and 2002, and their degree of comparability. The paper then maps poverty outcomes back to policy choices using standard analytical techniques. These include a growth-inequality decomposition, the compilation of pro-poor growth statistics and the derivation of growth incidence curves. The results of these analyses demonstrate severe weaknesses in the evidential record and in the degree of transparency with which this has been presented by those agencies responsible for undertaking the poverty surveys, principally the Mongolian Statistical Office and the World Bank. Nevertheless, we conclude that there has been poverty reduction in Mongolia although this is based on a ‘trickledown’ effect and the reduction would have been greater had more attention been paid to managing inequality.
Microeconomics and Mathematical Economics
"When is Social Welfare Increased by Permitting Multi-Card Payment Platforms to Impose a Tie-in on Merchants?"
Abstract: Payment card associations, such as Visa, charge both cardholders and merchants but tend to adopt a fee structure that favours cardholders over merchants to maximize the volume of transactions. Visa had a dominant position in the credit card market but faced competition in the debit card market. Rival debit cards used low merchant fees to steer merchants away from Visa’s debit card. Visa responded by demanding that merchants who wished to join its credit card platform also join its debit card platform; but an antitrust investigation forced Visa to abandon this tie-in. However, Rochet and Tirole claimed that this was not a socially desirable outcome.
Rochet and Tirole argued that Visa’s preferred fee structure was close to the social optimum and that competition in the debit card market distorted the price structure (without reducing the price level). Moreover, they claimed that the tie-in afforded flexibility to rebalance the fee structure and increase welfare.
However, Rochet and Tirole assumed that total demand for final goods was inelastic and claimed that ‘introducing price elasticities would not fundamentally change the analysis.’ My paper presents evidence to the contrary and claims that if demand is sufficiently elastic then tying does not benefit end-users.
Keywords: payment platforms, tying, price structure.
JEL numbers: L5, L82, L86, L96
"Expected Utility with Consistent Certainty and Impossibility Effects."
Abstract: This paper presents several methods for measuring Certainty and Impossibility Effects. It is shown that the Non-Extreme Outcome Expected Utility (NEO-EU) model has the property that, for each effect, all of the proposed measures are equivalent. Further, for establishing a ‘more affected by certainty’ or ‘impossibility’ relation, any NEO-EU maximiser has a unique index for each effect in terms of the NEO-EU model's two parameters. The NEO-EU model can be thought of then as ‘Expected Utility with Consistent Certainty and Impossibility Effects.’ An axiomatic foundation, the first for risky lotteries over an arbitrary outcome set, is given. In particular the axiomatisation shows that NEO-EU is the only model that can be thought of this way.
Journal of Economic Literature Classification D8