Page 1

tinbergen institute

magazine 12

Fall 2005

On the road to KNAW re-accreditation: Commendation and critique of TI by the International Peer Review Committee An interview with Dale Jorgenson Changing how economists think about risk attitudes Common factors in credit risk Letters from Alumni

Tinbergen Magazine is published by




Institute for economic research of Erasmus Universiteit Rotterdam, Universteit van Amsterdam and Vrije Universiteit Amsterdam. tinbergen institute

magazine 12

In this issue

Fall 2005

On the road to KNAW re-accreditation: Commendation and critique of TI by the International Peer Review Committee An interview with Dale Jorgenson Changing how economists think about risk attitudes Common factors in credit risk Letters from Alumni

Tinbergen Magazine is published by



Up close

Up close


Institute for economic research of Erasmus Universiteit Rotterdam, Universteit van Amsterdam and Vrije Universiteit Amsterdam.

On the road to KNAW re-accreditation: Commendation and critique of TI by the International Peer Review Committee

Highlighting ongoing research at Tinbergen Institute for policymakers and scientists.

An interview with Dale Jorgenson, Harvard University Beata Bierut


In depth Changing how economists think about risk attitudes

In depth Han Bleichrodt and Peter P. Wakker


Common factors in credit risk AndrĂŠ Lucas


Letters from Alumni Marcel Canoy, European Commission

Letters from Alumni


In short

Papers in journals Discussion papers





In short





By Beata Bierut

On the road to KNAW re-accreditation:

Up close

Commendation and critique of TI by the International Peer Review Committee

An interview with Dale Jorgenson, Harvard University Soon, Tinbergen Institute will apply to the Royal Netherlands Academy of Arts and Sciences (KNAW) for re-accreditation as an official Research School for the period 20062011. Part of the procedure is an evaluation of the quality of the Institute’s scientific research and graduate programme by an International Peer Review Committee.

This is actually the second time that TI has been evaluated by a committee of distinguished academics. In 1999, the committee chaired by Prof. Edmond Malinvaud prepared the report “The Tinbergen Institute, a Successful Cooperative Graduate School and Research Centre”. This year’s report is entitled, “Tinbergen Institute: Building a Top Research School in Economics”. I suppose the difference between ‘successful’ and ‘top’ reflects the Committee’s assessment of the progress made. As we emphasise in the report, important progress has been made in both key dimensions of graduate training and research. We were very impressed.

The Committee, consisting of Professors Dale W. Jorgenson, David F. Hendry, Arie Kapteyn, Robert C. Merton, and Torsten Persson, visited the Institute in February 2005. This issue of TI Magazine features an interview with the Chair of the Committee, Dale Jorgenson from Harvard University.


Let’s then move along to specific areas, beginning with the MPhil programme. The report says “... The MPhil degree [now] satisfies international standards for the initial two years of doctoral training in economics...” What were the major improvements that you noted? Substantial improvement has been made in the core training in microeconomics, macroeconomics and the combination of econometrics and statistics. In each of those three areas the Malinvaud Committee (on which I served) concluded that the earlier educational programme fell short of

tinbergen magazine 12, fall 2005

the system used by TI is quite different from the traditional Dutch 4:4 system, with four years leading to a Doctorandus degree and four years devoted to PhD research. The TI 3:2:3 system is fairly close to the British system: three years leading to a Bachelor’s degree, two years of the MPhil programme, and then three years for PhD research. The Committee maintains that the old system should be phased out as soon as possible. We think that new employment contracts for graduate students should be limited to people who have MPhil training. Finally, we recommend that students become accustomed to making regular presentations at informal seminars, where the typical audience would be other graduate students and two or more faculty members in a particular area... The need for “regular presentations” is repeatedly emphasised throughout the report. Why is it so important?

The MPhil thesis in the second year is taken much more seriously at TI than any corresponding requirement ... even at Harvard

international standards. Since then, the first year core has been completely restructured, and is now highly demanding. It covers a full range of topics with lots of homework, weekly classes to review homework, and examinations at the end of each of the three segments. The second year of the MPhil is devoted to a more specialised training, including an excellent selection of courses on specific areas of economics taught by top people. Finally, the MPhil thesis in the second year is taken much more seriously at TI than any corresponding requirement at North American or British graduate schools, even at Harvard. The thesis requirement, in my opinion, greatly smoothes the transition between the coursework of the first two years and the PhD research during the remaining three years of the programme. But there’s still some work to be done... Let me focus on the most important issues. The first is to have a student body that would be consistently at the level of 30 people entering each year. The second is that 4

It is important for students to interact with other people and to get regular feedback as they proceed with their research. The traditional apprenticeship system involves one-on-one interaction between the student and the faculty member, but this is not sufficient. There should be a public dimension to the research supervision process, and students themselves should play an important role through interacting with their peers in producing good research. It is vital to have several faculty members with an interest in a particular area sharing responsibility for all of the students working in that area. That is the way things work in all of the major North American and British institutions, and this is the best model for TI. Of course, life does not end with the PhD, and students need to find a job after receiving their degree. The report recommended increasing the distance between the hiring faculties and the placement of PhDs. There should be more internationalisation in PhD placement. Tinbergen students, once they finish their dissertations with solid MPhil training, are going to be able to compete with students trained in the UK and the US. They should therefore think about competing in an international job market. As far as the participating institutions are concerned, they should begin to hire what we call ‘high impact researchers’: top people in the international market who have potentially high impact in the profession. You may want to consider how to recruit these people and how to enable them to develop their research capabilities. Actually, that is a task for the participating institutions. We do not see this as a responsibility of TI.

tinbergen magazine 12, fall 2005

Should Tinbergen Institute play some role in attracting top researchers to the Netherlands, even if only for visiting programmes? A visiting researcher’s programme would be an excellent idea. It would be desirable for Tinbergen Institute to raise private funds for various kinds of outreach activities– including visitors (perhaps on a short-term basis), as well as visits by TI faculty and students to other institutions. Since the participating faculties are already attracting a lot of research support for projects in areas of interest to faculty members, the Institute could try to attract support for what we characterise as outreach. We do not think that the participating universities should be devoting their scarce institutional resources to this; their resources ought to be concentrated on the teaching programme. This is also emphasised in the report. Let’s now turn to research activities. The report applauded a rapid increase in TI’s research output, both in terms of quantity and quality. Still, a number of suggestions for improvement were put forward... The main thing that we thought was missing is that there is no system for bringing to the fore people who are really going to have a major impact on the economics profession, people who are going to be international high-flyers. The Netherlands should be, as it has been historically, a great centre of economics. You have to think about how to recruit and develop people of the very highest calibre, like Tinbergen and Theil. This should be The Peer Review Committee together with some TI students.


discussed and resolved between now and the next review five years from now. On the organisational side, the report discussed more active involvement of TI alumni and the issues of funding. I think that these two things actually go together. TI alumni are represented throughout Dutch business, government and academics. They should feel a sense of affiliation with TI and understand the way in which the institute is evolving. It is again a question of outreach, and I think the alumni should be the first targets of the outreach activities. Beyond that, I think that it is important to reach out to all of the Dutch business organisations in which economics plays a role. Every Dutch multinational and every major financial institution has some kind of economic staff. It is vital to reach out to those people because they could help to guide TI’s future development and to generate financial support. As far as fundraising is concerned, the report recommends strongly that the Board of Tinbergen Institute plays a key role. That covers the report, as far as I am concerned. Unless you feel that we missed something important... Let’s discuss the business end of the report. We think that the participating institutions are getting extremely good value for money in Tinbergen Institute. The report spells out what we feel are the positive contributions of TI to the participating institutions– for example, having centralised

tinbergen magazine 12, fall 2005

Ph.D.-level training of high quality, which I think none of the institutions could have produced by itself. Most importantly, as a research network, TI sets high standards for research that have led to upgrading new recruits into the participating faculties. Participating institutions should be happy to renew their contracts with TI. We eagerly expect that KNAW will renew its accreditation. We feel that TI has made a great deal of progress in both teaching and research, and this should be more than sufficient to justify re-accreditation. Those are two things that I would like to emphasise, in conclusion. I suggest that we finish the interview with a couple of questions regarding your own work. Could you please provide us with some brief insights into the book that has been published this year, “Information technology and the American growth resurgence”?

huge potential for gains from the tax reform, mainly due to increased efficiency in capital allocation. I estimate that a tax reform based on the combination of these two principles would produce an increment to US national wealth of about 20 percent. I am finished with my questions. Thank you very much for your thoughtful comments. Let me say on behalf of the Evaluation Committee that I am pleased that our work will reach the readership of TI Magazine. Having put a great deal of work into this report, we are proud that it will contribute to the discussion about the future of Tinbergen Institute.

Selected Bibliography Accounting for growth in the information age, in: P. Aghion and S. Durlauf, eds., Handbook of Economic Growth, Amsterdam, North-Holland, 2005.

This book has grown out of research on productivity that I have been conducting since I came to Harvard. What attracted my attention again (if you look at my research record, you will see that I like to move on and not get stuck in a particular niche) is the special role of information technology. The book spells out how information technology has changed the prospects for growth in the United States. On my website I have a version that extends the story to the other G7 economies. The theory of economic growth, as it is presented in textbooks, still owes a lot to the idea that the main source of economic growth is productivity change. I find that that was never true historically– and it is even less true today. The most important source of growth is investment. Capital embodies new technologies– that is the key idea. This shows up in the new methods we have developed for growth accounting, especially adapted to information technology.

Blueprint for expanded and integrated U.S. national accounts: Review, assessment, and next steps (with J.S. Landefeld), in: A New Architecture for the U.S. National Accounts, 2005. Efficient taxation of income (with K.-Y. Yun), in: T.J. Kehoe, T.N. Srinivasan, and J. Whalley, eds., Frontiers in Applied General Equilibrium Modeling, Cambridge, Cambridge University Press, 2005. Information technology and the Japanese economy (with K. Motohashi), Journal of the Japanese and International Economies, 19(4), December 2005. Information technology and the world economy (with K. Vu), Scandinavian Journal of Economics, 107(4), December 2005. Will the U.S. productivity resurgence continue? (with M.S. Ho and K.J. Stiroh), Current Issues in Economics and Finance, 10(13): 1-7, December 2004. Information technology and the G7 economies, World Economics, 4(4): 139-169, October-December 2003. Information technology and the U.S. economy, American Economic Review, 91(1): 1-32, March 2001. Raising the speed limit: U.S. economic growth in the

Last question, again related to your recent work: What is ‘A smarter type of tax’? Tax reform is currently on the agenda in the United States. The Tax Commission appointed by the President in January has decided to reform our system of income taxes, rather than to adopt a European-style value-added tax. I think that this is the right direction, but that we need to preserve progressivity and to improve the efficiency of capital allocation. The system that I have developed for doing both is called ‘efficient taxation of income’. It involves two basic principles. The first is that capital income would be taxed at a rate of about 30%, and labour income at a rate of about 10%. This would preserve progressivity. The second principle is that all capital income would be taxed at the same effective rate. There is


information age (with K.J. Stiroh), Chapter 3 in: Economic Growth in the Information Age, 2001. U.S. economic growth at the industry level (with K.J. Stiroh), American Economic Review, 90(2): 161-67, May 2000. A new architecture for the U.S. national accounts (with J.S. Landefeld and W.D. Nordhaus, eds.), Chicago, University of Chicago Press, 2005. Information Technology and the American Growth Resurgence (with M.S. Ho and K.J. Stiroh), Cambridge, MIT Press, 2005. Economic Growth in the Information Age, Cambridge, MIT Press, 2001. Lifting the Burden: Tax Reform, the Cost of Capital, and U.S. Economic Growth (with K.-Y. Yun), Cambridge, MIT Press, 2001.

tinbergen magazine 12, fall 2005








Changing how economists think about risk attitudes G

Han Bleichrodt and G Peter P. Wakker G

Han Bleichrodt, professor of

health economics at Erasmus Universiteit Rotterdam, investigates the measurement of utility in health and economics. Peter P. Wakker, professor in econometrics at Erasmus Universiteit Rotterdam, investigates risky decisions in economics. This contribution reviews some important changes in the current economic thinking about risk attitudes and presents the research of the authors on this topic.

Defining risk aversion: Economists vs. the rest of the academic world The classical economic theory of decision under risk is expected utility. Consider a “prospect” (0.3:200, 0.7:100), yielding €200 with probability 0.3 and €100 with probability 0.7. The prospect is evaluated through a probability-weighted average utility 0.3  U(200) + 0.7  U(100), with U denoting the utility function of money. U is a subjective factor, depending on the decision maker and describing his risk attitude. A classical result is that risk aversion (preferring a prospect less than its expectation) then holds if and only if the utility function is concave, implying diminishing marginal utility (as in Figure 1). In other words, the prospect is undervalued relative to its expectation because the marginal utility gained through outcomes above the expectation is less than the marginal utility lost because of outcomes below the expectation.

Since economists are trained to express everything in life in terms of money, the assertion that “risk aversion means concave utility of money” does not set alarm bells ringing for economists. For the rest of the academic world this is different, however. The psychologist Lola Lopes (1987), for instance, wrote, “risk aversion is more than the psychophysics of money” (p. 283). Intuitively, it somehow does not seem natural that risk attitude has to do with how we feel about money; it seems more natural that it has to do with how we feel about probabilities. Since the 1950s, psychologists have therefore used an evaluation w(0.3)  U(200) + w(0.7)  U(100) of a prospect (using the prospect described above). The weighting function w captures sensitivity towards probability, and can be non-linear. For example, the weight w(0.7) of 70% probability mass is usually less than 70% of the weight w(1) = 1 of certainty. The most prominent theory of this kind was Kahneman and Tversky’s (1979) prospect theory.

tinbergen magazine 12, fall 2005

Traditional economics focus

A long wait pays off It took thirty years for one of the most important ideas in risk theory to be discovered-in full depth and generality by Schmeidler (1989; first version 1982), and for the special case of risk by Quiggin (1981): The prospect mentioned above should be evaluated through the formula w(0.3)  U(200) + (1–w(0.3))  U(100), rather than through the formula used by psychologists. We will not explain the subtlety of replacing the weight w(0.7) used by psychologists by the weight 1–w(0.3) used here. A consequence of this new formula is that the weights of the outcomes always sum to one, and that the worst outcome is treated differently than the best outcome (rank dependence). Note that rank dependence allows people to be risk averse while having linear or even convex utility, for instance when they heavily overweight the worst outcome. People are also allowed to have concave utility and to be nevertheless risk seeking (when they overweight the best outcome). The relationship between risk attitude and utility curvature is thus no longer one-to-one. Rank dependence was incorporated in the new version of prospect theory (Tversky and Kahneman 1992). This new version added to the ideas of Quiggin and Schmeidler a different treatment of gains than of losses; discussion of the latter topic we leave for another day, and we restrict ourselves here to a discussion of gains. At long last, a theory has been developed that is both intuitively satisfactory (through incorporation of the indispensable probabilistic sensitivity into risk attitudes) and theoretically sound. Thus, only since 1992 do we have a satisfactory theory for risky decisions. The theoretical soundness of models is usually verified through so-called preference foundations. The properties of the model are then identified in directly observable terms, which are conditions expressed directly in terms of preferences that show how to verify or falsify the model empirically. This was established for the new version of prospect theory by papers including Wakker and Tversky (1993) (for monetary outcomes) and Bleichrodt and Quiggin (1997) (for health outcomes).


Economists, trained to focus on the conditions that ensure the existence of equilibria and risk aversion, have focused on such conditions for probability weighting. Given the natural w(0) = 0 and w(1) = 1, a low and convex w leads to a low weighting of best outcomes of a risky prospect and, hence, to a relative under-evaluation of risky prospects when compared to sure outcomes. This underevaluation enhances risk aversion. Convex probability weighting functions (as in Figure 2), and the corresponding ambiguity aversion for unknown probabilities, have therefore been the focus of most economic studies so far.

Empirical studies and the inverse-S shape Psychologists, on the other hand, are more interested in empirical facts than in theoretical wishes. They found that the probability weighting function is generally not convex but inversely S-shaped (as in Figure 3). The inverse S-shape reflects the tendency of people to be overly sensitive to probabilities close to zero (reflecting the shift from something that is impossible to something that is possible), and to probabilities close to one (reflecting the shift from possible to certainty). People tend to be insensitive to intermediate probabilities. The journal Management Science received two independently devised empirical studies that found this same phenomenon: Abdellaoui (2000) (for monetary outcomes) and Bleichrodt and Pinto (2000) (for health outcomes). The papers were published side by side; the impact of two such independent verifications in different domains is usually more far-reaching than what studies in isolation achieve. These two papers, together with Gonzalez and Wu (1999), finally established inverse-S as the prevailing empirical phenomenon.

Focussing on known probabilities in the health domain Our research concerns theoretical preference foundations, empirical tests and quantitative measurements of the general models discussed, and the various concepts in these models. Inverse-S shapes, and their generalisations to the case of unknown probabilities, suggest that not so much aversion to risk, but rather cognitive lack of comprehension prior to any attraction or aversion, is central in explaining people’s deviations from rational models. People simply do not understand adequately the concepts of probability and uncertainty and, consequently, fail to discriminate sufficiently between different levels of likelihood. This phenomenon generates the curve in Figure 3, and calls for new concepts and factors affecting risk attitudes.

tinbergen magazine 12, fall 2005

People simply do not

References Abdellaoui, M. (2000), Parameter-free elicitation of

understand adequately

utilities and probability weighting functions, Management Science 46, 1497-1512.

the concepts of probability

Bleichrodt, H. and J.L. Pinto (2000), A parameter-free elicitation of the probability weighting function in

and uncertainty

medical decision analysis, Management Science 46, 1485-1496. Bleichrodt, H., J.L. Pinto and P.P. Wakker (2001),

Decisions taken under risk are central in the health domain, where symptoms only partially signal the relevant disease and yet treatments must be chosen, and where budgets have to be allocated to treatments with uncertain effects. For instance, the new risk theories shed new light on quality-of-life measurements where new formulas have been proposed to improve classical evaluations (Bleichrodt, Pinto and Wakker (2001).

Making descriptive use of prospect theory to improve the prescriptive use of expected utility, Management Science 47, 1498-1514. Bleichrodt, H. and J. Quiggin (1997), Characterizing QALYs under a general rank dependent utility model, Journal of Risk and Uncertainty 15, 151-165. Gonzalez, R. and G. Wu (1999), On the shape of the probability weighting function, Cognitive Psychology 38, 129-166. Kahneman, D. and A. Tversky (1979), Prospect theory: An analysis of decision under risk,

New comprehensive theory of risk attitude calls for paradigm shift For the sake of simplicity, we have focused on known probabilities, a common case in the health domain. The case of unknown probabilities is more important, and more common in economics. The importance of developing separate theories for unknown probabilities had been understood since Keynes (1921) and Knight (1921) raised the issue. It took more than sixty years, however, before someone as creative as Schmeidler (1989) could develop a sensible theory for unknown probabilities, from which a sensible version of prospect theory could be derived. Only since that time do we have a sound theory of risk attitude, and much of the economics and health literature will have to be rewritten in light of these new ideas. Given the fact that the (in our opinion outdated) equation of risk attitude with curvature of utility– with the index of relative risk aversion serving as the most commonly used parameter for risk aversion– permeates all of the economic thinking and literature, such a shift in paradigm will take time. We hope to contribute to this development, and to receive and generate such contributions from our colleagues and the students of Tinbergen Institute.

Much of the economics and health literature will have to be rewritten in light of these new ideas


Econometrica 47, 263-291. Keynes, J.M. (1921), A treatise on probability. McMillan, London. Second ed. 1948. Lopes, L.L. (1987), Between hope and fear: The psychology of risk, Advances in experimental psychology 20, 255-295. Knight, F.H. (1921), Risk, uncertainty, and profit. Houghton Mifflin, New York. Quiggin, J. (1981), Risk perception and risk aversion among Australian farmers, Australian Journal of Agricultural Economics 25, 160-169. Schmeidler, D. (1989), Subjective probability and expected utility without additivity, Econometrica 57, 571-587. Tversky, A. and D. Kahneman (1992), Advances in prospect theory: Cumulative representation of uncertainty, Journal of Risk and Uncertainty 5, 297323. Wakker, P.P. and A. Tversky (1993), An axiomatization of cumulative prospect theory, Journal of Risk and Uncertainty 7, 147-176.








Common factors in credit risk André Lucas



André Lucas is professor of

Finance at Vrije Universiteit Amsterdam. His main areas of interest include risk management and financial econometrics. He is a fellow at Tinbergen Institute and a member of the European Academic Panel on credit risk

Credit risk management is as old as the banking profession itself. As part of their role as intermediaries, banks have always had to keep track of the market value of their asset portfolio in order to maintain an adequate level of solvency and liquidity. An obvious example is a bank granting a loan to a counterparty (i.e., to a firm or an individual). In the worst case of a default, the counterparty does not repay its loan (in full). The resulting loss directly translates into a reduction in the bank’s cash flows and profits. It is therefore not surprising that banks have always put much effort into assessing the credit risk of each counterparty, both at the outset of each loan contract, and during the loan period itself. This overview touches upon some recent research directions in this field of measuring and managing counterparty credit risk.

research of Standard and

Credit scoring

Poor’s. What he likes best about his area of research are the rapid developments that take place both academically and industry wise, which require the application of upto-date technical tools to answer empirically relevant questions.

The issue of counterparty credit risk assessment has a long history. The first credit-scoring models were introduced already in the 1960s (see the overview by Altman (1983)). Most scoring models make use of accounting variables, such as profitability and liquidity ratios, to predict future default events. More recent empirical


models also include additional information from financial markets (if such information is available). Typical examples include stock market returns and stock volatility changes. The information from financial markets has added significant explanatory power for default prediction to the well-established accounting variables.

Changing regulations Changes in current banking regulation have led to a strong revival in the attention devoted to assessing counterparty default and credit risk (see the Basel Committee for Banking Supervision (BCBS, 2004)). These regulations require banks to hold capital buffers in line with inter alia the credit risk of their activities. The earlier supervisory framework, effective since 1988, had a similar objective, but was too unrefined. For example, firms had to hold more capital for corporate exposures than for OECD government exposures, but capital requirements for large, internationally active companies were identical to those for small neighbourhood retail shops. The economic incentives created by this incongruity, the increased liquidity in financial markets, and the extended set of products available in financial markets to shift risk between

tinbergen magazine 12, fall 2005

The bank supervisor’s role has shifted from prescribing the capital requirements for each exposure, to using historical and current data in order to assess the performance of the bank’s internal model in predicting default. market participants, made a change in the regulations unavoidable. The key novelty in the new capital accord is that (under strict conditions) banks are allowed to come up with their own estimates of default probability for each counterparty. This approach is called the internal modelling (IRB) approach. The internal estimates of default probabilities are translated directly into a capital requirement for that specific counterparty. As a consequence, the supervisor’s role has shifted from prescribing the capital requirements for each exposure, to using historical and current data in order to assess the performance of the bank’s internal model in predicting default. Methods for constructing reliable credit scores, and evaluation techniques for assessing their adequacy, therefore enjoy obvious popularity.

From counterparties to portfolios Thus far, the focus has been on individual counterparty credit risk. A bank, however, maintains a portfolio of many different exposures. While some of these may be non-listed bonds, others may consist of listed securities, like bonds, stocks, and interest rate derivatives. Each of these instruments may be subject to counterparty default risk. If all exposures in the bank’s portfolio were independent, and the number of exposures sufficiently large, then one could easily compute an expected loss for the portfolio as a whole. Capital buffers and interest rate spreads could be determined accordingly. The independence assumption, however, obviously needs to be relaxed. Figure 1 presents default rates for counterparties of different credit quality,

Figure 1: Default probabilities per rating class in expansions and recessions (based on data from Nickell, Perraudin, Varotto (2000)) Figure 2: Portfolio credit loss distributions for correlated firms. Losses as a percentage of portfolio value on the horizontal axis.


which are (in this case) determined by Standard and Poor’s (one of the major rating agencies). The figure clearly shows that default probabilities are higher during bad economic times. The presence of such common risk factors has important consequences for credit risk management at the portfolio level. In particular, whatever the size of the portfolio, the common or systematic risk factors can never be completely diversified. This results in a portfolio credit loss rate that remains stochastic, irrespective of the portfolio size. The effect is illustrated using the following simple experiment. Consider a large portfolio of counterparties and a bank that has an identical exposure to each. A counterparty defaults if it experiences an asset return drop of more than two standard deviations. We assume that returns are normally distributed and that they have a constant pairwise correlation of R2. Figure 2 presents the resulting portfolio loss distribution for various levels of correlation. As the correlation decreases, the portfolio loss distribution becomes more peaked at the expected loss. For larger correlations, portfolio losses clearly remain stochastic, with their typical long tail towards large losses.

Recovering common credit risk factors Given the importance of dependence in credit exposures for risk management, several interesting questions can be asked. What concept of dependence is most appropriate for credit risk? How important are credit risk correlations empirically? To what extent do common credit risk factors coincide with macro-economic variables? And what are the dynamic properties of common credit risk factors? Some of our recent papers have contributed to this literature from a time-series perspective (see Koopman et al. (2005a,b,c) and Koopman and Lucas (2005)). The main idea is to use default and rating data from the major rating agencies directly to retrieve the common credit risk factors. This can be contrasted with an approach

tinbergen magazine 12, fall 2005

one to credit cycles (see Couderc and Renault (2005)). Alternative dynamics such as banking competition in lending conditions, availability of alternative sources of funding, and even aggregate rating dynamics may also play an important role.

Figure 3: Common credit risk factor estimated from Standard and Poor’s rating and default data


where one a priori imposes the restriction that the credit cycle should coincide with the business cycle. A key result is shown in Figure 3 in the form of the estimated credit cycle. The common credit risk factor shows clear troughs in the high default years in the mid-1980s, early 1990s, and early 2000s. Moreover, the common risk factor shows a strong persistence over time. This requires a careful distinction between conditional and unconditional variances of the common risk factor. This distinction appears to be much more blurred in the new supervisory framework Basel II. In particular, the conditional variability of default rates at the portfolio level may be much less pronounced than the prescribed values of the supervisor. It also appears that one should be very careful in mapping business cycles one-on-

Many interesting questions remain in this area, some of which have direct policy implications. The sticky nature of common credit risk factors should obviously be included in any sound portfolio credit risk management system. Interest in the dynamics of this risk factor then arises naturally. Another open issue is the magnitude and potential mismatch in default dependencies. The potential mismatch between empirically relevant estimates of default correlations and those imposed by regulators, may spur a new range of financial innovations similar to the one prompted by the previous Capital Accord of 1988. If prescribed regulatory correlations are higher than their empirical estimates (or marketimplied estimates), then banks may have an incentive to transfer the exposures with (too) low correlations to the market in order to obtain capital relief. This may jeopardize financial stability (despite the sophisticated, model-based approach), and thus be of major concern to all parties involved.

References Altman, E. (1983), Corporate financial distress. A complete guide to predicting, avoiding, and dealing with bankruptcy. New York: Wiley. Basel Committee on Bank Supervision (2004), Basel II: International convergence of capital measurement and capital standards: a revised framework, Report 107, Bank of International Settlements, Basel. Couderc, F., and O. Renault (2005), Times-to-default: Life cycle, global and industry cycle impacts, FAME working paper, University of Geneva. Koopman, S.J., and A. Lucas (2005), Business and default cycles for credit risk, Journal of Applied Econometrics, 20, 311-323. Koopman, S.J., A. Lucas and R. Daniels (2005b), A non-Gaussian panel time series model for estimating and decomposing default risk, TI Discussion Paper TI 2005-060/4. Koopman, S.J., A. Lucas and P. Klaassen (2005a), Empirical credit cycles and capital buffer formation, Journal of Banking and Finance, forthcoming. Koopman, S.J., A. Lucas and A. Monteiro (2005c), The multi-state latent factor intensity model for credit rating transitions, TI Discussion Paper TI 2005-071/4. Nickell, P., W. Perraudin and S. Varotto (2000), Stability of rating transitions, Journal of Banking and Finance, 24(1-2), 203-227.


tinbergen magazine 12, fall 2005

Letters from Alumni life after the PhD thesis defense Tell a story G Marcel Canoy European Commission


Marcel Canoy

graduated in 1993 with

Tell a story: my leitmotif for many years. During my years as a PhD student at UvA, I wondered what the fun was in writing stuff that nobody reads. So I tried a few dialogues between a fox and an owl, illustrated by my sister (“Bertrand meets the fox and the owl”). I’ve never regretted this creative detour– there’s nothing wrong with science, but there is a world out there.

a thesis entitled, “Bertrand meets the fox and the owl: essays in

This world was opened for me after a few post docs in Leuven, Paris and Maastricht. Until recently I worked at CPB Netherlands Bureau for Economic Policy Analysis, which has an excellent mix of scientifically and analytically based economics and the policy world. I was lucky enough to work in a field that has become increasingly important (competition and regulation). The knowledge in the Netherlands of this field was (and to a certain extent still is) rather poor.

the theory of price competition.”

Recently, I joined the European Commission, replacing André Sapir as the chief economist of the Bureau of European Policy Advisers (BEPA), the think tank of President Barroso. How does BEPA compare with CPB? Well, both think tanks rely on the quality of their work as the principal source of their reputation. BEPA is paradoxically both more influential and less influential than CPB. It is less influential than CPB, since it has less tradition and (much) less outside exposure. It is more influential, since it works directly for the President and is therefore closer to the political ‘heat’. I experienced both aspects recently. I worked the entire summer on the so-called ‘European Social Model’, which was discussed on October 26 at an informal Summit in Hampton Court between Barroso and the Heads of State (the ‘heat’). At the same time, I also saw the other side of the coin, however, since nobody realised the work that BEPA did (and probably nobody has ever heard of BEPA to start with!). Getting back to storytelling... Within the Social Model discussion a story needed to be told, too. The political debate is dominated by vested interests preaching fear that their social model will be destroyed by neo-liberals. The real story is that appropriate economic modernisation yields both social and economic benefits– albeit at the expense of (some) vested interests (think of early retirements), but ultimately to the benefit of many.


tinbergen magazine 12, fall 2005

papers in journals

How (not) to raise money What do Eric Clapton’s guitar, Margaret Thatcher’s handbag, and Britney Spears’ pregnancy test kit have in common? The answer: all were auctioned for the benefit of charity. Charities not only use auctions, but also organise lotteries and voluntary contributions to collect money. The coexistence of these mechanisms raises the obvious question: “Which mechanism is superior at raising money?” This article shows that “all-pay” auctions are better fundraising mechanisms than “standard” auctions, lotteries and voluntary contributions. The study assumes that bidders obtain extra utility for every dollar that is transferred to a charitable organisation. It was found

that in standard auctions (in which only the winner pays), revenues are relatively low. The reason is that all bidders forgo the extra utility they obtain from a high bid by one bidder if they top this bid. Bids are suppressed as a result, and so are revenues. This problem does not occur in lotteries and all-pay auctions, where bidders pay irrespective of whether they win or lose. Bidders are willing to bid more in all-pay auctions than in lotteries, moreover, because in all-pay auctions, the highest bidder always wins (in contrast to lotteries). The study introduces a general class of all-pay auctions, ranks their revenues, and illustrates how they dominate lotteries, standard auctions, and voluntary contributions. The optimal fundraising mechanism is an all-pay auction augmented with an entry fee and reserve price. The findings of this study are not merely of theoretical interest. The frequent use of lotteries as fundraisers indicates that people are willing to accept an obligation to pay even though they may lose. All-pay auctions may be characterised as incorporating “voluntary contributions” into a standard auction. They are easy to implement and may revolutionise the way in which money is raised.

Jacob K. Goeree (California Institute of Technology), Emiel Maasland (EUR), Sander Onderstal (UvA), and John L. Turner (University of Georgia), 2005, How (not) to raise money, Journal of Political Economy 113(4), 897-918.


Business and default cycles for credit risk Various economic theories are available to explain the existence of credit and default cycles. Some empirical ambiguity remains, however, regarding whether these cycles coincide. Recent papers suggest that defaults and credit spreads tend to co-move with macroeconomic variables. If true, this tendency is important for credit risk management as well as for regulation and systemic risk management.

GDP. Under investigation is the claimed lead relationship of credit spreads over growth, the (in)congruence between credit and business cycles, and the dynamics of default rates in one unified framework. The model uses 1933-1997 US data on real GDP, credit spreads and business failure rates to shed new light on the empirical evidence. Two types of cycles are found. The first has a frequency of around six years. There is clear (positive) co-cyclicality between spreads and business failures, and (negative) cocyclicality between spreads and GDP. The relation between GDP and business failures is insignificant at this frequency. The second type of cycle has a longer period of around 11 years. This frequency features a clear positive relation between spreads and failures, and a negative relation between GDP, on the one hand, and spreads and failures, on the other.

Siem Jan Koopman and André Lucas (VU), 2005, Business and default cycles for credit risk, Journal of Applied Econometrics, 20, 2, 311-323.

This paper studies the dynamic behaviour of two important determinants of credit risk (namely the default rate and the credit spread) in their relation to business cycle developments. A multivariate unobserved components approach makes it possible to disentangle long-term patterns from shorter-term cyclical patterns. The paper explores whether cycles in credit risk factors coincide with business cycles. Toward this end, the model explicitly allows for different cyclical movements in credit risk factors and economic activity, as measured by real

Comparative advantage, relative wages, and the accumulation of human capital How do changes in the composition of labour supply affect the relative wages for various skill types? A typical example is a reduction of the minimum wage, which effectively raises the labour supply of the least skilled workers. One would expect this to reduce the wages of this group. But what happens to the wages of other worker types? Again, one would expect this to depend on the degree of similarity of these

tinbergen magazine 12, fall 2005

discussionpapers 25 years of IIF timeseries forecasting: A selective review

workers to the least skilled: the more similar, the better substitutable, and the greater the extent that their wages move parallel to those of the least skilled. More generally: the degree of substitutability between worker types declines with their ‘distance’ in skill level. Although this idea seems obvious, none of the aggregate production functions currently in use yields this result. This paper shows that a simple production function, based on Ricardo’s notion of comparative advantage of high-skilled workers in complex jobs, yields exactly this implication. Further, investment in the human capital of almost any worker type is shown to reduce wage differentials, except in the extreme left tail of the skill distribution. The latter exception helps to explain why active labour market programmes aiming to increase the human capital of the least skilled tend to be so ineffective: they run into adverse general equilibrium effects.

Coen Teulings (UvA, SEO), 2005, Comparative advantage, relative wages, and the accumulation of human capital, Journal of Political Economy, 113 (2), 425.

What do daily stock prices, monthly rainfall figures, weekly sales data and annual gross domestic product have in common? Answer: all are time series. That is, they are data observed at regular intervals over time. While these data may arise in very different contexts, they are, from a mathematical perspective, all very similar. People who collect such data usually want to know one thing: what does the future hold? Predicting the future values of such data is known as “time series forecasting”.

is expected to fall with probability 95%. This provides a measure of uncertainty associated with the forecast. A great deal of work in the past 25 years has gone into methods for calculating such intervals accurately. The paper concludes with some forecasts of its own- regarding the future of forecasting. The authors predict that time series forecasting in the future will involve even heavier computation and that methods will be developed to deal with hundreds of time series simultaneously.

By Jan G. de Gooijer (UvA), Rob J.

public highways include costefficiency, innovativeness, and availability of funds. A main disadvantage is the divergence between the private objective of profit maximisation and the social objective of welfare maximisation. An important question is therefore whether there are ways, particularly through the design of auctions for highway concessions, to make the private operator behave more in line with welfaremaximising price- and capacity setting. Moreover, since the use of auctions or comparable allocation mechanisms seems to be unavoidable in the awarding of concessions for private highways, one needs to

Hyndman (Monash University Australia), 25 Years of IIF Time Series Forecasting: A Selective

Modern time series forecasting is a highly advanced computational science, involving complex mathematical models and fast computers. But 25 years ago, time series forecasting was in its infancy; the models on hand were much simpler and the computers available were primitive compared to the home PC. This paper explores the evolution and development of time series forecasting over the past 25 years. The paper marks the 25th anniversary of the formation of the International Institute of Forecasters (IIF) and the founding of the first scholarly journal of forecasting. It reviews over 300 academic papers and 17 books that have been published on the topic. One major advance during this period is the use of prediction intervals. An example: rather than give a single value for tomorrow’s predicted maximum temperature, forecasters now routinely provide an interval within which the temperature


Review, TI 05-068/4

Second-best road pricing through highway franchising The private supply of highway capacity offers one alternative to deal with growing traffic congestion when there are insufficient public funds to finance new capacity, and insufficient support for public road pricing. Proclaimed potential advantages of private over

understand of the potential efficiency impacts of the design of such auctions. This paper considers the welfare impacts of a range of franchising regimes for congested highways. For a single road in isolation, a

tinbergen magazine 12, fall 2005

competitive auction, with the level of road use as the decision criterion, is shown to produce the socially optimal road (in terms of capacity and toll level) as the equilibrium outcome, provided neutral scale economies characterise highway operations. The auction outperforms various alternatives in which the bidders are asked to minimise the toll level or toll revenues, or to maximise capacity or the bid for the franchise. When second-best network aspects are taken into account, the patronagemaximising auction is no longer optimal. When unpriced congestion on parallel capacity dominates (i.e., there are unpriced roads or lanes parallel to the one under consideration), the second-best highway would generate losses, and the zero-profit condition becomes binding. The auction produces a belowoptimal capacity. When unpriced congestion on serial capacity dominates (i.e., there are unpriced roads or lanes upstream or downstream of the one under consideration), the auction produces an aboveoptimal capacity. In both cases, however, the auction remains second-best optimal: it produces the highest efficiency possible under a zero-profit constraint for the road operator.

By Erik T. Verhoef (VU), Second-best road pricing

Examples are exchange-, communication- and diseasetransmission networks. A power measure for networks assigns to every position in a network a real number that somehow reflects the importance of the positions. One of the best-known power measures is the degreemeasure, assigning to every position its number of direct neighbours. Another measure is the beta-measure, which distributes the power over each position equally among its direct neighbours. Taking the process one step further, the second-order betameasure distributes the betapower value of each position equally among its direct neighbours. By repeating this procedure, each time distributing the newly obtained power values, the study shows that this procedure has a limit, which is equal to the degree-measure. Although the degree-measure is usually considered to be a local power measure, the study thus shows that it also can be seen as a global measure within powerdependence theory. Finally, the study provides full axiomatic characterisations of the two measures mentioned above, showing that they differ only in the normalisation that is used. The conclusion: choosing the normalisation in particular applications should be done carefully, since different normalisations yield different power distributions.

through highway franchising TI DP05-082/3 By RenĂŠ van den Brink (VU), Peter Borm and Ruud Hendrickx (both

Characterisations of network power measures

Tilburg University), and Guillermo Owen (Naval Postgraduate School, Monterey, CA), Characterizations of Network

Networks play an important role in economics and social sciences. This study considers symmetric networks, in which the roles of the two positions on each link are symmetric.

Power Measures TI 2005-06/1.


Money supply and the implementation of interest rate targets The stance of monetary policy in industrialised countries is commonly summarised in terms of changes in a shortrun nominal interest rate target. Less attention is paid to the behaviour of monetary aggregates, although money supply still serves as the main instrument of many realworld central banks. According to the conventional view on monetary policy, a rise in the supply of money is expected to lead to a decline in the interest rate. A closer look at the structural relation between money supply and interest rates in the standard general equilibrium model used for macroeconomic policy analysis casts doubts on this view. An increase in the money growth rate is actually associated with higher nominal interest rates. In these models, highly stylised interest rate targets (specifically, forward-looking Taylor rules: policy rules that are designed to stabilise inflation) cannot be implemented by nondestabilising money supply adjustments (for example, those that avoid hyperinflation equilibria). However, an interest rate target rule can be implemented by stabilising money supply changes if the rule is sufficiently inertial. This observation can be used to reconcile theory with the data. Empirical studies usually find short-run interest rates to be highly inertial. Standard macroeconomic theory, however, can hardly rationalise why a central banker smoothes interest rates. Efficiency requires non-inertial policy responses. The analysis in this paper provides an alternative

explanation for interest rate inertianamely, that it is caused by policy implementation constraints (rather than it being attributed to the central banker’s preference for interest rate smoothing).

By Andreas Schabert (UvA), Money supply and the implementation of interest rate targets TI 05-059/2

tinbergen magazine 12, fall 2005

theses The quality of political decisionmaking In representative democracies, citizens give politicians the authority to decide on the implementation of a variety of policies. Although delegation has clear advantages in terms of benefits of specialisation, it may also create agency problems between citizens and the representative politicians. Politicians may exert too little effort, implement inefficient policies, extract rents, or otherwise perform badly. Principal-agent problems between citizens and politicians are thus central to the analysis of this thesis. The main objective is to explain several institutional arrangements, observed in governments, in light of the agency problems. Particular attention is paid to the role of information and politicians’ motivation in the political decision-making process.

The thesis also examines the incentives of office holders to admit to a policy failure. The electoral consequences from admitting a mistake are more severe in an environment in which other politicians hardly ever admit policy failures, than if other politicians are likely to put at risk their reputation as well. An opportunistic political culture may consequently breed opportunistic

behaviour. Further under study are the incentives for a party leader and a party’s rank-and-file to replace sitting members of parliament. An incompetent leader replaces competent parliamentarians and retains other incompetent colleagues in order to reduce the risk that a future policy failure is discovered. In a decentralised party, parliamentarians are more often replaced to improve the quality of decisionmaking. Empirical results from the Netherlands show that political turnover is higher if parties are organised in a decentralised manner. Finally, the thesis also provides an explanation for the observed variety in the composition of committees in the U.S. Congress and for the sequential nature of information collection in budgetary systems.

intended to demonstrate the essential assumptions that must be imposed. A risk measure is appropriate if and only if its characterising axioms are.

Risk measures and stochastic dependence, with applications to insurance and finance

This thesis presents two new axiomatic characterisations of risk measures. The first is an axiomatisation of risk measures that are additive for independent random variables. The second is an axiomatisation of risk measures that are subadditive (superadditive) for comonotonic random variables. Once a risk measure has been selected (or rather: axiomatised), the next step is to actually calculate it. This can be a non-trivial exercise, particularly in a multivariate setting, when there is stochastic dependence between the risks under consideration.

Risk measures have for many years been important objects of study. Mathematically, a risk measure is a mapping from a class of random variables to the real line. Economically, a risk measure should capture the preferences of the decision maker in the economic situation under study. New postulates of risk measures should be equipped with rigorous justificative arguments. The appropriate tool to justify a risk measure is an axiomatic characterisation, which is

Many problems found in the areas of insurance and finance feature sums of dependent random variables. Under law invariance, the measurement of risk in such problems is reduced to determining the distribution function of the sum. However, distribution functions of sums of (dependent) random variables are typically of a complex form. Moreover, appropriately capturing the dependence structure within a random vector is a problem in its own right. For several

Thesis: ‘The quality of political decision-making: Information and motivation’ by Klaas Beniers Published in the Tinbergen Institute Research Series # 359


law-invariant risk measures, such as the Value-at-Risk and the Tail-Value-at-Risk, it is in practice only the tail of the distribution function that is relevant. This work studies the limiting behaviour of the tail distribution for specific sums of dependent random variables encountered in insurance: in particular, the discounted sums of heavytailed losses. The study then investigates applications of risk measures to two main problems in insurance and finance: valuation in incomplete markets and solvency capital allocation.

Thesis: ‘Essays on Risk Measures and Stochastic Dependence, with Applications to Insurance and Finance’ by Roger J.A. Laeven Published in the Tinbergen Institute Research Series # 360

Papers in TI-ranked journals by TI Fellows 2005 AA-ranked Journals Goeree, J.K., E. Maasland, S. Onderstal and J.L. Turner, 2005, How (not) to raise money, Journal of Political Economy, 113(4), 897-918. Teulings, C.N., 2005, Comparative advantage, relative wages and the accumulation of human capital, Journal of Political Economy, 113 (2), 425-61.


A-ranked journals Abbring, J.H., G.J. van den Berg and J.C. van Ours, 2005,

353 YIU CHUNG CHEUNG (31/5/2005), Essays on

The effect of unemployment insurance sanctions on the

European bond markets.

transition rate from unemployment to employment, Economic Journal, 115(505), 602.

354 ALJA_ULE (7/6/2005), Exclusion and Cooperation in Networks.

Baye, M.R., D. Kovenock and C.G. de Vries, 2005, Comparative analysis of litigation systems: An

355 IBOLYA SCHINDELE (20/5/2005), Three Essays

auction-theoretic approach, Economic Journal,

on Venture Capital Contracting.

115(505), 583.

356 MARTIJN VAN DER HEIDE (5/7/2005), An

Bleichrodt, H., J. Doctor and E. Stolk, 2005, A

Economic Analysis of Nature Policy.

nonparametric elicitation of the equity-efficiency trade-off in cost-utility analysis, Journal of Health

357 YONGJIAN HU (16/6/2005), Essays on Labour

Economics, 24(4), 655-78.

Economics: Empirical Studies on Wage Differentials across Categories of Working Hours, Employment

Bleichrodt, H. and J.L. Pinto, 2005, The validity of

Contracts, Gender and Cohorts.

qalys under non-expected utility, Economic Journal, 115(503), 533-51.

358 SIMONETTA LONGHI (9/9/2005), Open Regional Labour Markets and Socio-economic Developments.

Boswijk, H.P. and P.H. Franses, 2005, On the

Studies on Adjustment and Spatial Interaction.

econometrics of the bass diffusion model, Journal of Business & Economic Statistics, 23(3), 255-69.

359 KLAAS JAN BENIERS (1/9/2005), The Quality of Political Decision-making: Information and

Dellaert, B.G.C. and S. Stremersch, 2005, Marketing


mass-customized products: Striking a balance between utility and complexity, Journal of Marketing

360 ROGER LAEVEN (21/09/2005), Essays on Risk

Research, 42(2), 219-27.

Measures and Stochastic Dependence. With Applications to Insurance and Finance.

Franses, P.H., On the use of econometric models for policy simulation in marketing, 2005, Journal of

361 NEELTJE VAN HOREN (9/9/2005), Economic

Marketing Research, 42(1), 4-14.

Effects of Financial Integration for Developing Countries.

Franses, P.H., Diagnostics, expectations and endogeneity, 2005, Journal of Marketing Research, 42(1), 27-29. Goeree, J.K. and C.A. Holt, 2005, An experimental study of costly coordination, Games and Economic Behavior, 51(2), 349-64. Hommes, C., J. Sonnemans, J. Tuinstra and H. van de Velden, 2005, Coordination of expectations in asset pricing experiments, Review of Financial Studies, 18(3), 955. Hordijk, A. and D. van der Laan, 2005, On the average waiting time for regular routing to deterministic queues, Mathematics of Operations Research, 30(2), 521-45. Kobberling, V. and P.P. Wakker, 2005, An index of loss aversion, Journal of Economic Theory, 122(1), 119-31.


tinbergen magazine 12, fall 2005

Laan, D. van der, 2005, Routing jobs to servers with

Gelderen, M. van, R. Thurik and N. Bosma, 2005,

deterministic service times, Mathematics of

Success and risk factors in the pre-startup phase,

Operations Research, 30(1), 195-225.

Small Business Economics, 24(4), 365.

Post, T. and H. Levy, 2005, Does risk seeking drive

Herings, P. J.-J., G. van der Laan and D. Talman,

stock prices? A stochastic dominance analysis of

2005, The positional power of nodes in digraphs,

aggregate investor preferences and beliefs, Review

Social Choice and Welfare, 24(3), 439.

of Financial Studies, 18(3), 925. Hoekstra, J. and J.C.J.M. van den Bergh, 2005, Sandor, Z. and M. Wedel, 2005, Heterogeneous

Harvesting and conservation in a predator-prey

conjoint choice designs, Journal of Marketing

system, Journal of Economic Dynamics & Control,

Research, 42(2), 210-18.

29(6), 1097-1120.

Schabert A., 2005, Identifying monetary policy

Hommes, C., H. Huang and D. Want, 2005, A robust

shocks with changes in open market operations,

rational route to randomness in a simple asset

European Economic Review, 49(3), 561-77.

pricing model, Journal of Economic Dynamics & Control, 29(6), 1043-72.

Siegmann, A. and A. Lucas, 2005, Discrete-time financial planning models under loss-averse

Houweling, P., A. Mentink and T. Vorst, 2005,

preferences, Operations Research, 53(3), 403-14.

Comparing possible proxies of corporate bond liquidity, Journal of Banking & Finance, 29(6), 1331-58.

Verbeek, M. and F. Vella, 2005, Estimating dynamic models from repeated cross-sections, Journal of

Janssen, M.C.W., J.L. Moraga-Gonzalez and M.R.

Econometrics, 127(1), 83-102.

Wildenbeest, 2005, Truly costly sequential search and oligopolistic pricing, International Journal of

Vogelsang, T.J. and P.H. Franses, 2005, Testing for

Industrial Organization, 23(5,6), 451-66.

common deterministic trend slopes, Journal of Econometrics, 126(1) 1-24.

Koning, A.J., P.H. Franses, M. Hibon and H.O. Stekler, 2005, The M3 competition: Statistical tests of the

Wakker, P.P., 2005, Decision-foundations for

results, International Journal of Forecasting, 21(3),

properties of nonadditive measures: General state


spaces or general outcome spaces, Games and Economic Behavior, 50(1), 107-25.

Lejour, A.M. and R.A. de Mooij, 2005, Turkish delight: Does Turkey’s accession to the EU bring

B-ranked journals

economic benefits?, Kyklos, 58(1), 87-120.

Berg, B. van den, H. Bleichrodt and L. Eeckhoudt (2005) Contingent valuation: The economic value of

Listes, O. and R. Dekker, 2005, A scenario

informal care: a study of informal caregivers’ and

aggregation-based approach for determining a

patients’ willingness to pay and willingness to accept

robust airline fleet composition for dynamic

for informal care, Health Economics, 14(4), 363-76.

capacity allocation, Transportation Science, 39(3), 367-83.

Bleichrodt, H. and L. Eeckhoudt, 2005, Saving under rank-dependent utility, Economic Theory, 25(2), 505-

Mooij, R.A. de, 2005, Empirical models and policy-


making: Interaction and institutions, Economica, 72(286), 366.

Bloemen, H.G. and E.G.F. Stancanelli, 2005, Financial wealth, consumption smoothing and income shocks

Moraga-Gonzalez, J.L. and J.-M. Viaene (2005),

arising from job loss, Economica, 72(3) 431-52.

Dumping in a global world: Why product quality matters, The World Economy, 28(5), 669-82.

Boot, A.W.A., T.T. Milbourn and A.V. Thakor, 2005, Sunflower management and capital budgeting,

Paap, R. P.H. Franses and D. van Dijk, 2005, Does

Journal of Business, 78(2), 501-28.

Africa grow slower than Asia, Latin America and the Middle East? Evidence from a new data-based

Bosman, R., M. Sutter and F. van Winden, 2005, The

classification method, Journal of Development

impact of real effort and emotions in the power-to-

Economics, 77(2), 553-70.

take game, Journal of Economic Psychology, 26(3), 407-29.

Pennings, E., 2005, How to maximize domestic benefits from foreign investments: The effect of

Dur, R. and H. Roelfsema, 2005, Why does

irreversibility and uncertainty, Journal of Economic

centralisation fail to internalise policy

Dynamics & Control, 29(5), 873-89.

externalities?, Public Choice, 122(3-4), 395. Stel, A. van, M. Carree and R. Thurik, 2005, The Francois, J., H. van Meijl and F. van Tongeren, 2005,

Effect of Entrepreneurial Activity on National

Trade liberalization in the Doha Development

Economic Growth, Small Business Economics, 24(3),

Round, Economic Policy, 20(42), 349-79.



tinbergen magazine 12, fall 2005

Wennekers, S., A. van Wennekers, R. Thurik and


P. Reynolds, 2005, Nascent Entrepreneurship and

Harold Houba, VU, Alternating Offers in Economic

the Level of Economic Development, Small Business


Economics, 24(3), 293-309. 05-065/1

Discussion papers

Martijn Egas, Institute for Biodiversity and Ecosystem Dynamics, UvA, Arno Riedl, UvA, The Economics of Altruistic Punishment and the Demise

Institutions and Decision Processes

of Cooperation



Maarten C.W. Janssen, EUR, Rob van der Noll, EUR,

Astrid Hopfensitz, Ernesto Reuben, UvA, The

and CPB Netherlands Bureau for Economic Policy

Importance of Emotions for the Effectiveness of

Analysis, The Hague, Internet Retailing as a

Social Punishment

Marketing Strategy 05-076/1 05-043/1

Cees Diks, Valentyn Panchenko, CeNDEF, UvA,

Cees Diks, Florian Wagener, UvA, Equivalence and

Nonparametric Tests for Serial Independence Based

Bifurcations of Finite Order Stochastic Processes

on Quadratic Forms



Maarten Pieter Schinkel, Jan Tuinstra, UvA, Jakob

Klaas J. Beniers, EUR, Party Governance and the

Rüggeberg, LECG, Madrid and Brussels, Illinois Walls

Selection of Parliamentarians



Phongthorn Wrasai, EUR, Politicians’ Motivation, Role

Viktória Kocsis, Corvinus University of Budapest,

of Elections, and Policy Choices

Network Asymmetries and Access Pricing in Cellular Telecommunications

05-052/1 Peter Boswijk, Cars H. Hommes, Sebastiano Manzan,


UvA, Behavioral Heterogeneity in Stock Prices

Josse Delfgaauw, EUR, The Effect of Job Satisfaction on Job Search

05-055/1 Cars Hommes, UvA, Heterogeneous Agent Models: Two Simple Case Studies

Financial and International Markets 05-040/2


Leon Bettendorf, Stephanie van der Geest, EUR,

Cars H. Hommes, UvA, Heterogeneous Agent Models

Gerard Kuper, University of Groningen, Do Daily

in Economics and Finance

Retail Gasoline Prices adjust Asymmetrically?



Carl Chiarella, Tony He, University of Technology,

Harry P. Bowen, Vlerick Leuven Gent Management

Sydney, Cars H. Hommes, UvA, A Dynamic Analysis

School, Haris Munandar, and Jean-Marie Viaene, EUR,

of Moving Average Rules

The Limiting Distribution of Production in Integrated Economies: Evidence from US States and EU Countries

05-058/1 Stefano Ficco, Vladimir Karamychev, EUR, Evaluation


Problem versus Selection Problem in Organizational

Harry P. Bowen, Vlerick Leuven Gent Management


School, Haris Munandar, Jean-Marie Viaene, EUR, Zipf’s Law for Integrated Economies

05-061/1 René van den Brink, VU, Peter Borm, and Ruud


Hendrickx, CentER, Tilburg University, Guillermo

Ludger Linnemann, University of Cologne, Andreas

Owen, Naval Postgraduate School, Monterey, Ca,

Schabert, UvA, Productive Government Expenditure

USA, Characterizations of Network Power Measures

in Monetary Business Cycle Models



Hendrik P. van Dalen, EUR, Mieke Reuser,

Andreas Schabert, UvA, Money Supply and the

Netherlands Interdisciplinary Demographic

Implementation of Interest Rate Targets

Institute, The Hague, What Drives Donor Funding in Population Assistance Programs?

05-072/2 Joseph Francois, EUR and CEPR, Preferential Trade


Arrangements and the Pattern of Production and

Harold Houba, VU, Stochastic Orders of Proposing

Trade when Inputs are Differentiated

Players in Bargaining 20

tinbergen magazine 12, fall 2005



J. Francois, EUR, and CEPR, B. Hoekman, World Bank,

Gert-Jan M. Linders, VU, Arjen Slangen, EUR, Henri

Institut d’Etudes Politiques, Paris, and CEPR, M.

L.F. de Groot, VU, Sjoerd Beugelsdijk, Radboud

Manchin, EUR, Preference Erosion and Multilateral

University Nijmegen, Cultural and Institutional

Trade Liberalization

Determinants of Bilateral Trade Flows



Ludger Linnemann, University of Cologne, Andreas

Erik T. Verhoef, VU, Second-best Road Pricing

Schabert, UvA, Debt Non-Neutrality, Policy

Through Highway Franchising

Interactions, and Macroeconomic Stability 05-088/3 05-078/2

Jos Van Ommeren, Willemijn Van der Straaten, VU,

John B. Davis, UvA, Social Identity Strategies in

Identification of ‘Wasteful Commuting’ using Search

Recent Economics




Joseph F. Francois, Hugo Rojas-Romagosa, EUR, The

Thomas de Graaff, Henri L.F. de Groot, VU, Caroline

Construction and Interpretation of Combined Cross-

A. Rodenburg, VU and Ernst & Young, Erik T. Verhoef,

Section and Time-Series Inequality Datasets

VU, The WTP for Facilities at the Amsterdam Zuidas



Roger Lord, EUR and Rabobank International,

Wouter Vermeulen, CPB Netherlands Bureau for

Utrecht, Antoon Pelsser, EUR and ING Group,

Economic Policy Analysis, The Hague, Jos van

Amsterdam, Level-Slope-Curvature - Fact or Artefact?

Ommeren, VU, Compensation of Regional Unemployment in Housing Markets

05-087/2 Antonio G. Chessa, Marije C. Schouwstra, UvA, Total


Factor Productivity and the Mongolian Transition

Rob F.T. Aalbers, UvA, Herman R.J. Vollebergh, EUR, An Economic Analysis of Mixing Wastes

05-098/2 Andreas Schabert, UvA, Discretionary Policy, Multiple


Equilibria, and Monetary Instruments

Chris van Klaveren, Henriëtte Maassen van den Brink, UvA, Intra-household Work Time

Labour, Region and Environment

Synchronization: Togetherness or Material Benefits?

05-039/3 Jan Rouwendal, Jaap Boter, VU, Assessing the Value


of Museums with a Combined Discrete Choice /

Chris van Klaveren, Bernard M.S. van Praag,

Count Data Model

Henriëtte Maassen van den Brink, UvA, Empirical Estimation Results of a Collective Household Time


Allocation Model

Simonetta Longhi, Peter Nijkamp, VU, Forecasting Regional Labour Market Developments Under Spatial


Heterogeneity and Spatial Autocorrelation

Sebastian Buhai, EUR and Aarhus School of Business,


Separation in a Stochastic Productivity Model

Coen N. Teulings, UvA, Tenure Profiles and Efficient Jaap H. Abbring, VU, Jeffrey R. Campbell, Federal Reserve Bank of Chicago, A Firm’s First Year

Econometrics 05-016/4


Ad Ridder, VU, Adam Shwartz, Technion Israel

Jaap H. Abbring, Gerard J. van den Berg, VU, Social

Institute of Technology, Large Deviations Methods

Experiments and Instrumental Variables with

and the Join-the-Shortest-Queue Model

Duration Outcomes 05-042/4 05-069/3

Robin P. Nicolai, Rommert Dekker, EUR, Automated

Jos Van Ommeren, VU, Mihails Hazans, University of

Response Surface Methodology for Stochastic

Latvia, Riga, The Workers’ Value of the Remaining

Optimization Models with Unknown Variance

Employment Contract Duration 05-044/4 05-070/3

Dick van Dijk, Haris Munandar, Christian M. Hafner,

Pieter A. Gautier, VU, Coen N. Teulings, UvA, Aico

EUR, The Euro Introduction and Non-Euro Currencies

van Vuuren, VU, On-the-Job Search and Sorting 05-051/4 Reza Anglingkusumo, VU and Bank-Indonesia, Jakarta, Stability of the Demand for Real Narrow Money in lndonesia 21

tinbergen magazine 12, fall 2005



Reza Anglingkusumo, VU, and Bank-Indonesia, Jakarta, Money - Inflation Nexus in Indonesia: Evidence from a P-Star Analysis

Tinbergen Magazine is published by 05-060/4 Siem Jan Koopman, André Lucas, VU, Robert

the Tinbergen Institute, an economic

Daniels, De Nederlandsche Bank, Amsterdam, A NonGaussian Panel Time Series Model for Estimating and

research institute operated jointly by

Decomposing Default Risk

the Economics and Econometrics 05-066/4 Bruno Gaujal, INRIA Rhône-Alpes, Montbonnot Saint

faculties of three Dutch universities:

Martin, France, Arie Hordijk, Leiden University, Dinard van der Laan, VU, On the Optimal Policy for

Erasmus Universiteit Rotterdam,

Deterministic and Exponential Polling Systems

Universiteit van Amsterdam and Vrije 05-067/4 Yebin Cheng, Jan G. de Gooijer, UvA, Bahadur

Universiteit Amsterdam. Tinbergen

Representation for the Nonparametric M-Estimator Under Alpha-mixing Dependence

Magazine highlights on-going


research at Tinbergen Institute and is

Jan G. de Gooijer, UvA, Rob J. Hyndman, Monash University, Australia, 25 Years of IIF Time Series

published twice a year.

Forecasting: A Selective Review 05-071/4 Siem Jan Koopman, André Lucas, André Monteiro, VU, The Multi-State Latent Factor Intensity Model for Credit Rating Transitions 05-081/4 Siem Jan Koopman, Kai Ming Lee, VU, Measuring

Photographs Henk Thomas, Amsterdam Levien Willemse, Rotterdam Editorial services Etc. Editorial, Breda

Asymmetric Stochastic Cycle Components in U.S. Macroeconomic Time Series 05-084/4

Design Crasborn Grafisch Ontwerpers bno, Valkenburg a.d. Geul | 05554

J.S. Cramer, UvA, Omitted Variables and Misspecified Disturbances in the Logit Model

Printing Drukkerij Tonnaer, Kelpen

05-086/4 Bernd Heidergott, VU, Arie Hordijk, Leiden

ISSN 1566-3213

University, Miranda van Uitert, VU, Series Expansions for Finite-State Markov Chains 05-089/4 Michiel de Pooter, Martin Martens, Dick van Dijk, EUR, Predicting the Daily Covariance Matrix for S&P 100

Addresses Tinbergen Institute Amsterdam Roetersstraat 31 1018 WB Amsterdam The Netherlands

Stocks Using Intraday Data - But Which Frequency to Use?

Telephone: +31 (0)20 551 3500 Fax: +31 (0)20 551 3555

05-091/4 Siem Jan Koopman, Marius Ooms, VU, M. Angeles Carnero, University of Alicante, Periodic Seasonal Reg-ARFIMA-GARCH Models for Daily Electricity Spot Prices 05-092/4 Jurgen A. Doornik, Nuffield College, University of

Tinbergen Institute Rotterdam Burg. Oudlaan 50 3062 PA Rotterdam The Netherlands Telephone: +31 (0)10 408 8900 Fax: +31 (0)10 408 9031

Oxford, Marius Ooms, VU, Outlier Detection in GARCH Models



tinbergen magazine 12, fall 2005

Tinbergen Research Institute Four themes distinguish Tinbergen Institute’s research programme: I. Institutions and Decision Analysis II. Financial and International Markets III. Labour, Region and the Environment IV. Econometrics and Operations Research Each theme covers the whole spectrum of economic analysis, from theoretical to empirical research. Stimulating discussions on theories, methodologies and empirical results arise from the interaction of the Institute’s faculty– comprised of approximately 96 fellows. These fellows are faculty members with excellent track records in economic research, active in organising research activities, teaching graduate courses and supervising Ph.D. students. Discussion Papers Research is pre-published in the institute’s own Discussion Paper Series. Download discussion papers at (section ‘Publications’). E-mail address for correspondence: Tinbergen Graduate School Tinbergen Institute offers a five-year graduate programme, consisting of two years of intensive graduate coursework in its Master of Philosophy (M.Phil.) in Economics programme and three years of Ph.D. thesis research. The M.Phil. programme is a two-year research master in economics, econometrics, and finance that leads to an M.Phil. degree in economics. Due to the demanding nature of the programme, the M.Phil. is open only to a rigorously selected group of students. An excellent preparation for Ph.D. thesis research, the M.Phil. programme is connected to three-year Ph.D. positions in the economics departments of the Erasmus Universiteit Rotterdam, the Universiteit van Amsterdam and the Vrije Universiteit Amsterdam. The M.Phil. in Economics has been accredited by the Dutch and Flemish Accreditation Organisation for higher education (NVAO), and eligible students can claim two years of financial aid (“studiefinanciering”). In addition, Tinbergen Institute allocates a limited number of scholarships each year based on academic merit.

Detailed information on the institute’s graduate programme and the application procedure can be found in the Graduate School section of Please sent any questions to Board A.G.Z. Kemna (Chair), J.-W. Gunning, H. Oosterbeek, J.J.M. Kremers, C.G. de Vries. General Director M.C.W. Janssen Director of Graduate Studies J.H. Abbring Research Programme Co-ordinators Institutions and Decision Analysis: G. van der Laan, O.H. Swank Financial Economics and International Markets: F.C.J.M. de Jong, J.-M. Viaene Labour, Region and the Environment: J.C.J.M. van den Bergh, E. Plug Econometrics: R. Dekker, S.J. Koopman Scientific Council D.W. Jorgenson (Harvard University), M. Dewatripont (CORE), P. de Grauwe (Leuven University), D.F. Hendry (Oxford University), R.C. Merton (Harvard University), D. Mortensen (Northwestern University), S. Nickell (London School of Economics), T. Persson (Stockholm University), L. Wolsey (CORE) Social Advisory Council C.A.J. Herkströter (Chair), R.G.C. van den Brink (ABN-AMRO), H.J. Brouwer (DNB), M.J. Cohen (Mayor of Amsterdam), F.J.H. Don (CPB), C. Maas (ING), F.A. Maljers, I.W. Opstelten (Mayor of Rotterdam), A.H.G. Rinnooy Kan (ING), H. Schreuder (DSM), R.J. in ’t Veld, P.J. Vinken, L.J. de Waal Editorial Board Tinbergen Magazine B.K. Bierut, T.R. Daniëls, M.C.W. Janssen, R. Mendes, F. Ravazzolo How to subscribe? Address for correspondence/ subscriptions: Tinbergen Institute Rotterdam Burg. Oudlaan 50 3062 PA Rotterdam the Netherlands E-mail: Address changes may be sent to the above e-mail address.

In this issue

On the road to KNAW re-accreditation: Commendation and critique of TI by the International Peer Review Committee An interview with Dale Jorgenson

Changing how economists think about risk attitudes

Common factors in credit risk

Letters from Alumni

Publications and references


Tinbergen Institute Magazine highlights ongoing research at Tinbergen Institute for policymakers and scientists.