Maping Methods: An Instructors Survey. ECPR General Conference, Bordeaux

Page 1

See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/257138328

Blanchard, P., B. Rihoux & P. Álamos-Concha. 2013. Mapping Methods: An Instructors’ Survey. ECPR General Conference, Bordeaux September 2013. CONFERENCE PAPER · SEPTEMBER 2013

READS

57

3 AUTHORS: Philippe Blanchard

Benoît Rihoux

The University of Warwick

Université catholique de Louvain

21 PUBLICATIONS 19 CITATIONS

77 PUBLICATIONS 814 CITATIONS

SEE PROFILE

SEE PROFILE

Priscilla Álamos-Concha Université catholique de Louvain 10 PUBLICATIONS 23 CITATIONS SEE PROFILE

Available from: Philippe Blanchard Retrieved on: 30 March 2016


ECPR General Conference Bordeaux September 2013 Panel 502: Mapping Methods, Mapping Research Traditions (chairs: Philippe Blanchard and Benoît Rihoux)

Mapping Methods: An Instructors’ Survey Philippe Blanchard1, Benoît Rihoux2, and Priscilla Álamos-Concha3

1. Introduction4 T. Kuhn (1970) saw methods as part of the core of paradigms that structure the progress of scientific knowledge. Like in other disciplines, methods in the social and political sciences unearth facts, justify evidences and guarantee the reliability of theories (Blanchard 2003). They contribute to the history of scientific disciplines and shape scientific traditions (Bryman 1984; Mahoney and Goertz 2012). Recent factors have modified and reappraised the role of methods in social sciences. The decline of universal theories and explanatory narratives has somewhat displaced the focus towards procedural sophistication. Universal numeric formats and easy web publication have made massive data have available, concurrently with new techniques of data collection and treatment. Internet also contributed to make research projects more collaborative. Research rings unify at national, continental and world level, thereby intensifying exchanges and competition. At the same time, since the 1960s-1970s, following an increased division of scientific work, methods have gained intellectual and institutional autonomy. Some journals, book collections, academic departments, courses and academics themselves now specialize in methods or methodology, increasing the pressure towards methodological refinement. As a consequence, taking a systematic overview of methods is of crucial importance to understand a discipline’s development. In this paper, we see methodology as the systematic study of methods, that is, “as a wide-ranging framework for choosing analytical strategies and research designs that underpin substantive research. As such, the concept of methodology incorporates epistemological and ontological concerns; but it also concerns wrestling with specific questions about the appropriateness of a given method” (Moses, Rihoux and Kittel 2005: 56). Hence this paper takes into account any aspect of the discipline’s “know-how”. Our target sample, made of methods instructors in the ECPR Methods Schools, reflects a significant part of the diversity of understandings of (and importance granted to) methods and methodology. Their views on methods range from the most specific (techniques and sets of techniques, method stricto sensu) to the most general (research approaches, even paradigms). This diversity of views on methods connects with the diversity of conceptions of research: What is at stake in political science today? How scientific is political science? Are there any criteria of good political science? Should these criteria be the same for all topics and all data? In a previous method mapping project, Moses, Rihoux and Kittel (2005) examined successively largeN, medium-N and small-N methods, in a comparison between European and American political 1

University of Lausanne, philippe.blanchard@unil.ch. Université catholique de Louvain, benoit.rihoux@uclouvain.be. 3 Université catholique de Louvain, priscilla.alamos@gmail.com. 4 The authors warmly thank Derek Beach, Raoul Cordenillo, Clara Egger, Patrick T. Jackson, Bernhard Kittel and Luc Tonka for their suggestions about previous version of this project. They also thank the ECPR instructors for the time spent on the questionnaire and for their encouragements. The analyses presented here are of the sole responsibility of the authors. Comments welcome. 2

1


science. Here we take a more inductive perspective. N (sample size) is not used a priori as the structuring factor for method maps, it is only one of our survey questions, one out of many potential other factors, in an exploring approach. Second, we take up a non comparative view. Although most students and a majority of instructors at ECPR Schools are affiliated to European universities, neither students’ expectations nor instructors’ pedagogy limit themselves to the European methodological tradition—as far as it would exist. Courses rather target methods at their best in their respective field. 2. The usual ways of dealing with methods What is usually done with methods? How are they used to contribute to the elaboration of scientific knowledge? The first use of methods is teaching. Most political science programmes include courses and seminars about accepted methods and techniques. Transmitting methodological know-how is more and more considered as one of the basic building blocks of a discipline. This is done in regular University curricula but also in doctoral schools and in method schools. The second use of methods is innovation. Improving one given method or family of methods is part of one’s recognized contribution to a discipline. This means providing new ways of doing research, based on new techniques, new designs, new field data that challenge the previous way. This is the function of the many methodology monographs available, and of method journals such as Sociological Method and Research. Combining methods is a third aspect of methodological work. Combining enables triangulating the object under study, that is, adding up the benefits from different angles, different levels of treatment, different tools. This is the moto of mixed-method or multi-method studies (Fielding and Fielding 2008; Tashakkori and Teddlie 1998). This literature provides general rules and examples of combinations, as well as epistemological or philosophical statements about what combining means, what you should and should not do when doing so. But it does not provide any general view over the space – or respective location – of methods. A fourth use of methods consists in taking a view further away from daily research practices by listing and explaining methods, as well as providing introduction and references for each of them. This is the function of method handbooks, encyclopaedias and dictionaries, such as Goodin and Klingemann (1996) or Lewis-Beck, Bryman and Liao (2004). This kind of books help beginners to start with, get a quick introduction on and find further readings about this or that method. It can also be useful for advanced users of one given method to get a wider and more systematic view on it. Teaching, innovating, combining and listing: these four approaches have proved useful, especially when one wishes to learn a method already identified as relevant for her/his research project. But to do so one usually follows the advices given by her/his instructor or her/his fellow scholars or students. This way, one ends up reproducing the tracks traditional to her/his own sub-field. Methodology often complies to the endogenous logic of normal scientific progress, that is, continuing what works as long as no crisis puts an end to it (Kuhn 1970). There are multiple, intricate causes to this endogeny. The first is a higher and higher methodological sophistication, with dedicated concepts, know-how and software. The cost of acquiring one method is higher than it used to be, all the more getting familiar with more than one. Sophistication goes with specialization, both for individual scholars, research units, sometimes even disciplinary sub-fields. Electoral studies for example dramatically focus on regression models applied to individual profiles, closing the door to alternatives such as ethnographic observation. Sophistication and specialization also go with educational divides in the academic community. The quantitative-qualitative fracture, a sort of replica of the separation between humanities and science criticized by Charles Snow (1969), limits the ability of young scholars to bridge divides. Last but not least, there is a lack of incentives to publish beyond the methods seen as ‘normal’ in one given field, not to mention multi-method 2


approaches (Bennett, Barth and Rutherford 2003). These three factors must be seen as a complex but steady system with positive feedbacks. The best way to win the academic war (publishing, rallying, funding) is to take sides. This paper aims at contributing to limit this endogeny and to open up collaborations. 3. Why we need to map methods What can we do to improve our view on the methodological landscape? How can we facilitate exchanges and collaboration? We propose mapping as an innovative, comprehensive approach to these questions (Moses, Rihoux and Kittel 2005). A map of political science methods requires locating the methods in use relative to each other and arranging all of them within the same space. It requires combining all the skills demanded by the tasks listed above: specific methodological know-how, a global view so as to make the map as comprehensive as possible, and knowledge of traditional methodological tracks so as to understand how methods are born and developed, comparatively. What are the uses of a methodological map? It enables one to locate one's particular methodological practice relatively to others' and to compare them systematically. It also helps one make the best next methodological step, instead of following disciplinary traditions, available training sessions or software trends, all factors often independent from a rational or cumulative logic of scientific research. It also provides a more comprehensive view to method trainees: beyond "mixed-methods" and "multimethods" courses, methodological maps provide synthetic views on relevant connections and combinations. It helps one to figure out what methodological complementarity, combination or chaining over time is worth considering on one's research topic. From the point of view of epistemology and history of science, mapping methods provides a global view on the logic of disciplinary development. We assume that methods are part of the collective dynamic of science. Disciplinary and sub-disciplinary fields are not only framed by famous scholars, central theories and reference case studies, but also by methods. Methods used in a field also contribute to its unity and continuity. In some cases, methods are part of the core disciplinary knowledge. Therefore a map of methods is also a map of a discipline. As shown by M. Pehl (2012), methodological tracks connect with sub-fields within a discipline. Reviews from scientific journals and institutional curricula (Bennett, Barth and Rutherford 2003; Billordo 2005; Boncourt 2007; Kittel 2009) provide interesting insights into methods in use and their connection with the discipline’s subfields. Some sub-fields are eclectic regarding their methodological choices, while other are clearly tied to a given method or set of methods. Our first try at mapping methods is based, not on methods as they are used and published, but on methods as they are taught. The colour graph below (fig. 1) was drawn from the abstracts of all courses given at the ECPR Winter and Summer Schools in Methods and Techniques in 2012-2013, respectively in Vienna (Austria) and Ljubljana (Slovenia). This course programme results from the priorities decided by the Schools’ convenors and from the availability of instructors. Schools try and cover most possible of good, up-to-date methodological practices, especially for methods that are not taught by universities and for the ones that are requested by students. Hence we have at hand an interesting sample, although not an exhaustive one, of the discipline’s methodological landscape in 2012-2013. On the map we placed close to each other courses that bear similarities, based on our knowledge and on the abstracts provided by instructors for the Schools’ programmes. We distinguished four main methodological families: case-based, interpretive, formal/experimental and statistical courses; plus fundamental courses, which prepare to family specialization; and software courses, which stand on the side because their function is to assist the use of different kinds of methods. We also identified basic or core groups of courses within families (inside dashed curves) and between-family ties (continuous lines). This map had two goals: describing the content of the courses and their substantial proximity, and helping students to orientate themselves in a complex and growing methodological landscape. 3


Yet this map was not fully satisfactory. We submitted it to a group of ECPR Methods School instructors and they expressed objections to the position of their course or to the structure of a region of the map. We had to admit that so many specialized methods, with diverse technical, epistemological, sometimes ontological backgrounds could hardly be arranged from a unique observation point.

4. A bottom-up approach to methods mapping Acknowledging the limits of our first, top-down perspective, we opted for a better documented approach: a collaborative, horizontal and inductive survey among method experts. The rapidity with which methods develop, the diversity of tools they borrow from other fields, the fast growth of devoted scientific software, combined with exponential empirical sources available and many specialized applications giving birth to sub-methods, all these factors compel to make method mapping a collaborative task. In this respect, the ECPR instructors form a unique pool of experts combining technical knowledge and epistemological hindsight. Taking stock of the recent debates in the literature on method and epistemology, such as in King et al. (1994) and in Brady and Collier (2004), we set up a grid of seventeen dimensions suited to contrast methods (Table 1). Dimensions include research design (for example: q2. Stage of the research process at which the method operates), techniques (e.g. q8. Intensiveness of Software used) and epistemology (e.g. q12. Kind of Causality involved5). Besides we assumed that methods carry conceptions and teaching practices within the discipline. Hence we also included questions about their place within the scientific field (e.g. q15. Degree of Acceptance of the method) and the pedagogy used (e.g. q3. Level taught). At the same time we kept the size of the questionnaire limited so as not to discourage respondents. We also avoided questions that could have “spoken” only to a minority of instructors, therefore generating misunderstandings and missing or inadequate replies. The seventeen questions were associated three to ten responses each, aiming at covering most of what the instructors would feel like replying. Multiple answers were always available, even making it possible to express uncertain, evolving, ambiguous positions or other kinds of non-standard responses that we would not be able to anticipate. We also added a “Non applicable” option and an open cell for optional “Comments” so as to catch the diversity of views and practices. A limit was that that for most methods only one expert was to be surveyed, generating biases due to the diversity of conceptions and pedagogies related to some methods. Consequently, all questions were presented in two columns: “My own view on the method as taught in the course”/“The predominant view of the larger community of users of this method (as much as you know about this predominant view)”. This way we collected the respondent’s view, a well documented assessment of the representativeness of this personal view as well as an assessment of the diversity of practices in the field. The survey was submitted online in spring 2013 to all the instructors who proposed one course or more over the most recent venues of the ECPR Methods School: Winter 2012, Winter 2013 and Summer 2013. We kept in a few courses that had been devised but that were finally cancelled because the threshold of applying students was not reached. The resulting sample was a unique set of 55 courses, given by 49 expert respondents, out of a total of 82 courses, given by 64 experts (response rate: 67% for the courses and 77% for the respondents). A large part of the missing courses can be explained by two factors: some instructors giving multiple courses replied just once (including some 5

Q16(Epistemology: “Could you define in one or a few words the main epistemological position attached to the method taught in your course?”) was kept open so as to catch as much as possible of the diversity of positions. The result is uneven. On one side, a third of the instructors did not reply, either refusing to declare too rigid, general positions, or uncertain about the precise technical term(s) that would fit their research practice. On the other side, the replies collected could be coded into seven quite clear stances: Analyticist; Constructivist or Interpretativist; Empirical or Empiricist; Neopositivist, Postpositivist, Positivist or Objectivist; Pluralist; Rationalist; Realist; and a few Others.

4


Figure 1. Training Offer at ECPR Schools in Methods and Techniques in 2012


of them who received two invitations for same or similar courses); some instructors did not reply because the questionnaire did not fit well their course, especially courses that did not include any empirical manipulations, such as software training and mathematical courses. Therefore no obvious bias is expected due to missing courses. Table 1. Survey questions and responses id

Code on map Question

Responses (all questions also include "Other" and "na")

1

Course

Name of the course

[N=82]

2

Stage

Which stage of the empirical research process does the Research design/Data collection/Analysis/Reporting method taught in your course mainly address?

3

Level

4

Evidence

5

AnalLevel

At which level of analysis is the method taught in your Macro/Meso/Micro course mainly situated?

6

N

How many empirical units does the method taught in your Single/Small or intermediate N/Large N course mainly address?

7

Generalization What is the main scope of the method taught in your course? Case-centric/Limited generalization/Broad generalization-Inference

8

Software

How software-intensive is the method taught in your None/Some software treatment/Software-based course?

9

Formalization

How formalized (use of mathematical symbols) is the Not formalized/Formalized but non statistical/Formalized method taught in your course? (statistical)

10

Theory

How is the method taught in your course connected with Theory-building/Rather theory-building/rather theorytheory? testing/Theory-testing

11

Goal

Comprehensive understanding/Rather more comprehension than What is the main goal of the method taught in your course? explanation/Rather more explanation than comprehension/Explanation-causality-full inference

12

Causality

Main attention on variation and difference-making/Main attention How is causality considered in the method taught in your on invariant causal processes/Main attention on set course? relations/‘Causal’ analysis not a relevant issue

13

Time

How is the time dimension considered in the method taught Synchronic/Diachronic (discrete)/Diachronic (process) in your course?

14

Standardization How standardized is the method taught in your course?

15

Acceptance

How widely accepted and practiced within political science is Widely accepted and practiced/Somewhat/Modestly/Mot at all the method taught in your course?

16

Epistemology

Could you define in one or a few words the main Open-ended question, coded in 9 responses items: Analyticist, epistemological position attached to the method taught in Constr.Interpr, Empirical.Empiricist, Neo.Post.Positivist.Objectivist, your course? Pluralist, Rationalist, Realist

17

Discipline

In which discipline is the method taught in your course most Political science/Sociology/Anthropology/Economics/Other social & used? behavioural science /Philosophy

18

Scope

What is the main scope of your course? (different research Different research approaches (broadest scope)/A research approaches, a research approach, a method, a set of approach/A method /A set of techniques (within a method) /A techniques within a method, a specific technique ) specific technique (narrowest scope)

At which level do you teach the method your course deals Introductory/Intermediate/Advanced with? Individual categorical/Individual numerical/Aggregate Which type of evidence (data, empirical material) does the categorical/Aggregate numerical/Visual and method taught in your course mainly refer to? sound/Interviews/Focus groups/Text/Ethnographic material/Secondary data

Fully standardized/Semi-standardized/Emerging

Three preliminary remarks should be made. First, we note very high correlations between the respondent’s view and the “predominant view” on the 17 questions. None of these two “views” should

6


be seen as perfectly representing the method at hand. But used together they make the results more robust. Second, ticking several answers at one given question may be an indicator of the ambiguity of the proposed answers, or of the difficulty to locate one’s method. But none of these two obviously happens. Respondents did make use of multiple answers, but in a reasonable manner. The number of answers is very similar for the respondent’s and the predominant view, showing no sign of specific hesitation regarding one or the other view. Third, the final “Overall comment” cell did not show any major flaw or deficiency in the questionnaire. Two substantive difficulties were expressed: how to describe one predominant view if practices in one given field are diverse and how to choose between responses that rely on categories that contradict the course as it is given. Both difficulties could be solved partly by using multiple responses. As for the second critic, made by an expert of multi-method research and probably shared by a few other instructors, we fully agree about the limits of the survey’s wording. Let us just recall at this stage that this wording is precisely the one used in the current methodological debates and conflicts. 5. The qualitative vs. quantitative cleavage: still present but not consensual We choose to apply correspondence analysis (CA) 6. CA is unrivalled as a method to explore survey data made from multiple multiple-answer categorical questions (Blanchard and Patou 2003). Examining all questions one by one for example using cross-tabulations would be inefficient and timeconsuming and it would not provide a large picture of what responses combine with what other responses. A regression model would imply an early decision on what should explain what. On the contrary, CA reveals the overall structure of the data, that is, how questions, responses and courses associate with each other and in what proportion, without requesting precise prior hypotheses. This is done in three steps (Greenacre 2007, Le Roux and Rouanet 2004). Firstly, the data structure is summarized into “factors”, that is, recurring combinations of responses that characterize some of the responses and courses. Then the factors are represented as axes on “factorial maps” with responses and courses as points on the map. Each factor contrasts two groups of responses that are tendentiously not associated with the same courses, for example left-hand responses against right-hand responses on the horizontal axis. The closer two response-points are, the more they were chosen by common respondents. The closer two course-points are, the more similar their description by instructors. Finally clouds of points can be gathered into clusters that form a typology of responses, associated with a typology of courses. Types are described by means of scores on axes and cross-tabulations. We include all 18 variables (see Table 1: course names + 17 questions) together into CA. At this point, no hypothesis needs to be formulated—except the major assumption that the 17 questions will provide a relevant description of the courses’ methodological and epistemological profiles. The most structuring responses and courses will emerge on the maps, together with additional statistics that are used here but not reproduced by lack of space. Figure 2 is based on the first (i.e. main) two axes. They cumulate 21% of the total variance, i.e. of all the information contained in the questionnaire. Three clusters of courses clearly distinguish themselves. The group of instructors on the left-hand side (i.e. the first cluster of a typology in 3 clusters, C1/3, gathering 35 courses) see causality and inference as the main Goal (q11) of their method, with broad generalization as a dominant Scope. They apply formal and statistical models (q9: Formalization) to large-N datasets (q6: N) made of individual and aggregate, categorical and ordinal-numerical data (q4: Evidence). They make ample use of software tools (q8: Software) and economics is their most specific Discipline (q17). 6

A course on Correspondence Analysis and Related Descriptive Multivariate Statistics will be proposed at the ECPR Winter School, 2014. See http://new.ecpr.eu/Events/PanelList.aspx?EventID=17.

7


The bottom-right corner (C2/3, n=7 courses) gathers instructors who do not reply to questions that make no or little sense regarding their topic. Indeed software and mathematics training do not bear much connection with Theory (q10), they are not applied to specific Goals (q11) and being used by all Disciplines (q17), they do not belong to any specific one. The top-right corner (C3/3, n=23) is composed of courses whose Evidences (q4) are interviews, text, visual and sound material, or material from focus groups and ethnographic investigation. These courses are specifically interested in case studies and rely mainly on constructivist, interpretivist or realist Epistemologies (q16). They are hardly formalized and they understand method in a pluralistic way, as an introduction to different research approaches (q18: Scope). To summarize, this first map reveals, first the specific position of some courses limited to one step of the research process and therefore not as concerned by the survey as much as others are (C2/3). Prototypical examples are an Introduction to SPSS and Linear Algebra and Calculus. This is not an artefact of the survey sample: some research communities actually define their method in the first place by one given tool or family of tools, such as lexicometry (e.g. the French-speaking community using Alceste software), Computer Assisted Qualitative Data Analysis Software (CAQDAS, especially Atlas.ti or NVivo), or network analysis. Centring research on a new and innovative tool can be a fruitful step, before a real method develops. More discretely, some tools may orient research in a community without its members realizing it, simply by not providing some useful options. For example, much of the descriptive and inferential statistics in political science has long been guided by what SPSS or Stata programmers would decide to program. Only statistics specialists would complement it with specialized software, or move to more demanding generalist software such as R. As a second result from this first map, we find the usual contrast between so-called quantitative and qualitative (QQ) methods (C1/3 vs. C3/3). Prototypical examples are respectively courses about Spatial voting and Binary logistic regression, and Expert interviews and Participatory and Deliberative Methods. This split is rooted in a long-standing methodological and epistemological tradition. Vivid debates took place between German and Austrian historians and social scientists during the Methodenstreit in the 1880s and 1890s. They were continued by the controversies and confrontations between the schools schools of Frankfurt and Vienna (Adorno et al. 1976), as well as between the Schools of Columbia and Chicago. The present methodological landscape largely inherits from rich debates that degenerated into rigid borders with high risky for trespassers. Nowadays the QQ opposition and some desperate efforts to overcome it is the prominent figure of this evolution (King et al. 1994; Brady and Collier 2004; Monroe 2005; Blanchard 2010). What our sample shows is that instructors in C3/3 take side more clearly in matters of general epistemological position (mainly, constructivism and interpretivism) than the ones in C1/3 (mainly empiricism and analyticism). The qualitative side seems to be, either more coherent epistemologically, or more conscious of this coherence, or more prone to affirm it in order to distinguish itself. Reversely, the quantitative side shows more epistemological diversity, or just less interest in defining itself in epistemological terms. Some the related methods might not feel the need to worry for their philosophical meaning, which they see as trivial or of secondary importance compared to concrete research itself. Yet the QQ divide is not consensual. Several courses, including courses based on empirical data and with a comprehensive view over the research process, are very weakly connected with axis 1. This is the case for Comparative designs, Game theory, Qualitative comparative analysis and fuzzy sets, or Sequence analysis. The related instructors apparently do not define their teaching along the QQ line. Second, the label “QQ” is only an a posteriori label for the opposition between C1/3 and C3/3. Instructors might not agree with what is commonly implied by this label. We also see “Q” and “Q” as very weak concepts, that obscure our understanding of what is at work in the scientific work more than it enlights it. What is meant here is that there are similarities between methodological and epistemological options within C1/3 and within C3/3. Further analysis should detail what these similarities imply for concrete research: Are methods within one family really close to each other? Do they complement each other and can they be implemented in “mixed” projects? Are methods from distinct families incompatible? 8


Figure 2. A map of methods (factorial map 1) Green dots represent the 15% responses that contribute the most to the map, with size proportionate to their contribution to the map. Black names represent courses, as illustrative variables, i.e. they do not contribute to the map’s structure. Three-cluster typology (red dots) is created by means of ascending hierarchical cluster analysis applied to CA scores. Reading example: Causality.p.Regul:y (top-left green name) represents the courses whose instructors replied that, that regarding Causality (see Table 1: q12), the main attention is put on variation and difference-making (regularities). This holds for the predominant view (.p.), as opposed to the respondent’s view (.r.). The final :y differentiates courses that do so from courses that do not (:n) (top-left point in the lower-right quadrant). Courses ending with “I/II” are basic/advanced training steps in the same topic. “W12/S12/W13” mark different versions of the same course given in the successive Schools.


Figure 3. A map of methods (factorial map 2). Same legend as map 1, but with six-cluster typology.


6. Varieties of “qualitative” methods By contrast with axes 1 and 2, axes 3 and 4 clearly materialize epistemological contrasts (Figure 3). As the first three clusters do not distinguish well on this map, we refine the classification and move up to 6 clusters. This refinement does not impinge on the courses that generate many missing values: C2/3 is equivalent to C3/6, showing that this set of courses is cohesive and that our interpretation of missing values was reliable. But the new clustering generates three interesting new clusters: On the left-hand side, C6/6 gathers eleven courses, among which Discourse analysis, Introduction to Atlas.ti, Foreign Languages in Qualitative Research and Participatory and Deliberative Methods. These methods put the accent on case studies, on data collection at micro-level, data analysis and theory-building, with constructivist and interpretivist epistemological stances. They refer to anthropology and “other social sciences”, probably linguistics, iconography, ethnography or literature. On the upper-right quadrant, C4/6 is composed of six courses, among which QCA and fuzzy sets and Comparative designs, which take intermediate stances on several respects. Instructors giving these courses show an interest in causality, non statistical formalization, a limited ambition to generalize (that is, more than case-centric but less than inference) and modest software use. They favour intermediate Ns, analysis at macro level and synchronous time dimension. They take up (neo/post)positivist and/or objectivist epistemological positions. On the first map, C4/6 is located inbetween C1/6 and C6/6. The lower-right quadrant, C5/6 gathers seven courses, including Methodological pluralism and problem-focused research, Issues in Political language and Writing ethnographic and other qualitative/interpretive research. These courses focus on theory-building as C4/6, but they distinguish themselves by putting the accent on single-case studies with no formalization and using no software. They have no causal purpose, or causality focused on invariant causal processes (“mechanisms”). Their epistemology is mostly realist. Two courses located in the corner, Knowing and the Known and Mathematics for political science, can be seen as nearly independent cases in this cluster. They express their universal significance and applicability by citing philosophy as a reference discipline. Reversely they neutralize questions about empirical materials and designs, either by ignoring them or by ticking many responses. These two courses mainly fit within C5/6 for similarities with two other courses that provide an overview on social and political research: Methodological pluralism and problem-focused research and An Introduction to Qualitative Methods for Political Scientists. To summarize this second map, it mainly distinguishes three varieties among the big initial “qualitative” cluster (C3/3). Each of these varieties considers specific evidences, at specific scales and using specific tools. Regarding the “quantitative” paradigm previously represented by C1/3, it is much less discriminating on this second map. C1/6 keeps together most of the courses from C1/3, with similar response patterns. This shows that C1/3 was more cohesive than C3/3. The few “quantitative” courses missing in C1/6 (Data access and management, Introduction to R and Structural equations) have “emancipated” and now form C2/6. They claim a specific focus on individual-level data, economics as main discipline of application, as well as causality, theory-testing and broad generalization as goals. We do not have enough space to present here an extended analysis of axes 5 and further. Suffice to say that these axes do provide indices of further cleavages, including between the courses that formed C1/3. They also show the role played by questions that have not been cited so far, such as the stage of the research process involved (q2), the role of time (q13) and disciplines (q17).

11


7. Further directions The main added value of method mapping through expert survey and correspondence analysis over other approaches (dictionaries, encyclopaedias, handbooks and monographs) is to provide at the same time a global view on the methodological landscape and up-to-date conceptual and technical developments on local similarities between methods. Yet the results presented here are preliminary. We drew the global maps but many sites (courses and courses features) and connections between sites (similarities between courses) remain to explore. More light should be brought onto the moving regions and continents, either methodological families with know common roots, or distinct methods that converge unintentionally. One main result already emerges: the so-called QQ divide appears structuring, but not as substantially as is often conceived. Previous descriptions of the two “traditions” (Mahoney and Goertz 2012) have often amplified the most contrasting dimensions in order to understand why divides are so pregnant. The method mapping survey enables taking seriously more dimensions in an inductive manner and let concrete similarities and dissimilarities appear. This should help understand new methodological comparisons and combinations, as well as new ways of teaching methods. In this respect a group of courses seems to experiment different ways out of the QQ story: exploring the concepts that found the social sciences so as to emancipate from rigid traditions (e.g. Knowing and the Known); exploring social reality from new angles such as intermediate levels and Ns (QCA and Fuzzy Sets); or combining diverse tools in the study of new objects (Process-tracing, Sequences). Only the detailed examination of each method can help understand the global evolution of methodology. The survey could be continued in several directions. We could interview more experts, be they method instructors or other scholars. We could survey them on new methods to enlarge the corpus, and on the same methods in order to seize alternative conceptions. We could add new questions about forgotten dimensions, or about methods that experts see as connected to (resp. disconnected from) the one they teach or practice. We could also make it a collaborative survey, for example by letting respondents suggest what additional dimensions they find relevant. Suggestions are welcome.

Bibliography ADORNO Albert, Ralf DAHRENDORF, Jürgen HABERMAS, Harald PILOT and Karl POPPER. 1976. The Positivist Dispute in German Sociology, Heinemann London and Harper Torchbook ALMOND Gabriel A. 1998. “Separate Tables: Schools and Sects in Political Science.” PS: Political Science and Politics 21 (4): 828-842 BENNETT Andrew, BARTH, A. and RUTHERFORD, K.R. 2003. “Do we preach what we practice? A survey of methods in political science journals and curricula’, PS: Political Science and Politics 36(3): 373378 BILLORDO Libia. 2005. “Publishing in French Political Science Journals: an Inventory of Methods and Sub-fields.” French Politics 3: 178-186 BLANCHARD Philippe. 2003. “Introduction” Pp. 9-18 in Méthodes et outils des sciences sociales. Innovation et renouvellement, edited by P. BLANCHARD and T. RIBÉMONT. Paris: L'Harmattan

12


BLANCHARD Philippe. 2010. "Sorcery and the sociology of academic publishing." Conference Selective affinities, friendship and obligations in the investigations in sociology and political science: fieldwork in a comparative perspective in Europe, Freibourg, Germany BLANCHARD Philippe and Charles PATOU. 2003. “Les usages de l’analyse factorielle dans les sciences sociales en France.” Pp. 85-110 in Méthodes et outils des sciences sociales. Innovation et renouvellement, edited by P. BLANCHARD and T. RIBÉMONT. Paris: L'Harmattan BONCOURT Thibaud. 2007. "The Evolution of Political Science in France and in Britain: A Comparative Study of Two Political Science Journals." European Political Science 6 (3): 276-294 BRADY Henry and David COLLIER (eds.). 2004. Rethinking social inquiry: Diverse tools, shared standards. Lanham, MD: Rowman and Littlefield BRYMAN Alan. 1984. “The Debate About Quantitative and Qualitative Research: A Question of Method or Epistemology?” British Journal of Sociology 35 (1): 75-92 FIELDING Jane and Nigel FIELDING. 2008. "Synergy and synthesis: integrating qualitative and quantitative data.” Pp. 555-571 in P. Alasuutari, J. Brannen and L. Bickman, eds., The Sage Handbook of Social Research Methods, London: Sage GOODIN R. and H.-D. KLINGEMANN. 1996. “Political Science: The Discipline.” Pp. 3–49 in R.E. Goodin and H.-D. Klingemann (eds.) A New Handbook of Political Science. Oxford: Oxford University Press GREENACRE Michael J. 2007. Correspondence analysis in practice (2nd ed.). Boca Raton : Chapmann & Hall KAUFMAN-OSBORN Timothy V. 2006. “Dividing the Domain of Political Science: On the Fetishism of sub-fields.” Polity 38: 41–71 KING Gary, Robert O. KEOHANE and Sidney VERBA. 1994. Designing social inquiry: Scientific inference in qualitative research. Princeton, NJ: Princeton University Press KITTEL Bernhard. 2009. “Eine Disziplin auf der Suche nach Wissenschaftlichkeit: Entwicklung und Stand der Methoden in der deutschen Politikwissenschaft.” Politische Vierteljahresschrift 50(3): 577–603 KUHN Thomas. 1970. The structure of scientific revolutions. Chicago: University Press of chicago LE ROUX Brigitte and Henry ROUANET. 2004. Geometric data analysis : from correspondence analysis to structured data analysis. Dordrecht: Kluwer Academic Publ LEWIS-BECK Michael S., Alan BRYMAN and Tim Futing LIAO (eds.). 2004. The Sage encyclopedia of social science research methods, Thousand Oaks: Sage MAHONEY James and Gary GOERTZ. 2012. A Tale of Two Cultures: Qualitative and Quantitative Research in the Social Sciences. Princeton: Princeton University Press MONROE Kristen (ed.). 2005. Perestroika!: The Raucus Rebellion In Political Science. New Haven: Yale University Press MOSES Jonathon, Benoît RIHOUX and Bernhard KITTEL. 2005. “Mapping political methodology: reflections on a European perspective.” European Political Science 4: 55-68 PEHL Malte. 2012. “The study of politics in germany: a bibliometric analysis of sub-fields and methods.” European Political Science 11: 54-70

13


RIHOUX Benoît, Bernhard KITTEL and J. MOSES. 2008 “Political science methodology: Opening windows across Europe and the Atlantic.” PS: Political Science and Politics 41(1): 255–258 SNOW Charles. 1993 (1969). The two cultures. Cambridge: Cambridge University Press TASHAKKORI Abbas and Charles TEDDLIE. 1998. Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks, CA: Sage.

14


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.