Research design and methods

Top PDF Research design and methods:

Validity: Challenges in Conception, Methods, and Interpretation in Survey Research

Validity: Challenges in Conception, Methods, and Interpretation in Survey Research

In response to our initial call for extended abstracts for articles for this special issue on Validity in Survey Research, we were delighted, but taken aback, to receive 45 submissions! We think this high submission rate speaks to a strong interest in, and response to challenges related to, validity and validation in survey research. We received many very strong and interesting proposals but we were restricted in the number we could pursue and, after much debate, invited authors to submit full articles on the basis of abstracts that made a strong link to validity, had broad appeal or relevance to the journal’s readership, and were able to provide a strong level of detail with respect to research questions, research design and results. After undergoing peer review, the special issue consists of five feature articles.
Mehr anzeigen

4 Mehr lesen

Research Design Meets Market Design: Using Centralized Assignment for Impact Evaluation

Research Design Meets Market Design: Using Centralized Assignment for Impact Evaluation

Impact Evaluation * A growing number of school districts use centralized assignment mechanisms to allocate school seats in a manner that reflects student preferences and school priorities. Many of these assignment schemes use lotteries to ration seats when schools are oversubscribed. The resulting random assignment opens the door to credible quasi-experimental research designs for the evaluation of school effectiveness. Yet the question of how best to separate the lottery-generated variation integral to such designs from non-random preferences and priorities remains open. This paper develops easily-implemented empirical strategies that fully exploit the random assignment embedded in a wide class of mechanisms, while also revealing why seats are randomized at one school but not another. We use these methods to evaluate charter schools in Denver, one of a growing number of districts that combine charter and traditional public schools in a unified assignment system. The resulting estimates show large achievement gains from charter school attendance. Our approach generates efficiency gains over ad hoc methods, such as those that focus on schools ranked first, while also identifying a more representative average causal effect. We also show how to use centralized assignment mechanisms to identify causal effects in models with multiple school sectors.
Mehr anzeigen

78 Mehr lesen

Mixed methods and triangulation in history education research: Introduction

Mixed methods and triangulation in history education research: Introduction

Articles reflecting triangulation and mixed methods in general Udo Kelle (Germany), Christoph Kühberger and Roland Bernhard (Austria) give a brief overview of triangulation and mixed-methods research design developments in social sciences and as they are reflected in the field of history education research. After reviewing the theoretical and epistemological debate concerning qualitative and quantitative research, the authors show how the ‘war’ between these paradigms was overcome by promoting triangulation and mixed-methods designs as a new ‘third’ way for research. However, the question of a consistent typology of mixed-method designs is a work in progress. With regard to history education, especially in German-speaking Europe in the twentieth and twenty-first centuries, the authors conclude that there has been a noticeable increase in the use of triangulation and mixed-methods designs, even though conscious methodological reflections on this are very rare.
Mehr anzeigen

4 Mehr lesen

Methods in bi- and multilingualism research

Methods in bi- and multilingualism research

Larsen-Freeman and Cameron (2008) rightly argue that when we work with a DCT-approach “…the nature of explanation changes, cause and effect no longer operate in the usual way, and reductionism does not produce satisfying explana- tions that are respectful to the interconnectedness of the many nested levels and timescales that exist” (2008: 241). They also provide a number of methodologi- cal perspectives to be followed from a DCT-perspective and suggest the adop- tion of modified research methodologies ranging from ethnography, formative experiments, and action research as well as longitudinal, case-study, time-series approaches, micro-developmental studies and computer modelling, brain imag- ing and the combination of a number of methodologies in order to be able to provide valid answers to new research questions (241–50; see also De Bot, Lowie and Verspoor 2011). Examples of DCT-methodology-driven studies in the field of bilingualism and SLA would be studies on individual variability in development in learners of English by Verspoor, Lowie and Van Dijk (2008), Meara (2007) on the application of Boolean networks to the growth of vocabulary, and Larsen-Freeman (2013) on how to combine CALL and design-based research. A longitudinal study on the development of linguistic awareness in multilingual attrition (LAILA) is currently under way at Innsbruck University.
Mehr anzeigen

24 Mehr lesen

Sustainable and resilient building design : Approaches Methods and tools

Sustainable and resilient building design : Approaches Methods and tools

Passive design concept plays an important role in reducing energy consumption, achieving energy efficiency, and decreasing dependence on external energy sources, but the resilience demands could nonetheless change the traditional utilisation of passive systems. To this end, the main research question concerns the functioning of region-typical passive mechanisms in future climatic conditions. In principle, the performance of passive mechanisms applied to a building of certain type in the future, will depend on local climate change manifestation, as well as on their intensity and frequency. For instance, according to the predicted climatic temperature increase in Northern European, the application of passive solar design principles to maximise daylight and achieve solar heat gains will no longer be appropriate (ArupResearch+Development, 2004), and new passive solutions typical of areas in which corresponding climate patterns are experienced, and adequate responses provided, could be used through a transposed regionalism approach as a basis for design redevelopment. In some warmer regions, like the Mediterranean, passive mechanisms used to combat increasing heat are already in place, just as the social adaptation that is deeply rooted in regional culture. According to ArupResearch+Development (2004), cultures in Northern Europe will have to alter their lifestyle to accommodate to the emerging climate change. Analogously, transposed regionalism may refer not just to architecture, but also to the culture, meaning that the social dimension of resilience inevitably calls for a change. When the threshold of habits and the capacity of traditional passive systems are exceeded (and for that reason become non-responsive to climate change manifestations), developed adaptation to the emphasised climatic parameters can easily imply new energy demands, which is why the passive measures in today’s design for the future should be maximised to the fullest (Gupta & Gregg, 2012).
Mehr anzeigen

276 Mehr lesen

Research through DESIGN through research - a problem statement and a conceptual sketch

Research through DESIGN through research - a problem statement and a conceptual sketch

Firstly: the failure of de-contextualized scientific approaches to handle the systemic complexity of real world situations. For a kind of programmatic statement see Weaver´s initial concept of organized complexity (1948), for an account of the inherent problems in analysing / controlling / designing social systems see Luhmann (1984, 1997). Secondly: the failure to deal with future developments of real-world systems. Design is involved in proposing the new, which, by definition, is not predictable. Early Futures Studies were still aimed at prediction, today there are projective and evolutionary approaches, which explore multiple futures and take the methods rather as learning devices than as forecasting tools. These failures demand us to reconfigure and conjoint the above two questions into one and ask:
Mehr anzeigen

10 Mehr lesen

Final Research Report for Sound Design and Audio Player

Final Research Report for Sound Design and Audio Player

Possible uses of the dataset are the evaluation of track identification methods when monitoring DJ mixes, or the precise annotation or even reverse engineering of DJ-mixes when constituent tracks are available. In the latter project, we perform alignment to determine the exact offset of each track in the mix, and then estimate cue points and volume fade curves, in order to learn about the decisions a DJ makes when creating a mix. The UnmixDB dataset is based on the curatorial work of Sonnleitner et. al., for identification of the tracks within human-made DJ mixes by fingerprinting. They collected Creative-Commons licensed source tracks of 10 free dance music mixes from the Mixotic
Mehr anzeigen

39 Mehr lesen

Efficient modeling and computation methods for robust AMS system design

Efficient modeling and computation methods for robust AMS system design

Behavior abstraction techniques for analog components modeling at system level are an active research topic in heterogeneous SoC design. In particular, efficient power system modeling is relevant for low power consumption embedded system design. Modeling semiconductor com- ponents such as diodes and transistors as ideal instantaneous switches lead to efficient circuit response computations. The resulting switched electrical networks are well suited for the integration of power elec- tronic circuits into system level models. They provide a good accuracy and simulation speed for design and validation purposes. The circuit behavior abstraction based on ideal switches allows fast and stable power control simulations, however requires the handling of voltage and current impulses for determining the correct topology after switch- ing. This chapter presents an efficient impulse analysis method at topol- ogy level that allows the prediction and logic representation of voltage and current impulses as well as the fast and reliable topology switching computation. The main advantage of the proposed topological impulse analysis method is the efficient clustering of multiple topologies that produce electrical impulses. It enables the fast identification of such topologies for circuits containing a large number of switches before sim- ulation start. Furthermore, the proposed topological analysis method does not depend on the circuit equations formulation form which is a key feature. In order to validate the proposed methodology, a SystemC AMS extension was implemented and utilized for the modeling and simulation of several power converters.
Mehr anzeigen

218 Mehr lesen

Trustworthy spacecraft design using formal methods

Trustworthy spacecraft design using formal methods

and size, we highlighted additional points on the offered modelling constructs. We recognized a need to support flows on continuous variables, used for the hybrid aspects. This would allow for exposing its continuous evolution to its neighbouring components. In the same line, it would be useful to develop efficient algorithms for verifying systems with (decidable fragments of) non-linear equations, allowing for more fine-grained hybrid behaviour. Additionally, we encountered Zeno behaviour and time divergence several times (cf. Section 4.4), and found it difficult to manu- ally pinpoint them in the model. The algorithmic detection of Zeno behaviour is an active field of research, and once it matures, it is desirable to have it included. Regarding the COMPASS toolset itself, it is pleasant not to be exposed to the underlying logic and model checking tools. For most analyses, the performance and the features are sufficient. Other analyses are subject to improvement. Upon model checking for example, the ability of expressing fairness constraints for the absence of starvation is a more elegant way than expressing them in the model itself. For certain temporal logics such as LTL, it is possible to encode a rich class of fairness assumptions in the requirements. Regarding FMEA, we found that FMEA generation is making a reverse mapping of fault tree generation, hence not complementing the information provided by fault trees. What would be more useful is to understand the chain of effects (i.e. fault propagation) that start by a failure. This would give more information on their detection means and possibly the design of the recovery procedures. Regarding performability analysis, this analysis ran out of memory after nine hours (cf. Table 6.3). The cause is the weak
Mehr anzeigen

208 Mehr lesen

Research and Educational Federation Testbed: A New Architecture Design for the Collaborative Research Cloud

Research and Educational Federation Testbed: A New Architecture Design for the Collaborative Research Cloud

The credibility of research results, new standards, and pro- totype testing; depends on the method of research validation. Several methods exist for the evaluation of research. The first and most popular method is using a simulation program because of its relative ease and low cost. The problem with simulations is that they make many assumptions to simplify calculations, which might compromise the credibility of results. In addition, large-scale simulations require resources or take duration that could go beyond what is feasible on personal computers. The second method is to acquire the hardware which has the advantage of leading to accurate results that reflect reality. But, in additional to the high cost, it has a limitation of reuse over group of researches and the overhead of setup both hardware and software. The last method, is remotely accessible testbeds offer researchers a more realistic tool of validation, while avoiding most of the problems associated with the management of hardware. Thus, it can be used for increasing the trust of the pro- posed systems in an economically feasible manner. With the massive deployment of equipment and research centers, a unified testbeds center has become a prerequisite for the provision of resources. However the independent provision
Mehr anzeigen

6 Mehr lesen

Foundations of Digital Methods: Query Design

Foundations of Digital Methods: Query Design

I would like to present, first, an example of research conducted using unambiguous queries. The project in question concerns the Google image results of the query for two different terms for the same barrier: [“apartheid wall”], which is the official Palestinian term for the Israeli-Palestinian barrier mentioned previously, versus the Israeli term, [“security fence”] (see Figure 5.7). The results from these two queries present images of objects distinctive from one another. The image results for [“apartheid wall”] contain graffitied, wall-like structures, barbed wire, protests, and people being somehow excluded, whereas with [“security fence”] there is another narrative, one derived through lightweight, high-tech structures. Furthermore, there is a series of images of bomb attacks in Israel, presented as justification for the building of the wall. There are also information graphics, presenting such figures as the number of attempted bombings and the number of bombings that met their targets before and after the building of the wall. In the image results we are thus presented with the argumenta- tion behind the building of the fence. The two narratives resulting from the two separate queries are evidently at odds, and these are the sorts of findings one is able to tease out with a query design in the programme/ anti-programme vein. Adding neutral terminology to the query design would enrich the findings by showing, for example, which side’s images (so to speak) have become the neutral ones.
Mehr anzeigen

21 Mehr lesen

Research through DESIGN through reserach

Research through DESIGN through reserach

According to Rittel (1972), these “wicked problems” can only be overcome by opening up the closed algorithmic problem solving process (1st generation methods) and initiating a process of argumentation and negotiation among the stakeholders instead (2nd generation methods). In other words: he suggests a change from 1st order observation to 2nd order observation: not systems are observed, but systems observing systems (von Foerster, 1981). Under conditions of 2nd order observation, we have to account for the fact that the problem itself is not “given” but will be designed by the stakeholders. In consequence, problems are changing their character in the course of the solution process. No information is available, if there is no idea of a solution, because the questions arising depend on the kind of solution, which one has in mind. One cannot fully understand and formulate the problem, before it is solved. Thus, in the end, the solution is the problem. Therefore, Rittel argues for the further development and refinement of the argumentative model of the design process and the study of the logic of the designers’ reasoning, where logic means the rules of asking questions, generating information, and arriving at judgements. Given this situation Rittel (Cross, 1984, p. 326) states slightly ironically:
Mehr anzeigen

19 Mehr lesen

Mixed methods research: An opportunity to improve our studies and our research skills - Editorial

Mixed methods research: An opportunity to improve our studies and our research skills - Editorial

E-mail address: jf.molina@ua.es expansion (seeking to extend the breadth and range of inquiry by using different methods for different inquiry components). Another important issue about mixed methods is how to con- duct a mixed methods study. There are two main factors that help researchers to determine the type of mixed methods design that is best suited to their study: priority and implementation of data col- lection. Regarding priority, the mixed methods researcher can give equal priority to both quantitative and qualitative parts, emphasize qualitative more, or emphasize quantitative more. This emphasis may result from the research question, from practical constraints on data collection, or from the need to understand one form of data before proceeding to the next. Implementation of data col- lection refers to the sequence the researcher uses to collect both quantitative and qualitative data. The options consist of gathering the information at the same time (concurrent design) or introduc- ing the information in phases (sequential design). In gathering both forms of data concurrently, the researcher may seek to com- pare them to search for congruent findings. When the data are introduced in phases, the sequence relates to the objectives of the research. Thus, when qualitative data collection precedes quantita- tive data collection, the intention may be first explore the problem being studied and then to follow up on this exploration with quanti- tative data that are amenable to studying a large sample, so that the results can be applied to a population. Alternatively, when quan- titative data precede qualitative data, the intention may be to test variables with a large sample and then to explore in more depth with a few cases during the qualitative phase.
Mehr anzeigen

3 Mehr lesen

Anzeige von Community Health- Public Health Research Methods and Practice

Anzeige von Community Health- Public Health Research Methods and Practice

Throughout the sixties, seventies and eighties much of the rhetoric in public health paid lip service to the value of a community, empowerment, community-based care, population-based needs assessment and so on, but we could not see much of the evidence of this commitment in the day-to-day service provision of practitioners or in design applied in public health interventions. Potential contributions from the social sciences tend to be overwhelmed by the appeal of the biomedical and behavioural sciences. The most common notion of community in public health was the most simple – a lots and lots of people or community as the population. This notion is illustrated in large-scale community interventions propelled by the concern to reach as many people as possible and make best use of scarce program resources. The outcome evaluation of these interventions usually amounts to summing up changes made by individuals in relation to the problem of interest. The greater the number of people who change, the more successful is the intervention (1). The second approach to community borne out of the first could be described as community as “giant reinforcement schedule” or community as setting, with aspects of that setting being used as levers to support and maintain individual behaviour change. In this approach, organizations, groups and key individuals in the community are valued because of their capacity to translate the health messages of the campaign into the local culture. The evaluation of this model rests principally on aggregating changes made by individuals in the population (1,2,3,4). The third, newest, approach developed throughout the nineties sees community as “eco-system with capacity to work towards solutions to its own community identified problems” or to see it as a social system. This notion of community focused on strengths instead merely on deficits. The evaluation in these case attempts to capture changes in community processes and structures, as outcomes (1).
Mehr anzeigen

10 Mehr lesen

Towards a pluralistic conception of research methods in information systems research

Towards a pluralistic conception of research methods in information systems research

development of a sound and useful terminology is a key criterion for indicating the progress of the field. Last but not least, critical rationalism restricts science to empirical hypotheses that can be tested against reality – in other words: Only propositions that satisfy the claim for truth according to the correspondence theory of truth qualify for being scientific. Such a restriction does not necessarily exclude the design of possi- ble future worlds from the body of scientific knowledge. The designs could be implemented and then the core hypotheses underlying the design could be tested. However, often the implementation of a construction in a realistic setting is no option, since it would require too much resources and time. While blueprints of future worlds cannot entirely be justified through empirical studies, Albert suggest catering for a more substantial evaluation by bridging the proposed scenario to proposi- tions with an empirical content. Among other things, this so called “Brückenprinzip” (bridging postu- late) ([Albe91], pp. 76) implies to show the feasibility of a possible world.
Mehr anzeigen

85 Mehr lesen

Game Research Methods. An Overview

Game Research Methods. An Overview

in people. Differing groups do not pose a problem to within-group designs. Measuring people three times and creating two intervals where changes can ensue freely (one before and one after game-inter- vention) will provide baseline knowledge of naturally occurring changes over one month, hence the baseline problem will disappear. Will it then, with reasonable confidence, be possible to conclude that a difference in antisocial tendencies between time points two and three results from the introduction of the game? If we assume that the game was introduced in September 2001, would it then be possible to rule out that a difference between the intervals was not caused by fear induced by the September 11 attack? Probably not. An elegant change can solve the problem: Instead of letting all participants play the game during the second month, the participants are distributed to two groups: One plays the game, while the other continues to live their ordinary lives. Both groups would experience and react to the tragedy, while only one group had been playing the violent game. So, if a difference is not only seen between interval one and two, but also between two groups in interval two, it would, finally, be possible to establish that the game would actually lead to violent behavior. A design mixing repeated measures with the creation of different groups is called a mixed design (not to be confused with mixed methods, Chapter 16), and is, if practically possible, the way to go when conducting real-world effect studies. Even though in vivo effect studies use the same logics as lab experiments, they are much closer to every- day reality. They pragmatically aim at testing the effect of something that is already happening. This makes most in vivo studies ecologically valid, but relinquishes a lot of the control scientists usually have over research environments, and thus the ability to pinpoint exactly what caused the changes seen. The dance game study cannot, for instance, be considered an adequate experiment for inferring systematic effects generalizable to all old women because of an insufficient N (number of participants), lack of blinding, and no control group. As a feasibility effect study, however, the results were interest- ing and even garnered mention in Nature (Payton, 2014). In the world of medical or education research it will often be reasonable to run smaller ‘studies of convenience’ as a pilot study before larger interven- tions. Indeed, many research committees and funding bodies require statistically sound proof of con- cept before approving large-scale studies and expensive experiments.
Mehr anzeigen

374 Mehr lesen

Machine Learning Methods for the Design and Operation of Liquid Rocket Engines - Research Activities at the DLR Institute of Space Propulsion

Machine Learning Methods for the Design and Operation of Liquid Rocket Engines - Research Activities at the DLR Institute of Space Propulsion

Since rocket engines operate at the limits of what is technically feasible, they are inherently susceptible to anomalies [23]. The immense costs associated with the loss of the launch vehicle or a test bench clearly show the importance of a suitable condition monitoring system. Machine condition monitoring systems may have to provide a proper diagnosis in real-time from existing sensor data to detect abnormal behavior and, for example, to trigger an emergency shutdown. Usu- ally, the detection of faults is realized by just monitoring if a sensor signal exceeds a certain threshold or ana- lyzing the discrepancy between sensor readings and expected values, derived from a theoretical model. In cases where the exact theoretical modeling is not pos- sible or would be very costly, statistical methods or di- rect pattern recognition algorithms are used. ML tech- niques belong to this second category and received significant attention in recent years.
Mehr anzeigen

9 Mehr lesen

Design adaptation methods for genetic association studies

Design adaptation methods for genetic association studies

Technological progress allows for conduction of genomewide association studies of com- mon SNPs in order to identify DNA sequence variants related to disease risk. Despite substantial differences in technology, all available first-generation SNP panels rely on indirect LD mapping and offer similar levels of genomic coverage for common variants (Barrett and Cardon, 2006; J orgenson and Witte, 2006b). For this reason, planning a genomewide association study is less constrained by LD coverage considerations and more strongly requires knowledge about the genetic model for the investigated complex trait. In the majority of cases, however, such information is missing (e.g., Ioannidis et al., 2006). A procedure for conducting such a study despite these uncertainties was developed in chapter 5. Using arbitrary information for ranking genetic markers, it was shown how to adapt the subsequent part of a genomewide study while controlling the genomewide type I error rate (FWER in a strong sense). Compared to all previously proposed genomewide multi-stage designs where formal statistical rules have to be met, the procedure addresses the practical need for increased flexibility of ongoing genetic research using genomewide SNP panels (e.g., Hampe et al., 2007; Sladek et al., 2007; Frayling et al., 2007; McPherson et al., 2007). Moreover, the requirement to comply with formal statistical rules may be one of the main reasons why truly sequential study plans with a joint final analysis (Skol et al., 2006) are rarely found in practice.
Mehr anzeigen

144 Mehr lesen

Lightweight Design: Construction Methods and Vehicle Concepts

Lightweight Design: Construction Methods and Vehicle Concepts

• new vehicle concepts, lightweight design concepts and a push for lighter materials. • Focus for research and development:[r]

18 Mehr lesen

Research-Based Design
Building for Children in Theory and Practice

Research-Based Design Building for Children in Theory and Practice

It is therefore reasonable to expect that the tradi- tional kinder garten model of West German states will become a thing of the past. Day-care facilities for children are not as a rule able to choose the chil- dren in their care; rather they must respond to local needs (parental wishes) and demand (that demand which is politically acknowledged as requiring to be fulfilled) and then conform to these changes. Even just taking into account this large age co- hort presents a  major challenge to the designers of such facilities: the age bracket encompassing children of crèche age up to nursery age is similar to that between those of primary school age and those sitting for the Abitur graduation exam – and in terms of physical and mental development the margin is much greater. If it is the case that study groups comprising pupils of the same age are pro- jected to be retained in schools, at day-care centres there is a significant trend towards organisational frameworks based upon mixed age groups. More- over, the construction of schools affords a  plan- ning phase of several years, whereas the generally smaller-scale nursery schools are forced to adapt to parental wishes and demands essentially without proper warning. A shift in the provision or scheme of local public transport, job creation, job losses or the construction of new housing estates can all de- cisively alter demand and the age composition of those to be accommodated – and all in the blink of an eye. A building designed nowadays as a crèche for infants must possibly in five years' time offer facilities to a broad age spectrum, or even open up the premises to the local community.
Mehr anzeigen

91 Mehr lesen

Show all 10000 documents...