• Nem Talált Eredményt

Method of data collection

In document DOKTORI (PHD) DISSZERTÁCIÓ (Pldal 89-92)

3 Research design

3.8 Method of data collection

and had already completed several questionnaires in their lives. The respondents were also expected to be familiar with Likert-type questionnaires as they are widely used in the profession.

The questionnaire might also come in for criticism for using an odd number of choices. Researchers have pointed out the difficulties of interpreting the mid-point (McDonough & McDonough, 1997, p. 176), namely, whether the middle point means that the respondent either has no opinion or is not interested in the topic. Generally speaking, the researcher can never be certain how respondents interpret the statements or questions in a questionnaire, whether their interpretation is similar enough to that of the researcher or those of the other respondents. However, the questionnaire was not the primary research tool in the present investigations, and it was hoped that it would be sufficiently reliable. Reliability statistics presented in Table 9 and Table 10 provide evidence that the tool was reliable.

In the last section of the online questionnaire, factual questions were asked about participants’ mother tongue, age, gender, degree(s), teaching experience and teacher training courses. The response boxes were constructed in a way that they would not limit participants in their choices, which allowed them, for example, to tick more than one kind of institution or add several different courses.

participants were assured of confidentiality. The interviews lasted 16 to 45 minutes and some were recorded in the participants’ schools or workplaces, while others were recorded in public places such as cafés and a park. Three interviews were recorded respectively in the researcher’s or the participants’ homes at both parties’ convenience. Because of technical difficulties, one of the participants, Maureen, was hardly audible on the recording, and thus the interview was reconstructed on the basis of notes rather than transcribed. All the other interviews were fully transcribed. All the participants were interviewed in their mother tongue and they all gave their consent to the use of the transcripts for research purposes.

The learner interviews were conducted with the principle in mind that they ought to resemble real-life conversations (Cohen et al., 2007) and that “qualitative research is basically communication” (Szabolcs, 2001, p. 46). Even though a semi-structured interview guide was used with each learner participant, no two interviews were conducted using exactly the same wording. The questions were slightly changed to accommodate individual differences and the relationship between researcher and interviewee. This way, as suggested by McDonough &

McDonough (1997), more room was allowed for individual expression. The majority of the interviews included spontaneous, relevant answers from the participants, with questions typically remaining short and answers long, although the three youngest participants needed more prompting during the interviews and these interviews were not as rich in data as expected. I conducted all the interviews myself and thus had the opportunity to ask for clarification, so the interviews were interpreted as they went on. Hence, the interviews met most of Kvale’s quality criteria (cited in Cohen et al., 2007).

Before conducting the teacher interviews, teacher participants had been informed what the interview would be about. All the participants had been sent the interview guide, although not all wished to study in great detail. All the teacher participants had been assured of confidentiality. The interviews lasted 24 to 60 minutes. All the interviews were conducted

in the participants’ mother tongue. The interviews were recorded in the schools of the participants with the exception of one, which was conducted in an eatery. Out of the 12 interviews, one interview could not be recorded because of organization difficulties, in which case detailed notes were taken and the interview was thereby reconstructed. The interview was later sent to the participant for verification. In the case of another interview, one half of the interview had to be reconstructed because the dictaphone broke down, but the participant was willing to work on and edit the reconstructed text, so the data was not lost. All the other interviews were transcribed word by word, verified by the participants and, if needed, amended accordingly. The transcription method included several stages: I listened to the full recording once, then typed it out chunk by chunk, then listened again to check and sent the transcript to the participants. Not every participant wished to comment on the transcripts, but some even added a few thoughts.

The teacher interviews were conducted in a way that allowed participants to freely contribute whatever they felt was relevant, the semi-structured guide was there merely to provide a framework for the discussion. This way, teachers’ narratives about their own development came to the surface. All the interviews were interactive situations, in which participants, be them learners or teachers, were given the opportunity to ask back and clarify if something was unclear. Many of the participants actually used this opportunity and the interviewee and interviewer shaped each others’ way of understanding of the terms used, which is a distinctive characteristic of qualitative pedagogical enquiry (Szabolcs, 2001, p. 21).

Below is a description of the way quantitative data were collected. After traditional piloting with print-outs, which comprised data-entry and a subsequent analysis (see Soproni, 2007), a website for the questionnaire was established and the link was sent to fellow professionals, schools, school principals, school secretaries, network administrators all over the country. Over 1,200 letters were sent over a period of 18 months encouraging fellow

English teachers to visit the website and submit their opinion electronically. The responses were recorded between 1 Oct 2008 and 21 May 2010. Some schools were also approached over the telephone to improve the response rate. For example, in cases when official school email addresses failed to work, schools were contacted by telephone. The return rate was low;

some colleagues even contacted me apologizing for their unwillingness to participate for lack of time or because they were irritated by being contacted all too often for questionnaire and market surveys. Likewise, snowball sampling (Dörnyei, 2002) was attempted, in the letter attached teachers were asked to send the link to fellow professionals but it is unclear how much of that actually happened.

In document DOKTORI (PHD) DISSZERTÁCIÓ (Pldal 89-92)