The effect of a teaching intervention on students’ online research skills in lower secondary education
Tuulikki Alamettälä and Eero Sormunen.
Introduction. Information literacy skills are crucial in today’s world. But teaching these skills is challenging and calls for new pedagogical approaches. This paper reports the results of a teaching intervention designed by practicing teachers in a lower secondary school.
Method. A quasi-experimental pre-test/post-test design was used to investigate the effect of the intervention. Students’ learning outcomes were measured in four component skills of online research: search planning and query formulation, Web searching, critical evaluation, and argumentative use of Web information.
Analysis. A mixed between-within subjects ANOVA (analysis of variance) was conducted to investigate the impact of the intervention on students’ online research skills.
Results. The intervention group outperformed the control group in an online research performance test. The intervention effect was most powerful among the students who were less active Web searchers or social media users or among those with lower self-efficacy in online research. Surprisingly, the students who had a positive attitude towards traditional teacher-centred learning improved their skills, but the attitude towards independent online learning did not make a difference in learning outcomes.
Conclusions. Even individual teachers may draw inspiration and ideas from research-based pedagogies, develop their professional practice effectively, and create effects in students.
Introduction
In today’s Internet-centred information environment, people need online research skills to make sense of controversial issues typical of public debates and everyday life. Online research skills refer to the competences of searching, evaluating, and synthesising information on the Internet. In information sciences, these skills are traditionally referred to as information literacy (Limberg et al. 2008; Kuhlthau et al., 2015). In this paper, we focus solely on students’ work on the Web, and we call these competences online research skills. This is in line with the conceptualisation adopted in the study of online reading and comprehension (Leu et al., 2015).
Student-centred learning, including learning by searching, evaluating and integrating information from multiple sources, has become a common practice in schools (Alexandersson and Limberg, 2012; Lundh, 2011; Rouet and Britt, 2011). Yet, recent research shows that students’ skills are underdeveloped (Kaarakainen et al., 2018; Coiro et al., 2015; Kiili and Leu, 2019). Students’ and teachers’ blind trust in the search engines and their unawareness of the biases caused by the search algorithms are also fundamental aspects of the problem (Haider and Sundin, 2019; Sundin and Carlsson, 2016). All in all, we are talking about a complex set of skills that is not easy to acquire (Brand-Gruwel et al., 2005).
Teaching online skills is a challenge for teachers. Information literacy (including online research skills) earns little explicit attention in teacher education (Duke and Ward, 2009; Tanni, 2013). Practicing teachers are uncertain about effective teaching practices (e.g., Colwell et al., 2013). Studies suggest that information literacy instruction is often weakly designed and technically oriented, and it leaves the learning process without sufficient attention (Limberg et al., 2008). However, some studies show that there are individual teachers who actively develop their professional practice in information literacy instruction (Sormunen and Alamettälä, 2014).
Research-based pedagogical models have been developed for information literacy instruction in library and information science. Guided inquiry design is one of the most established frameworks (Kuhlthau et al., 2015). This framework is based on extensive studies of students learning through research assignments. It provides a framework to help schools develop their curricula and detailed guidelines for teachers to guide students through the inquiry process. It is grounded in the idea that information literacy is developed by training information practices in a collaborative inquiry process. In the process, the teacher identifies the crucial points at which students need support and offers targeted guidance. (Kuhlthau et al., 2015)
Although novel pedagogical frameworks have been developed, it is not easy for an individual teacher or teacher team to implement them in the classroom. The suggested models, for example guided inquiry design, assume that the renewal of the pedagogical practice is a school-wide process. Overall, we have little research-based evidence of the effects of novel pedagogies in online research. Some researcher-designed teaching interventions, especially those that are curriculum-embedded, have been shown to be helpful for students (Macedo-Rouet et al., 2013; Chu et al., 2011). However, we lack studies in which a pioneering teacher or teacher team designs an intervention to improve their professional practice of teaching online research skills. Another problem is that most studies deal with short-term interventions in which a reliable measurement of development in complex competences is difficult. Studies on the long-term effects of teaching interventions on online research skills are urgently needed (Bråten et al., 2011).
This study aims to fill the gap in research regarding teacher-designed interventions to develop students’ online research skills. In a previous paper (Alamettälä and Sormunen, 2018), we reported how teachers designed and experienced a teaching intervention informed by guided inquiry design. We have also reported the effect of the intervention on students’ self-efficacy and attitudes (Alamettälä et al., 2019). In the present study, the goal is to show its effect on students’ online research skills. We are also interested in seeing how students’ success in the pre-test, information and communication technologies activity, self-efficacy beliefs and behavioural intentions in online research, attitudes towards learning and sex predict their learning outcomes in the intervention.
We build our study on the constructive approach to learning (Phillips, 2000). In this theoretical framework, the student is seen as an active participant in the learning process, actively creating his/her own knowledge and skills based on his/her prior knowledge and experiences. We build on the task-based information interaction framework in which information interaction is understood and examined as cognitive and behavioural activities throughout the stages of task performance (Järvelin et al., 2015). The study assumes (as most intervention studies, e.g., Argelagós and Pifarré, 2012; Chen et al., 2014) that the effect of teaching on students’ skills can be measured in a pre- and post-test design simulating a realistic online research process.
Previous research
Quite a few studies have been published on online research teaching interventions in lower (or upper) secondary education. Online research skills have been approached from two angles: some studies have focused on online research skills and how they have been changed by the intervention (Argelagós and Pifarré, 2012; Baji et al., 2018). In other studies, online research has had only an instrumental role, and learning has been measured as a change in subject content knowledge and problem-solving skills (Chen et al., 2014; Chen et al., 2017).
Both Argelagós and Pifarré (2012) and Baji et al. (2018) carried out quasi-experimental studies with a pre- and post-test control group design. Argelagós and Pifarré worked with 7th and 8th graders for two academic years, and Baji et al. studied 6th graders for six weeks. Argelagós and Pifarré utilised a Web-based learning environment in their intervention, and Baji et al. used the Big6 model (Eisenberg and Berkowitz, 1990) as the framework for their intervention. Both studies showed an intervention effect.
Argelagós and Pifarré (2012) measured students’ skills in defining the problem and searching for information, scanning and processing information and organizing and presenting information. The experimental students outperformed the controls in defining the problem and searching the Web. As the experimental group searched more effectively, they could also devote more time to scanning and processing information, which also helped them organize and present information; their task performance scores were higher and Pi. The researchers concluded that it is important to support students as they develop their searching skills.
Baji et al. (2018) concluded that the Big6 model, integrated into the curriculum, improved the students’ information literacy skills and helped them assimilate a deeper understanding of the research process. To perform the test, they used a modified version of the tool for real-time assessment of information literacy skills (TRAILS) for the 6th-grade students, which includes multiple-choice questions.
Chen et al. (2014) also conducted a quasi-experimental pre-post study with an experimental group and a control. Chen et al. (2017) used students’ academic achievements as the moderating factor in their study instead – there was not a control group. Chen et al. (2014) conducted their three-week study among 7th graders, and Chen et al. (2017) studied elementary school students for six years, following them from grade 1 to grade 6. Both studies applied the Big6 framework in their interventions and found an intervention effect.
Chen et al. (2014) measured students’ memory of the learned subject content, their comprehension of scientific concepts and problem-solving skills. Their results showed that the experimental group outperformed the controls on comprehension and problem-solving tests but not on the subject content (memory) test.
Chen et al. (2017) examined students’ memory and comprehension of subject contents. Regardless of prior academic success, students’ fact memorisation and conceptual understanding of subject content improved. In general, the progress level in comprehension was higher than in memory learning. Low-achieving students progressed most in both memory and comprehension learning compared to the medium- and high-achievers.
There are plenty of studies on how different student-related factors explain differences in higher-level Internet skills, i.e., online research skills. For example, it is suggested that students’ digital skills benefit from active use of information technology at home (Fraillon et al., 2014). There is also empirical evidence that self-efficacy beliefs (Rohatgi et al., 2016) and attitudes towards information technology (Petko et al., 2017) are associated with the students’ skill level. Sex has been related to computer competences, but no consensus has been reached on sex differences in online research skills (Fraillon et al., 2014; Kaarakainen et al., 2018).
We did not find any empirical studies on how the above-mentioned student-related factors predict how students benefit from online research teaching interventions. Aesaert et al. (2017) speculated that slightly overoptimistic self-efficacy judgements could be ideal for effective learning of competencies. The authors argue that moderate overestimation motivates students to persist in their efforts. It seems that discussions on the association among attitudes towards online research, active free-time use of the Internet, and sex, and the effect of online research teaching interventions is even more rare in the research literature.
To summarise, only a few longitudinal teaching intervention studies have been conducted on online research skills in secondary education. Some have measured the effect of the intervention with performance tests (e.g., Argelagós and Pifarré, 2012). Most studies do not elaborate on their findings of how different student groups are affected by the interventions. One exception is Chen et al. (2017), who used students’ academic achievement as a moderating factor for their learning benefits in an intervention, concluding that low-achieving students displayed the most progression. Three studies (Baji et al., 2018; Chen et al., 2014; 2017) applied a specified pedagogical framework (Big6), but we lack studies on other frameworks. For example, the guided inquiry design (Kuhlthau et al., 2015) has been assessed only in studies that lack a rigorous design of measuring learning outcomes (e.g., Scott, 2017; Chu et al., 2008; Chu et al., 2011). Further, we found only two studies that had adopted a longitudinal approach to follow students’ progress in a scale of school years (Argelagós and Pifarré, 2012; Chen et al., 2017). Finally, we observed that previous research has neglected the viewpoint of an individual teacher who wants to adjust some ideas of the research-based pedagogical frameworks into the everyday professional practice in the school.
Research questions
The aim of this study is to find out how students’ online research skills develop in a teaching intervention informed by guided inquiry design in the 7th grade and how the development of skills is associated with various student-related factors. The research questions are:
- Does the teaching intervention improve students’ online research skills?
- In the teaching intervention on online research skills, are students’ benefits associated with their success in the pre-test of online research skills, information technology- and Internet-related activity, self-efficacy beliefs and behavioural intentions in online research, attitudes towards learning or sex?
Methods
The research was conducted as a pre-post intervention study with a control group. The study was quasi-experimental and used a non-equivalent groups design. The groups were selected based on a convenience sample without randomisation. Quasi-experiments are useful when random allocation is difficult, for example, in educational field interventions (Bryman, 2008, p. 40-41; Price et al., 2015).
Participants
The study was conducted in an urban school in a medium-sized city in southern Finland. The school serves as a teacher training school and has about 300 students who come from the neighbouring area without entrance examinations.
Data were collected during the 2015–2016 school year when the students were 7th graders, aged 12–14 years. The intervention group was comprised of three parallel classes of 58 students in total (35 girls, 23 boys). Two classes not exposed to the interventions were studied as a control group, which consisted of 36 students (19 girls, 17 boys).
Procedure
The teaching intervention was integrated into two courses: autumn 2015 and spring 2016. A Finnish language teacher designed and implemented the intervention, and two history teachers were involved in the second course. All were experienced teachers and also taught student teachers. Teachers of the control group were not involved in the study.
Guided inquiry design was introduced to the teachers, but they had the right to decide how to apply it in their pedagogical practices based on the requirements of the curriculum. Therefore, some features of the framework were incorporated by teachers into school practices. The teachers aimed to follow two instructional principles of the framework: to emphasise the first stages of the inquiry process (open, immerse, explore, identify) and to let students choose their own topics of interest. To keep track of information sources, inquiry logs were introduced to the students. In addition, the students worked in groups (shared knowledge building), and the teachers also worked as a team.
Course 1 was part of the Finnish language curriculum, and course 2 was a joint project of Finnish language and history. In course 1 (September 2015), students made a brochure about recommended practices in social media. The theme of course 2 (April–May 2016) was the Finnish Civil War. The students worked on two end-products: a source-based presentation in history and a fictive text in the Finnish language. The main learning goals were that the students learn to search information on the Internet, evaluate the information and use it appropriately in a given task. The teaching intervention is described in more detail in Alamettälä and Sormunen (2018). The control group remained compliant with the school’s curriculum.
Before the intervention, the school librarian briefly introduced information searching to all students, including controls. The lesson dealt with the school library and information searching on the Web (e.g., how to use search engines and formulate queries). Then, the teachers continued with another lesson with textbooks to familiarise students further with the basics of information searching (including planning and defining the search, evaluating sources and listing references). Thus, all students were able to learn the basics of information searching, but the intervention group was exposed to extra activities informed by guided inquiry design.
Materials
Evidence of students’ skills in online research can be collected in various ways, for example, by knowledge tests, self-assessments (including self-efficacy scales) and performance tests. The biggest limitation of knowledge tests is that they measure factual knowledge rather than practical skills (Sparks et al., 2016). The problem with self-assessments is that the students easily underestimate or overestimate their skills (Bussert and Pouliet, 2010, pp. 136–137). Authentic tests or exercises have shown to be the most effective way to document actual applied skills (Schilling and Applegate, 2012). Integrated performance tests such as ORCA (online reading comprehension assessment) (Kennedy et al., 2016) and NEURONE (oNlinE inqUiRy experimentatiON systEm) (Sormunen et al., 2017) seek to expose the participants to the challenges of an online research process.
The test used in this study applied the ideas of integrated performance tests. Pre- and post-tests covered four competence dimensions: 1) search planning and query formulation skills, 2) search performance skills, 3) critical evaluation skills, and 4) argumentation skills. The pre- and post-tests had the same form but different themes in order to prevent memorisation. In the first test, the students were asked to find an answer to the following question: “Can shopkeepers refuse to sell energy drinks to schoolchildren?” In the second test, the question was: “In which school subjects might computer gaming have positive effects?” The students performed the test assignment online but wrote the answers on paper. Neither task was only a simple fact-finding task; they both required information searching and interpretation. However, they were formulated so that it was possible to find straightforward, justified answers.
Before seeking information, the students were asked to think up various search terms. Next, they were allowed to use laptops and search with the help of online search engines. Students were required to list the search terms they used, name two of the best sources and justify their choices. At the end, they were asked to give a well-justified answer to the question. The search plans, queries, sources and their justifications and the answers were assessed and scored.
The basic data on students’ backgrounds, including their computer and Internet use, attitudes towards learning, behavioural intentions and self-efficacy beliefs in online research were surveyed by a questionnaire reported in a previous publication (see Alamettälä et al., 2019). Students’ information technology activity was measured in three dimensions: school-related technology activity (two items), free time information-seeking activity (two items) and social media activity (two items). Two sets of items measured attitudes towards independent online learning (four items) and traditional teacher-centred learning (four items). The third attitude component measured behavioural intentions (intent to act a certain way with regard to the attitude object) in online research, including searching (seven items), evaluation (five items) and writing (four items). Self-efficacy beliefs were targeted to information searching (three items) and writing (three items).
Data collection
The tests were carried out before the first intervention course and after the second. The total number of students was 94, 87 of which completed both tests. The tests took about 30 minutes each. The students were tested in their regular classrooms during their Finnish language lessons. The questionnaire regarding self-efficacy and attitudes was administered two times: before and after the intervention. The background information was collected between the first and the second intervention course.
Scoring
The scoring of search plans aimed to assess students’ abilities to identify the core and auxiliary concepts of the search topic and find appropriate search terms to represent those concepts. Any string-level word form was accepted (various inflectional and derivational forms). The student could earn 0–6 points by
- identifying the core concept of the topic and presenting it in meaningful search terms (0–2 p)
- identifying auxiliary search concepts and presenting them in meaningful search terms (0–2 p)
- suggesting optional search plans (0–2 p)
The quality of queries used in searches was assessed similarly to search plans. The student could earn 0–4 points by
- formulating a query in which the core concept of the topic is represented with a meaningful search term (0–2 p)
- applying auxiliary search concepts by representing them with meaningful search terms (0–2 p)
The meaningfulness of search terms to represent a search concept was estimated using a three-category classification: strong (1 point), weak (0.5 points) and off-topic (0 points). Altogether, one could earn a maximum of 10 points in search planning (scaled by a factor of 0.2 in the overall test score).
The first author assessed the sources (2) selected by the student based on their relevance in the task completion. The source was relevant if it correctly answered the question from the test task. The student earned one point for each relevant source. The maximum number of points for search performance was 2.
The students were also asked to justify the chosen sources, and the justifications were evaluated. The criteria for evaluation were relevance (factual content) and reliability (Borlund, 2003). The student earned 1 point by justifying the chosen information source based on its relevance and 1 point based on its reliability. The maximum was 2 points regardless of whether the student evaluated one or two sources.
Measuring the use of information was simplified, and the evaluation was based on whether the student was able to give the correct answer (max 1 point) and justify it (max 1 point). The researcher assessed the relevance the same way as above (see search performance skills). The student could earn one point by giving the correct answer and another point by presenting a source-based argument to support the answer (for example, by referencing to authorities).
The overall test score was a sum of component scores all equalised to a maximum of 2 points. It is noteworthy that, in this kind of test, the search performance component dominates the two last components. Without relevant search results, it is difficult to achieve high scores in the evaluation and use of sources.
Data analysis
SPSS version 25 was used for statistical analyses. A mixed between-within subjects ANOVA (analysis of variance) was conducted to investigate the impact of the teaching intervention on students’ online research skills. A mixed between-within subjects ANOVA is an extension of the repeated measures ANOVA and can be used in a study with two independent variables: one is a between-subjects variable (e.g., group: intervention or control) and the other a within-subjects variable (e.g., time: pre or post). (Pallant, 2013, pp. 284–292)
Prior to the analysis, assumptions of normality were tested using the Kolmogorov-Smirnov test and the Shapiro-Wilk test (Field, 2009, pp. 145–148). Data in both groups were normally distributed (p > o.05). There were no outliers, as assessed by box-plot. The homogeneity assumption of variances and covariances in the data sets was verified by Levene’s test of homogeneity of variances (p > 0.05) and Box’s M test (p > 0.05), respectively (Field, 2009, pp. 150–152, 604).
Students from the intervention group were divided into the subgroups High and Low based on their pre-test scores, information technology activity, self-efficacy, attitudes and behavioural intentions to examine if these factors are related to the measured intervention benefits. The division was based on the mean values of each variable as a cut-off point. The relationships were analysed by applying one variable at a time. The effect of sex was also analysed.
Results
Does the teaching intervention improve students’ online research skills?
The mean of the overall test score rose from 3.93 to 4.73 in the intervention group while staying around 3.7 in the control group F(1, 85) = 6.43, p = 0.013, partial η2 = 0.070. The results of the mixed between-within subjects ANOVA (Table 1) show that the intervention had a measurable effect on students’ learning. The effect size was medium (partial eta-squared thresholds: small = 0.01; medium = 0.06; large = 0.14, see Pallant, 2013, pp. 217–218).
Examining one test component at a time reveals that the intervention effect was most remarkable in search planning and query formulation skills (p = 0.023, ηp2=0.059). Argumentative use of Web information was near significance (p = 0.087). However, the scores improved in both groups, suggesting that the change is related to learning external to the intervention (for example, an effect of the school’s regular curriculum, or an easier topic to build valid arguments in the post-test). In Web searching and critical evaluation, the measured difference between pre- and post-tests indicated a weak intervention effect, but the results were not statistically significant. It is worth noting that Web searching scores dropped in the control group by 0.33 points and in the intervention group by 0.08. This suggests that the relevant sources were more difficult to find in the post-test topic than in the pre-test topic. No change was observed in the scores of critical evaluation, indicating that in this sub-task, the intervention was ineffective.
Intervention group (n=55) | Control group (n=32) | repeated ANOVA | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Pre-test M (SD) | Post-test M (SD) | t | p | Pre-test M (SD) | Post-test M (SD) | t | p | F | p | ηp2 | ||
Search planning and query formulation | 1.37 (0.2) | 1.46 (0.3) | -2.35 | 0.022* | 1.44 (0.2) | 1.36 (0.4) | 1.14 | 0.264 | 5.36 | 0.023* | 0.059 | |
Web searching | 1.02 (0.5) | 0.94 (0.7) | 0.74 | 0.463 | 0.89 (0.4) | 0.56 (0.7) | 2.32 | 0.027* | 1.86 | 0.177 | 0.021 | |
Critical evaluation | 0.80 (0.4) | 0.92 (0.5) | -1.66 | 0.102 | 0.73 (0.5) | 0.6 (0.4) | 0.50 | 0.620 | 1.98 | 0.163 | 0.023 | |
Argumentative use of Web information | 0.74 (0.6) | 1.42 (0.6) | -7.56 | 0.000* | 0.67 (0.6) | 1.09
(0.6) | -3.48 | 0.002* | 3.00 | 0.087 | 0.034 | |
Overall score | 3.93 (1.1) | 4.73 (1.4) | -4.20 | 0.000* | 3.74 (1.1) | 3.71 (1.5) | 0.11 | 0.915 | 6.43 | 0.013* | 0.070 | |
* statistically significant p < 0.05 |
What kind of student-related factors explain students’ learning benefits?
Table 2 presents a comparison of intervention benefits achieved in subgroups formed on the basis of students’ pre-test scores, information technology activity, self-efficacy and behavioural intentions in online research, attitudes towards learning and sex. Column set A presents the comparison of online research scores between the high/low and male/female subgroups in the pre-test, and column set B the corresponding comparison in the post-test. Column sets C and D present the measured improvement of scores from the pre-test to the post-test within high/low and male/female subgroups.
A. Between subgroups comparison Pre-test | B. Between subgroups comparison Post-test | C. Within subgroup high improvement | D. Within subgroup low improvement | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
High M (SD) | Low M (SD) | t | p | High M (SD) | Low M (SD) | t | p | M (SD) | t | p | M (SD) | t | p | |
Pre-test score nhigh=26, nlow=29 | 4.9 (0.6) | 3.1 (0.8) | 10.17 | 0.000* | 5.5 (1.1) | 4.1 (1.4) | 4.07 | 0.000* | 0.6 (1.2) | 2.29 | 0.031* | 1.0 (1.6) | 3.56 | 0.001* |
School-related information technology activity nhigh=32, nlow=23 | 4.3 (0.1) | 3.4 (1.2) | 3.06 | 0.004* | 4.9 (1.4) | 4.5 (1.5) | 0.94 | 0.352 | 0.6 (1.2) | 2.67 | 0.012* | 1.1 (1.6) | 3.28 | 0.003* |
Free time-related information-seeking activity nhigh=33, nlow=22 | 3.9 (1.2) | 3.9 (1.1) | 0.13 | 0.898 | 4.7 (1.5) | 4.8 (1.3) | -0.43 | 0.667 | 0.8 (1.5) | 2.72 | 0.010* | 0.9 (1.3) | 3.39 | 0.003* |
Social media activity nhigh=23, nlow=32 | 4.4 (1.2) | 3.6 (1.0) | 2.56 | 0.013* | 4.8 (1.5) | 4.7 (1.4) | 0.17 | 0.864 | 0.4 (1.1) | 1.68 | 0.107 | 1.1 (1.5) | 4.01 | 0.000* |
Self-efficacy nhigh=27, nlow=26 | 4.4 (0.9) | 3.5 (1.1) | 3.16 | 0.003* | 5.0 (1.3) | 4.5 (1.5) | 1.3 | 0.199 | 0.6 (1.2) | 2.55 | 0.017* | 1.0 (1.5) | 3.28 | 0.003* |
Attitude towards online learning nhigh=27, nlow=26 | 4.1 (1.0) | 3.8 (1.2) | 0.93 | 0.358 | 4.7 (1.3) | 4.8 (1.9) | -0.16 | 0.871 | 0.7 (1.3) | 2.38 | 0.025* | 1.0 (1.4) | 3.46 | 0.002* |
Attitude towards traditional learning nhigh=28, nlow=25 | 4.1 (1.2) | 3.9 (0.9) | 0.52 | 0.605 | 5.2 (1.5) | 4.3 (1.2) | 2.14 | 0.037* | 1.1 (1.5) | 1.21 | 0.001* | 0.4 (1.1) | 1.93 | 0.066 |
Behavioural intentions nhigh=27, nlow=26 | 4.2 (0.8) | 3.8 (1.3) | 1.14 | 0.256 | 5.0 (1.4) | 4.5 (1.4) | 1.38 | 0.173 | 0.8 (1.3) | 3.61 | 0.001* | 0.7 (1.5) | 2.34 | 0.028* |
Female M (SD) | Male M (SD) | t | p | Female M (SD) | Male M (SD) | t | p | M (SD) | t | p | M (SD) | t | p | |
Sex nfemale=34, nmale=21 | 4.2 (1.1) | 3.5 (1.1) | 2.19 | 0.033* | 5.0 (1.5) | 4.3 (1.1) | 1.71 | 0.094 | 0.8 (1.3) | 3.46 | 0.002* | 0.8 (1.6) | 2.38 | 0.027* |
*statistically significant p < 0.05 |
The high subgroup in the pre-test scores scored 1.8 points higher than the low subgroup (4.9 vs. 3.1, p = 0.000). In the post-test, the difference was 1.3 points (5.5 vs. 4.1, p = 0.000). Both groups learned, and the difference between the groups levelled out a bit (improvement 0.6 vs. 1.0). Still, the difference remained clear after the intervention.
The high subgroup in school-related information technology activity scored 0.9 points higher in the pre-test than the low subgroup (4.3 vs. 3.4, p = 0.004). In the post-test, the difference levelled out (4.9 vs. 4.5, p = 0.352). In other words, students actively using the technology for schoolwork at home did better in the pre-test, but the difference faded during the intervention. Both groups learned, but the mean measured improvement was higher for the low subgroup (0.6 vs. 1.1).
The free time information-seeking activity variable was not connected with intervention benefits. No difference was found between the active and passive free time information seekers in the pre-test (3.9 vs. 3.9) or in the post-test (4.7 vs. 4.8). Students’ skills developed equally in both subgroups.
The high subgroup in social media activity scored 0.8 points higher in the pre-test than the low subgroup (4.4 vs. 3.6, p = 0.013). In the post-test, the difference levelled out (4.8 vs. 4.7, p = 0.864). Thus, the results provided evidence that students who are less active on social media improved their online research skills (difference 1.1, p = 0.000), but those who were more active on social media made hardly any progress (difference 0.4, p = 0.107).
The high subgroup in self-efficacy in online research scored 0.9 points higher in the pre-test than the Low subgroup (4.4 vs. 3.5, p = 0.003). In the post-test, the difference levelled out (5.0 vs. 4.5, p = 0.199). Both groups learned, but the low subgroup seemed to benefit more from the intervention.
Surprisingly, an attitude towards online learning did not predict an intervention effect. Both the high and low subgroups were approximately at the same level in the pre-test (4.1 vs. 3.8, p = 0.358) and post-tests (4.7 vs. 4.8, p = 0.871). Contrary to what was expected, an attitude towards traditional teacher-centred learning seemed to be a more likely predictor of effective learning. Both the high and low subgroups succeeded similarly in the pre-test (4.1 vs. 3.9, p = 0.605), but in the post-test, the high subgroup scored 0.9 points higher than the low subgroup (5.2 vs. 4.3, p = 0.037). The improvement in test results in the High subgroup was evident (1.1, p = 0.001) but remained speculative in the low subgroup (0.4, p = 0.066).
No difference in learning was observed between the subgroups based on behavioural intentions in online research. No difference was found between the subgroups in the pre-test (4.2 vs. 3.8, p = 0.256) nor in the post-test scores (5.0 vs. 4.5, p = 0.173). The subgroups improved test scores about equally (0.8, p = 0.001 vs. 0.7, p = 0.028).
Regarding sex, girls outperformed boys in the pre-test, but in the post-test, the measured difference was not conclusive. Girls scored 0.7 points higher in the pre-test than boys (4.2 vs. 3.5, p = 0.033). The measured difference remained the same in the post-test, but the result did not pass the t-test (5.0 vs. 4.3, p = 0.094). The problem in achieving statistical significance may be caused by the relatively low number of boys in the sample (n-males=21), and the higher standard deviation in the girls’ post-test test scores (SD = 1.53).
Discussion
The aim of our case study was to investigate the effect of a long-term inquiry-based information literacy intervention designed and implemented by a teacher and her colleagues in a lower secondary school. The teachers were informed of the guided inquiry design framework (Kuhlthau et al., 2015), and they applied some aspects of it to their pedagogical practice (see Alamettälä and Sormunen, 2018). We also examined how student-related factors such as pre-test success (online research skills before the intervention), information technology related and Internet related activity, self-efficacy and behavioural intentions in online research, attitudes towards learning and sex explain the benefits gained by students from the intervention.
Our first finding was that the teaching intervention improved students’ performance in online research. The sum of the test scores developed more in the intervention group than in the control group. The results are in line with the findings of Argelagós and Pifarré (2012) and Baji et al. (2018). In all studies, the intervention group outperformed the control group in the post-test. A closer examination of one component skill at a time showed that the intervention effect was most powerful in search planning and query formulation. The substantial progress in this sub-task can be explained by factors related to the pedagogical design of the intervention (Alamettälä and Sormunen, 2018):
- All students including controls participated in the information searching unit arranged by the school librarian and the teacher at the beginning of the first semester just after the pre-test. All students were expected to learn the basics of information searching. However, the practical skills seemed to be developed and sustained only in the intervention group, which practised these skills during the intervention.
- The teachers paid attention to the first phases of the inquiry process as recommended in the guided inquiry framework. They were obviously supported to conceptualise the topic of the task at hand and formulate their personal focus on it before the searches were done, that is, before the technical part. Past research has shown that practising teachers tend to teach searching as a technical skill (Limberg et al., 2008). The application of the guided inquiry framework helped to overcome this limitation in the professional practice.
Similarly, Argelagós and Pifarré (2012) found that the students in the intervention group outperformed the controls in the component skill defining the problem. Their study design supported activating prior knowledge and specifying the information needed.
We were not able to demonstrate an intervention effect in the other sub-tasks. Web searching scores (based on the relevance of the chosen documents) dropped in the control group and did not improve in the intervention group. This suggests that the relevant sources were more difficult to find in the post-test topic than in the pre-test topic. However, the difference in search task difficulty does not explain the failure to demonstrate the intervention effect. A likely explanation is that searching on the Web is always a trial-and-error process through which various details in the implementation of a search plan (e.g., choice of search words, writing mistakes) may lead the searcher to fail or succeed. Especially if there are only a few relevant sources available, the test measure suffers from low stability (see Soboroff, 2004). The additional random error component in a relatively small dataset makes it difficult to achieve statistically significant results.
No intervention effect was observed in the critical evaluation of sources (justifications for the relevance and reliability of sources used). This is a disappointing result since the team of teachers in the second unit emphasised the importance of critical evaluation skills as a learning goal. Teachers introduced the evaluation criteria and a sample of high-quality sources to the students. However, no organized activities or guidance were included in the intervention to practise these skills. Therefore, some students ignored the sources offered by the teachers and used low-quality Web sites as their sources. (Alamettälä and Sormunen, 2018). In the guided inquiry framework, the teacher team (including the school librarian) pays a lot of attention to the critical evaluation of sources. They actively guide students in locating and evaluating information sources. In this intervention, the teachers did not take such an active role, and the librarian was not involved at all. Nor did they look at the evaluation from the angle of search engine infrastructures (cf. Haider and Sundin, 2019, pp. 107-110).
The last sub-task measured how skilful students were at finding an answer to the assignment and presenting source-based arguments to support it. Here, we found only a weak indication that the intervention developed the skills of the argumentative use of Web sources. As reported in Alamettälä and Sormunen (2018), we observed that the teachers paid at least indirect attention to the use of sources. They instructed the students to use inquiry logs, and they reported that students who used the log seemed to synthesise information better across various sources than those who had not used it. However, we did not find active, teacher-guided activities to improve students’ skills in the argumentative use of sources. For example, Argelagós and Pifarré (2012) argued that scaffolds related to scanning and processing information helped students in their study to construct knowledge in a more efficient way. In their study, the intervention improved students’ task performance, i.e., to present a justified answer to a question.
We also wanted to find out how several student-related factors were connected to the intervention effects. The pre-test of online research skills indicated that various student-related variables are associated with the levels of online research performance. It is important that novel pedagogical approaches do not increase the gaps between high- and low-performing students. We followed the example of Chen et al. (2017), who found that the teaching intervention helped low-achieving (overall in studies) students improve their online skills more their medium- and high-achieving peers.
The pre-test revealed four factors, which are related to skill differences in online research: using information technology for schoolwork at home, using social media, self-efficacy in online research and sex. In each group, the post-test results indicated that the intervention decreased the gap between the groups in online research skills. The finding suggests that those who were less active Web searchers or social media users or those who were not convinced of their Web skills got a boost to learn new Internet skills. The evidence for gap reduction regarding sex was weakest; girls clearly outperformed boys in the pre-test, but in the post-test, the measured gap was still a borderline case in terms of statistical significance.
Free time information-seeking, behavioural intentions in online research, attitude towards online learning and attitude towards traditional teacher-centred learning were student-related variables that did not differentiate students in the pre-test. The high and low groups related to the first three variables performed in the post-test equally well. The attitude towards traditional learning made a difference and emerged as a variable that influenced learning outcomes. The high group for students preferring traditional teacher- and textbook-centred pedagogies outperformed the low group in the post-test. This finding is rather confusing, especially because, at the same time, the high group for students that prefer independent learning with the help of the Internet did not improve their online research skills more than the low group.
The relationship between the intervention effect and a positive attitude towards traditional teacher-centred learning may indicate that our attitude scale also measures students’ engagement and motivation to learn in school. It seems that their basic orientation did not change, although the learning object was now online research skills. Their motivation to learn might be greater than for those who preferred working online. The 7th-grade students do not have much experience with online research, and their conceptions of online learning might refer to the hope of easy learning on the Internet.
Conclusions
This study contributes to the pedagogy of online research skills. We lack intervention studies on teachers’ professional practices in teaching online research skills. The results revealed that even individual teachers may draw inspiration and ideas from research-based pedagogies such as the guided inquiry framework. This kind of pioneering work helps to improve teaching practices in schools. However, as most of the pedagogical models emphasise, the renewal of pedagogical practices is a school-wide process (e.g., Kuhlthau et al., 2015). In an ideal situation, the whole school would be involved, including the school library.
The study demonstrated the importance of organized activities. The intervention was effective, but the effect was limited only to one sub-task of online research: search planning and query formulation. We expect that balanced progress in all sub-skills could be achieved by operationalising learning goals for each sub-task more concretely. Furthermore, the learning of complex skills requires that students come across each sub-task several times in different contexts during their school years. Obviously, the chances of an individual teacher or teacher team are limited in this respect.
The case intervention tended to decrease the gaps in students’ online research skills. This might not happen in all teaching interventions. It might be that the guided inquiry framework helps students of lower-than-average performance to narrow some of the skill gaps. However, those who shared the attitude towards traditional teacher-centred learning seemed to increase their advantage over their less-engaged peers. Motivating all students in the learning process overall is a challenge in schools and needs further research in the context of online research.
We encourage teachers to collaborate and try out new pedagogical models such as guided inquiry, paying attention to every phase of the online research process, and researchers should report these experiments to get more research-based knowledge of online research instruction and its effects on students. Future research could be conducted among students from different age groups and could also take students’ overall academic achievement into consideration.
This study shares the limitations of quasi-experiments. Intervention and control groups may not be comparable, and prior differences between the groups may affect the outcome of the study (Bryman, 2008, pp. 40-41). Our students came from the same neighbourhood and were not exposed to any entry exams or ability-based selection procedures. Therefore, we assume there were similar starting level distributions for both groups. In addition, the pre-test scores showed that students’ online research skills were on the same level in the beginning. However, there might be other factors that are impossible to control that influence students’ development during and between the intervention units. On the other hand, the ecological validity of our quasi-experiment is expected to be high because it was conducted in a natural environment (cf. Bryman, 2008, pp. 40-41).
Acknowledgements
This research was funded by the Faculty of Information Technology and Communication Sciences at Tampere University, Finland. We are grateful to the students and teachers participating in the study.
About the authors
Tuulikki Alamettälä is a PhD Candidate at the Faculty of Information Technology and Communication Sciences, 33014 Tampere University, Finland. She can be reached at tuulikki.alamettala@tuni.fi
Eero Sormunen is Professor Emeritus at the Faculty of Information Technology and Communication Sciences, 33014 Tampere University, Finland. He received his Ph.D. from the same university and his research interests are in information literacy, online research and learning. He can be contacted at eero.sormunen@tuni.fi.
References
Note: A link from the title is to an open access document. A link from the DOI is to the publisher's page for the document.
- Aesaert, K., Voogt, J., Kuiper, E. & van Braak, J. (2017). Accuracy and bias of ICT self-efficacy: an empirical study into students’ over- and underestimation of their ICT competences. Computers in Human Behavior, 75, 92-102. https://doi.org/10.1016/j.chb.2017.05.010
- Alamettälä T. & Sormunen E. (2018). Lower secondary school teachers’ experiences of developing inquiry-based approaches in information literacy instruction. In S. Kurbanoğlu, J. Boustany, S. Špiranec, E. Grassian, D. Mizrachi, & L. Roy, (Eds.), Information Literacy in the Workplace. ECIL 2017, Saint Malo, France, September 18-21, 2017 (pp. 683-692). Springer. (Communications in Computer and Information Science, 810). https://doi.org/10.1007/978-3-319-74334-9_70
- Alamettälä T., Sormunen E. & Hossain, M. A. (2019). How does information literacy instruction in secondary education affect students’ self-efficacy beliefs and attitudes? In S. Kurbanoğlu, S. Špiranec, Y. Ünal, J. Boustany, M. L. Huotari, E. Grassian, D. Mizrachi, & L. Roy (Eds.), Information Literacy in Everyday Life. ECIL 2018, Oulu, Finland, September 24–27, 2018 (pp. 443-453) Springer. (Communications in Computer and Information Science, 989). https://doi.org/10.1007/978-3-030-13472-3_42
- Alexandersson, M. & Limberg, L. (2012). Changing conditions for information use and learning in Swedish schools: a synthesis of research. Human IT, 11(2), 131-154. https://humanit.hb.se/article/download/70/52 (Archived by the Internet Archive at https://bit.ly/2zfI7sV)
- Argelagós, E. & Pifarré, M. (2012). Improving information problem solving skills in secondary education through embedded instruction. Computers in Human Behavior, 28(2), 515-526. https://doi.org/10.1016/j.chb.2011.10.024
- Baji, F., Bigdeli, Z., Parsa, A. & Haeusler, C. (2018). Developing information literacy skills of the 6th grade students using the Big 6 model. Malaysian Journal of Library & Information Science, 23(1), 1-15. https://doi.org/10.22452/mjlis.vol23no1.1
- Borlund, P. (2003). The concept of relevance in IR. Journal of the American Society for Information Science and Technology, 54(10), 913-925. https://doi.org/10.1002/asi.10286
- Brand-Gruwel, S., Wopereis, I. & Vermetten, Y. (2005). Information problem solving by experts and novices: Analysis of a complex cognitive skill. Computers in Human Behavior, 21(3), 487-508. https://doi.org/10.1016/j.chb.2004.10.005
- Bråten, I., Strømsø, H. I. & Salmerón, L. (2011). Trust and mistrust when students read multiple information sources about climate change. Learning and Instruction, 21(2), 180-192. https://doi.org/10.1016/j.learninstruc.2010.02.002
- Bryman, A. (2008). Social research methods (3rd. ed.). Oxford University Press.
- Bussert, L. & Pouliot, N. (2010). A model for information literacy self-assessment: enhancing student learning in writing courses through collaborative teaching. In T. P. Mackey, & T. E. Jacobson (Eds.), Collaborative information literacy assessments: strategies for evaluating teaching and learning (pp. 131-149). Neal-Schuman.
- Chen, L. C., Chen, Y. H. & Ma, W. I. (2014). Effects of integrated information literacy on science learning and problem-solving among seventh-grade students. Malaysian Journal of Library & Information Science, 19(2), 35-51. https://mjlis.um.edu.my/article/view/1788 (Archived by the Internet Archive at https://bit.ly/2Xp0Vxz)
- Chen, L. C., Huang, T. & Chen, Y. (2017). The effects of inquiry-based information literacy instruction on memory and comprehension: a longitudinal study. Library & Information Science Research, 39(4), 256-266.
- Chu, S. K. W., Chow, K., Tse, S. & Kuhlthau, C. C. (2008). Grade 4 students’ development of research skills through inquiry-based learning projects. School Libraries Worldwide, 14(1), 10-37. https://www.iasl-online.org/Resources/Documents/slw/v14/14_1chu.pdf (Archived by the Internet Archive at https://bit.ly/3cYaCK8)
- Chu, S. K. W., Tse, S. K. & Chow, K. (2011). Using collaborative teaching and inquiry project-based learning to help primary school students develop information literacy and information skills. Library & Information Science Research, 33(2), 132-143. https://doi.org/10.1016/j.lisr.2010.07.017
- Coiro, J., Coscarelli, C., Maykel, C. & Forzani, E. (2015). Investigating criteria that seventh graders use to evaluate the quality of online information. Journal of Adolescent & Adult Literacy, 59(3), 287-297. https://doi.org/10.1002/jaal.448
- Colwell, J., Hunt-Barron, S. & Reinking, D. (2013). Obstacles to developing digital literacy on the internet in middle school science instruction. Journal of Literacy Research, 45(3), 295-324.
- Duke, T. S. & Ward, J. D. (2009). Preparing information literate teachers: a metasynthesis. Library & Information Science Research, 31(4), 247-256. https://doi.org/10.1016/j.lisr.2009.04.003
- Eisenberg, M. B. & Berkowitz, R. E. (1990). Information problem-solving: The big six skills approach to library and information skills instruction. Ablex.
- Field, A. P. (2009). Discovering statistics with SPSS (3rd ed.). Sage Publications.
- Fraillon, J., Ainley, J., Schulz, W., Friedman, T. & Gebhardt, E. (2014). Preparing for life in a digital age: the IEA International Computer and Information Literacy Study international report. Springer.
- Haider, J. & Sundin, O. (2019). Invisible search and online search engines: the ubiquity of search in everyday life. Routledge.
- Järvelin, K., Vakkari, P., Arvola, P., Baskaya, F., Järvelin, A., Kekäläinen, J., Keskustalo, H., Kumpulainen, S.W., Saastamoinen, S., Savolainen, R., & Sormunen, E. (2015). Task-based information interaction evaluation: the viewpoint of program theory. ACM Transactions on Information Systems, 33(1), 1-30. https://doi.org/10.1145/2699660
- Kaarakainen, M.-T., Saikkonen, L. & Savela, J. (2018). Information skills of Finnish basic and secondary education students: the role of age, gender, education level, self-efficacy and technology usage. Nordic Journal of Digital Literacy, 13(4), 56-72. https://www.idunn.no/dk/2018/04/information_skills_of_finnish_basic_andsecondary_education (Archived by the Internet Archive at https://bit.ly/2XgN8ZS)
- Kennedy, C., Rhoads, C. & Leu, D. J. (2016). Online research and learning in science: a one-to-one laptop comparison in two states using performance based assessments. Computers & Education, 100, 141-161. https://doi.org/10.1016/j.compedu.2016.05.003
- Kiili, C. & Leu, D. J. (2019). Exploring the collaborative synthesis of information during online reading. Computers in Human Behavior, 95, 146-157. https://doi.org/10.1016/j.chb.2019.01.033
- Kuhlthau, C. C., Maniotes, L. K. & Caspari, A. K. (2015). Guided inquiry. Learning in the 21st century. (2nd. ed.). Libraries Unlimited.
- Ladbrook, J. & Probert, E. (2011). Information skills and critical literacy: where are our digikids at with online searching and are their teachers helping? Australasian Journal of Educational Technology, 27(1), 105-121. https://doi.org/10.14742/ajet.986
- Leu, D., J., Forzani, E., Rhoads, C., Maykel, C., Kennedy, C. & Timbrell, N. (2015). The new literacies of online research and comprehension: rethinking the reading achievement gap. Reading Research Quarterly, 50(1), 37-59. https://doi.org/10.1002/rrq.85
- Limberg, L., Alexandersson, M., Lantz-Andersson, A. & Folkesson, L. (2008). What matters? Shaping meaningful learning through teaching information literacy. Libri, 58(2), 82-91. https://doi.org/10.1515/libr.2008.010
- Lundh, A. (2011). Doing research in primary school: information activities in project-based learning. [Doctoral dissertation, University of Borås]. Valfrid. (Skrifter från Valfrid, no. 47). http://www.diva-portal.org/smash/get/diva2:876983/FULLTEXT03
- Macedo-Rouet, M., Braasch, J. L. G., Britt, M. A. & Rouet, J. (2013). Teaching fourth and fifth graders to evaluate information sources during text comprehension. Cognition and Instruction, 31(2), 204-226. https://doi.org/10.1080/07370008.2013.769995
- Miller, C. (2016). TRAILS: tool for real-time assessment of information literacy skills. The Charleston Advisor, 173), 43-48. https://doi.org/10.5260/chara.17.3.43
- Pallant, J. (2013). SPSS survival manual. A step by step guide to data analysis using IBM SPSS. (5th. ed.). Open University Press.
- Petko, D., Cantieni, A. & Prasse, D. (2017). Perceived quality of educational technology matters: A secondary analysis of students’ ICT use, ICT-related attitudes, and PISA 2012 test scores. Journal of Educational Computing Research, 54(8), 1070-1091. https://doi.org/10.1177%2F0735633116649373
- Phillips, D. (Ed.). (2000). Constructivism in education. University of Chicago Press.
- Price, P. C., Jhangiani, R. S. & Chiang, I-C. A. (2015). Research methods in psychology (2nd. ed.). OpenEd. https://opentextbc.ca/researchmethods/
- Putman, S. M. (2014). Exploring dispositions toward online reading: analyzing the survey of online reading attitudes and behaviors. Reading Psychology, 35(1), 1-31. https://doi.org/10.1080/02702711.2012.664250
- Rohatgi, A., Scherer, R. & Hatlevik, O. E. (2016). The role of ICT self-efficacy for students’ ICT use and their achievement in a computer and information literacy test. Computers & Education, 102, 103-116. https://doi.org/10.1016/j.compedu.2016.08.001
- Rouet, J. F. & Britt, M. A. (2011). Relevance processes in multiple document comprehension. In M. T. McCrudden, J. P. Magliano, & G. Schraw (Eds.), Text relevance and learning from text (pp. 19-52). Information Age.
- Schilling, K. & Applegate, R. (2012). Best methods for evaluating educational impact: a comparison of the efficacy of commonly used measures of library instruction. Journal of the Medical Library Association, 100(4), 258-269. https://dx.doi.org/10.3163%2F1536-5050.100.4.007
- Scott, R. (2017). Assessing the impact of a guided inquiry unit on year 5 pupils’ information literacy: a student case study. Journal of Information Literacy, 11(1), 220-226. https://doi.org/10.11645/11.1.2211
- Soboroff, I. (2004). On evaluating web search with very few relevant documents. In SIGIR ’04 Proceedings of the 27th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 530-531). ACM. https://doi.org/10.1145/1008992.1009105
- Sormunen, E. & Alamettälä, T. (2014). Guiding students in collaborative writing of Wikipedia articles – how to get beyond the black box practice in information literacy instruction. In Viteli, J. & Leikomaa, M. (Eds.), Proceedings of EdMedia 2014 – World Conference on Educational Multimedia, Hypermedia and Telecommunications, Jun 23, 2014, Tampere, Finland (pp. 2122-2130). Association for the Advancement of Computing in Education.
- Sormunen, E., González-Ibáñez, R., Kiili, C., Leppänen, P. T., Mikkilä-Erdmann, M., Erdmann, N. & Escobar-Macaya, M. (2017). A performance-based test for assessing students’ online inquiry competences in schools. In S. Kurbanoğlu, J. Boustany, S. Špiranec, E. Grassian, D. Mizrachi, & L. Roy (Eds.), Information Literacy in the Workplace. ECIL 2017 (pp. 673-682). Springer. (Communications in Computer and Information Science, 810). https://doi.org/10.1007/978-3-319-74334-9_69
- Sparks, J. R., Katz, I. R. & Beile, P. M. (2016). Assessing digital information literacy in higher education: a review of existing frameworks and assessments with recommendations for next-generation assessment. Educational Testing Service. (Research Report ETS RR–16-32). https://doi.org/10.1002/ets2.12118
- Sundin, O. & Carlsson, H. (2016). Outsourcing trust to the information infrastructure in schools. Journal of Documentation, 72(6), 990-1007. https://doi.org/10.1108/JD-12-2015-0148
- Tanni, M. (2013). Teacher trainees’ information seeking behaviour and their conceptions of information literacy instruction. [Doctoral dissertation, University of Tampere]. Tampere University Press. https://trepo.tuni.fi/handle/10024/68249
- Tu, Y., Shih, M., & Tsai, C. (2008). Eighth graders’ web searching strategies and outcomes: the role of task types, web experiences and epistemological beliefs. Computers & Education, 51(3), 1142-1153. https://doi.org/10.1016/j.compedu.2007.11.003