Page Nav





Breaking News


Results and Discussion Characteristics

 Results and Discussion Characteristics of the Studies Before considering the findings reported in research on the effectiveness of study sk...

 Results and Discussion Characteristics of the Studies Before considering the findings reported in research on the effectiveness of study skills programs, we examined the characteristics of the studies from which conclusions about this research will be drawn. The first characteristic was the quality of the studies, as we wished to exclude studies of low quality. Quality was assessed on the basis of consensus between the independent ratings agreed to by the three authors. The prime concerns were that the study be based on reasonable sample sizes, have a control (e.g., pretest and posttest, or control and experimental groups), and use reliable tests. The eight lowquality studies (with 14 effect sizes) had a greater mean effect size than the remaining studies.

 Given typical advice in conducting these analyses, they could not be meaningfully included in the final sample and thus were dropped from all further analyses. Table 2 shows many of the characteristics of the various effect sizes. As shown by the central tendencies of these characteristics, studies generally were based on reasonably large sample sizes using school age students and were published relatively recently in journals (median = 1986). The programs were implemented by teachers for classes of students, although, as will be shown later, the majority of study skills packages (a) were implemented in universities wherein students self-selected to participate, (b) were conducted for atypical students (the low, high, and underachievers), (c) used a variety of study skills assessments, and (d) included 96 students (range 7 to 226). There were 30 effect sizes that included follow-up evaluations,

 typically of 108 days, and the effect sizes declined to an average of 0.10. Overall Summary of the Relative Evaluation of Study Skills Programs In presenting the findings of our meta-analysis, we first consider the overall effect sizes in the study skills programs and then report a number of models showing that several characteristics of the studies moderated the overall results. The mean weighted effect size was 0.45, with a standard error of 0.03, and the overall homogeneity statistic was 3,246.99 (df = 269, p < .001), which indicates that the overall mean may not be the most typical value, as there are many moderator variables that mediate this mean. When the study was the unit of

analysis, the mean weighted effect size was 0.63. A stem-and-leaf diagram of these effects is presented in Figure 1. As can be seen, there is marked positive skew. A close inspection of the quality of the studies which produced the 26 effect sizes greater than 1.4 did not reveal any pattern. These 26 effects came from 11 different studies, and the mean across all effect sizes within these 11 studies was close to the overall mean, which indicates that the largest effects were not unique to any particular cohort. A mean of 0.45 can be interpreted with reference to other influences on outcomes in education. Hattie (1987, 1992) outlined a measurement procedure for ascertaining the typical effect of most innovations in education. Based on a synthesis of 304 meta-analyses, he ascertained that an effect size of 0.40 was a benchmark from which various innovations could be interpreted (Table 3). 

That is, across the 304 meta-analyses, based on more than 40,567 studies, the typical effect size in educational interventions was 0.40. Of course, this is a global benchmark, and more refined comparisons can be made to interventions similar to the study skills interventions considered here. Table 3 presents a range of innovations, and it can be seen that the overall effect of study skills programs is close to the typical benchmark figure of 0.40. There were six effects relating to study skills embedded within other meta-analyses. These are listed in Table 4, and three are greater than the typical classroom innovation effect size.

you can refer to the source page : 19

No comments