Use of web-based research materials in education: Is uptake occurring?


A preliminary example from piloting the project with CEA



Download 148.62 Kb.
Page2/2
Date02.02.2017
Size148.62 Kb.
#15146
1   2

A preliminary example from piloting the project with CEA

The research partner whose data we report in this paper is the Canadian Education Association (CEA). The mission statement of the CEA, an organization founded more than a century ago, is to initiate and sustain dialogue throughout Canada influencing public policy issues in education for the ongoing development of a robust, democratic society and a prosperous and sustainable economy. The CEA relies on good theory and research evidence as the foundation on which to build shared understanding and commitment with organizations that share their values and purposes (http://cea-ace.ca/abo.cfm, 2009). Because it is a national organization with a small staff it relies heavily on dissemination strategies including its website.

We focus our exploratory findings on the Google Analytics data for CEA on three targets:


  1. Comparing CEA’s research and policy page to other pages in regard to page views, average time on page and bounce rate

  2. Comparing which products (PDFs) are accessed the most, with a focus on comparing full reports versus executive summaries

  3. Comparing the uptake of two research-based initiatives: What Did You Do in School Today (WDYDIST) and the CEA’s study of the Ontario Primary Class Size initiative.

What Did You Do in School Today (WDYDIST) is a research project that gathers survey data from middle and secondary students in schools across Canada to explore their social, academic and intellectual engagement. We tracked five research-related products from this project:



  • National report (52 pages)

  • Summary report (4 pages)

  • Two supporting document reports that included a report on student engagement (26 pages) and a teaching effectiveness framework and rubric (18 pages)

  • FAQ document (5 pages).

Each of these documents is available as a PDF in English and in French in several parts of the CEA website, including on the homepage, the main research page, and a specific WDYDIST page. The New & Noteworthy page also includes various announcements pertaining to the project in June 2009, August 2009, and September 2009 as the media picked up on the project.
The Class Size Reduction Initiative is a research project which evaluates the Ontario government’s implementation of a class size reduction policy that reduced class size in the 90% of Ontario primary classrooms to 20 or fewer students as of 2008. We tracked six research-related products from this initiative:

  • National Report (22 pages)

  • Executive Summary (2 pages)

  • Evaluation Report (140 pages)

  • Question and Answer document (1 page)

  • Literature review on class size reduction (36 pages)

  • A paper that was in the CEA quarterly magazine in the fall of 2008 (4 pages).

As with WDYDIST, these documents are available as PDF files in both official languages in multiple locations on the CEA website, including the homepage, the main research page in two locations, and from the New & Noteworthy page with announcements pertaining to the release of the full report in February 2010.
We have analytics data for the overall site usage, with a focus on comparing research-related pages to non-research pages from September 2009 through April 2010 (page views, unique page views, average time spent on page, and bounce rate). For the research initiatives, we report on data for the product specific targets from February, 2010 (when the appropriate tracking code was inserted) through April 2010.
From these three targets we noticed:


  1. Visitors tend to view non-research related pages more but to spend more time on pages that have research-related content

  2. Non-research pages and resources were more visited and used than research-related pages and resources. Visitors accessed longer versions of reports more than they did short versions where both were available


Visitors tend to spend the most time on pages that have research-related content but view non-research related pages more
From September 2009 through April 2010 the CEA website was visited more than 200,000 times. The pages with the most views are shown in Table 3, and are not research related. On these non-research pages, visitors spent an average of 30-50 seconds on the page. In contrast, visitors spent the most time on average on pages that had research related content. On the WDYDIST page visitors spent an average of 2:33 on the page. Similarly, on the Focus on Literacy page, visitors spent 3:49 on the page. Although visitors spent more time on these pages with research-related content, the bounce rate (see Table 1) was also highest on these pages and lowest on the pages that had general information about what CEA does.
Table 3
All page views September 2009-April 2010

Rank

Page

Page views

Unique Page Views

Average time on page (minutes)

Bounce rate

1

Home page

25,070

18,608

1:04

41.60%

2

Education Canada publication page

6,663

5,007

0:47

51.55%

3

About CEA

6,537

4,565

0:38

39.46%

4

CEA publication page

6,090

4,383

0:27

20.34%

5

Research and policy main page

6,089

4,169

0:45

43.90%

6

WDYDIST page

5,702

4,012

2:33

68.68%

7

FAQ

5,545

4,932

1:50

68.60%

8

Focus on Literacy page

4,467

3,752

3:49

78.76%

9

Education Canada – Spring 2010 page

3,693

2,521

1:18

52.70%

10

Focus On – main page

3,484

2,478

0:29

41.09%

Not surprisingly, the CEA home page had substantially more views than the target research pages across the eight months of tracking.



Figure 4. Comparison of home page views to research page targets for CEA.
We also compared time spent on the target pages (Table 4).
Table 4
Comparison of average time spent on home page, research and policy page and WDYDIST page per month

Month

Average time spent on page (minutes)

Home page

Research and Policy

WDYDIST

September

1:04

0:39

2:17

October

1:01

0:48

3:02

November

0:59

0:51

2:54

December

0:58

0:54

3:07

January

1:18

0:49

3:45

February

1:06

0:44

2:24

March

1:03

0:42

1:23

April

0:56

0:34

1:49

The most time is spent by visitors on the WDYDIST; it should be noted that this page has a series of 2-3 minute videos embedded in it. While we cannot track access to the videos (because they are embedded on the page) this might account for the additional time spent watching the videos from the initiative.


In another attempt to compare these data, Figure 5 shows views of Research and Policy and WDYDIST as a percentage of the homepage views.


Figure 5. Comparison of CEA homepage, Research and policy page and WDYDIST page.
While access to both research page targets are low comparison to the homepage, in the WDYDIST activity peaked in November 2009 with 1,046 page views that month. This peak corresponded to a media release and additional media attention surrounding the initiative.
We explored the ten products on the whole CEA site to see how many would be research related (Table 5).
Table 5
Top 10 accessed PDF’s from the CEA website February through April 2010

Rank

Page

Page views

Unique Page Views

Average time on page (minutes)

1

2009-2010 School Calendar

777

686

2:22

2

WDYDIST National Report

284

270

3:35

3

Public Education in Canada: Facts, trends and attitudes (2007)

176

169

3:53

4

Beyond doing school: From stressed-out to engaged in learning

137

119

2:11

5

WDYDIST Teaching effectiveness framework and rubric

134

123

2:52

6

Democracy at Risk article

119

106

2:09

7

WDYDIST Student engagement report

119

106

2:22

8

KI-ES-KI contact handbook order form

98

86

2:04

9

Class size National Report

68

58

3:20

10

A vision for early childhood education and care article

62

61

2:08

The PDF with school calendar information from across Canada was by far the most frequently accessed document. Also consistent among the top ten were the WDYDIST National Report; WDYDIST teacher effectiveness report; the WDYDIST student engagement report and the KI-ES-KI contact order form.


In addition to being the top accessed product, the school calendar had an average view time of 2:22, which is similar to the view times of the research-related products. The KI-ES-KI order form had the shortest average time on page, at 2:04.
Comparing the uptake of two research based initiatives
We were interested in comparing the uptake (measured as frequency of access to the research-related products) of the CEA target initiatives: WDYDIST and the Class size reduction project. We found that the WDYDIST initiative had a greater uptake than the Class size project (Tables 6 and 7).
Table 6
Top content: PDFs relating to the What Did You Do In School Today Project February 2010-April 2010

Rank

Page

Page views

Unique Page Views

Average time (minutes)

5

WDYDIST Teaching effectiveness report

134

123

2:52

7

WDYDIST Student engagement report

119

106

2:09

19

WDYDIST National Report Summary

39

38

2:59

52

WDYDIST FAQ document

16

15

1:36

72

WDYIST National Report – French

11

9

3:41

Table 7
Top content: PDFs relating to the Class Size Reduction Project February 2010-April 2010



Rank

Page

Page views

Unique Page Views

Average time (minutes)

9

External view of the Class Size National Report

68

58

3:20

15

External view of the Class Size Evaluation Report

51

44

2:03

18

Literature review of Class Size Reduction

41

35

1:29

27

Class Size Executive Summary

28

24

3:06

61

Q and A document

13

13

0:38

62

Evaluation Report

13

12

0:55

WDYDIST has been up for longer on the CEA website (launched May 2009). It also has its own webpage as well as more diverse products such as videos on the website and document downloads. Hence, WDYDIST applies more strategies in terms of both products and media attention, with frequent news releases at key times within the two year research project. In contrast, the Class Size Reduction project was only released in February 2010 and, so far, there has not been as much space devoted to this initiative on the CEA website or attention by media (there has been only one news release on the project).


Visitors accessed longer versions of reports more than they did short versions where both were available
We were interested in exploring the frequent claim that readers prefer to access shorter versions of research such as executive summaries. For both these initiatives on the CEA site, in fact, the longer reports were viewed more often than the shorter versions. The WDYDIST student engagement report (26 pages) was viewed 134 times and the teaching effectiveness framework (18 pages) 119 times while the summary report was only viewed 38 times in the reported time frame. For the Class size project, visitors viewed the National Report 68 times (22 pages) and the Evaluation Report (140 pages) 51 times externally and 13 times internal to the organization website whereas the summary report (1 page) was viewed 28 times in the reported time frame. In addition, visitors spent more time on the longer reports than on the summary reports. In the case of WDYDIST, visitors spent an average of 2:59 on summary version of the national report and 3:35 on the long version For the Class size project, visitors spent an average of 3:06 on the summary but 3:20 on the National Report.
Survey findings
At the time of writing this paper, we do not have enough responses to our online surveys to report any data. Currently about 1% of visitors are responding to this survey while the second, follow-up survey is too new to be able to report take up or results. As we add more partners we will have more data from both of these instruments..

.

Conclusion

Dissemination of research materials through the internet is a ubiquitous practice but although it takes considerable resources, we have virtually no knowledge about its impact. This paper outlines a study currently underway that seeks to fill some of that gap, including outlining its conceptual basis and giving examples of the kind of data it will provide. Web analytics applied over time and across organizations will increase our understanding of the kinds of strategies and products that are most effective in creating attention. Our two part survey, if effective, will start to provide information on how and how much people actually use the materials they obtain from various websites. Both approaches will add to the base of empirical knowledge on effective mobilization of research in education.

References

Amara, N., Ouimet, M., & Landry, R. (2004). New evidence on instrumental, conceptual, and symbolic utilization of university research in government agencies. Science Communication, 26(1), 75-106.


Armstrong, R., Water, E., Crockett, B., & Keleher, H. (2007). The nature of evidence resources and knowledge translation for health promotion of practitioners. Health Promotion International, 22, 254-260.
Behrstock, E., Drill, K. & Miller, S. (2009). Is the supply in demand? Exploring how, when and why teachers use research. Learning Point Associates. Paper presented at the Annual Meeting for American Education Research Association, Denver, Colorado.
Berta, W. B., & Baker, R. (2004). Factors that impact the transfer and retention of best practices for reducing error in hospitals. Health Care Management Review, 29(2), 90-97.
Belkhodja, O., Amara, N., Landry, R., & Ouimet, M. (2007). The extent and organizational determinants of research utilization in Canadian health services organizations. Science Communication, 28(3), 377-417.
Biddle, B., & Saha, L. (2002). The untested accusation: Principals, research knowledge,

and policy making in schools. Westport, CT: Ablex.
Clifton, B. (2008). Advanced web metrics with Google analytics. Indianapolis: Wiley Publishing Inc.

Cooper, A., Levin, B., & Campbell, C. (2009).  The growing (but still limited) importance of evidence in education policy and practice.  Journal of Educational Change, 10(2-3), 159-171.

Cooper, A. & Levin, B. (in press, accepted January 2010). Some Canadian contributions to understanding knowledge mobilization. Evidence and Policy.
Cordingley, P. (2008). Research and evidence-informed practice: focusing on practice and practitioners. Cambridge Journal of Education, 38(1), 37-52.
Davies, H., Nutley, S., & Smith, P (2000). What works? Evidence-based policy and practice in public services. Bristol: Policy Press.

Dede, C. (2000). The role of emerging technologies for knowledge mobilization, dissemination, and use in education, paper commissioned by the Office of Educational Research and Improvement, US Department of education. Retrieved February 2010 from http://www.virtual.gmu.edu/ss_pdf/knowlmob.pdf


Greenhow, C., Robelia, B., & Hughes, J. (2009). Web 2.0 and classroom research: What path should we take now? Educational Researcher, 38(4), 246–259.
Grimshaw, J., Eccles, M., Thomas, R., MacLennan, G., Ramsay, C., Fraser, C., & Vale, L. (2006). Toward evidence-based quality improvement: Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966-1998. Journal of General Internal Medicine, 21, S14-20.
Hartley, K. & Bendixen, L. (2001). Educational Research in the Internet Age: Examining the Role of Individual Characteristics. Educational Researcher, 30(9): 22 - 26.
Hemsley-Brown, J., & Sharp, C. (2003). The use of research to improve professional practice: a systematic review of the literature. Oxford Review of Education, 29(4), 449-470.
Hemsley-Brown, J. (2004). Facilitating research utilization: A cross-sector review of research evidence. The International Journal of Public Sector Management, 17(6), 534-552.
Ho, K., Bloch, Gondocz, T., Laprise, R., Perrier, L., Ryan, D., Thivierge, R., Wenghofer, E. (2004). Technology-enabled knowledge translation : Frameworks to promote research and practice, Journal of Continuing Education in the Health Professions, 24, 90-99.
Jadad, A. (1999). Promoting partnerships : challenges for the internet age, BMJ, 319, 761-764.

Knott, J., & Wildavsky, A. (1980). If dissemination is the solution, what is the problem? Knowledge: Creation, Diffusion, Utilization, 1(4), 537-578.


Landry, R., Amara, N., & Lamari, M. (2001). Utilization of social science research knowledge in Canada. Res Policy, 30, 333-349.
Lavis, J., Robertson, D., Woodside, J. M., McLeod, C. B., & Abelson, J. (2003). How can research organizations more effectively transfer research knowledge to decision makers? The Milbank Quarterly, 81(2), 221-48.

Ledford, J., and Tyler, M. (2007). Google analytics 2.0. Indianapolis: Wiley Publishing Inc.


Lemieux-Charles, L., & Champagne, F. (2004). Using knowledge and evidence in health care: Multidisciplinary perspectives. Toronto: University of Toronto Press.

Levin, B. (2004). Making research matter more. Education Policy Analysis Archives, 12(56). Retrieved November 15, 2008, from http://epaa.asu.edu/epaa/v12n56/


Levin, B. (2008, May). Thinking About Knowledge Mobilization. (Paper prepared for an invitational symposium sponsored by the Canadian Council on Learning and the Social Sciences and Humanities research Council of Canada, Vancouver).
Levin, B., Sá, C., Cooper, A., & Mascarenhas, S. (2009). Research use and its impact in secondary schools. CEA/OISE Collaborative Mixed Methods Research Project Interim Report.
Levin, B. (2010). Theory, research and practice in mobilizing research knowledge in education. Paper presented at the 39th Annual Canadian Society for the Study of Education Conference, Montreal, Quebec.
McLaughlin, M. (2008). Beyond “misery research”. In C. Sugrue, (Ed.). The future of educational change: International perspectives (pp.176-185). London and New York: Routledge.
Nutley, S., Walter, I., & Davies, H. (2007).  Using evidence: How research can inform public services.  Bristol: Policy Press.
Page, R. (2008). Web metrics 101: What do all these terms mean? Retrieved on November 3, 2009 from www.makeuseof.com/tag/web-metrics-101
Pfeffer, J., & Sutton, R. (2000). The knowing- doing gap: How smart compaies turn knowledge into action. Boston: Harvard Business School Press.

Qi, J. & Levin, B. (2010). Strategies for mobilizing research knowledge: A conceptual model and its application. Paper presented at the 39th Annual Canadian Society for the Study of Education Conference, Montreal, Quebec.


Timperley, H. (2010). Using evidence in the classroom for professional learning. Paper presented at the Ontario Education Research Symposium, Toronto, Ontario, Canada.
Weiss, C. H. (1979). The many meanings of research utilization. Public Administration Review, 39(5), 426-431.

Appendix A: Initial CEA Survey














Appendix B: Follow-up Survey




Download 148.62 Kb.

Share with your friends:
1   2




The database is protected by copyright ©ininet.org 2024
send message

    Main page