Use of web-based research materials in education: Is uptake occurring?
Original proposal title: The use of web-based research materials: Using web analytics in conjunction with survey data to better understand how online research materials are used
Paper presented at CSSE, Montreal, 2010
Amanda Cooper, Hilary Edelstein, Ben Levin, Joannie Leung
Ontario Institute for Studies in Education
University of Toronto
Theory and Policy Studies in Education Department
252 Bloor Street West
Use of web-based research materials in education: Is uptake occurring?
The internet age
The rise of the internet offers new possibilities for research dissemination globally. The emergence of information technology has enabled people to access information at an ease and rate like never before (Dede, 2000). With nearly 2 billion internet users worldwide, the level of online activity is staggering and increasing exponentially: between 2000 and 2009 alone, internet usage worldwide increased by 400%! (http://www.internetworldstats.com/stats.htm). Every sector has been transformed by the internet age, and the education sector is no exception.
Knowledge mobilization (KM) is about using research more to improve policy and practice in education. There is growing interest globally in understanding KM processes (Cooper, Levin & Campbell, 2009; Levin, 2010; Nutley, Walter & Davies, 2007). Researchers and educational organizations are increasingly using websites and the internet as the primary vehicle for the dissemination of research findings in the form of reports, knowledge sharing events, and the creation of interpersonal networks to support KM efforts (Hartley & Bendixen, 2001; Greenhow, Robelia & Hughes, 2009); consequently, investigating web-based dissemination and collaboration strategies (websites, social media, virtual networks, and so on) might better inform our understanding of KM in the current technological societal context.
We have begun exploring online KM strategies as part of the Research Supporting Practice in Education (RSPE) program (www.oise.utoronto.ca/rspe). RSPE, housed at OISE and funded through Dr. Ben Levin’s Canada Research Chair, is a program of research and related activities that investigates KM across different areas of the education system including perspectives of research producers, research users, and in relation to the emerging technological landscape. The increasing importance of the internet draws our attention to use of web-based research materials as an important area for additional research.
Currently, we have two studies underway which attempt to explore and evaluate the internet’s growing role in KM in education. One study is an analysis of KM strategies in educational and other organizations based on analysis of their websites (Qi & Levin, 2010). In this work we have developed inductively a common metric for assessing KM work as revealed on websites in terms of strategies (products, events and networks) and indicators as they relate to strategies (different types, ease of use, accessibility, focus of audience and so on). This analysis helps us understand the range of KM strategies being employed by different kinds of organizations, including research producers, research users and intermediaries.
Our data show that few organizations display a wide range of practices related to KM, and many organizations have virtually no KM activity (Qi & Levin, 2010). Many organizations focus on posting reports and research-related products online, with far attention to building interaction through events or networks. However, as we looked at the range of different research products in a variety of formats for a variety of audiences, we also wondered whether and how much people are actually using these web-based research resources, a subject on which there appears to be very little empirical evidence.
While a great deal of effort goes into developing websites for sharing of research materials and resources, there is little or no empirical evidence on the value or impact of these strategies. In fact, we could find no studies of how people actually use web-based research material in education.
This study was conceived to explore the use of online research dissemination.
The research question guiding this work is:
How much and by whom are web-based research findings and analyses being used?
The study uses two data sources to determine the extent and nature of use of web-based research products First, it uses web analytics to track website use. Second, survey data extend the web analytics data by asking users directly questions about research use that cannot be answered from usage data
While our data are only beginning to accumulate from our various partner organizations, we hope this paper will stimulate discussion about this research design or other ways of studying the use of web resources. We expect that data from this study will shed light on what forms of online dissemination strategies are effective.
An overview of the paper This paper is organized into four parts. We use a review of related literature to develop a conceptual framework for studying the use of web-based research materials. We describe the challenges and opportunities in studying research use online in relation to our approach of Google Analytics (GA) in conjunction with online surveys. Fourth, we provide some initial web metrics data from one of our partner organizations in order to begin a discussion about the possibilities and limitations of these data to gauge research use and to answer KM questions.
This literature review is organized into three sections. The first outlines some key findings about knowledge mobilization generally that set the context for this study. The second examines the sparse literature that discusses KM in relation to the internet. The third section provides some introduction to the literature on web analytics and web metrics related to the study of KM.
What we know about KM
Research use is a multifaceted, nonlinear process that takes place within and between diverse organizations in the education system (Lemieux-Charles & Champagne, 2004; Levin, 2004; McLauglin, 2008; Nutley et al., 2007). Factors affecting KM also arise at multiple levels including individual, organizational and structural, as well as environmental and contextual (Berta & Baker, 2004). From a cross-disciplinary review of the literature, Levin (2004) outlines that KM is a function of the interaction among three main areas: research producing contexts; research using contexts; organizations and processes which mediate between these two contexts. All of this takes place over time within a larger societal context.
Multiple iterations of use
Understanding of KM has been growing in the past decade due to increasing interest in the topic as a way to improve public services (Cooper et al., 2009; Davies, Nutley & Smith, 2000). However many important issues remain unexplored, particularly in education (Cooper & Levin, 2010; Nutley et al., 2007). The empirical evidence suggests that research use remains modest across sectors, especially in education (Behrstock, Drill & Miller, 2009; Biddle & Saha, 2002; Cordingley, 2008; Hemsley-Brown, 2004; Hemsley-Brown & Sharp, 2003; Lemieux-Charles & Champagne, 2004; Levin, Sá, Cooper & Mascarenhas, 2009; Pfeffer & Sutton, 2000).
The use and impact of research are difficult to measure (Amara, Ouimet & Landry, 2004). One reason is that research may inform our thinking in ways that are not overtly visible in behaviour, sometimes referred to as conceptual use. Research can be used in direct and observable ways (instrumental use) though this is typically less frequent (Amara et al., 2004; Landry, Amara & Lamari, 2001). So it can be hard to know whether or to what extent research has actually informed the thinking or actions of people or organizations.
The discussion of multiple kinds of research use is at least 30 years old, and still relies on Weiss’ (1979) foundational work on the many meanings of research utilization. Knott and Wildavsky (1980) also proposed seven levels of research utilization that remain relevant today:
These sequential stages attempt to trace the different components involved from the time that someone actually receives a research related product to the point of impact resulting from that use.
The literature on KM indicates that research use happens over time. Incorporating research into policy and daily practice is not an instantaneous process; rather, a multitude of factors - from quality of the evidence to the credibility of the messenger, to the effort it takes on the part of practice organizations to implement evidence-based changes - all affect how quickly (if at all) KM occurs (Levin, 2004; Nutley et al., 2007; McLaughlin, 2008).
Timperley (2010) proposes that behaviour change takes at least three years to be fully incorporated. Her work involves intense and sustained interaction with teachers in order to have them use evidence (predominantly student assessment data disaggregated into different areas) to guide their practice. Others contend that incorporating research substantively takes much longer:
Studies in healthcare show that it can take a decade or more before research evidence on the effectiveness of interventions percolates through the system to become part of established practice. The abandonment of ineffective treatments in the light of damning evidence can be equally slow (Davies, Nutley & Smith, 2000, p. 10).
Many examples from the health sector and education sector reinforce this point such as the long road to increasing hand washing among health practitioners or the amount of time it took to end corporal punishment in schools.
These studies suggest that in order to understand how much research use is actually going on in the education system, studies need to attend to the issue longitudinally. This study includes a longitudinal element.
Audience and the format of research matters
Many studies have reported that tailoring research products for groups of stakeholders increases the likelihood of use (Cordingley, 2008; Biddle & Saha, 2002; Levin, Sá, Cooper & Mascarenhas, 2009). Our team found similar results of research use by principals in school districts (Levin, Sá, Cooper & Mascarenhas, 2009). On the other hand, Belkhodja et al. (2007) found that interaction and contextual considerations of production and practice environments were much more influential than format of products.
Practitioners in the field have time and time again insisted that the format of the research influences whether or not they actually use it (Cordingley, 2008; Behrstock, Drill & Miller, 2009; Biddle & Saha, 2002; Levin, Sa, Cooper & Mascarenhas, 2009); however, this claim does not appear to have been tested. There is simply not enough empirical evidence yet to know whether adaptation of products or interaction and recognition of context are most important to research use. Our study will explore this issue to some extent by assessing which products are actually accessed and downloaded.
The importance of active mobilization of research
A considerable amount of research suggests that passive dissemination of research products has limited effectiveness (Armstrong, Waters, Crockett & Keleher, 2007; Grimshaw et al., 2006; Lavis, Robertson, Woodside, McLeod, & Ableson, 2003). If this is so, investing time and resources in passive online dissemination mechanisms also seems a doubtful strategy, yet one that is common. One cannot assume that research is being used just because it is freely available online. Research also provides growing evidence that successful dissemination efforts need to consider the audience and have dedicated staff and resources (Levin, 2008; Cooper et al., 2009).
Dede (2000) similarly cautions that the internet, if utilized in the same way that traditional research dissemination has occurred (for example simply transferring large quantities of data to practice settings), will not yield different results. Hence, he suggests that “reconceptualising the historic role of information technology in knowledge mobilization and use is central to its future effectiveness” (p. 3).
Linking KM and technological literature
The literature on KM in relation to technology is sparse. Although many contend that the internet and various websites can facilitate this work, we found only a few studies in the health sector that explicitly addressed the use of the internet to mobilize research knowledge.
Ho et al. (2004), in a conceptual paper, explore the potential synergy of research knowledge transfer and information technology, which they refer to as technology-enabled knowledge translation (TEKT). They provide evaluation dimensions and methodologies for TEKT including structural, subjective, cognitive, behavioural and systemic elements in order to help researchers compare successful models and characterize best practices of TEKT. However they do not provide any empirical data on these practices or ideas.
Dede (2000) discusses the role of emerging technologies explicitly in relation to knowledge mobilization, dissemination and use in education. He elaborates on three ideas to use the internet to spread best practice across educational organizations. First, “emerging information technologies enable a shift from the transfer and assimilation of information to the creation, sharing and mastery of knowledge” (p. 2). Here, active collaboration among stakeholders, facilitated through the internet, is seen as a way to co-construct knowledge in a more meaningful way, because it takes into account contextual factors and, as a result, increases uptake. Second, Dede highlights that “dissemination efforts must include all the information necessary for successful implementation of an exemplary practice, imparting a set of related innovations that mutually reinforce overall systemic change” (p. 2). He argues that interactive media can facilitate this process, but must include detailed plans along a number of important areas – leadership, professional development, and so on. Third, “a major challenge in generalizing and scaling up an educational innovation is helping practitioners ‘unlearn’ the beliefs, values, assumptions, and culture underlying their organization’s standard operating practices” (p. 3). He argues that professional rituals are deeply entrenched and that changing practitioners’ behaviours can be supported through virtual communities that provide social support for this difficult and sometimes threatening process.
Jaded (1999) argues that the internet provides opportunities for networking and partnerships in the health sector. But he also lists a number of conditions that are necessary in order for online KM to be effective: a better understanding of the way service users and practitioners use the internet; systems that are easy to access and use; rapid transmission systems (bandwidth he argues is still too slow in many parts of the world); and information that is relevant and in a format that is ready to use. Different strategies are needed to integrate the large volumes of available information in a meaningful way; virtual interaction might still need to facilitate face-to-face meetings; and global access to technology is still needed to ensure global equity.
While these are interesting ideas, they provide little evidence on the actual use of web-based research materials.
Conceptual Framework For purposes of this study we conceptualize use of web-based research material in terms of the interaction between three elements (Figure 1):
Research evidence: Various aspects of the research products influence use.
Type of resource (idea, product, contact, link)
Format (long or short print version, video, language)
Relevance (how tailored to particular users)
Role (parent, teacher, student, researcher, district administrator, journalist, interested citizen)
Purpose of visit to website (work, study, personal reasons)
Actual use over time: Comparing original intention to actual use.
Use over time (no use, undetermined usefulness, immediately useful, intended future use, actually used)
Sharing of materials (formally and informally; internally or externally to their workplace)
Type of use (conceptual, symbolic, instrumental)
Figure 1. Conceptual framework: Online research use as the interplay between evidence, audience and use over time.
This study involves our team partnering with educational organizations in Canada and abroad to investigate use of web-based research in education. The organizations vary in form and function; for example, one partner is a unit within a school district, while others are intermediary research organizations or have websites designed to be databases of relevant research.
The study uses two data sources to assess the extent and nature of use of research products found on the websites of participating organizations. First, web analytics track website usage in various ways. Second, we developed two surveys, administered at two different points in time, that ask visitors directly about their use of these web-based resources.
Using web analytics
Web analytics software provides useful data on the use of research materials from websites (Wikipedia, http://en.wikipedia.org/wiki/Web_analytics). Tracking and understanding web analytics allows us to understand the specific activity on a website, translated into metrics (Ledford and Tyler, 2007). Types of metrics include: hits, page views, visits, unique visitors, referrers, search engines, keywords, time spent on site, exit pages, entrance pages, bounce rate, repeat visits, subscribers and conversion rate (Table 1). These data exist for each page on a website and for each product on the site, allowing comparisons over time and across sites.
For this study we chose Google Analytics (GA) software because it is widely used already, including by most of our partner organizations, and because it offers a range of useful tools to analyze the data it provides. GA also allows our partner organizations to give us access directly to their data, facilitating our analysis.
Google Analytics web metrics (Clifton, 2008; Ledford & Tyler, 2007; Page, 2008)
The general point for all analytics information. Clicking on any clickable point on the dashboard will take you to the in-depth analytics section of that point.
The map overlay is a visual cue to see how many visitors from which countries have visited the site. Additionally, the map overlay can be broken down by region, province/state, and city thereby comparing specific sections of the world with each other.
Shows a segmentation of visitors: what language/s they speak, where their network is located and which browser/operating system they use. There is also a section (with a pie graph) demonstrating percentage of visitors who are new versus who are returning visitors. Although this section is helpful on its own, it is more helpful to use as a comparison tool comparing between months the amounts of visitors – using it as a comparison tool could tell us if the new visitor in one month has become a returning visitor by looking at increases and by looking at the visitor loyalty breakdown.
Traffic source overview
Shows where visitors are being referred from to the website such as search engine link, another website, and direct traffic to the site.
The specific information relating to content for each page and/or document of the website. Includes how many people visited the page and the percent of page views. Often includes the unique views of each page.
This is a summary of visits, page views, pages per visit viewed, bounce rate, average time on the site, new visits. Clicking on any one of the headers will bring you to a further analysis of that point.
Line graph plotting how many visits a day, spread out by once a week points. Scrolling over the line graph on each of the points, the number of visitors per day pops up.
Within this portion all the analytics for visitors can be found.
Time on site
Provides an average time that each visitor spent on the site per day. From this number we could presume or infer how many pages the visitor read, if they downloaded something, or through breaking down the number, how many visitors bounced in and out of the site within a few seconds.
Provides a percentage of how many visitors on that day bounced on/off the site within a few seconds of coming to the site. From this statistic, we could presume that the visitor did not find what they were looking for, or that it was the wrong site.
Absolute unique visitors
The percentage of people and the number (in brackets after the percent), per day of completely new IP addresses tracked coming to the site.
Average page views
The approximate percent and number of pages visitors viewed when coming to the site.
Unique page views
The number of unique page views represents the number of individual visitors who have reviewed your pages
The first image that one sees on logging in to GA is the dashboard (Figure 2).
Figure 2. CEA Dashboard from Google Analytics September 1, 2009- April 26, 2010.
From the dashboard, users can view different reports to understand what pages visitors view, where visitors come from, and what products visitors access. Table 1 describes the definitions of different web metrics. In this paper, we specifically report on nine metrics: Content, site usage, visits, time on site, bounce rate, absolute unique visitors, page views, average page views and unique page views.
There are, however, limits to what Google analytics can tell us. While the analytics tell us about frequency of downloads of different formats of products (for instance full reports versus executive summaries) they do not provide information about who visits the site or, more importantly, about what people do with the research information after their visit. Since actual use of resources is our fundamental interest, we developed a two survey model to use in conjunction with web analytics to deepen our understanding of the use of web-based materials.
A two-part survey We are using a two part survey. When people visit one of our partner sites, they are invited to take part in a short survey (Appendix A) that asks them about whether they found useful information on this visit to the site and about their plans for using any such information. They are also invited to take part in a second survey (Appendix B), to be sent to them at a later date, that asks about their actual use of the materials or resources since their initial visit. The second survey is being circulated to those who volunteer either 30, 60 or 90 days after their initial visit.
Both surveys (Table 2) focus on whether the research-related products or resources are used at all and, if so whether this use is conceptual (informs thinking on future issues, and so on) or instrumental (affects the users thinking on research, work, or practice; impacts how the user does work in their context, and whether or not the participant shares the information with others formally or informally inside or outside their organization).
Survey questions in relation to type of use and time
Intention /Use over Time
Type of Use
Undetermined Use at this time
Intended future use
Actual Use (as determined by follow-up survey)
Q9, Q13, Q14, Q15, Q16
Level of Impact
We currently have either have in place or are about to have in place eight partner organizations; two in Canada and six in England. Each partner organization is involved in attempting to share research information in education through making it available on their website. We hope to recruit additional partners in the coming year; there is in principle no limit to how many organizations could take part. Partners have very little work to do; they have to provide us with access to their GA data, to embed some tracking codes on particular pages and products, and to embed our initial survey on their site.
The benefit for the partner organizations is the data analysis and reporting we provide on the use of their web-based research related materials. We also provide each partner with comparative data on the other study participants (anonymously). This will allow organizations to see how the take-up of research resources on their site compares with other educational organizations and should help them improve their sharing of research-related products.
Since each educational organization has different goals and, as a result, different content and layouts of their websites, we work with each partner to identify and track some particular ‘targets’ on their website. Targets can refer to a number of different things depending on the website – a particular web page, a product, an initiative that is linked to multiple products, and so on. Our analysis then focuses on these targets. We use ratios to compare different targets in order to gauge intensity of uptake of research materials in relation to other kinds of information within and between organizations (Figure 3).
Figure 3. Metrics analysis framework examining research-based targets within and between educational organizations.
Tracking different targets within a single website allows an organization to compare uptake of different initiatives or products. Tracking several sites over time provides the opportunity to compare them in terms of their ‘power’ to generate visitors to and downloads of material related to research findings in education. Looking at these data across sites and times will allow us to understand more about how, in general, web-based products are used and which kinds of approaches seem to have the greatest impact.