Please feel free to e-mail this article to a friend, a principal, a parent, a colleague, a teacher librarian, a college professor, a poet, a magician, a vendor, an artist, a juggler, a student, a news reporter or to anyone else you think might enjoy it.

 From Now On
The Educational Technology Journal


 
Vol 10|No 1|September|2000

 

Finding Your Way
Through The Data Smog:

Enabling Empowered Decision Making
with Free Online Tools

by Joe Slowinski
joe.Slowinski@chadwick-k12.com
about the author

My intent in this article is to offer a process to help technology leaders prepare, collect, and analyze data associated with technology integration at their schools. Do you sometimes feel overwhelmed with the amount of information your school or district is attempting to capture or analyze? Well, you are not alone. Clearly, one of the most important issues facing school leaders today, under an increasingly high-stakes accountability environment, is managing data flow. For our purposes, data flow is the process of collecting and analyzing information in a school environment for the purpose of improving learning.

From my discussions with other educators, most perceive this as a challenge related to David Shenk’s recent title, "Data Smog: Surviving the Information Glut" or as Jamie McKenzie articulated in FNO in the past summer issue, "[n]ow we may drown in infoglut - oceans of unreliable and undifferentiated data that often confound and confuse rather than illuminate. We suffer from noise. We search for relief. We need help!"

This call for assistance is justified. We have entered an era of rigid accountability where educational leaders have been called upon to justify their decisions. In fact, policy makers are applying pressure on school personnel to become data collection agents in the almighty name of accountability. Is this justified? If you consider the cost, especially of technology, perhaps it is.

To put the cost issue in perspective, consider the amount we have allocated to schools. Nearly $25 billion was spent on K-12 educational technology since 1991. In 1999 alone, school district allocations combined to exceed $5 billion. In addition, between 1995 and 1999, state legislature allocations exceeded $4 billion. What has this money really done for schools? We do know that these funds have increased access to technology. As of today, approximately 95 percent of schools and 65 percent of classrooms have access to the Internet (Williams, 1999).

Clearly access is fundamentally important to education, but, in my opinion, most have not yet imagined the real power of technology - the ability to facilitate learner-centered environments AND the ability to enable decision makers to collect and analyze data more effectively, more frequently, and more easily. Once we begin to utilize technology to support formative decision making, we will move to the precipice of improving education on a more consistent and ongoing basis. Yet, to be powerful, this will require clearing away the data smog in an effort to eliminate, as much as possible, the infoglut that can exist in every room, building, district, community.

To clear away the data smog, I recommend that schools engage in a cycle of improvement:

Clearing the Smog:
Preparing for Data

Using data is a process that begins with the conceptualization of what it is you want to do and why you want to do it. In essence, this requires brainstorming and reflection on behalf of the school in an effort to develop and ask fundamentally good questions. In fact, before thinking about adapting any tool, whether free or not, you should try to articulate and document answers to questions similar to these:

  • Why do you need to collect data? What is the purpose of data collection?
  • What questions are you seeking answers to?
  • What information is necessary to answer these questions?
  • What method is best to collect this information without investing too much time on behalf of participants?
  • Who should participate? Which variety of voices should be heard? When do I gather information from students, parents, and other community members?
  • What questions did other schools or districts attempt to answer?
  • What other research can I do to improve the quality of questions?

To illustrate this process in a case, imagine the arrival of a new Director of Technology vested with improving the school’s technology. Sally (we might call her) would be interested in answering the following:

Reason for collecting data

(1) to gather baseline and formative information in an effort to measure the impact of my professional development program;
(2) to obtain information about teachers’ professional development practices and perceptions about technology as well as their willingness to utilize technology (correlations exist between these factors and technology use - see Becker, 1998; Rockman,1998).

What beginning questions are you seeking answers to?

  1. What barriers exist for faculty in regard to integrating technology into the curriculum?
  2. What is the professional development culture of individual staff members as well as the school in general?
  3. What is the best time for faculty to engage in professional development activities?
  4. What are teachers interested in learning?

What information is necessary to answer these questions?

What method is best to collect this information without investing too much time on behalf of participants? Who should participate?

(1) survey data gained from asking teachers what barriers have existed for them; interviews with administrators to determine what programs have existed in the past that was successful in promoting cultural change;
(2) survey and interview questions about faculty gained from faculty as well as administrators;
(3) survey data, interviews, and focus group;
(4) survey data, interviews, and focus group.

What questions did other schools or districts attempt to answer?

What other research can I do to improve the quality of questions?

Resources such as the following are excellent for school case studies written by individuals involved in this process:

Data Collection

In the grand scheme of things, reflection is the most important phase of a data collection and analysis process. Consider the fact that the questions you form will drive the information that you collect. Consequently, be patient and attempt to consider diverse perspectives. As William James once wrote, "[g]enius, in truth, means little more than the faculty of perceiving in an unhabitual way."

Second, after reflecting on important driving questions, you will need to think about the collection of data as a cultural process. In other words, prior to implementing a data collection process, consider what potential barriers will exist. Data collection is often perceived as simply a rationale activity to inform decision-making. Historically, and all too frequently today, data is collected for summative evaluative purposes; school administrators evaluate teacher performance once or twice a year. Due to this reality, the culture of schools can resist the introduction of new data collection.

For example, if the collection of data is not a normal function of the school environment or data is only collected to evaluate performance at the end of a year in a summative fashion, faculty members may be intimidated by the introduction of a data collection process. In fact, many staff members may resist a call for data without knowing how this information will be used and why it is being collected. Recall your experiences and feelings when school policies and practices change without notice or you were asked to provide personal info without justification. Moreover, data collection initially requires building trust. In an effort to facilitate support, be open with all participants. I call this process "transparency." Rationales and objectives should be transparent for those who will be involved in taking valuable professional time to engage in answering survey questions, interview questions, or focus group activities.

Transparency will be guaranteed with further ethical considerations about the privacy and use of data. In other words, the use of the information that you gather must be an ethical consideration embedded in your reflection of your institution’s norms and potential resistance by faculty members. With ethical considerations, ask the following questions:

  • Who should gather the data?
  • Who will see the data?
  • How will the data be utilized?
  • How frequently do you collect data?

These questions are critical to the entire process and must be considered far in advance with prime stake holders being involved in the discussion. But, more importantly, share this information openly with participants. This will lead to increased professional trust.

Analyzing: Data Driven Decision Making

Prior to identifying a data collection tool or combination of tools, it is prudent to develop hypotheses and consider analysis techniques. By thinking about the analysis stage prior to adapting a tool to your needs, you will choose a tool aligned more closely with your needs.

Let’s return to our hypothetical technology director. During this phase of analysis, she wants to test assumptions and measure the impact of programs. From research on innovation and technology professional development, she anticipates a five level growth process to occur. For each level, she has planned specific types of professional development activities and support mechanisms.

Educational Technology Use and Implementation Growth Process

.

Level I

Level II

Level III

Level IV

Level V

Participant Perspective

Target

Personal Productivity

Professional Productivity

Professional Enrichment

Paradigm Shift

Vision

Role of Teacher in PD Process

Learner

Adopter

Adapter

Refiner

Leader

Focus of PD

Individual

Growth

Support Instruction

Enrich Instruction

Support Assessment

Student-Centered Learning

PD Product

Mentor, Modeling, On-line Just-in-time (JIT), Physical & Virtual Help Desk, Workshops, Lunch Bytes

Mentor, Modeling, On-line JIT, Workshops, help desk, resource search engine, Lunch Bytes

Mentor, modeling, On-line JIT, workshops, resource search engine, Lunch Bytes

Virtual mentor, mentor modeling, On-line JIT, workshops, resource search engine

Virtual collegial collaboration, incentive, facilitation opportunities, showcase; resource search engine

Adapted from Sherry et al. (2000); Rogers, (2000); Rogers (1983); Hall & Horde (1987)

 At each level of professional growth, she has conceptualized appropriate themes and topics for that level. Her assumption is that these issues, skills, and knowledge are aligned with the level they are meant to operate in under this model. These hypotheses have been generated based on past experience and research. But, to verify these, she will need to collect data for each activity at each level. Once data is collected, she will need to analyze this information and determine how closely Her assumptions were met. If incongruence exists, she should conduct follow-up interviews and administer surveys to determine the reasons that staff members perceive.

Professional Development Topics/Themes by Growth Level

Personal Productivity

Professional Productivity

Professional Enrichment

Paradigm Shift

Vision

Utilizing E-mail

Content Specific Web Sites

Building & Using Web Quests

Project Based Learning

Conducting Action Research

Utilizing the WWW

Power Point for Learning

Curriculum Mapping

Performance Based Assessment

Electronic Collaboration

Word Processing Basics

Professional Listserv

Ethics & Privacy

Case Models

(e.g., ICON)

Web Based Learning

Spread Sheet Basics

Free Professional Web Resources

Gender Equity

Student Centered Instruction

 

Using a Search Engine

Advanced Web Search Engine

Ergonomics

Advanced tech based assessment

 

Form Letters

Building a course web site

Learning Contests on the WWW

Advanced asynchronous learning support

 

Web Resources for personal use

 

Technology supported Assessment

   
   

Asynchronous learning support

   
   

Advanced course web site

   

After reflecting on these topics and the anticipated growth, she has also articulated assumptions about what she expects as an outcome of the professional development activities. Since personal and professional productivity emerge together, these two categories have been combined. In preparation for measuring the impact of professional development offerings, she will use an initial needs assessment to capture technology use and perceptions about the value of technology in education. This needs assessment has two fundamental purposes: (1) to collect data on staff technology use; and (2) to gather data on faculty perceptions toward technology. Furthermore, data can be used to measure the constructivist orientation of teachers. Recently, several studies have confirmed the connection between constructivism and technology access and use - see Becker, 1998; Rockman,1998. Initially, his or her objective was to conduct research on what was known about technology use and beliefs/practices in an effort to establish initial assumptions and hypotheses. With this data, she can test assumptions as well as generate new ones. She will then compare a pre-assessment with a post-assessment at the end of the semester and again at the end of the year as well as collect frequent data for formative decision-making.

Expected Minimum Product & Service Completion by Cohort Group

 

Level I/II

Level III

Level IV

Level V

Participant Perspective

Target

Personal & Professional Productivity

Professional Enrichment

Paradigm Shift

Vision

Role of Teacher

Learner &

Adopter

Adapter

Refiner

Leader

Focus of Products

Support

Individual & Instruction

Enrich Instruction

Support Assessment

Student-Centered Learning

Objective

Minimum Production

Use of web based resources to support instruction

Basic course web site

Increased use of personal IT tools

Participate in web contest

Participate in content collaboration with school or organization

Advanced web site

Introduction of performance based assessment (PBA)

Introduction of project based learning (PBL)

Increase use of asynchronous learning support

Increased use of electronic collaboration

Action research project

Enhanced use of web based learning

Virtual collaboration

Embedded use of asynchronous learning support, PBL, and PBA

By using a pre/post design, our hypothetical technology director can assess the overall impact of their program for the year.

Gathering Data: Free Online Data Collection Tools

At this stage of the cycle, you have articulated a purpose as well as identified what information is essential to collect. And, you have also made an effort to identify characteristics of information and use as well as share the reasons for collecting information with relevant participants. In addition, you have considered analysis techniques and hypothesis generation. Now, and only now, are you are ready to seek out appropriate educational technology tools.

Many web-based tools have emerged in the past several years to support school district efforts to manage data smog. The following tools are available for free and offer a variety of options for schools to collect and analyze opinions and status-quos among staff.

A variety of tools are now available to teachers and administrators to improve the systemic collection and analysis of data. I offer a few here in the hope that you can utilize these to support your good questions. As you review these tools, consider how you could apply these at various stages in the process.

Tech Builder
http://compaq.edmin.com/
 
Profiler: Online Collaboration Tool
http://profiler.scrtec.org/
 
School Staff Education Technology Needs Assessment
http://chadwick-k12.com/ssetna
 
Milken Education Technology Discrepancy Tool
http://www.mff.org/edtech/discrepancy/
 
Learning Profile Tool
http://www.ncrtec.org/capacity/profile/profile.htm
 
Technology in Education Snapshot Survey
http://www4.snapshotsurvey.org/index.html

Conclusion

Managing information and data requires a reflective and strategic process - one that becomes a continual part of the cultural norm of an education institution. We, as education professionals, can no longer operate in data-free zones, but, rather, we must begin to engage in on-going perpetual reflective data collection and analysis. I have always maintained that professionalism requires being comfortable with change because perpetual self-reflection requires evolving a growing as an individual and organization. Becoming more comfortable with change and managing data requires a systemic process:

  • Begin with questions and assumptions based on research
  • Consider cultural norms (local area research)
  • Consider analysis & data tools according to hypotheses and culture
  • Conduct needs assessment and analyze for gaps
  • Collect formative on-going data and verify/modify assumptions
  • Revise policies and practices
  • Begin again — continual process

Engage in this process and you will begin to manage the data smog more effectively and strategically.

References

Becker, H. J. (1998). Teaching, Learning, and Computing: 1998. Irvine, CA: Center for Research on Information Technology and Organizations. [On-line]. Available at: http://www.crito.uci.edu/tlc/html/findings.html

Hall, G.E. & Hord, S.M. (1987). Change in schools: Facilitating the process. Albany, NY: State University of New York.

Kongshem, L. (1999, September). Smart Data: Mining the School District Data Warehouse. Electronic School. [On-line]. Available at: http://www.electronic-school.com/199909/0999f1.html

Mckenzie, J. (1998). Emerging from the Smog: Making Technology Assessment Work for Schools. From Now On, 7 (5). [On-line]. Available at: http://www.fno.org/feb98/cov98feb.html

Rockman et al. (1998). The Laptop Program. San Francisco, CA: Author. [On-line]. Available at: http://rockman.com/projects/laptop/

Rogers, D. L. (2000). A paradigm shift: Technology integration for higher education in the new millennium. Education Technology Review, 13, 19 — 27.

Rogers. E. (1996). The diffusion of innovations. (4th ed.). New York: Free Press.

Sherry, L., Billig, S., Tavalin, F. & Gibson, D. (2000, February). New insights on technology adoption in schools. The Journal. [On-line]. Available at: http://www.thejournal.com/magazine/vault/A2640.cfm

Slowinski, J. (2000, September/October). The gap between preparation and reality in training teachers to use technology. Technology Horizon. [On-line]. Available at: http://horizon.unc.edu/TS/commentary/2000-09.asp

Slowinski, J. (1999). Internet in America's Schools: Potential Catalysts for Policy Makers. First Monday, 4 (1). [On-line]. Available at: http://www.firstmonday.dk/issues/issue4_1/slowinski/index.html

Williams, C. (2000). Internet Access in U.S. Public Schools and Classrooms: 1994-99 (NCES 2000-086). U.S. Department of Education. Washington, DC: National Center for Education Statistics. [Online]. Available at: http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2000086

Back to September Contents

Credits: The photographs were shot by Jamie McKenzie.

Copyright Policy: Materials published in From Now On may be duplicated in hard copy format if unchanged in format and content for educational, nonprofit school district and university use only and may also be sent from person to person by e-mail. This copyright statement must be included. All other uses, transmissions and duplications are prohibited unless permission is granted expressly.

From Now On Index Page