21ST CENTURY SKILLS DEVELOPMENT: LEARNING IN DIGITAL
COMMUNITIES: TECHNOLOGY AND COLLABORATION
by
BARBARA J. SHORT
A DISSERTATION
Presented to the Department of Educational Methodology,
Policy, and Leadership
and the Graduate School of the University of Oregon
in partial fulfillment of the requirements
for the degree of
Doctor of Education
June 2012
ii
DISSERTATION APPROVAL PAGE
Student: Barbara J. Short
Title: 21ST Century Skills Development: Learning in Digital Communities: Technology
and Collaboration
This dissertation has been accepted and approved in partial fulfillment of the
requirements for the Doctor of Education degree in the Department of Educational
Methodology, Policy, and Leadership by:
Dr. Kathleen Scalise Chairperson
Dr. Gerald Tindal Member
Dr. Lynne Anderson-Inman Member
Dr. Kathie Carpenter Outside Member
and
Kimberly Andrews Espy Vice President for Research & Innovation/Dean of the
Graduate School
Original approval signatures are on file with the University of Oregon Graduate School.
Degree awarded June 2012
iii
© 2012 Barbara J. Short
This work is licensed under a Creative Commons
Attribution-NonCommercial-NoDerivs (United States) License.
iv
DISSERTATION ABSTRACT
Barbara J. Short
Doctor of Education
Department of Educational Methodology, Policy, and Leadership
June 2012
Title: 21ST Century Skills Development: Learning in Digital Communities: Technology
and Collaboration
This study examines some aspects of student performance in the 21st century
skills of Information and Communication (ICT) Literacy and collaboration. In this
project, extant data from the Assessment and Teaching for 21st Century Skills project
(ATC21S) will be examined. ATC21S is a collaborative effort among educational
agencies in six countries, universities, educational research groups, high tech innovators
and the multinational corporations Cisco, Intel and Microsoft. ATC21S demonstration
tasks explore the use of digital literacy and collaborative problem solving constructs in
educational assessment. My research investigates evidence from cognitive laboratories
and pilots administered in one of the ATC21S demonstration scenarios, a collaborative
mathematics/science task called “Global Collaboration Contest: Arctic Trek.” Using
both quantitative and qualitative methods, I analyze student work samples. Specifically, I
(i) develop a rubric as a measurement tool to evaluate the student assessment artifact
“Arctic Trek Notebook” for (a) generalized patterns and (b) trends that may indicate skill
development in collaborative learning in a digital environment and (ii) conduct
descriptive studies among the variables of student age and student notebook
characteristics. Results are intended to inform instructional leaders on estimates of
v
student ability in virtual collaboration and to make suggestions for instructional design
and professional development for online collaborative learning assessment tasks in K-12
education.
vi
CURRICULUM VITAE
NAME OF AUTHOR: Barbara J. Short
GRADUATE AND UNDERGRADUATE SCHOOLS ATTENDED:
University of Oregon, Eugene
Prescott College, Prescott, AZ
Humboldt State University, Arcata, CA
DEGREES AWARDED:
Doctor of Education, Educational Leadership, 2012, University of Oregon
Master of Arts, Counseling Psychology, 2000, Prescott College
Bachelor of Arts, Child Development, 1990, Humboldt State University
AREAS OF SPECIAL INTEREST:
Social Emotional learning
Inquiry-based Instruction
American Indian Education
Teacher preparation
PROFESSIONAL EXPERIENCE:
Academic Advisor, University of Oregon, 2009-2012
School Counselor, Happy Camp High School, Happy Camp, CA 2005 – 2009
Associate Faculty, Early Childhood Education, College of the Redwoods,
Eureka, CA, 2004 – 2006
Teacher, Klamath Trinity Unified School District, Hoopa, CA 2001- 2004
Youth and Family Therapist Hualapai Wellness Court and Hualapai Health
Department, Peach Springs, AZ, 1998 - 2001
Teacher, Peach Springs Elementary, Peach Springs, AZ, 1996-1997
Teacher, Forks of Salmon Elementary, Forks of Salmon, CA, 1994 - 1996
Teacher, Happy Camp Elementary, Happy Camp, CA, 1991- 1994
vii
GRANTS, AWARDS, AND HONORS:
Robert and Ruth Sylwester Scholarship, University of Oregon, 2011-2012
Jerome Colonna Leadership Award, University of Oregon, 2010-2011
Woman of the Year, Humboldt State University, 1990
viii
ACKNOWLEDGMENTS
I wish to acknowledge my Committee members for their service to academia and
assistance in the development of my research and presentation skills: Dr. Jerry Tindal,
Dr. Lynne-Anderson-Inman, Dr. Kathie Carpenter, Dr. Karen Sprague and chair Dr.
Kathleen Scalise; thank you so much for your dedication and patience.
I am deeply appreciative of Dr. Scalise for her mentorship throughout my doctoral
studies. Dr. Scalise has encouraged academic exploration, synthesis of seemingly diverse
concepts in education, and application of innovative thought and methods to practice.
She accepted me as an additional advisee, provided the data for my study, and much
assistance throughout the dissertation process; I consider myself extremely fortunate to
have the opportunity to work under her guidance and benefit from her expertise.
The Office of Academic Advising provided me with three years of fellowship to
support my studies, and a level of flexibility and consideration for my academic and
family needs that is unparalleled. My father and daughters provided unconditional love
in spite of my near constant academic deadline-driven insanity; their belief in me is
humbling and sustaining. A strong family and good friends makes all the difference.
The many blessings I enjoy make so much possible.
ix
For the students, who come to us open and trusting; much of their development and self-
concept is influenced by the time they spend in school. May their experiences in school
be meaningful, interesting, exciting, thought provoking, and inspiring; help them develop
their unique strengths and vision; and reinforce their belief in themselves as important
and capable people with much to offer the world. May they love learning and see endless
possibilities. This effort is one small step toward that vision.
x
TABLE OF CONTENTS
Chapter Page
I. INTRODUCTION.......................................................................................................... 1
21st Century Skills as Educational Goals ............................................................... 3
Common Core State Standards and 21st Century Skills ................................. 5
The ATC21S Project ...................................................................................... 7
Purpose of Study and Research Questions ..................................................... 8
Literature Review ................................................................................................ 11
Search Terms and Systems ........................................................................... 11
Search Results............................................................................................... 12
Criteria for Inclusion .................................................................................... 14
An Environment of Global Social and Economic Change .................................. 14
New Workforce Requirements Beyond Basic Skills.................................... 15
Socio-Cultural Change and Educational Technology Policy .............................. 17
Technological Enhancements and Digital Literacy...................................... 19
Technology Use Among Youth Aged 8-18 .................................................. 21
Youth and New Social Media................................................................ 22
The Education-Technology Gap................................................................... 23
Emergence, Definition and Development of 21st Century Skills......................... 24
New Media Strengths, Skills and Behaviors ................................................ 27
21st Century Skills and Learning Theory...................................................... 28
Collaborative Learning ................................................................................. 29
Benefits of Collaborative Learning ....................................................... 32
Use of Roles to Facilitate Group Learning............................................ 34
Computer-Supported Collaborative Learning .............................................. 35
Implications of Virtual Collaboration .................................................. 36
Evaluation of Digital Collaboration ...................................................... 37
Social-Emotional Learning........................................................................... 39
Professional Development for Collaboration in a Digital Environment.............. 40
Collaborative Learning ................................................................................. 42
Academic Social-Emotional Learning.......................................................... 43
Technology ................................................................................................... 45
NETS Essential Conditions ................................................................... 46
New Models of Professional Development .................................................. 47
xi
Chapter Page
Performance Assessments for 21st Century Skills ............................................... 48
Barriers to the Implementation of Performance Assessments...................... 51
II. METHODS ................................................................................................................. 53
Design .................................................................................................................. 54
Cross-Case Analysis ..................................................................................... 55
Overview of Methodology for Case-Oriented Strategies...................... 56
Sample ................................................................................................................. 57
Sampling Procedure...................................................................................... 58
Student Characteristics ................................................................................. 58
Sample Assessment Frame: ATC21S Cognitive Labs and Pilot Trials............... 59
Tools for Working ........................................................................................ 59
Consumer............................................................................................... 59
Producer................................................................................................. 60
Ways of Working.......................................................................................... 60
Social Capital......................................................................................... 60
Intellectual Capital................................................................................. 60
Multiple Opportunities to Demonstrate Skills.............................................. 60
ATC21S Scenario 2: “Global Collaboration Contest: Arctic Trek” ................... 62
Notebook Specifics....................................................................................... 68
Assessment Administration Instructions ...................................................... 71
Implementation of Phases .................................................................................... 72
Phase 1: Review and Coding of Student Work Samples.............................. 72
Body of Work Method........................................................................... 73
Discourse Analysis ................................................................................ 74
Phase 2: Rubric Development ...................................................................... 77
Emerging Rubric Development............................................................. 78
Expert Review ....................................................................................... 78
Range Finding ....................................................................................... 78
Second-Stage Rubric Development....................................................... 78
Expert Review ....................................................................................... 79
Third-Stage Rubric Development.......................................................... 79
Phase 3: Scoring of Student Work Through Within-Case Analysis............. 79
Phase 4: Examination of Trends and Patterns Through Cross-Case
Analysis ........................................................................................................ 80
xii
Chapter Page
Phase 5: Examination of Skill Areas for Instructional Design .................... 80
Cognitive Task Analysis........................................................................ 81
Backward Design................................................................................... 83
Phase 6: Investigate Potential Professional Development Strategies ........... 85
Analysis by Phase ................................................................................................ 86
III. RESULTS .................................................................................................................. 88
Case Characteristics ............................................................................................. 88
Results of Phase 1: Review and Data Coding of Student Work Samples ........... 89
Discourse Analysis ....................................................................................... 89
Unit of Discourse Analysis.................................................................... 90
Coded Checklist of Discourse Categories ............................................. 91
Separation of Checklist and Rubric Traits............................................. 97
Results of Phase 2: Rubric Development ............................................................ 97
Traits ............................................................................................................. 99
Expert Review ....................................................................................... 99
Interactive Regulated Learning ........................................................... 100
Preliminary Work Sample Assessment and Range Finding ....................... 102
Preliminary Work Sample Assessment for Significant Traits.................... 102
Six Traits Digital Collaboration Rubric...................................................... 103
Initial Assessment of Student Work, Using the Six Traits Rubric ............. 103
Evaluation of Six Traits Digital Collaboration Rubric Use for
Student Work Assessment .......................................................................... 104
Expert Review of Six Traits Digital Collaboration Rubric ........................ 104
Rubric Revision: 3+3 Six Traits Digital Collaboration Rubric .................. 104
Results of Phase 3: Assess Student Work and Evaluate Rubric ........................ 105
Secondary Assessment of Student Work Using the 3+3 Rubric ................ 107
Assessment of Student Work Descriptive Statistics................................... 108
Scores by Trait............................................................................................ 110
Scores by Group ......................................................................................... 111
Scores by Age............................................................................................. 111
Inter-Rater Reliability................................................................................. 112
Inter-Rater Comparison by Group....................................................... 113
Qualitative Review From Educator Raters ................................................. 116
Rubric Utility....................................................................................... 119
xiii
Chapter Page
Qualitative Review From Researchers in the Field .................................... 120
Results of Phase 4: Examine Categorical Patterns and Trends in Student
Work .................................................................................................................. 121
Diagnostic Summary of Student Work Samples: Patterns and Trends ...... 122
Did Not Use Collaborative Tool Shared Document............................ 122
Used the Collaborative Tool Shared Document But Did Not
Progress With Task.............................................................................. 123
Used the Collaborative Tool Shared Document and Progressed
With Task ............................................................................................ 125
Results of Phase 5: Instructional Design ........................................................... 129
Technology Skills ....................................................................................... 130
Academic Social-Emotional Skills ............................................................. 131
Cooperative Learning Strategies and Skills................................................ 131
Shared Document Skills ...................................................................... 131
Composite Domains Supporting Collaboration in a Digital
Environment ............................................................................................... 132
Results of Phase 6: Professional Development for Digital Collaboration......... 135
Analysis of Results and Validity Considerations .............................................. 138
Construct Validity....................................................................................... 139
Domain Theory and Structure ............................................................. 140
Selection Bias and History ......................................................................... 142
External Validity......................................................................................... 143
Statistical Conclusion Validity ................................................................... 143
IV. DISCUSSION.......................................................................................................... 145
Research Question Summary............................................................................. 146
Research Question 1 ................................................................................... 146
Research Question 2 ................................................................................... 148
Research Question 3 ................................................................................... 150
Contributions to Research and the Body of Knowledge.................................... 151
Areas for Future Research ................................................................................. 156
Limitations of the Study .................................................................................... 159
Limitations of the Sample........................................................................... 160
Limitations of the Measures ....................................................................... 161
Arctic Trek Content............................................................................. 161
3+3 Six Traits Digital Collaboration Rubric ....................................... 162
xiv
Chapter Page
Limitations to Internal Validity .................................................................. 163
Limitations to External Validity ................................................................. 164
Implications ....................................................................................................... 164
Implications for Instructional Design ......................................................... 165
Implications for Professional Development ............................................... 166
Curriculum Development for Teaching Digital Collaboration................... 167
Lesson Planning, Refinement and Alignment ..................................... 169
Conclusion ......................................................................................................... 170
APPENDICES
A. BLUEPRINT CONSTRUCT CHECKLISTS............................................... 171
B. SUMMARY TABLE OF ATC21S SCENARIO BLUEPRINT DATA
COLLECTION .............................................................................................. 176
C. KSAVE MODELS ........................................................................................ 178
D. SIX TRAITS DIGITAL COLLABORATION CHECKLIST...................... 183
E. SIX TRAITS DIGITAL COLLABORATION RUBRIC ............................. 185
F. 3+3 SIX TRAITS DIGITAL COLLABORATION RUBRIC ...................... 187
G. ASYNCHRONOUS INTER-RATER DIRECTIONS.................................. 189
H. SAMPLE STUDENT NOTEBOOK: HIGH SCORING.............................. 193
I. SAMPLE STUDENT NOTEBOOK: LOW SCORING ................................ 195
J. ISTE STANDARDS FOR TECHNOLOGY INSTRUCTION...................... 197
K. ISTE ESSENTIAL CONDITIONS FOR TECHNOLOGY IN
EDUCATION................................................................................................ 200
L. TEST ADMINISTRATOR MANUAL FOR ARCTIC TREK
ASSESSMENT.............................................................................................. 202
M. SAMPLES OF CODED NOTEBOOKS...................................................... 215
N. SURVEY OF EDUCATORS ....................................................................... 218
O. ACADEMIC SOCIAL EMOTIONAL LEARNING STRANDS ................ 221
P. TECHNICAL AND INTER-RATER STUDIES .......................................... 223
REFERENCES CITED.................................................................................................. 228
xv
LIST OF FIGURES
Figure Page
1. American Management Association 2010 Critical Skills Survey ......................... 17
2. Opening Screen Shot for Arctic Trek Assessment ................................................ 64
3. Screen Shot From Arctic Trek Assessment: Assigning Roles and Tasks.............. 65
4. Screen Shot of Clue Two for the Arctic Trek Assessment .................................... 66
5. Screen Shot From Arctic Trek: Polar Bear Population Probability Task .............. 67
6. Screen Shot From Arctic Trek: Creating a Population Graph ............................... 68
7. Screen Shot From Arctic Trek: Notebook Link..................................................... 69
8. Notebook Instructions From Arctic Trek Assessment Notebook.......................... 69
9. Screen Shot From Arctic Trek: Notebook Use During Assessment...................... 70
10. Sample Assessment Instructions From Arctic Trek Assessment........................... 72
11. Notebook Sample Illustrating Thread of Discourse Between Students................. 93
12. Two Notebook Samples Display Student Role Conflict or Confusion ................. 94
13. Notebook Sample Displaying Visual Organization............................................... 96
14. Collaborative Learning Stem and Leaf Plots ......................................................... 110
15. Collaborative Process and Product Scores by Group ............................................ 112
16. Asynchronous Inter-rater Comparisons for Case 33.............................................. 114
17. Inter-Rater Comparison of Traits 4 and 6 .............................................................. 115
18. Notebook Sample Displaying Trends in Visual Organization............................... 124
19. Notebook Sample Reflecting Off-Task Behavior.................................................. 125
20. Elements for Teaching Collaborative Learning in a Digital Environment ............ 130
21. Notebook Sample Illustrating Lack of Reciprocal Discourse ............................... 133
22. Curricular Components of Collaborative Learning in a Digital Environment ...... 134
23. Collaboration in a Digital Environment Professional Development Model
With Content and Format. .................................................................................... 168
24. Digital Collaboration Professional Development Process..................................... 169
xvi
LIST OF TABLES
Table Page
1. Teen Internet Access and Social Media Use in the U.S. ....................................... 4
2. Literature Search Terms and Systems ................................................................... 13
3. Reported Technology Use Among Children 8 - 18 ............................................... 22
4. 21st Century Skills by Framework ......................................................................... 25
5. Glossary of Terms for Types of Group Work Discussed ...................................... 30
6. Phases of Research................................................................................................. 53
7. Discourse Analysis Sample Array of Categories................................................... 75
8. Analyses by Phase and Research Question............................................................ 86
9. Case Characteristics ............................................................................................... 89
10. Phase 1 Processes and Outcomes........................................................................... 90
11. Phase 2 Processes and Outcomes........................................................................... 97
12. Overall Notebook Scores for Six Traits Digital Collaboration Rubric.................. 103
13. Phase 3 Processes and Outcomes........................................................................... 106
14. Overall Scores for Process and Product Dimensions on 3+3 Six Traits Digital
Collaboration Rubric............................................................................................. 108
15. 3+3 Six Trait Digital Collaboration Rubric Combined Total Scores by Group .... 109
16. Notebook Scores By Age Group ........................................................................... 113
17. Phase 4 Processes and Outcomes........................................................................... 122
18. Phase 5 Processes and Outcomes........................................................................... 129
19. Phase 6 Processes and Outcomes........................................................................... 136
20. Survey Results: Professional Development in Technology and Collaboration ..... 137
1!
CHAPTER I
INTRODUCTION
The late 20th and early 21st centuries have seen unprecedented global sociocultural
change driven by advances in technology. Continuing expansion in the use of technology
for all facets of society is changing the world economy and social structures. Economists
cite the influences of globalization on the U.S. Labor Market, including an increased need
for workers with expert thinking, metacognition, problem solving, and complex
communication skills (Levy & Murnane, 2007).
Such workplace competencies are leading to the development of research
sometimes described as 21st century skill development. The Partnership for 21st Century
Skills (2003) defines this as “knowing how to use knowledge and skills in the context of
modern life” (p. 6). More specific definitions of 21st century skills are emerging from a
variety of sources in business, education and government. In order to address the myriad
of definitions and provide some commonality, the project involved in this dissertation
research, Assessment and Teaching for 21st Century Skills project (ATC21S), arrived at
a model framework for such skills by assembling an international group of experts to
examine and compare curriculum and assessment frameworks for 21st century skills that
have been developed around the world in recent years (Griffin, McGaw, & Care, 2012).
Frameworks examined from more than a dozen different organizations included
the U.S. National Academy of Sciences and the International Society for Technology in
Education (ISTE), as well as the Organization for Economic Cooperation and
Development (OECD) and United Nations. Ten over-arching skills that spanned across
many frameworks were identified by ATC21S to typify the skills necessary for the 21st
2!
century. The ten skills were grouped into four areas: ways of thinking, ways of working,
tools for working, and living in the world (Binkley et al., 2012). Demonstrations by the
project are focusing on the assessment and development of skills in two areas, as
examples of what can be done to assess 21st century skill building:
• Ways of working: communication and collaboration (or teamwork).
• Tools for working: information literacy and ICT literacy.
In ATC21S, researchers, cognitive scientists, measurement professionals,
technology leaders and policy scholars have come together to facilitate the integration of
21st century skills in K-12 systems through the creation of evidence-centered design
performance assessments for formative purposes; technology-based tools for scaffolding
metacognition, social networking, collaborative participation, and semantic analysis;
developmental frameworks and progressions; and models for enhancing domain
knowledge through infusion of 21st century skills. The goal is to demonstrate assessment
and learning models that define 21st Century Skills.
This dissertation project examines student performance from an ATC21S research
project in one area of the new framework, Information and Communications Technology
Literacy, or ICT Literacy. The purpose of the study and research questions will be
introduced in an upcoming section. But first, in order to situate the purpose of the study,
21st century skills as educational goals and the relationship of such skills and abilities
with the new U.S. Common Core standards will be briefly explored in the following two
sections.
3!
21st Century Skills as Educational Goals
The recent infusion of Web 2.0 media that supports access to creation, production
and interconnectivity has fundamentally changed the global cultural ecology further by
enhancing global communication structures, reforming authoritative knowledge,
restructuring the economy, and organizing political change through mass participation
supported by social media (Dede, 2009; Ito et al., 2008). 21st century skills such as ICT
Literacy are identified as crucial to a knowledge-based economy and for the innovation
necessary to meet increasing global challenges including climate change, sustainable
food systems, medical advances and economic structures (Balistreri et al., 2011; Wagner,
2008). The benefits of having a society competent in 21st century skills may include
enhancing productivity and global competitiveness, minimizing unemployment,
improving income distribution, supporting social cohesion, and facilitating individual
participation in democratic processes (Organization for Economic Cooperation and
Development, 2005; World Bank, 2003).
Today’s youth, often described as digital natives, were born into a technology-
infused culture, and have spent their formative years with access to social media
(Prensky, 2001). In 2000, 17 million Americans aged 12-17 used the Internet (Lenhart,
Purcell, Smith, & Zickuhr, 2010; Rideout, Foehr, & Roberts, 2010). In 2005 this number
rose to 21 million or 87%; and by 2009, 93% of teens used the Internet (Lenhart et al.,
2010). Table 1 describes current patterns of technology and social media use among
teens.
4!
In a 2008 survey of schools across all 50 states of the U.S., 100% of schools
examined had one or more instructional computers with Internet access and an average
student-computer ratio of 3:1. About 97% of the schools had one or more instructional
Table 1
Teen Internet Access and Social Media Use in the U.S.
Technology Behaviors % of Teens in U.S.
Online daily 63%
Use social network sites 73%
Own a computer 68%
Own an mp3 player 79%
Own a portable gaming device 51%
Use handheld device for Internet access 25%
Access the Internet wirelessly 55%
Note. Adapted from “Social media and young adults,” by A. Lenhart, K. Purcell, A. Smith, and
K. Zickuhr, 2010, for the Pew Internet & the American Life Project.
computers with Internet access directly in the classroom, and 58% of the schools had
laptops on mobile carts for shared use (Gray, Thomas, & Lewis, 2010).
While having access to computers in schools is rising, formally educating students
within school settings and via standards-based approaches for the development of 21st
century skills is dependent on many other factors as well. These include the ability to
adequately define specific and generalized skills and constructs of interest, or areas of the
curriculum where such skills could be integrated, then creating curricular pathways to
teach these skills and approaches to accurately assess such skills. Involved in all of this
are key components of both teacher knowledge and school leadership knowledge of how
5!
to support student learning and to appropriately advance educational goals in these areas.
This will be discussed in more detail in subsequent sections.
Common Core State Standards and 21st Century Skills
The new Common Core State Standards (CCSS), now adopted by forty-five states
and the District of Columbia, hope to provide shared clear and consistent expectations for
learning, defining what students should understand and be able to do at each grade level,
with an emphasis on college and career readiness. While all states had standards prior to
the adoption of the Common Core, the standards were often vastly different from state to
state, sometimes leading to differences in achievement levels across state lines; the CCSS
are one step towards a national model of education. Common Core State Standards
Initiative was led by the states, supported by the National Governors Association Center
for Best Practices (NGA) and the Council of Chief State School Officers (CCSSO). The
standards were developed in collaboration with teachers, school administrators, content
experts and researchers in education, and the development process included feedback on
draft standards from a variety of K-12 education stakeholders, including teachers,
parents, the business community and civil rights advocates (National Governors
Association Center for Best Practices, Council of Chief State School Officers, 2010).
21st century skills, as described in frameworks listed in Chapter I, are supported
both implicitly and explicitly in the CCSS. The Common Core Standards are organized
by English Language Arts (ELA) and Mathematics. ELA, for instance, includes literacy
standards for history, social sciences, science and technical subjects, addressed through
both reading and writing strands. The 21st century skills critical thinking, collaboration,
communication and information literacy are supported, for example, in the following
6!
ELA standards: under Writing Standards for Literacy in History/Social Studies, Science,
and Technical Subjects 6–8, Standard 8 asks students to:
Gather relevant information from multiple print and digital sources, using
search terms effectively; assess the credibility and accuracy of each source;
and quote or paraphrase the data and conclusions of others while avoiding
plagiarism and following a standard format for citation. (National Governors
Association Center for Best Practices, Council of Chief State School
Officers, 2010, p. 66)
In the Speaking and Listening Standards for K-5, Comprehension and Collaboration,
Standard 1 asks first grade students to:
Participate in collaborative conversations with diverse partners about grade
1 topics and texts with peers and adults in small and larger groups: a. Follow
agreed-upon rules for discussions (e.g., listening to others with care,
speaking one at a time about the topics and texts under discussion). b. Build
on others’ talk in conversations by responding to the comments of others
through multiple exchanges, and c. Ask questions to clear up any confusion
about the topics and texts under discussion. (National Governors
Association Center for Best Practices, Council of Chief State School
Officers, 2010, p. 23)
21st century skills are integrated throughout the CCSS for Math with overarching
Mathematical Practices that cross grade levels. These overarching practices include
asking students to (a) make sense of problems and persevere in solving them; (b) reason
abstractly and quantitatively; and (c) construct viable arguments and critique the
reasoning of others (National Governors Association Center for Best Practices, Council
of Chief State School Officers, 2010). Outside of the CCSS framework, a number of
states have adopted 21st century skills standards that include problem solving,
communication, using technology, working in teams collaboratively, making multi-
disciplinary connections, using media for learning purposes, engaging in lifelong
learning, using complex thinking, ethical thinking, and responsible citizenship (Dede,
2009; P21, 2008).
7!
The ATC21S Project
This dissertation project and its associated research questions are intended to
contribute to the research base on student abilities in the 21st century skills of virtual
collaboration through Information and Communication Technology (ICT) literacy, also
sometimes described as digital literacy in the United States. Potential implications as
such research begins to accumulate include informing instruction and helping to guide
leadership in formulating teacher professional development and student support in
collaborative learning and digital literacy in K-12 education.
Given the wide range of institutions calling for improving student skills described
in 21st century frameworks, the need exists to develop new research-based pedagogical
strategies that support 21st century skills. This includes creating and piloting assessments
aligned with integrating 21st century skills into teaching and learning (Balistreri et al.,
2011; P21, 2003), which is a focus of this project.
This study uses extant data from the Assessment and Teaching for 21st Century
Skills project (ATC21S). ATC21S is a collaborative effort among international
ministries of education in six countries, universities, educational research groups, high
tech innovators and the multinational corporations Cisco, Intel and Microsoft. ATC21S
demonstration tasks explore the use of digital literacy and collaborative problem-solving
constructs in educational assessment. Using data from cognitive laboratories and pilot
assessments administered in 2011, I analyze student work samples from a collaborative
mathematics/science task called “Global Collaboration Contest: Arctic Trek” developed
by ATC21S as a demonstration scenario to assess information and communication
literacy.
8!
Purpose of Study and Research Questions
Teachers in many settings are being encouraged to adopt tools of digital
collaboration and use them in classroom instructional settings, but little research is
available to help teachers understand how to evaluate and assess such work, and the
instructional trends they should look for as they integrate participatory media tools into
the classrooms. The purpose of this study is to examine student work samples from a
collaborative task in a digital environment and describe patterns or trends of collaborative
skill evident in the body of work to be reviewed. The intent of this study is to contribute
to research that may inform practice for instructional and assessment strategies in this
emerging area of collaboration in a digital environment. This study will further the
understanding of the cognitive and social processes involved in collaborative digital
literacy skills for students at ages 11, 13 and 15. The results of the study may also help
inform instructional leaders on conceptions of student work in virtual collaboration and
guide leadership in formulating instructional design to support collaborative learning in
K-12 education.
In this project, I develop a rubric as a measurement tool to evaluate the student
assessment artifact “Arctic Trek Notebook” for generalized patterns of skill development
and to investigate trends in collaborative learning through a digital environment. I
conduct descriptive studies among the variables of student age and student notebook
characteristics.
9!
My research questions are as follows:
RQ 1. Does the use of the artifact Arctic Trek collaborative Notebook fall into distinct
patterns that reflect levels of skill development or show trends in collaborative learning
through a digital environment?
1a. Can categorical patterns be identified?
1b. Can these patterns be seen as types of performances referenced by
collaboration literature, based on this data set?
RQ 2. Will descriptive analysis show that levels of Notebook use have a relationship with
student age, for this sample?
1a. Do data displays show patterns clustering by age?
1b. If patterns are evident in 1a, are there important trends to be seen in the age-
related patterns, such as will more advanced digital collaboration patterns be seen for
younger or older students, in this data set?
RQ 3. Given the results of analysis in RQ 1-2 above, do performance patterns identified
in the digital collaborative work products suggest connections to student instructional
support as examined through an instructional leadership focus?
Regarding RQs 1 and 2, my hypothesis is that I will find patterns associated as
trends, and they will have a relationship with age. There is some speculation in the field
that the youngest age groups may show the most advanced digital collaborations skills
because of higher exposure over more years, due to the rapid pace of technology growth.
However, the literature to date supports that students in the 11-year-old group would be
expected to have more difficulty with collaboration, even technology-based
collaboration, due to maturity-related issues such as goal orientation; lack of refined
10!
situational awareness; less mature patterns for social orientation; and broader group
orientation. Therefore, I hypothesize that the trend will show more success with digital
collaboration as age increases from the 11-year-old to the 13-year-old group and then
subsequently for the 15-year-olds who overall as a group I expect to show the highest
traits on the rubric, though considerable variability within group may be seen. Note that
these hypotheses are based on cross-sectional data only. Comparisons of patterns are for
select groups of the age-related cohorts. More about the sampling characteristics will be
described in the Methods chapter.
Regarding RQ 3, my approach will be to consider what connections with trends
from RQs 1-2 can be made to the research literature regarding student instructional
support. RQ3 subsumes many large questions clearly worthy of entire research projects
in their own right for 21st century skill dispositions. My intention here is to begin
documenting RQ3 concerns associated with trends from RQs 1-2 based on this data set,
to establish a landscape for future work. This will help to underscore instructional
leadership concerns that need to be addressed for helping to support 21st century skill
development. RQ3 also points out the alignment of assessment and instruction,
particularly important in domains such as 21st century skills that are beginning to appear
in educational standards frameworks in many countries but do not yet have an established
formal instructional basis in many schools today.
11!
Literature Review
Education is a comprehensive discipline, with a web of strands crossing many
areas. 21st century skills, ICT literacy, and collaboration are three such interdisciplinary,
multiple-skill encompassing topics. Each of these topics will be taken up in turn in this
literature review and discussed contextually regarding history, policy, and practice related
to these topics.
While 21st century skills and ICT are rather new to the field of education,
collaboration has a long history of various iterations that are likely to inform such
practice in a digital and 21st century context. Moreover, 21st century skills, collaboration
and ICT have all been variously defined and described in the field of education. My
intent in exploring the literature was to cast a wide net with regards to how these topics
may be defined, named, and conceptualized, and follow strands that presented depth in
development of the combination of collaboration and ICT, both encompassed by and
outside of a stated 21st century skills context.
Search Terms and Systems
A review of the literature on 21st century skills was conducted June through
December 2011, using the UO library collection and online databases of Academic
Search Premier, Education Resources Information Center (ERIC), EBSCOHOST and
Google Scholar. Keywords searched included broad topics such as 21st century skills,
21st century skills in K-12, 21st century skills in education, 21st century skills for students
in K-12, collaboration skills for students in K-12, cooperative learning skills in K-12, and
performance assessments. Topics with a narrowed focus such as computer-supported
collaborative learning, computer-supported collaborative learning K-12, and ICT and
12!
collaborative learning K-12 were generated through the broader literature, and pursued
further. A search of websites hosted by organizations that promote 21st Century Skills
was also conducted, including research-based sites, university-based sites, school sites,
and professional development sites.
The literature search included education policy and practices related to 21st
century skills, collaboration, and technology. Background information on these topics
were necessary to address current constructs regarding collaboration and technology use
in instructional settings, and how these skills tie to 21st century frameworks for student
learning. Information was also needed on current assessment practices in these areas.
As these curricular areas are relatively new to K-12 education, searching the literature for
mechanisms of instructional design and professional development to enhance teacher
efficacy in instructing in these areas was also included in the literature search. The
literature search was at least initially inclusive of educational research at any level before
narrowing it to K-12, in part because the structure of most K-12 systems makes
educational research difficult, and even findings at college level may be applicable to K-
12 settings, depending on the content and format of the study.
Search Results
The literature search revealed a variety of topics and sub-topics to be reviewed in
order to reflect material relevant to the various disciplines that intersect among the
constructs of 21st century skills, collaboration, and technology in education. Table 2
displays the search terms and search systems used to access literature, prior to narrowing
the searches with the criteria for inclusion described in the next section.
13!
A large percentage of the articles retrieved described a set of ICT and 21st century
standards and the rational for such standards. Many of the articles researched were
related to curricular frameworks for technology and technology integration. Additional
resources were gleaned from reference lists and citations of articles reviewed through the
search process, as well as from websites promoting the inclusion of 21st century skills in
education.
Table 2
Literature Search Terms and Systems
Search Term Search Engine
21st century skills in education Google Scholar
21st century skills in K-12 Google Scholar
21st century skills and education EBSCOHOST
Collaboration skills for students in K-12 Google Scholar
Cooperative learning skills in K-12 Google Scholar
Computer-supported collaborative learning Google Scholar
Computer-supported collaborative learning Academic Search Premier
Computer-supported collaborative learning
in K-12
Google Scholar
ICT skills in K-12 education Google Scholar
ICT and collaboration Google Scholar
Performance Assessments in K-12 Google Scholar
Performance Assessments for 21st century
skills
Google Scholar
Technology and global economic change Google Scholar
Technology use among youth in the United
States
Google Scholar
14!
Criteria for Inclusion
Criteria for inclusion following the search described above included retaining
articles appearing in peer-reviewed journals specific to the description of 21st century
skills and associated instructional models; technology integration for 21st century skills;
collaboration and computer-supported collaborative learning; and assessment of 21st
century skills. Policy papers and Reports authored by foundations and government
agencies were reviewed. Another set of literature included was book chapters in
theoretical, methodological, and professional development books and reference
handbooks.
Articles specific to outcomes of 21st century skills were not reviewed, although
sometimes outcomes were mentioned as a portion of the discussion in the citations
retained. Literature published from 1980 to 2012 was included. Relevant citations prior
to 1980 were sparse and less informative than the more current literature on 21st century
skill development, so were not included. However it should be noted that some of the
collaboration literature extends considerably earlier than this search period. Where
appropriate, major collaboration research contributions are mentioned in various portions
of this dissertation work regardless of this specific date framework.
An Environment of Global Social and Economic Change
The literature identified frequently addressed how both the economy and
workplace have changed in recent years, and cited impacts on workforce training and
education (Levy & Murnane, 2007; OECD, 2005). Global competition, the pace of
change, new organizational structures and the nature of how work is accomplished have
necessitated a workforce of flexible, collaborative, continuous learners with complex
15!
cognitive skills (American Management Association, 2010). These skills are described
as necessary for workforce preparedness and business success in what is termed the
knowledge age as producers attempt to hold a lead on innovation in world markets
through knowledge as the center of economic production. The knowledge economy
requires new skills, education and training, as the demand for highly skilled, digitally
literate workers increases and the demand for less skilled workers is reduced (World
Bank, 2003).
Globalization of the economy brings opportunities for expansion as well as
pressures from global competition. Team-based workplaces with flatter or decentralized
organization are increasingly dependent on personnel networks of cross-functional teams
and technology-related or technology-inclusive job descriptions (Stuart & Dahm, 1999).
Other trends impacting the workforce are smaller work units, knowledge networks, and
shorter product cycles that increase the need for innovation, resulting in the need for
workers to take more personal responsibility for their work (Huitt, 1999; World Bank,
2003).
New Workforce Requirements Beyond Basic Skills
Previous “industrial age” skills for success in the workplace were characterized
by punctuality and routine: following instructions; recognizing the authority of the
supervisor; using routine functions that remained constant over time; and working on
monotonous tasks for extended periods (Huitt, 1999; Secretary's Commission on
Achieving Necessary Skills, 1991; World Bank, 2003). Some scholars have noted that
the public school system and other institutions in our society prepared students under
those conditions (Huitt, 1999).
16!
However, in the knowledge age, businesses need adaptable employees who are
lifelong learners; able to update or learn new skills independently; communicate
effectively; work independently; use critical thinking, problem solving and decision
making skills; work in teams to manage information; and produce new knowledge (P21,
2008). Workers must show such flexibility in order to respond to the changing
knowledge and skill requirements of the workplace. Many jobs require abilities in multi-
tasking, project work, and self-management, with strong interpersonal skills and the
ability to negotiate and influence (P21, 2008; Stuart & Dahm, 1999).
As early as 1991 the U.S. Dept of Labor report, What Work Requires of Schools
(SCANS, 1991), outlined new thinking skills, personal qualities, and competencies for
schools to address beyond the foundational basic skills in order to prepare students for
new workplace skills in the 21st century. The Secretary's Commission on Achieving
Necessary Skills (SCANS) identified Thinking Skills, Personal Qualities and
Competencies necessary for success in the future economy as follows: creativity,
decision-making, problem- solving, and knowing how to learn; responsibility, sociability,
self-management and integrity; and information skills, interpersonal communication and
teamwork, systems thinking, and technology proficiency (SCANS, 1991).
From an organizational perspective, corporations and industry have also
participated in conversations on the need for new skill development. The American
Management Association (AMA) surveyed over 2,000 managers and executives in 2010
regarding workforce preparation and the nature of skills required for success in today’s
economy. Participants were represented most heavily by business, financial services and
manufacturing, and one quarter of those surveyed represented companies with 10,000 or
17!
more employees. Eighty percent agreed or strongly agreed that students would be better
prepared to enter the workforce with strong skills in critical thinking, communication,
collaboration and creativity. Seventy-five percent of respondents expect these skills to
become even more important in the future and stated that employees were both screened
for and evaluated on their abilities in these skills (American Management Association,
2010).
The AMA 2010 Critical Skills Survey defined the skills as follows:
• Critical thinking and problem solving: including the ability to make
decisions, solve problems, and take action as appropriate.
• Effective communication: the ability to synthesize and transmit your ideas both
in written and oral formats.
• Collaboration and team building: the ability to work effectively with others,
including those from diverse groups and with opposing points of view.
• Creativity and innovation: the ability to see what’s not there and make
something happen.
AMA 2010 Critical Skills Survey (AMA, 2010).
Figure 1. American Management Association 2010 Critical Skills Survey
Socio-Cultural Change and Educational Technology Policy
In 1983 the influential report A Nation at Risk identified computer science as a
basic requirement for high school graduation and recommended that students understand
and be able to use computers for information and communication purposes in work and
personal capacities (National Commission on Excellence in Education, 1983). Rapid
advances in technology over the past thirty years have led to transformation of socio-
cultural ecology worldwide, and early primary students now master basic computer use.
18!
U.S. Department of Education reports discuss the transformative potential of
technology in re-configuring teaching and learning to support the development of skill
sets emerging as important for participation in future economies. Other important
priorities for technology use were described as supporting rich applications of teaching
and learning as described by the emerging field of cognitive science, and enhancing
learning accommodations (Web-based Education Commission, 2000; U.S. Department of
Education, 1996; U.S. Department of Education, 1997).
However, this emerging transformation with technology also coincided in the
U.S. with the movement for basic skills competency as exemplified by the federal No
Child Left Behind (NCLB) Act of 2001. An emphasis under NCLB on high stakes
testing of core content as measured by standardized, multiple-choice tests included little
focus on such skills as described above (Klieman, 2004; Pecheone & Kahl, 2010). For
instance, while NCLB recommended technological literacy as a benchmark for 8th grade,
assessments were not put in place to measure such literacy system-wide. Manifestations
of the basic skills accountability system under NCLB were described by some scholars as
narrowing the curriculum as teachers taught to the test in the few areas being assessed or
used test-prep materials in order to increase test fluency and raise scores in basic skills
(Baker, 2008; Darling-Hammond & Adamson, 2010; Herman, 2008).
This is changing as research has highlighted the need for the development of
higher order thinking skills as necessary precursors to college and career readiness
(Conley, 2010; Pecheone & Kahl, 2010). Internationally benchmarked assessments such
as the Programme for International Student Assessment (PISA) illustrated the relative
weaknesses of student in the United States in higher order thinking skills as compared to
19!
students in other countries (Baker, 2008; Balistreri et al., 2011). In the U.S., the new
common core standards adopted across most states push the standards beyond basic skills
to enhance college and career readiness and promote 21st century skills as integrated
aspects of learning across numerous areas (National Governors Association Center for
Best Practices, Council of Chief State School Officers, 2010).
Technological Enhancements and Digital Literacy
In an analysis of twenty years of educational technology policies, Culp, Honey
and Mandinach (2003) outline policies planning the use of technology to extend teaching
and learning processes through student use of data collection and analysis; increased use
and critical review of diverse information resources; integration of higher order thinking
and communication skills; and use of technology and multimedia to assess more
“complex dimensions of learning” through performance and portfolio based assessment
(p. 5, 2003). These policies are aligned with 21st century learning frameworks that
suggest using educational technology and new media not as an end in itself, but as part of
the learning culture and should be infused throughout disciplines.
Specifically in digital literacy, research and development teams note the tension
evident throughout plans and policies between using educational technology as an
addition to the conventional curriculum and pedagogy in place since the industrial age, or
for reform appropriate to the knowledge age with transformative pedagogical change
made possible by the new tools and changing patterns of access and response to
information (Culp et al., 2003; Harris, Mishra, & Khoeler, 2009). For instance, according
to the National Science Foundation sponsored Teaching, Learning and Computing
Survey, 12% of high school social studies teachers, 17% of science teachers, and 24% of
20!
English teachers reported using computers in the classroom on more than 20 occasions
over a 30-week period. The most frequently reported uses were factual information
gathering, typing, and drilled practice for skill mastery (Becker, 2001; Smeets, 2005).
With information gathering cited as a main use for technology, this helps to
suggest one pathway for building toward more integrated 21st century skills in some
classrooms where appropriate, through a research and evidence-gathering process. While
there are many small-scale and innovative research-based applications of such activities
for school use, inquiry-oriented instruction including collaboration and the infusion of
educational technology remains used by relatively small numbers of teachers nationwide,
and appears to be supported by teacher training in student-centered pedagogies (Inan,
Lowther, Ross, & Stahl, 2010; Smeets, 2005).
More transformative pedagogical change through the use of technology is often
aligned with constructivist methods such as problem-based, project-based or inquiry-
based learning, exemplified by collaborative, student directed learning guided by a
teacher-facilitator and the use of curriculum, which may be open source, supported by
technology. This potential shift in pedagogy has implications for the reform of teaching
and learning as well as the structure of K-12 systems. However, researchers have
reported that in studies of wide-spread educational computer use, technology in U.S.
schools is rarely used for such 21st century skills as problem solving, creating products,
and communication to share perspectives with others (Inan et al., 2010).
Educational priorities as outlined by current Elementary and Secondary Education
Act (ESEA) policy Blueprint for Reform include raising test scores in Math and English
Language Arts both overall and for disaggregated subgroups of students, and increasing
21!
graduation rates from high school (U.S. Department of Education, 2010). Technology is
used as an administrative tool in education for data collection and management to support
student achievement through computer-based assessments and dissemination of the data,
tracking of attendance, behavior, and RTI data as well as internet use for information
access (Culp et al., 2003; Harris et al., 2009). Researchers describe how the political
pressure to increase scores in narrowly defined domains has had a limiting effect on other
content areas and types of skill development in K-12 systems, while in social and
business realms outside of schools Web 2.0 media has led to an explosion in the use of
technology by school-age populations in informal contexts and outside of formal
education systems (Dede, 2010).
Technology Use Among Youth Aged 8-18
Growth in Internet use among youth rose steadily throughout the first decade of
the 21st century, and youth ages 8-18 use the Internet in an increasing variety of ways, see
Table 3 for changes in technology use among youth between 2004 and 2009. A study by
Lenhart et al. (2010) reported that 76% of teens get news online, doubling between 2000
and 2009; 55% of teens in 2009 obtained health related information online, 71% used the
internet to make purchases, and only 13% of teens state that they do not use the Internet,
largely due to access related to low income.
Among U.S. teens, 11 to 14 year olds logged more media use than 8-11 or 15-18
year olds, and Black and Hispanic youth logged more media time per day on average than
White youth in the Kaiser study, though the Pew study found that Black youth had less
access to internet than other groups and that Black youth primarily accessed the Internet
over mobile wireless devices (Lenhart et al., 2010; Rideout et al., 2010).
22!
Table 3
Reported Technology Use Among Children 8 – 18
Media Use 2004 2009
Play Games Online 52% 81%
Read News Online 38% 76%
Own Laptop 12% 29%
Own iPod or mp3 Player 18% 76%
Own Cell Phones 39% 66%
Total Daily Media
Exposure (time in hours:
minutes)
8:33 10:45
Multi-tasking Proportion 26% 29%
Total Daily Media Use
(time in hours: minutes)
6:21 7:38
Note. Adapted from “Generation M2: Media in the lives of 8 to 18 year olds” by V. J. Rideout,
U.G. Foehr, and D.F. Roberts, 2010, Kaiser Family Foundation.
Youth and new social media. Researchers of social media and ICT among teens
found that youth use social media to extend friendships, connect and network with
interest-driven groups, and engage in self-directed, peer-based learning; they create,
express, and distribute their work and achieve outcomes through exploration more than
they pursue predefined goals (Agosto and Abbas, 2010; Ito, et al., 2008; Rideout et al.,
2010). The influence of social media and the youth focus on peer-based, exploratory
learning has implications for the traditional authoritative role of adults in education; out
of school, teens have increasingly become self-directed learners. Teachers and parents
are often less technologically literate than the youth, and youth are engaging in pedagogy
not supported by traditional learning structures (Ito et al., 2008).
23!
Ito et al. (2008) use the term “networked publics” to describe public culture
supported by online networks that bridge mass media and online communication with
active participation in distributed social networks to produce and circulate culture and
knowledge. Networked publics are increasingly the access to participation in both local
and distributed communities, among friendship and interest-driven groups as well as
political entities. The ability to fully participate in our society today is somewhat
dependent on the ability to navigate new media as both a savvy consumer and producer
(Ito et al., 2008; OECD, 2005; World Bank, 2003).
The Education-Technology Gap
Student experiences of technology within and outside of the classroom are
disparate; out of school technology use tends to be fluid, flexible, social and creative
while in-school use tends to be structured; limited to drills and practice, information
search in restricted modalities, and defined demonstration of knowledge such as typing a
paper (Buckingham, 2006; Kleiman, 2004; Ito et al., 2008).
Digital literacy in schools, when defined operationally, is typically a functional
definition such as how to operate hardware or use software with basic skills for certain
operations, a focus on internet searches, and safety or security issues (Buckingham, 2006;
Dede, 2005; Balistreri, et al., 2011; Harris et al., 2009). Information and Communication
Technology (ICT) researchers suggest rethinking the definitions of ICT or digital literacy
and education—that new media have become more than tools; they are infused with
emerging cultural norms, and modes of expression for both private and public
engagement (Buckingham, 2006; Dede, 2009; Jenkins, Clinton, Purushotma, Robinson,
& Weigel, 2006; Smeets, 2005).
24!
Newer frameworks call for increasing use of critical evaluation of online content,
while researchers cite this critical evaluation as a skill lacking in many students. In
addition, Web 2.0 social media supports diverse forms of ICT literacy, with different skill
sets and manners of communication required depending on the content and connotation
of the media use, such as friendship-driven or interest-driven participation, and students
benefit from understanding differing social expectations in order to develop cultural and
communicative competence in diverse media environments (Buckingham, 2006; Ito, et
al., 2008).
Emergence, Definition and Development of 21st Century Skills
21st Century Skills were developed in tandem with changes in the workplace from
the industrial age to the information age and what is now in post-millennium referred to
as the knowledge age. Based on recommendations for 21st Century workforce skills,
stakeholders inclusive of businesses, higher education and government agencies
simultaneously developed, defined and refined conceptual frameworks for 21st century
skills. Though initially more focused on technology integration and ICT literacy,
frameworks have matured to include such topics as environmental and health literacy,
descriptions of inquiry-based learning, promotion of second languages, and the use of
performance assessments.
Integrating 21st century skills into the K-12 education system is difficult due to a
number of systemic issues, including that 21st century skills are not necessarily content-
driven and require an element of dynamic emergence that is not typically accommodated
in current school curricular, assessment, or organizational structures. 21st century skills
are contextual and collaborative in juxtaposition to an education system often designed
25!
for linear individual work in more separated domains (Pecheone & Kahl, 2010).
However, the inclusion of 21st century skills and pedagogy is gaining momentum and
systemic support. The 21st Century Readiness Act has been introduced to allow the use
of ESEA funds to develop, enhance and expand teaching of 21st century skills defined as
(a) critical thinking and problem solving; (b) communication; (c) collaboration; and (d)
creativity and innovation. The bill seeks to support 21st century readiness initiatives that
combine 21st century skills with core academic subjects (Govtrack, 2011).
There are several independently developed conceptual frameworks outlining 21st
century skills with general overlap in terminology, varying degrees of operationalization
of skills and competency, and some specialization with regards to the infusion of skills
with values and work habits. Some of the major frameworks are outlined in Table 4.
Table 4
21st Century Skills by Framework
Framework Competencies Relevance
EnGauge Framework from
Metiri/NCREL (2003)
Digital-Age Literacy
Inventive Thinking
Interactive Communication
High Productivity
Teaming, Collaboration,
and Interpersonal Skills
Interactive Communication
Effective Use of Real-
World Tools
Organization for Economic
Cooperation and
Development (2005)
Using Tools Interactively
Interacting in
Heterogeneous Groups
Acting Autonomously
Use knowledge and
information interactively
Use technology
interactively
Cooperate, work in teams
26!
!
Table 4 (continued)
Framework Competencies Relevance
International Society for
Technology in Education
ICT Skills (ISTE) (2008)
Creativity and Innovation
Communication and
Collaboration
Research and Information
Fluency
Critical Thinking, Problem
Solving, and Decision
Making
Digital Citizenship
Technology Operations and
Concepts
Use digital media and
environments to
communicate and work
collaboratively
Students apply digital tools
to gather, evaluate, and use
information
Exhibit a positive attitude
toward using technology
that supports collaboration,
learning, and productivity
Partnership for 21st Century
Skills (P21) (2003 / 2009)
Core subjects and 21st
century themes
Learning and Innovation
skills
Information, Media and
Technology Skills
Life and Career skills
Communication and
collaboration
ICT literacy
Assessment and Teaching
of 21st Century Skills
(ATC21S) Project (2010)
Ways of thinking
Ways of working
Tools for working
Living in the world
Digital Learning
Communities
Communication and
collaboration
ICT and information
literacy
College Board Global
Education Framework
(2011)
Empirically Based
Knowledge and Skills;
Higher-Order Cognitive,
Metacognitive and
Interpersonal Skills;
Global dispositions,
perspectives, and attitudes
Information literacy
Communication and
collaboration
27!
New Media Strengths, Skills and Behaviors
New media skills, strategies and learning strengths such as in the use of audio,
video and animation are embedded in some but not all frameworks. Learning strengths
and styles are outlined by Dede (2005) as fluency in multiple media and valuation of each
media type for the different communication options promoted; active learning based on
collectively seeking, sieving and synthesizing media experiences rather than using a
single information source; expression through both non-linear associational webs as well
as linear media; and learning experiences co-designed by teachers and students for
individualization (Dede, 2005).
Jenkins et al (2006) describe skills and behaviors related to rich use of new media
including
• Play: experimentation as a form of problem solving
• Performance: the ability to adopt alterative identities for the purpose of
improvisation and discovery
• Simulation: the ability to interpret and construct dynamic models of real-world
processes
• Appropriation: the ability to meaningfully sample and remix media content
• Collective intelligence: the ability to pool knowledge and compare notes with
others toward a common goal
• Transmedia navigation: the ability to follow the flow of stories and information
across multiple modalities
• Negotiation: the ability to travel across diverse!communities, discerning and
respecting multiple perspectives, and grasping and following alternative norms
28!
Dede (2009) created a Web 2.0 Use framework as follows:
• Sharing: communal bookmarking; photo/video sharing; social networking; and
writer workshops
• Thinking: blogs; podcasts; and online discussion forums
• Co-creating: wikis/collaborative file creation: mash-ups/collective media creation
• Collaborative social change communities
21st Century Skills and Learning Theory
This wide range of elements can quickly become unworkable for instructional
leadership, so an important consideration for this study is how views of digital literacy
and the affordances of technology relate to learning theory. Cognitive scientists posit that
learning occurs in context through accessing and constructing with prior knowledge; and
is active, social and reflective, with learners utilizing metacognition to support self-
direction, set learning goals and monitor progress (Barron & Darling-Hammond, 2008;
Driscoll, 2002). To best support learning, some learning experts believe instruction
should be learner centered, contextual, authentic, and supported by assessment (Donovan,
Bransford & Pellegrino, 1999; Driscoll, 2002). Technology supports can be helpful in
some of these areas, and the majority of 21st century skills frameworks include most of
these components, with an emphasis on learner-centered, social, inquiry-based learning
experiences in an authentic context (P21, 2003; Balistreri et al., 2011; Harris et al., 2009;
Metiri, 2003). 21st century skills as typically defined support some of the best practices
in cognitive research on learning: learning by doing, analyzing, communicating,
processing and problem solving, and using transfer to different situations to support long-
lasting and long-ranging educational efficacy.
29!
Lifelong learning, called for in most 21st century frameworks, is described as a
learner-centered, constructivist activity, with people learning in groups and from one
another, with the teacher as a guide for resources and facilitator for individualized
learning plans (World Bank, 2003). 21st century, lifelong learning, and global education
share many traits describing a constructivist methodology, such as the call for creation
and application of knowledge using diverse sources, and application of learner-centered
and competency driven models in a flexible, decentralized manner with multiple learning
options, modalities and settings.
Some researchers have described how traditional education models do not always
well support research-based learning theory and the facilitation of 21st century
competencies (Barron & Darling-Hammond, 2008; Dede, 2010). Students in these
studies were found to primarily work alone and were often seen as passive recipients of
knowledge from the teacher using a curriculum driven, acquisition and repetition model
of learning. Some scholars have described how the emergence of new technologies
combined with the new economic challenges of the knowledge age require
transformation for future success in the new global society, but also while doing so help
support educational best practices (Balistreri et al., 2011; Harris et al., 2009; P21, 2008;
World Bank, 2003).
Collaborative Learning
Collaboration as an instructional strategy or process to support student centered
learning is central to most 21st century frameworks and mimics the team-based structures
common to the workplace in the 21st century model (AMA, 2010; Dede, 2010; SCANS,
1991). Literature on collaboration refers to cooperative learning, collaborative learning,
30!
learning communities, distributed cognition, computer-supported collaborative learning,
and joint or co-construction of knowledge, and these terms represent variations on the
theme of students working together to maximize both their own and each other’s learning
in a small group situation. Table 5 presents a glossary of terms related to collaboration
used throughout this study and referenced literature.
Table 5
Glossary of Terms for Types of Group Work Discussed
Term Definition Citation
Collaboration Knowledge generation emphasized
through shared meaning yet individual
interpretation; meaning-making in the
context of group interaction;
interdependence highlighted; advances
in collective knowledge prized
Scardamalia, Bransford,
Kozma, & Quellmalz
(2012)
Stahl, Koschmann, &
Suthers (2006)
Co-construction
of knowledge
Knowledge is interactively achieved in
discourse and may not be attributed as
originating from any particular
individual
Stahl, Koschmann, &
Suthers (2006)
Computer-
Supported
Collaborative
Learning
(CSCL)
Group engagement in a group
knowledge-building space, with
channels of interaction between social
and personal systems; may be
asynchronous
Stahl, Koschmann, &
Suthers (2006)
Cooperation Both individual and group
accountability are often present; may
have distinct roles and division of labor;
Smith, Sheppard, Johnson,
& Johnson (2005)
Strijbos, Martens, &
Joachems (2004a)
Cooperative
Learning
Students work in teams to accomplish a
shared goal with positive
interdependence; both individual and
group accountability are often present;
may have distinct roles and division of
labor; typically has structured
interactions
Johnson & Johnson (2009)
Smith, Sheppard, Johnson,
& Johnson (2005)
Strijbos, Martens, &
Joachems (2004a)
31!
Table 5 (continued)
Term Definition Citation
Group learning Learning by groups—not in groups or
individual learning by social processes
Scardamalia, Bransford,
Kozma, & Quellmalz
(2012)
Student collaboration on learning tasks is not an invention of the 21st century;
cooperative learning strategies were documented in ancient Rome and China and have
been practiced in European and American schools since the late 1700’s. Francis Parker
promoted cooperative learning in schools in the early 1800’s, as did John Dewey in the
1920-30’s in American schools. After a disappearance in favor of individualized,
competitive instructional strategies, cooperative learning re-emerged in the 1960’s and
became widespread during the 1990’s along with an emphasis on constructivist pedagogy
(Smith, Sheppard, Johnson, & Johnson, 2005).
Johnson and Johnson (2009) initially described cooperative learning as students
working together to accomplish shared learning goals. Smith et al. (2005) describe
cooperative learning as students working in teams to accomplish a shared goal with
positive interdependence, meaning that the performance of individual group members is
dependent upon the performance of all other group members. The process often includes
individual and group accountability; teamwork skills; and group processing. Stahl (2009)
describes collaboration as incorporating the contributions of individuals into a group
discourse and involving those individuals in maintaining and directing group processes;
this is congruent with cooperative learning strategies. Stahl further illustrates
collaboration as a spiraling cycle of individual to group enhancement where individuals
contribute to the group and advance group cognition stimulating further individual
32!
thought processes, which are then contributed back to the group in the shared problem
space, as the cycle continues.
In comparing cooperative and collaborative learning, Smith et al. (2005) suggests
that while both modalities use peer group interaction to promote engagement and
optimize learning, cooperative learning includes individual accountability while
collaborative learning does not. This is not always the distinction that others use.
Strijbos, Martens, and Joachems (2004a), for instance, synthesize distinctions such that
cooperative learning is more structured with distinct roles and division of labor
procedures, while collaborative learning is less structured though implying equality of
contribution to the group effort. They note that cooperative learning and collaborative
learning share more similarities than differences, and that the distinction may not be
necessary in many contexts, where other terms could be used interchangeably.
In this paper I will use the terms cooperative learning and collaborative learning
or collaboration interchangeably in the background and for discussion of instructional
design and professional development, where cooperative learning is the more familiar
and widely discussed concept in teaching practice; and use collaboration to describe my
research, as the ATC21S Arctic Trek Notebook performance task is a group task that
does not include individual accountability nor contains the structured interactions typical
of cooperative learning strategies. I will use the terms digital collaboration and
collaborative learning in a digital environment interchangeably to describe the act of
collaborating through a technological medium.
Benefits of collaborative learning. Collaboration or cooperative learning aligns
with cognitive science learning theory by providing opportunities for transfer of
33!
knowledge and skills through social interaction, problem solving and the metacognitive
skills used to facilitate and reflect on group processes (Smeets, 2005). Collaboration
among students promotes increased engagement by interactively working with materials
and concepts, creating shared meaning, and using metacognitive skills to process learning
and performance.
Research on cooperative learning dating as far back as 1924 documents that this
modality can promotes higher individual achievement; retention and transfer of content
and skills; creativity in problem solving; metacognition; persistence; increased social
skills; higher self esteem; positive interpersonal relationships including trust and cohesion
among students; and mutual positive regard across diverse groups of students (Smith et
al., 2005).
However, there are difficulties related to the implementation of cooperative
learning including developing norms and structures within groups that facilitate students
working successfully together; choosing meaningful tasks that fit the cooperative work
structure, such as open-ended, multi-faceted task requiring a variety of skills; and
developing strategies for discussion and interaction with materials that support rich
learning of discipline-specific content (Barron & Darling-Hammond, 2008).
Cooperative learning has been found to be more effective when teachers made small
groups of three to four students, structured individual accountability combined with
positive interdependence, scaffolded group interaction, and adapted instructional
materials and methods to small group instruction (Lou, Abrami, & d’Appolonia, 2001).
These findings translate to increased teacher preparation and classroom management
activity; not surprisingly, cooperative learning has been found most effective when
34!
teachers had extensive training and practice using this method (Barron & Darling-
Hammond, 2008; Lou et al., 2001).
Researchers studying instructional practices for technology found that students
working in small groups for technology instruction performed better than students
working individually. Optimal performance of small groups was positively related to a
social context including a difficult task, group size of 3-5 students, and little to no
feedback or assistance available from the instructor (Lou et al., 2001). For example,
students in pairs researching complex information by searching on the Internet to
compose and support ideas had greater effectiveness than individuals, finding more
information in less time with a greater range of search strategies, and showed greater
proficiency in monitoring and evaluating their search behaviors (Lazonder, 2005).
Use of roles to facilitate group learning. Roles can promote group cohesion and
responsibility through increasing group awareness, organizing group interaction, and
directing individual efforts, leading to both positive interdependence and individual
accountability (De Wever, Van Keer, Schellens, & Valcke, 2009). Roles can be assigned
or self-selected, and are sometimes categorized as content roles, task roles, and
maintenance roles. The use of roles is assumed to support functionality in collaborative
learning, and was central to cooperative learning strategies as implemented in primary
and secondary school settings.
In a study of roles in Computer Mediated Communication (CMC) among college
students, Strijbos, Martens, Jochems, and Broers (2004b) found that the use of roles
increased task-focused discourse and perceived group efficiency, but not overall
performance as measured by grades. Other researchers found that scaffolded role
35!
assignment, where role structures were introduced early in the group process and then
allowed to fade, had greater value for group performance (De Wever et al., 2009).
Group learning in face-to-face (FtF) modalities in primary and secondary settings
typically involve the use of cooperative learning strategies for task distribution through
role assignment. Theory holds that roles assigned to the group will facilitate interaction
and full participation among group members by giving each an assigned role or purpose
within the group and lead to better group efficiency, engagement and outcomes (Johnson,
Johnson & Stanne, 1986). Role assignment has been posited to increase the positive
interdependence and thus group cohesion (1986). Roles may be either content or process
oriented. Many strategies for facilitation and implementation of roles were designed for
FtF settings, such as numbered heads together, jigsaw, prompting or timekeeping. These
roles were typically developed with purposive instruction and guided practice, and are a
customary part of in-person classroom practice in U.S. schools.
Computer-Supported Collaborative Learning
Computer-supported collaborative learning (CSCL) is a relatively new research
discipline that is also referred to as remote-located collaboration or computer supported
group based learning. It is more widely researched internationally than in the United
States, and often is associated more with post-secondary instruction than schooling for
younger children. However it is a growing phenomenon as schools at all levels of
instruction are adding more online or blended instructional venues each year.
Computer supported group based learning (CSGBL) mimics new 21st century
workplace structures of remote-located teams collaborating on problem-solving and joint
construction of knowledge in both synchronous and asynchronous modes. CSGBL is
36!
implemented much the same as FtF group work, and as the field is still emergent, there is
a lack of continuity among institutions and instructors as to approaches for CSGBL
programming and evaluation (Strijbos et al., 2004b).
Implications of virtual collaboration. Research studies have begun to
investigate whether students working collaboratively in virtual environments may have
their effectiveness perhaps hindered or enhanced from the reduction of in-person Face to
Face (FtF) contact. Effects could result from reduction in the amount of social cuing that
can occur in a non-visual interaction space, or alternatively from new interactions that
may be possible with online tools, such as simultaneous text chat and audio signal
available to all group members, supporting multiple channels of expression or reducing
the anxiety of social regulation for teens by inserting the distancing abstraction of
technology. In one study, students collaborating face-to-face showed more and higher
levels of communication than the control group online, although a social presence could
be created and maintained in the digital environment given social media tools (Lowry,
Roberts, Romano, Cheney, & Hightower, 2006). Mutual construction of meaning may be
hindered in virtual environments due to the lack of visual and physical cues, which can
reduce social relatedness (Rienties, Tempelaar, Van den Bossche, Gijselaers, & Segers,
2008).
Large group size often is negatively correlated with quality two-way
communication, due not simply to logistics but also to participant apprehension of
evaluation, which tends to be higher in FtF and lower in virtual environments. The
virtual environment appears to offset group size effects, such that a large group online
37!
will have greater participation and quality of communication than a large group in FtF
environment (Lowry et al., 2006).
Temporal, relational and content dimensions are necessary to construct and
support remote-located group interactions (Stahl, 2009). Social presence can be
established without FtF interaction, or through virtual face-to-face with Web 2.0 social
media tools. The additional features of a digital environment promoting social presence
include parallelism, the ability for group members to contribute simultaneously; group
memory; self-scribing; and group awareness. Also, the shared interface of a collaborative
writing tool can offer a supported text environment that can lead to greater productivity,
document quality, relationships and communication than static, non-interactive writing
forums (Lowry & Nunamaker, 2003). The use of shared writing tools has allowed
struggling learners to engage in collaborative note taking with more able peers or tutors,
and has promoted increased participation among students collaborating in groups using
collaborative writing tools for class discussion (Anderson-Inman, Knox-Quinn, &
Tromba, 1996).
Evaluation of digital collaboration. Whether face to face or conducted
virtually, communication is composed of several sub-constructs, including quality,
appropriateness, richness, openness, and accuracy of receptive and expressive modalities.
A major issue facing the research community for assessments in digital collaboration is
the lack of continuity in evaluating virtual collaborative efforts. There is a lack of
consistency around what is being evaluated and how it is being measured.
A variety of instruments in use show widely differing characteristics in theoretical
orientations, units of analysis, levels of details, categories of analysis, and discrimination
38!
of content (De Wever, Schellens, Valcke, & Van Keer, 2006; Strijbos, Martens, Prins, &
Jochems, 2006). The different instruments or methodologies may or may not address
contextual issues such as group composition, task features or task complexity, and
whether or not roles were explicit or role orientation occurred. Researchers may evaluate
collaboration based on density of the social network, numbers of messages, quality of
communications variously defined, or group processes. Units of analysis and instruments
for content analysis do not tend to be overtly discussed or justified within many of the
studies, which often involve limited attention to formal measurement characteristics.
This is not unexpected in an emerging field of measurement such as this, but does
deserve additional exploration as the research area moves forward.
Among theoretical constructs, De Wever et al. (2006) analyzed 15 constructs
currently in use and found that most did not mention inter-rater reliability, and only 33%
explained their theoretical background. Strijbos et al. (2006) describe the methodology
of research and evaluation in the field as lacking debate and critical reflection. Hence,
the validity and reliability of generalized methods is not as yet fully established, and
results of research must be screened for the evaluative framework before findings can be
generalized across studies and situations. The emphasis on the types of content to be
evaluated, the unit of analysis, and the theoretical grounding are various; proscriptions
are vague; and the evaluative context may change between settings in order to be
appropriately matched to the technology that is used in each setting (De Wever, et al.,
2006; Strijbos et al., 2006). Rather than interpreting this as a barrier to work in this area,
this dissertation study sees it as an opportunity for work that can contribute to the field, in
39!
an area very relevant to educational settings and interests around the work, as described
in this introduction to this chapter.
Content analysis. The primary ways that virtual collaboration is evaluated are
through content analysis or overall group performance. Content analysis is defined here
as the examination of communication elements and processes to determine trends or
phenomena present in the communication and the meaning or purpose served. Content
may be analyzed quantitatively with the unit of analysis coded, summarized and
frequency or percentages of coded types calculated; a qualitative analysis may involve
case studies or participant observation to infer trends without computing frequencies
(Strijbos et al., 2006).
Unit of analysis. Methods for content analysis differ with regard to the unit of
analysis used. The unit can be messages, threads of successive joined messages, thematic
units, complete discussions, paragraphs, illocutions or utterances, all of which are used by
different researchers (Strijbos et al., 2006). According to Schellens, Van Keer & Valcke,
(2005) the most widely used unit of analysis is complete messages, in no small part
because the author of the message intended for what they wrote to be a complete unit.
Discourse analysis will be discussed more thoroughly in Chapter II.
Social-Emotional Learning
While outside the scope of this dissertation project, it should be mentioned here
that many of the foundational skills necessary for successful collaboration among
students often can be categorized as social-emotional skills. Intrinsic motivation rated
high for virtual collaborative work in a study by Rienties et al. (2008) and they found that
the intrinsically motivated students also ranked high in social relatedness and perceived
40!
competency. Learning is a social process, and social-emotional skills not only contribute
to but also are critical for enhancing academic success, with prosocial behavior of
students linked to increased engagement and higher outcomes on achievement tests (Zins,
Bloodsworth, Weissberg, & Wahlberg, 2006). The Collaborative for Academic, Social
and Emotional Learning (CASEL) (2003) describes a set of social emotional learning
constructs that are key to developing abilities in collaboration and other 21st century
skills; they are listed in a table in Appendix O.
Professional Development for Collaboration in a Digital Environment
The intent of this study is to contribute to research that may inform practice for
instructional and assessment strategies in this emerging area of collaboration in a digital
environment. There may be links found between the trends identified in some of the
digital student work and research literature regarding supporting student learning and
teacher professional development in this emerging area. Although work product trends
will be identified in the study (see Chapters 2 and 3), this literature survey now
introduces some of the research base regarding professional development approaches for
preparing teachers for the 21st century skills challenges. Guiding leadership in
formulating instructional design, and ensuring teachers are well prepared to address the
instructional needs will be an important part of supporting digital collaborative learning
in K-12 education.
Teacher professional development and continuing support concentrated on
pedagogical content is necessary to facilitate student-centered learning with a focus on
higher order skills as characterized by 21st century frameworks (Inan et al., 2010).
Incorporating 21st century skills such as computer-supported collaboration in K-12
41!
education has been described as needing concerted effort on the part of systemic
leadership. For instance, chief among tasks will be providing professional development
opportunities for teachers in the field as well as insuring pre-service instruction from
university partners. Rotherham and Willingham (2009) point out that methods for
teaching 21st century skills such as self-direction, collaboration, creativity, and innovation
are not yet well known or fully understood, but should be taught in the context of practice
with feedback, and strategies for improving practice with benchmarks for achievement.
As the methods for teaching many 21st century skills are not yet known, uncovering the
implicit domains involved and discerning sub-skills that can be taught to support 21st
century skills would be an significant contribution, and could lead to targeted
professional development opportunities for educators to prepare for teaching such skills.
In order to provide feedback for student growth, educators must themselves be proficient
at the skills.
The need for teacher training and curriculum design in the areas of collaboration,
technology and social emotional learning have been discussed roundly in the literature,
both within the framework of 21st century skills as well as in these instructional areas
outside of a 21st century focus (Barron & Darling-Hammond, 2008; Dede, 2005;
Donovan et al., 1999; Harris et al., 2009; Inan et al., 2010; Lou et al., 2001; Rotherham &
Willingham, 2009). As these general areas relate to the specific topic of this dissertation,
technology and collaboration, literature on professional development and implications for
instructional design in these areas is discussed in the following sections as a frame for
Research Question 3 and possible implications from the results of this study.
42!
Collaborative Learning
Collaborative learning has been identified as central to most 21st century
frameworks (P21, 2008; Dede, 2010). Researchers cite teacher training and practice in
cooperative learning strategies as essential to the success of the strategy in promoting
student learning due to the challenges inherent in implementation of collaborative
learning (Barron & Darling-Hammond, 2008; Lou et al., 2001). Aside from training in
the pedagogy and techniques that support collaborative learning experiences, teachers
must be able to model collaborative processes (Barron & Darling-Hammond, 2008).
Moreover, instructional design must be attentive to scaffolding group processes and
interactions, task development specific to collaborative group work, motivational
supports, and formative feedback cycles (Barron & Darling-Hammond, 2008).
In a meta-analysis of cooperative learning research, Johnson, Johnson and Stanne
(2000) present the last major models of cooperative learning as developed in the 1980’s,
and show research on use and effectiveness of cooperative learning models decreasing by
50% between the 1980’s and the 1990’s. Today’s younger teachers may have
experienced cooperative learning as students, and they may not have received training in
cooperative learning during their teacher preparation program. Johnson, Johnson and
Stanne cite the incorporation of many cooperative learning techniques throughout
packaged curriculum, making the use of cooperative learning techniques fairly
widespread, though the extent to which teachers are able to unpack scripted curriculum
and use elements such as cooperative learning strategies in new or different curricular
areas has not been assessed.
43!
K-12 teaching is typically performed in a classroom environment with a high
number of students paired to one teacher; a 32:1 ratio for middle school and high school
is relatively standard, and in the current low budget climate, many secondary schools
have a much higher ratio. Elementary schools vary from 30:1 in many early primary
classrooms in Oregon, to 20:1 in California, which has class-size reduction funding for
K-3rd grades. As such, teachers often work in isolation from peers, and have many
responsibilities to meet in a short amount of time. Therefore, some teachers may lack
opportunities to practice collaboration with other educators. The most common form of
educator collaboration may be between regular education teachers and education
specialists as they collaborate regarding individual student needs.
Academic Social-Emotional Learning
The social emotional learning (SEL) generalized core skills consist of self-
awareness; social awareness; relationship skills; self-management; and responsible
decision-making (CASEL, 2003). SEL skills are recognized as promoting both long and
short-term positive outcomes for academic and personal success. These skills are aligned
with collaborative learning skills such as personal responsibility; communication; group
processing skills; decision-making; and conflict management and resolution (Barron &
Darling-Hammond, 2008). For students to be successful in collaborative working
environments, their social-emotional skills must be operative, and for students to be
operative in social-emotional skills, their teachers must be teaching these skills as well as
integrating the practice of these skills throughout the curriculum and school program.
Children higher in social-emotional skills are better able to manage emotion,
establish healthy relationships, meet personal and social needs and make responsible and
44!
ethical decisions (Zins et al., 2006). Cohen (2006) describes the ability to listen to self
and others, and be critical and reflective as precursors to communicating and
collaborating; these skills are described as underlying responsible and caring participation
in a democracy. Educating children in social emotional skills increases their relatedness
in society and their ability as lifelong learners (CASEL, 2003; Cohen, 2006). Social
emotional skills are often taught in isolation in developmental settings, such as listening
skills for kindergarteners, but these skills can become integrated into curriculum and
classroom processes as student s progress in grade levels (Cohen, 2006). Whereas Social
Emotional learning offers precursory skill sets for great success in collaborative learning,
so does collaborative learning offer continued practice and refinement of those skills so
important for success as lifelong learners and participation in societal processes
(Raggozino, Resnick, Utne-O’Brien, & Weissberg, 2003).
Several states have adopted standards for social-emotional learning and a Bill has
been introduced to Congress. The Academic, Social, and Emotional Learning Act of
2011, HR 2437, is intended to increase support for teaching SEL skills in schools.
Professional development is necessary to provide teacher expertise in new
pedagogies that support learning based on cognitive science (Pecheone & Kahl, 2010).
Social-emotional learning, widely viewed as integral to cognitive based learning theories,
is another domain that may not be widely taught to teachers, who must master core
academic instructional skills such as reading and/or math concepts as a priority. School
counselors, who once presented social-emotional skills to classrooms or small groups of
students, have been downsized in recent difficult economic times, leaving the promotion
of social-emotional skills to teachers over-burdened with concerns for achievement as
45!
measured by standardized test scores and curriculum driven by data on core skills such as
reading and math.
Technology
The incorporation of technology across disciplines in K-12 education is central to
the inclusion of 21st century skills in the curriculum. For teachers to incorporate
technology in the curriculum, they must themselves be prepared to do so, and have some
degree of fluency in technology themselves. Studies in K-12 education describe teacher
applications of technology for instruction as lacking in breadth, depth, and variety as well
as lacking integration with curriculum (Harris et al., 2009). Kleiman (2004) states that
success in 21st century skills and technology integration depends on the preparation and
support of teachers and appropriate curriculum design. Harris, Mishra and Khoeler insist
that teachers must know the appropriate pedagogical strategies, including cognitive,
social and developmental theories of learning, for various ages of students and content
area instruction for meaningful integration of technology in K-12 programming.
The International Society for Technology in Education National Educational
Technology Standards for Teachers (ISTE NETS*T) (ISTE, 2008) outlines five
comprehensive standards with performance indicators for the incorporation of technology
in schools:
• Facilitate and Inspire Student Learning and Creativity in both face-to-face and
virtual environments
• Design and Develop Digital Age Learning Experiences and Assessments
including digital tools and resources
• Model Digital Age Work and Learning demonstrating fluency in
46!
technological tools for collaboration, communication and creative work
• Promote and Model Digital Citizenship and Responsibility
• Engage in Professional Growth and Leadership to continuously improve
professional practices
For the full, comprehensively discussed ISTE NETS*T standards, see Appendix I.
NETS essential conditions. ISTE (2008) outlines several conditions that must be
present at the school site in order for technology infused educational experiences work
smoothly, and which require the support of instructional leadership and resource
allocation. These conditions include but are not limited to:
• Implementation planning to infuse student learning with ICT and digital
resources
• Equitable access to current and emerging technologies
• Skilled personnel able to select and effectively use ICT resources
• Ongoing professional learning and practice opportunities in technology
• Technical support
• Curriculum frameworks that support digital age learning and work;
• Student-centered learning to best meet student needs and abilities
• Assessment and evaluation of teaching, learning, the use of ICT and digital
resources
See Appendix J for the full description of NETS Essential Conditions.
47!
New Models of Professional Development
Education leaders cite curriculum, teacher expertise and assessment as the main
challenges for the integration of 21st century skills in the schools, and suggest an long-
term iterative process of planning, implementation, reflection and continued planning
with implications for teacher training (Rotherham & Willingham, 2009). The effects of
professional learning experiences that are intense and focused on the work of teaching
appear to support the new paradigm of professional development (Wei, Darling-
Hammond, Andree, Richardson, & Orphanos, 2009). The new model of professional
development involves content area expertise, sustained over time, with application to
learning and a focus on student learning and achievement. Teachers need time to develop
knowledge in the content areas to effectively teach students matching the content
(Gersten, Dimino, Jayanthi, Kim, & Santoro, 2010). For professional development to be
truly effective, teachers must be able to participate collectively in ongoing learning that
allows in-depth discussion of strategies and an opportunity to practice and receive
feedback (Education Northwest, 2010). Professional learning communities fit these
effectiveness standards, creating a culture of collaboration by being site or district-based,
practice-oriented and having a sustained focus over time.
When professional learning communities are focused on student learning and
achievement, students benefit through improved achievement scores over time. 30 to 100
hours of professional development spread out over six to 12 months have a positive effect
on student achievement, whereas limited professional development from five to 14 hours
total have no statistically significant effect on student achievement; an average of 49
hours of professional development in a year can boost student achievement by
48!
approximately 21 percentile points (Wei et al., 2009). The use of research, data
interpretation, and application to student learning is imperative (Saunders, Goldenburg, &
Gallimore, 2009; Monroe-Baillargeon & Shema, 2010; Vescio, Ross, & Adams, 2008).
In a controlled study on professional development, teachers who participated in a
Teacher Study Group received support in looking at research in reading, debriefing
previous applications of the research, looked at lessons on teaching reading for which the
research was applied, and collaboratively planned lessons with their group. These
teachers' knowledge of the content and their ability to deliver the content was
significantly different from the control group. Student knowledge of the content based on
test scores was also significant (Gersten et al., 2010). Students of teachers who
participated in a professional development program experienced significantly larger gains
in learning as measured by assessments than those students in the control group (Johnson
& Fargo, 2010). Designing and promoting sustained high quality professional
development opportunities for teaching collaboration in a digital environment will be
central to the inclusion of this learning modality in the curriculum.
Performance Assessments for 21st Century Skills
Assessment and accountability-driven education systems call for valid and
reliable assessments for standards and skills taught; schools need a way to assess student
ability and measure growth in order to effectively plan, deliver and monitor instructional
programs. Adequate measures have not been widely or uniformly developed to measure
21st century skills, which as discussed previously are cross-cutting in many of the higher
order thinking skills in the U.S. new common core standards, and the standardized
multiple-choice assessments currently in use for measuring student performance in
49!
content areas are not designed to consider 21st century skill sets (Darling-Hammond &
Adamson, 2010). Traditional assessments can measure factual recall, vocabulary, basic
reading comprehension and algorithmic procedures, but are often not adapted for
assessing applied higher order thinking and synthesized skills (Baker, 2008). Assessment
of 21st century skills must be sufficiently performance-based to capture analysis,
reflection, collaboration, and using technology to respond to essential questions
(McTighe & Seif, 2010). Through the development and validation of performance
assessments for 21st century skills, the schools may be better able to include such skills as
part of the curriculum and measure student and school progress in those areas.
Performance Assessments are generally defined as opportunities to construct an
answer, produce, or perform; or to apply knowledge and skills without pre-determined
options. A performance assessment can be a collection of performance tasks, defined as
“a structured situation in which stimulus materials and a request for information or action
are presented to an individual, who generates a response that can be rated for quality
using explicit standards. The standards may apply to the final product or to the process
of creating it” (Stecher, 2010, p.3).
The structured piece of this definition can help accommodate the need for
standardization and replication, as without standardization an assessment can be less
useful for comparison between students or schools, thereby rendering it ineffective for
accountability purposes. While assessment experts and researchers are still working
toward an entirely agreed upon definition of Performance Assessment, it is often defined
in terms of what it is not—multiple choice exams containing solely factual or procedural
level questions, not embedded in a context or activity (Pecheone & Kahl, 2010; Stecher,
50!
2010). The more complex arenas in which performance assessments tend to take place
therefore have generated such structured requirements as the Stecher definition discusses,
when use includes replication or comparison purposes.
Performance assessments may take many forms, including portfolios, which are
more difficult to replicate, to writing tasks scored by rubric, conducting or analyzing
experiments, or synthesizing information from various sources to construct a response to
a query in any discipline. It is typical for performance assessments to have a defined task
with stimulus and outcomes that may be described as: relatively simple/relatively
constrained; relatively simple/relatively open; relatively complex/relatively constrained;
and relatively complex/relatively open (Stecher, 2010).
Kane, Crooks, and Cohen (1999) consider performance assessments to be more
authentic and valid when they replicate the conditions under which adults would perform
the same tasks. Computerized performance assessments would replicate many adult-
oriented work environments and include authentic 21st century tasks. Multi-user virtual
environments offer promising possibilities for assessing 21st century skills and may be
cost effective as well as allowing tracking of interaction and collaboration, but they have
not yet been scaled for use with large populations outside of the gaming industry (Silva,
2008).
Performance assessments across areas that are construct-referenced to the same
21st century skills may be considered a good form of measurement for these abilities due
to the tests focusing on the construct-measured ability and not only the specific domain of
knowledge and skills supporting it, if constructs fall across domains (Messick, 1984).
51!
Some achievement constructs measure declarative and procedural knowledge,
with a student score showing their status in that domain (Baker, 2008). In contrast, a
performance assessment would be more likely include opportunities for students to
demonstrate strategic and schematic knowledge as well as declarative and procedural
skills. An achievement construct of cognitive ability often represents a domain of
complex tasks, referred to as fluid, developing or learned abilities; these cognitive
abilities involve contextualized mental models and complex performance with multiple
ways to be represented (Haladyna & Downing, 2004).
Barriers to the Implementation of Performance Assessments
Sometimes significant investments are required to implement performance
assessments with regards to financial expenditures, timeframe for administration and
evaluation, coordinating organizational processes for administration and scoring, and the
training of staff system wide (Baker, 2008; Linn, 2008). Traditional multiple-choice
assessments are conveniently uniform and scored automatically, and have been quoted as
costing about $1-10 per student in 2003 as compared to performance assessments such as
the College Work Readiness Assessment (CRWA) at over $40 per student plus an
estimated $8,000 in staff training per student enrolled (Silva, 2008).
Designing performance assessments is at times more complicated than designing
multiple choice measures, although it should be kept in mind that the quality control, item
bank development and psychometric processes involved with selected response measures
can be quite expensive as well. For performance assessment, task creation, including
alignment of complex tasks to standards, is called for, as well as scoring options and
designing scorable products; and creating a system for scoring accuracy require intensive,
52!
coordinated development over time as often these systems are not yet readily in place
(Pecheone & Kahl, 2010). Other barriers include reliability; equating; and uniform
scaling, which is associated with a reduced emphasis on simplified total-score
quantitative outcomes and requires more time-consuming scale-oriented validation
(Balistreri, et al., 2011; Silva, 2008).
The next chapter introduces the methodology for this study. It describes a series
of phases of discovery and analysis intended to address the research questions introduced
in this chapter.
53!
CHAPTER II
METHODS
This study of student performance on a computer-based assessment of digital
literacy had six phases of research that inter-relate or build upon each other; Table 6
outlines the phases and their relationship with the research questions in Chapter I. Each
phase has specific iterative processes that will be discussed in detail in ensuing sections.
Table 6
Phases of Research
Research
Question
Phase Purpose
RQ1a
Phase 1: Review and initial
coding of student work
samples
Develop a taxonomy reflecting student work
patterns seen through an initial review of
student work
RQ1a Phase 2: Develop initial
rubric
Structure the taxonomy into a scoring rubric
RQ1a
RQ2
Phase 3: Assess student work
and evaluate rubric
Explore inter-rater use of the rubric and
teacher reflection regarding the rubric
RQ1b
RQ2
Phase 4: Examine in-depth
categorical patterns and trends
in student work
Explore in-depth attributes of collaboration
and non-collaboration displayed in student
work, using the rubric and scored work to
explore trends
RQ3 Phase 5: Examine skill areas
for instructional design
Identify sub-skills through Phases 1-4 that
could be instructed to improve student
collaboration skills on task
Not
associated
with a RQ
Phase 6: Explore professional
development needs
From an instructional leadership stance,
explore implications of identified sub-skills
relative to teacher professional development
needs for supporting 21st century skill
development in this area
54!
Design
A descriptive, cross-case analysis design that integrated mixed methods was used
to evaluate student performance to address the research questions, through the six phases.
Greene, Kreider, and Mayer (2005) described mixed methods as, “…approaches to social
inquiry [involving] the planned use of two or more different kinds of data gathering and
analysis techniques, and more rarely different kinds of inquiry designs within the same
study or project” (p. 274).
The basic assumption of mixed designs is that there are multiple legitimate
approaches to research in the social sciences, and that the complex diversity of using
multiple lenses to examine research questions can offer a deeper understanding. Greene
(2007) identified five purposes for mixing methods—triangulation, complementarity,
development, initiation, and expansion. Triangulation, for instance, can increase
reliability and validity, serve to control bias, and offer multiple perceptions about a single
reality (Golafshani, 2003).
Caracelli and Greene (1997) describe mixed methods research designs as falling
into two categories: component designs or integrated designs. Component designs are
methodologically discrete, where the methodologies are not mixed but combined only at
the level of interpretation; integrated designs integrate the methods and elements of the
different paradigms.
An integrated design is used here, drawing on extracting both qualitative and
some quantitative information from the same work products. For this project, analysis of
digital notebooks for Research Question 1 — which addresses whether the use of the
artifact Arctic Trek collaborative Notebook falls into distinct patterns — involved
55!
qualitative methods through the Body of Work method (see below). Once trends were
identified, Research Question 2 — the relationship of Notebook patterns with age —
involved quantitative reasoning through frequency counts, summary numbers and
displays grouping rubric-related traits into results based on the age variable. Cognitive
Task Analysis and Backward Design were employed to examine skill areas for
instructional design to address Research Question 3.
The qualitative methodology to address the first research question is explored first
in the next section. Qualitative approaches can be useful when the research seeks to
describe an aspect of the participant or participant’s work, and generate information
about participants’ traits, experiences, attitudes or beliefs. Qualitative methods are
sometimes favored to quantitative methods when relevant variables have yet to be
identified and an exploratory phase of research is undertaken (Marshall & Rossman,
1995). Qualitative analysis involves interplay between the data and developing
conceptualizations, from which theory may emerge. The researcher suspends tacit theory
and keeps focus questions as broad as possible so that data can be examined with an
openness that allows themes to emerge that may not have been previously conceptualized
by the researcher. Savenye and Robinson (1995) suggest the use of qualitative methods
for researching the use of educational technology as it can look at what is occurring as
students use a new technology.
Cross-Case Analysis
This study used case-oriented strategies for qualitative cross-case analysis, with
each Notebook treated as a case. Case-oriented strategies for cross-case analysis can be
suited to answering the research questions about the identification of categorical patterns
56!
of skill development. Here patterns will focus on possible types of performance in
collaboration in a digital environment among student groups.
Overview of methodology for case-oriented strategies. In general, Cross-Case
Analysis is a method used to support qualitative research in complex settings through the
examination of events, traits or processes across a number of cases in order to increase
generalizability through seeing “…processes and outcomes across many cases, to
understand how they are qualified by local conditions and thus to develop more
sophisticated descriptions and more powerful explanations” (Miles & Huberman, 1994,
p.172). Cross-case analysis allows a researcher to discern whether the findings within a
specific case make sense beyond that case, and examine similarities and differences
across cases to discover a broader understanding of or explanation for the phenomena
observed and the conditions that may support the phenomena.
Qualitative cross-case analysis can be either variable-oriented or case-oriented.
This study uses case orientation for the qualitative phase, followed by a variable
orientation to order the cases in the more quantitative phase. This is followed by phases
of examining the evidence for instructional design and instructional leadership
implications, based on the trends identified for the data set. It should be noted this is a
limited data set and a small number of “case” related digital artifacts, thus the trend
findings are exploratory and intended to lead to further studies in this emerging area of
21st century skill development.
The case-oriented approach used here is iterative and requires first examining
each case as a unit, looking at arrangements and relationships within the case before
investigating similarities across cases. The Notebooks were analyzed using two different
57!
case-oriented strategies; synthesis of multiple exemplars and forming clustered types or
families. Miles and Huberman (1994) describe synthesis of multiple exemplars as
collecting multiple cases reflecting a phenomenon, in this instance a computer-supported
collaboration task performed by same-aged student groups, and examining the cases for
essential elements, which are then used to illustrate the generalized findings across the
group of cases.
Identifying commonality across cases involves looking for patterns of the
phenomena among cases and then sorting the cases into groups sharing displayed patterns
or configurations of the phenomena (Miles & Huberman, 1994), which may then be
considered representative types for classification. Cases can be sorted by type, or ordered
by the presence of an element. As the cases in this study were all structured similarly and
clearly bounded by the parameters of the assessment task, as described in upcoming
sections, they seemed well suited to case-oriented cross-case analysis through the
identification of types, especially if the typology would then allow exploration for
possible approaches to instructional intervention and professional development through
connections with the research literature.
Sample
The sample for this study included students of ages 11, 13, and 15 years from
schools in three different countries, the United States, Singapore and Australia. Teachers
were recruited by ministries of education and National Project Managers in each country,
with human subjects permissions as specified within each country. Data sets were
provided de-identified for secondary data analysis. The IRB approval by the University
of Oregon for secondary data analysis of de-identified data specified use of code number
58!
for students, without names or key. Participation was voluntary for both students and
teachers.
Sampling Procedure
Schools were identified by their national ministries or departments of education
through professional education networks in the participating countries. The samples were
not representative country samples and were not intended to reflect an indicator of
national performance. Rather, the intent of the data collection was to see demonstrations
of performance to interpret how the instruments performed and what overall patterns
might be seen. Schools or districts were selected by their ministries of education that
were deemed capable of meeting the technical requirements listed in the task
descriptions, such as sufficient computers and Internet capacity. ATC21S also requested
that countries recruit a range of students expected to be lower performing, middle-range
performing and higher performing in digital literacy, to offer a range of results.
Student Characteristics
Thirty-three notebooks (cases) were provided in the data set analyzed here, with
approximately 100 students using the notebooks. Descriptions of the sample show in the
Results chapter. Students were grouped in age-based groups in teams of three or four
persons for the Arctic Trek assessment, which will be described in detail below. Teams
of four were specified for the task but classroom configurations influenced the creation of
smaller teams of three in some schools. The work was synchronous, with each student
working from a different computer station but with teammates working together across
multiple computers at the same time. Students were teamed with others from within their
same classroom or school site for this task.
59!
Sample Assessment Frame: ATC21S Cognitive Labs and Pilot Trials
The Arctic Trek Assessment trials were designed to measure ICT literacy and
collaborative problem solving, with the use of social networking media to collaborate in a
problem space. The full Arctic Trek task, only one part of which is analyzed here,
consisted of several work products generated by students over a 45-minute period, one of
which was the Notebook product described below, which is the focus of this dissertation
research.
The full task, including the Notebook as only one part, was designed along with two
other 21st century assessment scenarios to assess the development of skills from two of
the ATC21S framework areas:
• Ways of Working: communication and collaboration (or teamwork).
• Tools for Working: information literacy and ICT literacy
Overall, the ATC21S model groups ten 21st century skills into four areas: ways of
thinking, ways of working, tools for working, and living in the world. Two of these areas
that are incorporated in the Arctic Trek task are described in more detail below.
Tools for Working
As described by ATC21S, Tools for Working within the Notebook task
encompass two separate but related skills areas and roles or manners of functioning in
digital environments: Consumer and Producer. These functional roles mimic authentic
roles for career- and college-readiness, in the workplace, and in digital environments in
society at large.
Consumer. Functioning as a Consumer involves obtaining, managing and
utilizing information or knowledge from shared digital resources and experts.
60!
Producer. Functioning as a Producer involves creating, developing, and
organizing information or knowledge in order to contribute to shared digital resources.
Functioning as a Consumer of digital resources builds to functioning as a Producer of the
digital resources.
Ways of Working
The ATC21S project defines the 21st century area Ways of Working involved in
the Notebook task as the skill sets of Social Capital and Intellectual Capital. These skill
areas include participating, contributing and eventually initiating and taking a leadership
role in facilitating social networks, as well as working collectively with others to create
shared intellectual knowledge.
Social Capital. Developing and sustaining Social Capital (SC) in this context
involves using, developing, moderating, leading and brokering the connections within
and between social groups in order to facilitate collaborative action for learning.
Intellectual Capital. Developing and sustaining Intellectual Capital (IC) through
social networks in this context involves understanding how tools, media and social
networks operate; and using these tools, techniques and resources to build collective
intelligence and integrate new insights into personal understandings. Intellectual Capital
is a culminating construct that reflects the use of skills from the Consumer, Producer and
Social Capital constructs.
Multiple Opportunities to Demonstrate Skills
The intention of the full set of three ICT literacy demonstration tasks for ATC21S
was to embed skills described above into three different learning contexts: a math/science
context through the Arctic Trek task, a English language arts task through a digital
61!
graphic organizer literature analysis task, and a second language acquisition task through
a chat tool with students co-constructing knowledge in a second language.
Methodologically, the full set of tasks, which are beyond the scope of this
dissertation, build an argument for validity and reliability through a sampling design of
student performance over the three contexts, and students are given opportunities to
participate in multiple groups or teams. Together, the three scenarios and multiple team
placements are intended to paint a picture of student proficiency as learners in digital
communities (Wilson et al., 2012).
The other two contexts are out of the scope of this dissertation, as are all the other
work products and student assessments embedded in the Arctic Trek task. Therefore this
dissertation focuses only on the collaborative notebook work product in the Arctic Trek
task, which is insufficient on its own for inferences about individual student work.
However it should be noted that the full assessment process itself ranged for students
over several scenarios and team placements. In this way the assessments are a form of
“saturation evaluation,” where the intention of each additional task and team placement is
to “saturate” the information available on the individual student as a learner in digital
networks, looking for the replication of patterns typical for that student in digital
interactions over multiple contexts and teams.
This would be similar to a student using integrated technology over multiple
courses or periods of the school day, or across several subject matter areas, none of which
focused exclusively on learning about technology but all of which may have a signature
of learning with technology.
62!
ATC21S Scenario 2: “Global Collaboration Contest: Arctic Trek”
This section describes the full Arctic Trek task, to give the reader sufficient
information to understand the role of the Notebook artifact, which will be explored in the
Results chapter.
The full Arctic Trek is a 45-minute computer-based performance assessment,
administered to students in a team format. Students engage in an interactive web
search/web quest exercise of seeking information, or “information foraging” online, to
solve clues and answer questions in order to demonstrate their ability with technology
and collaboration. The information foraging activities draw on a set of real scientific
documents from a science expedition to the polar region (http://polarhusky.com, 2005).
The background on the development of “Global Collaboration Contest: Arctic
Trek” assessment scenario is discussed by Wilson and Scalise (2012), contributors to the
assessment design:
For the ATC21S project, the Berkeley Evaluation and Assessment
Research (BEAR) Center at UC Berkeley and the University of Oregon
developed three scenarios in which to place tasks and questions that could
be used as items to indicate where a student might be placed in the
collaborative digital literacy construct. Each scenario was designed to
address more than one strand, but there were different emphases in how
the strands area represented among the scenarios. Where possible, we
took advantage of existing web-based tools for instructional development.
The Arctic Trek task is described below.
Arctic Trek. One potential mechanism for the assessment of
student ability in the learning network aspect of ICT literacy is to model
assessment practice through a set of exemplary classroom materials. The
module that has been developed is based on the Go North/Polar Husky
information website (www.polarhusky.com) run by the University of
Minnesota. The Go North website is an online adventure learning project
based around arctic environmental expeditions. The website is a learning
hub with a broad range of information and many different mechanisms to
support networking with students, teachers and experts. ICT literacy
resources developed relating to this module focus mainly on the
63!
Functioning as a Consumer in Networks strand. The tour through the site
for the ATC21S demonstration scenario is conceived as a "collaboration
contest," or virtual treasure hunt. The Arctic Trek scenario views social
networks through ICT as an aggregation of different tools, resources and
people that together build community in areas of interest. In this task,
students in small teams ponder tools and approaches to unravel clues
through the Go North site, via touring scientific and mathematics
expeditions of actual scientists. The task helps model for teachers how to
integrate technology across different subjects. It also shows how the Go
North site focuses on space to represent oneself, and can be combined
with tools that utilize texting, chat and dialogue as forms of ICT literacy.
(Wilson & Scalise, 2012, p. 5)
The goal of the assessment was for students to work together with their team in a contest-
like format, searching to find the answers to all of the clues they encountered on the
“journey through the Arctic” with the Notebook being the collaborative workspace.
Skills used in the Arctic Trek assessment context included basic math and science
skills such as reading graphs and charts; performing simple calculations and map reading;
and reading comprehension skills for analysis of content. Recall that a broad range of
such skills were intended to be embedded over the three ATC21S scenarios, as a
sampling design for the context of the assessment to represent typical situations
encountered in schools. The actual constructs of interest for measurement were such
skills as computer use including online tools; web navigation; and collaboration within a
range of such digital environments, or in other words, digital skill and collaboration
emplaced in a range of knowledge-rich and team settings.
When each student logged into their computer for the Arctic Trek task, they were
assigned a number (ex: 144), and the only means of communication available to
collaboratively solve the clues and enter their team response information was through the
Notebook to be examined here, which was a shared document. It is described fully in the
64!
section “Shared Document Notebook” below, with example Notebooks included in
Appendices H and I.
In order to access the collaborative tools, students had to find the link to the
shared document Notebook, and enter the document space to share the information they
found, seek or receive help, and to decide on team answers. Student performance in all
these activities was tracked. The opening screen of Arctic Trek is shown in Figure 2.
When students entered the Arctic Trek, after having an opportunity to set up their
Notebook, they had an opportunity to work with their team to assign roles and tasks by
sorting task and role cards on the screen through mouse manipulation, as shown in the
screen shot shown in Figure 3.
Figure 2. Opening Screen Shot for Arctic Trek Assessment.
65!
Figure 3. Screen Shot from Arctic Trek Assessment: Assigning Roles and Tasks.
Students were to use the Notebook to decide which team members should be Persons 1,
2, 3 and 4, and which role and tasks would be assigned to whom. Then, as students
carried out their tasks, they were to report to each other through the Notebook, compare
answers, seek help, or track group progress.
Each clue had an associated web page made available in the task, with assorted
links that might lead to further information to assist students in solving the questions
associated with the clue. The clues often involved reading background information
related to the clue, or doing interactive exercises to help answer the clue.
The following page of the assessment, as shown in Figure 4, illustrates the prompt
for Clue 2 and scaffolding for the information search. The prompt reads “ The first
sentence of the clue helps you select a webpage from the list at the right. Which page is
66!
about what land animals eat? Click on that link and search for the answer to that
question.” Here, the student is prompted to choose the link titled “Land Animal Food”
and would read further information at the site to help them answer the clue. They could
check their ideas or share their information with their teammates through the team
Notebook; this screen shot also shows the Notebook open in the window behind the clue,
with student use of the Notebook. Note that the screenshot is in low resolution as an
actual screen image of student work from the trials, where screen recording was
occurring throughout the task.
Figure 4. Screen Shot of Clue Two for the Arctic Trek Assessment. This student
view shows the structure of a clue with prompts below, and live links on the right.
67!
As students navigated through the clues on their Arctic journey, they performed a
variety of math/science tasks. Answering one clue includes an exercise that involves
transferring polar bear cub and mother population ratio data from a graph to a probability
spinner, as shown in the screen shot below. The student working in the screen shot in
Figure 5 has their Notebook open in the window behind the spinner window, perhaps
ready to report the colors and names they used to make their spinner. This is an
assessment of creating a digital tool.
Figure 5. Screen Shot from Arctic Trek: Polar Bear Population Probability Task.
Another clue involves interpreting polar bear population data and manipulating an
interactive graph to create a line for the graph that best matches the data. Students were
to answer two questions regarding the process they used and the product they created in
this exercise, as shown in the screen shot in Figure 6.
68!
Figure 6. Screen Shot from Arctic Trek: Creating a Population Graph
For the full Arctic Trek task, Web 2.0 tools utilized in the assessment included
additional tools besides the Notebook, which was a Google document. These other tools
are outside the scope of this dissertation.
Notebook Specifics
The shared document in the assessment was called a Notebook and was a Google
document set up for student use. As described previously, the purpose of the Notebook
was to provide a collaborative space for students to identify themselves; organize their
work processes by giving them a space to collaborate on choosing roles, assigning tasks,
and tracking progress; sharing content material and resources; and negotiating clue
answers so the team members could discern accurate responses to prompts and each have
the information to enter in their separate answer spaces within the assessment.
69!
The student work product “Notebook” was accessed by students from an opening
page that prompted them to access the Notebook to “share ideas and coordinate using
your team Notebook”, see the screen shot featured in Figure 7. Once students opened
Figure 7. Screen Shot from Arctic Trek: Notebook Link
their Notebook they were prompted by these Notebook instructions; as shown in the
screen shot in Figure 8:
Figure 8. Notebook instructions from Arctic Trek assessment Notebook. This is what the
team sees when they open their Notebook. Each Team has a number and a secret code.
70!
Students used their Notebook to report on the clues they answered, dialogue about
possible answers, or ask for help, among other discourse. An example of a Notebook in
progress is shown in Figure 9. The text students have written reads as follows:
Ok so im waiting for the map to load wt about you guys? me two it taking to long to
load to hard to find im looking at all the stuff and its hard to look. a lot of stuff to look at
I know and ive been looking at the same thing over and over again but it says nothing
about white bears let me know if you find the answer i don’t know the answer ether and
going to keep reading
Found it it was the Laptev Sea and the next was Arctic Fox
My answer was 5 colors. where did u get tht answer from never mind I know how u got
that answer jaime here is anybody else there
You need help how do you rate ur team
So you put the rate first and then you go up and put the other answer after
i don’t get it it is really hard to do this this is jose
jaime do you get it this is jose ya wat do you need help on
ay do get what are tring to do here
waagan in onwho is jamie that’s not how u spell my name wagaan is the way jose
spelled it
Dominic is Anonymous user 1881
Yea so any way we need to fin this what slide you guys on and 8118 stop fooling around
were nt fooling around whoever this is
Figure 9. Screen Shot from Arctic Trek: Notebook Use During Assessment.
71!
In the example shown in Figure 9, the Notebook is open in one window and a
population graph task is open behind. The student writing in the Notebook is asking if
the prior contributor needs help. Students could go back and forth between the Notebook
and the various assessment tasks, with the Notebook being the continuous connection
between group members. Two samples of the Notebook product are appended to this
paper; one high scoring sample and one lower scoring sample, see Appendix H and I.
Further discussion of student work in the Notebook is discussed in the Results
chapter. Examples are given here only for clarity regarding the task methodology.
Assessment Administration Instructions
Administration of the assessments was conducted was classroom teachers. They
engaged in a training session in advance provided by their national country project
managers, regarding the assessment purpose and process of the full range of tasks,
outside of the scope of this dissertation. A standardized assessment delivery booklet, the
Pilot Test Administration Booklet, was provided to the teachers, a section of which
shows below, and teachers were guided through instructions on how to use it.
As with most assessments, teachers were present primarily to proctor and not
provide content or process support. Teachers were instructed not to give help
immediately even if the students had difficulty accessing the shared document or links as
these were part of the digital literacy assessment. However the administration
instructions did allow teachers to intervene and assist a student if the student had
exhausted the three available resources for students, see instructions excerpted from the
Pilot Test Administration Booklet, shown in Figure 10. In this case, teachers had a pop-
up screen available for each student where any assistance provided could be described
72!
and included in the assessment record. Teacher assistance provided was then included in
the assessment evidence collected for the student. The full Pilot Test Booklet with
assessment delivery instructions is shown in Appendix L.
In about 5 MINUTES, give students "ASK THREE THEN ME" directions. Every student
is expected to explore three sources of information before asking instructor or test
administrator help. These three are: (1) task directions and resources on each screen, (2)
questions online of team members to get and give help, and (3) access internet for
information PRIOR to requesting help. Instructor help is to be RARELY given (see
below for instructions on how), and students are to explore and do their best with the
information and team members available. Instruct students that collaborating and using
the Internet is expected and is NOT cheating for this assessment" !
SAY:
“I will provide you with ASK THREE THEN ME directions. Every student is
expected to use three sources of information before asking for help. First, you
are expected to use task directions and resources on each screen. Second, work
with your team members to get and give help. Third, use the internet for
information. PLEASE KEEP IN MIND THAT THIS IS NOT CHEATING.
Otherwise, you should explore the tasks and do the best you can with the
information and team members provided. You are being assessed on YOUR
ABILITY to work with tools and people online.”!
Figure 10. Sample Assessment Instructions from Arctic Trek Assessment.
Implementation of Phases
Each of the six phases of the dissertation analysis represents categorical
components of the active investigative process associated with the research questions.
The phases are iterative, as the data are analyzed in multiple ways and stages to address
the research questions. The phases are described here, and summarized in Table 10.
Phase 1: Review and Coding of Student Work Samples
Phase 1 of the research in this dissertation project involved a concentrated
qualitative analysis of Notebook cases utilizing a Body of Work method and Discourse
73!
Analysis, explained below, in order to address Research Question 1a. These approaches
examined the data and exposed as many as possible of the components of student ability
and behavior displayed through the work product examined.
Body of work method. The Body of Work method was introduced in research
for the purpose of setting performance standards on complex assessments such as
constructed response, work samples, portfolios and combination formats (Cizek &
Bunch, 2007). The Body of Work method has been used extensively by state
departments of education to develop rubrics and set performance standards. The full
method involves range finding, pinpointing important patterns and trends, and analysis
with logistic regression. This study used the method iteratively as the qualitative
technique for examination of the Notebooks, and employed the range finding and
pinpointing processes.
Using the Body of Work method, the Notebooks were arranged from low to high
for display of collaboration leading to task completion, and coded for where they fell in a
proficiency category. Components of student behaviors generated in the body of work
through display in the samples were listed on a working checklist along with elements
that were expected to be present as per student instructions for the assessment task. The
checklist was used as a means to track elements displayed both within and across
Notebook samples, and used as one basis for generating the rubric. Student behaviors
listed included both structural and content-based behaviors, with structural meaning such
items as introductions, role and task assignment, or visual organization of the Notebook
space, and content-based meaning sharing information to help answer clues, reporting
74!
progress, seeking help, or evaluating a team members answers or ideas. Results are
discussed in the next chapter.
Discourse analysis. In order to determine the types of student abilities displayed
in collaboration, it was necessary to understand the content that students created in the
Notebook product, with the ultimate goal of analyzing their content for collaborative
elements. Discourse analysis as a process looks at all discourse acts and makes no
preconceived judgment about the value. Once discourse acts are coded, then the process
is to review coded elements to look for categorical patterns, response or thread
development, and develop a picture of the group discursive interaction.
Content analysis often involves collecting qualitative data about levels of
participation as well as uncovering the variance among groups and situations so as to
solidify instructional and programmatic practices to (a) enhance virtual collaboration as a
tool for education, innovation and problem solving; and (b) better understand the psycho-
social processes occurring in the problem solving space or with joint construction of
knowledge.
Communication can be categorized as reactive, when responses occur in separate
episodes but do not build on previous messages; or reciprocal, when messages co-occur
across episodes and do build on previous messages (Strijbos et al., 2004a). Content
analysis of discourse has been defined as including, but not limited to, the following
categories displayed in Table 7.
75!
!
Table 7
Discourse Analysis Sample Array of Categories
Categories Study
Proposals or bids; questioning; building common
ground; maintaining a joint problem space;
establishing intersubjective meanings; positioning
actors in roles; and constructing knowledge
collaboratively to solving problems together!
Stahl (2009)!
New facts; students’ own experiences and opinions;
theoretical ideas, explication and evaluation
Schellens, & Valcke (2005)!
Theory: theory, new point or question, experience,
suggestion, and comments or
Discussions: higher-level, progressive and lower
level!
Jarvela & Hakkinen (2002)!
Number of members; density and intensity;
responsiveness; and attentiveness of members! Fahy (2001)!#$%&&'&()!*+,-&',%$)!./,'%$)!&/&.+&.+!/0!1&0+$%*+2! Veerman & Veldhuis-Diermanse
(2001)!
Affective; interactive; and cohesive ! Rourke, Anderson, Garrison, &
Archer (1999)!
Participative; social; interactive; cognitive and
metacognitive! Henri (1992)
Other methods or categories used for analyzing discourse include processes of
knowledge construction and interactional dynamics through the study of purpose of
discourse. Gunawardena, Lowe and Anderson (1997) used a grounded theory approach
to develop a scheme for content analysis involving five phases of knowledge
construction:
1. Sharing or comparing information
2. Dissonance or inconsistency
76!
3. Negotiating agreements or co-construction
4. Testing tentative constructions
5. Statement or application of newly constructed knowledge
Ideally groups will attain the higher phases of communication. A meta-analysis of
discourse using Gunawardena’s content analysis approach showed that Phases 1 and 3 of
knowledge construction tend to be dominant, while the Phases 2, 4, and 5 of knowledge
construction occurred less often in the discourse (Schellens & Valcke, 2005).
Evans, Feenstra, Ryon, and McNeill (2011) have created a multimodal analysis
with a theoretical framework to study interactional dynamics using co-references to track
focus, dominance and coalition building. References are categorized using three levels of
discourse: object-, para-, and meta-level co-references. They look for co-referential
chains, or topics, to emerge. The unit of analysis is at the level of utterance. They note
periods of high productivity and also for patterns of leadership, power, experience or
confidence evident.
Researchers have used different units for analysis of discourse including
messages, threads of joined messages, paragraphs, or utterances, with messages being
perhaps the most widely used (Schellens & Valcke, 2005; Strijbos et al., 2006). Coding
strategies vary with the unit and the theoretical orientation. Message units mean that
compound sentences can be divided by the content, and that level of analysis affects the
number of units coded and may be involved in unit overlap. Problems with coding
messages in complete sentence form as a single unit of analysis include that some
students will submit two messages within one sentence, such as a bid for new action and
77!
a reaction to a previous bid or comment from another student (Schellens & Valcke,
2005).
In this study, the unit of discourse analysis was fragment or utterance, and
included any discourse act; even non-verbal acts such as a lone exclamation point or
emoticon. A response to an utterance was coded as a response of just one reply, or a
thread, if there were ensuing co-references to a topic. Some utterances were non-verbal,
such as emoticons, but are expressive nonetheless, and were thus retained in the discourse
analysis. All fragments or utterances were coded, whether they were on task or off task.
See sample coding on Notebooks 2 and 8 in Appendix M, which will be discussed in
detail in the next chapter.
The combined methods of Body of Work, Discourse Analysis and multiple
exemplars for cross-case analysis provided the techniques of data reduction from the data
set to use for exploration and categorization of elements. Traits were coded and a
checklist was developed of traits and qualities displayed to track occurrences within and
across cases to quantify how often traits, qualities or type of discourse appears throughout
total work samples. The checklist is described in the Results chapter, as it was a result or
outcome of the methodology here. A frequency count helped to describe the amount of
evidence per code or trait, and helped to assess commonality of displayed traits across
cases. Traits were analyzed and defined for the purpose of creating categorical groupings
in Phase 1 of this study.
Phase 2: Rubric Development
The coded traits, qualities and behaviors developed through the Body of Work,
discourse analysis, and cross-case analysis in Phase 1 were used in the development of a
78!
rubric that was intended to capture student ability in computer-supported collaboration
such as this. This effort utilized iterative review and revision of the initial rubric. The
rubric developed through this process was used to address Research Questions 1a and 1b.
Emerging rubric development. A second stage of the Body of Work method
was used to separate coded Checklist traits designated explicitly as assessment tasks in
Arctic Trek instructions (ex: “choose a role”) from other traits or qualities identified in
discourse analysis and cross-case analysis. A trait category was developed from the
constellation of displayed activity that supported or facilitated performing the explicit
assessment tasks.
Expert review. An educator from the field and the author used an emerging Six
Traits Rubric with two student work samples to determine the adequacy of capturing
student ability in collaboration in a digital environment. Adjustments were made to
better reflect coded traits discovered through qualitative analysis.
Range finding. Student work sample Notebooks were sorted by level of
assessment task completion, and the range finding of evidence of assessment task
completion was initially established. Checklist qualities were assessed against task
completion on each student work sample; as some qualities do not appear related to task
completion, these are left on checklist and not added into rubric.
Second-Stage rubric development. The results of the above iterations of Phase
Two were used to identify six traits that represent components of assessment task and
four skill levels of displayed evidence. Task definition originated with the ATC21S
Arctic Trek performance assessment authors, using 21st Century skills frameworks
79!
researched from the field. The initial Six Trait Digital Collaboration Rubric, described in
the Results section as an outcome of the work, was developed for use.
Expert review. The initial Six Traits rubric was reviewed by an educator in the
field and used by the author and educator in separate assessment of student work samples
utilizing this initial version of the rubric.
Third-Stage rubric development. Following the expert review, the rubric was
revised to become more sensitive to domains of collaborative work in order to better
capture student ability in collaboration in a digital environment. The resulting revised
rubric split the six traits into two dimensions: collaborative processes and collaborative
products, and was finally named the 3+3 Six Traits Digital Collaboration rubric, for
future reference referred to as the 3+3 Digital rubric. See Appendix P for the technical,
inter-rater and initial utility studies for this final rubric. This appendix also includes
descriptive statistics and displays used to compare of inter-rater scoring by group and
trait.
Phase 3: Scoring of Student Work Through Within-Case Analysis
This is the specific Phase where student work was scored in order to address RQ 1
and 2, leading to discoveries pertinent to RQ3. Scoring of the data set of 33 Notebooks
occurred following the development of the final 3+3 Digital rubric. The rubric was used
to score student sample Notebooks and determine a proficiency level per case, or student
collaborative group. The scoring yielded information to address Research Questions 1
and 2. The relative rubric responsiveness to student work samples was also evaluated in
this scoring stage. Descriptive statistics and displays were prepared in this phase for the
work sample.
80!
Phase 4: Examination of Trends and Patterns Through Cross-Case Analysis
In this phase the final step of the Notebook interpretation for Research Question
1b and Research Question 2 was to create a typology of patterns of skill development that
match construct definitions, if possible, or to better explicate the construct if the data
support an alternate view of student use patterns and trends. Using information coded
directly from student work sample Notebooks, scored on the rubric, and quantified with a
coding checklist, trends in student use of the shared documents Notebook were explored
for organized patterns of student use, and any pattern sub-categories that may describe
patterns of use in greater detail.
Student Notebooks were reviewed using Body of Work and Cross-case analysis
of scores and traits displayed to determine broader group patterns of Notebook use.
Diagnostic analysis of individual group Notebooks for a deeper analysis of group patterns
of sub-skills and behaviors provided information to be used in a cross-case analysis of
these sub-skills and behaviors to discern if behaviors may contribute to the broader group
patterns of displayed collaborative skill. As suggested by the literature, notice was taken
both of what patterns and behaviors were present as well as what was absent, and patterns
discovered are discussed in Chapter IV.
Phase 5: Examination of Skill Areas for Instructional Design
The Body of Work method, Cognitive Task Analysis, and Backward Design
principles were used to analyze the assessment findings to categorize skills per domain to
develop instructional categories that might aid in planning and resource allocation for
instruction in digital collaboration.
81!
Cognitive task analysis. CTA is described by Stanton (2006) as a set of methods
to identify the cognitive skills needed to perform a task efficiently, with the break down
and study of individual elements of the task. Steps identified in CTA include a)
determine task-specific processes; b) identify a strategy for performing the task; c) check
the model against a set of representative task instances to assess performance on the task;
and d) evaluate the model. Kieras and Meyer (2000) explain identifying task strategy as
an intuitive process when a system has yet to be developed, with the predicted assumed
strategy tested by the success of performance with use of the model developed for the
task strategy. Confounding cognitive task analysis is the human factor; there may be
more than one task strategy to facilitate the task production, or people may follow
optional task strategies that were not predicted, and display productive or non-productive
outcomes.
Hierarchical Task Analysis (HTA) consists of a system of goals and sub-goals,
with goal-directed behavior involving the use of sub-goals related to an overall plan or
strategy with sub-operations in the hierarchy to achieve the overarching goal or task
(Stanton, 2006). The basic method for application of HTA is described by Stanton as: a)
define the purpose of the analysis; b) define the boundaries of the system description; c)
gather information about the system to be analyzed from a variety of sources; d) describe
the system goals and sub-goals; e) link goals to sub-goals and describe conditions where
sub-goals are triggered; f) verify analysis with subject matter experts; and g) be prepared
to revise the analysis.
HTA can be useful to critically assess aspects of a defined type of work, or to
clarify aspects of training or procedures (Militello & Hutton, 1998). HTA helps
82!
instructional designers understand the nature of the domain, scope and organization of the
work; perform error analysis and prediction; identify performance standards and
conditions; and assess the presence of environmental or situational task stresses (Stanton,
2006).
Applied Cognitive Task Analysis (ACTA) is an approach useful for systems and
instructional design that involves creating a task diagram, a knowledge audit, and a
simulation interview (Militello & Hutton, 1998). The task diagram is a broad overview
of the task that identifies difficult cognitive elements, and breaks the task into sub-tasks.
The knowledge audit seeks to describe the skills of an expert at the task in order to
describe appropriate examples and to bring forward knowledge categories that
characterize expertise, and may involve components related to task performance such as
situational awareness, prediction, diagnosis, cuing, metacognition including self
monitoring, recognizing anomalies, and improvising. The simulation interview with
Subject Matter Experts (SME) analyzes performance within a contextualized scenario
and addresses items like events, actions, assessment, critical cues, and potential errors to
uncover potential novice versus expert comprehension and decision-making differences.
In this study, the task of collaboration in a digital environment was analyzed
consistent with CTA, for procedural knowledge and the production rules required for task
performance. Concepts from HTA were employed with respect to the subsystems of
collaboration and ICT use, as well as the sub-operations implicit to the operational goals
in each system. For example, communication of content to team members is necessary to
work towards a collaborative team response. This also aided in determining error
taxonomy framed as skills deficit. ACTA practices such as the task diagram and
83!
knowledge audit helped to clarify levels of performance ranging from novice (Emerging
on Digital Collaboration Rubric) to expert (Capable distinction on the rubric). The
ACTA model simulation offered a contextualized manner with which to walk through the
analysis of the overall constellation of tasks and sub-tasks.
Backward design. Rather than deciding what to teach and giving an assessment
at the end of the unit to see what students learned, Backward Design principles start with
deciding what students should learn, and how that would be measured or assessed, before
going through the process of deciding how and what to teach to help students achieve the
stated desired learning outcome. Wiggins and McTighe (2001) describe backwards
design as a “purposeful task analysis” that calls for outlining goals for assessment
evidence of learning outcomes before planning the instruction towards those outcomes, in
order to create more clearly defined teaching and learning targets (p. 8, 2001).
Wiggins and McTighe (2001) frame the design process as a multi-phased process
across three stages. The phases include addressing the key design question for each
stage; followed by design considerations and design criteria in order to ascertain what the
final design accomplishes; with these phases applied to each of the three stages. The first
stage seeks to determine what is worth learning and what is required of understanding;
the second stage asks what is evidence of that understanding; and the third stage
determines what learning experiences and instruction will promote understanding,
interest and excellence with respect to the learning goals.
Following this design process should result in a coherent instructional sequence
that will promote targeted teaching and learning towards explicit conceptual
understanding with the acquisition of essential enabling knowledge and skills, as
84!
evidenced by a continuum of valid and reliable assessments. Student needs also shape
instructional design, and student ability in prerequisite knowledge and skills must be
identified. Childre, Sands, and Tanner Pope (2009) suggest identifying both classroom
and individual learner needs as important steps in differentiating instruction and
incorporating the accommodations into the backward design process. It is important to
analyze multiple sources of data for both student needs and evidence of desired results, to
determine appropriate action plans.
The operationalized skills and sub-skills, and the behaviors displayed by students
in the Notebooks were examined using CTA, HTA and ACTA and categorized according
to general content domain. Expected tasks as defined by the constructs in the assessment
were also categorized by general domain. Domains were considered regarding the basic
prerequisite enabling skills necessary for being operational in that digital collaboration,
consistent with Backwards Design principles (Wiggins & McTighe, 2001). The resulting
described domain areas and composite skills are listed in Chapter IV and discussed
within the frame of Instructional Design that can support the inclusion of digital
collaboration and the many sub-skills in the K-12 curriculum.
It should be noted here again that the digital notebook work product as a single
artifact is expected only to begin to explore some of the intellectual demands of such new
collaborative tasks online. It was hoped that as teachers are widely being encouraged to
include such approaches in their instruction, through technology integration, that the
work product samples will help to shed some light on the sub-skills that might be
supported with instructional interventions. However it is acknowledged here and
85!
discussed in the limitations section that this is only a small part of the necessary research
work in this area, although hopefully illuminating across the cases available.
Phase 6: Investigate Potential Professional Development Strategies!!! Phase 6 explores the leadership aspect of Instructional Design, addressing what
practices of professional development might help teachers support the findings in Phase
5. For Phase 6, which again involved only a small exploratory sample, eight educators
serving as raters for technical studies in this project (see Appendix P) also were given an
exploratory survey about their training in the different sub-skill domains involved in
collaborative learning: technology, cooperative learning and social-emotional learning,
and their use of these areas in the classroom.
The survey, shown in Appendix N, inquires about the type of professional
development experience they received, such as pre-service, in-service, or training
educators sought on their own. It also asked for information about use of technology and
collaboration in the classroom and district.
In addition, comments made by educators were gathered from discussion and
correspondence during the inter-rater moderation sessions (see Appendix P), and
categorized by type of support needed.
The survey results and comments from educators were analyzed to discover
common experiences and to generate information regarding task-specific educator needs
towards implementing digital collaboration in their classrooms. The results of the survey
and educator feedback are described in Chapter IV.
86!
Analysis by Phase
The analysis for this study was largely iterative and comprised of many separate
analyses, with much of the analysis per phase affecting the direction, development and
then analysis of subsequent phases. Analyses are associated with the corresponding
numbered Phases of research that are described in detail in the above sections. The
analyses are listed by Phase and Research Question with a brief description of the
function or purpose in Table 8.
Table 8
Analyses by Phase and Research Question
Research
Question
Phase
Analysis
Methods Purpose
RQ1a I Body of Work,
Discourse
Analysis and
case-oriented
Cross-case
analysis
Examine Notebooks to perform qualitative
data reduction. Iterative data management
activities: code, compare, aggregate, contrast,
sort, and order data; Look for patterns, links
and relationships.
RQ1a II Body of Work Sort Notebooks by level of assessment task
completion, and assess coded checklist
qualities against task completion on each
work sample.
RQ1a IIIa Score with 3+3
Digital rubric;
Cross-case
analysis
Have a uniform score to use in comparing
Notebooks; test use of rubric;
Evaluate scoring differences on the first two
iterations of the Rubric.
RQ2 IIIb Descriptive
statistics
Determine the existence, strength and
direction of the relationships between student
age and Notebook patterns
87!
!
Table 8 (continued)
Research
Question
Phase
Analysis
Methods Purpose
RQ1b IV Body of Work
method and
case-oriented
Cross-case
analysis
Diagnostically examine individual group
Notebooks to determine categorical patterns
and trends in collaborative skills, sub-skills,
and behaviors that may contribute to group
patterns of Notebook use.
RQ3 V Cognitive Task
Analysis and
Backward
Design
Categorize sub-skills for collaboration in a
digital environment; Determine the composite
skills and domains necessary to plan
instruction.
No RQ
associated
VI Survey,
qualitative
feedback
Needs Assessment for Professional
Development
88!
CHAPTER III
RESULTS
As described in Chapter II, the results of this study are related to six phases of
research that inter-relate and build upon each other. Results from each Phase will be
discussed in turn in this chapter.
Case Characteristics
Table 9 outlines the sample Notebook cases by age group and country. Note that
demographic data other than age and country is restricted to country-level use and was
not available in this dataset. What immediately becomes apparent in Table 9 is that the
33 cases available in this data set are strongly skewed toward a majority of the case
consisting of Age 15 team notebooks from the U.S. Numbers of notebooks available at
other ages and from other countries are limited in this sample.
Currently larger field trials are taking place in these countries, with a study design
that will provide a more fully representative sample. However for the purposes of this
exploratory dissertation of the currently available pilot study notebooks, the sample is
more limited, which will be discussed in more detail in upcoming sections. As a
descriptive cross-case study of a small number of cases, the sample of 33 cases described
here is too small for full representation across the multiple age groups and countries.
This is not the goal of the study, however it should be noted here that due to the sample
characteristics, more of the information captured in the patterns that are identified
through the upcoming phases will represent age 15 student work in the U.S. Exploration
of a more fully representative sample will be discussed in Chapter IV, through the
implications for future work.
89!
Table 9
Case Characteristics
Demographic age 11 team
products
age 13 team
products
age 15 team
products
Total number cases
(N of cases)
N = 10 N = 5 N= 18
USA n = 7 n = 4 n = 15
Singapore n = 2 n = 1 n = 2
Australia n = 1 n = 0 n = 1
Results of Phase 1: Review and Data Coding of Student Work Samples
Phase 1 of the research involved the review and coding of student work samples
to develop a taxonomy reflecting the scope of student work such that the displayed
student ability of collaborative skills in the student work samples could be broadly
interpreted. See Table 10 for the outline of the data coding process.
Discourse Analysis
In order to determine the level of student ability in collaboration, it was necessary
to understand the content that students created, and the possible purpose of their
discussion generated, with the ultimate goal of analyzing their content for collaborative
elements. Discourse analysis as a process looks at all discourse acts and makes no
preconceived judgment about the value. Once discourse acts are coded, then the process
is to review coded elements to look for categorical patterns, response or thread
development, and develop a picture of the group discursive interaction.
90!
Unit of discourse analysis. The unit of discourse analysis was fragment or
utterance, and included any discourse act; even non-verbal acts such as a lone
exclamation point or emoticon. A response to an utterance would be coded as a response
if just one reply, or a thread, if there were ensuing co-references to a topic. Some
utterances were non-verbal, such as emoticons, but are expressive nonetheless. All
fragments or utterances were coded, whether they were on or off task. See sample coding
on Notebooks 8 and 2 in Appendix N.
Table 10
Phase 1 Processes and Outcomes
Phase 1: Review
and coding of
student work
samples
Method Process Results/Outcome
Phase 1.0
RQ1a
Body of Work
method qualitative
research
Examine student
work samples
Develop familiarity
with range of ability
and elements
displayed
Phase 1.2
RQ1a
Discourse analysis Code discourse in
student work
samples for type and
purpose
Generate list of
traits and qualities
represented in
student work
RQ1a Quantify how often
traits, qualities or
type of discourse
appears throughout
total work samples
Frequency count for
amount of evidence
per code or trait to
assess commonality
of displayed traits
Phase 1.3
RQ1a
Define and
categorize coded
elements
Analyze traits for
creating categories
Develop checklist of
traits and qualities
displayed
91!
Coded checklist of discourse categories. On the basis of the coded student
work, an initial task-based Rubric and a coding Checklist were developed to quantify use
of categorical discourse and discern general patterns of discourse content or purpose.
The initial patterns for category findings from the within-case study were as follows:
Identification. The participant says who they are, and maybe declares an
identifying color or font. The assumed purpose is to facilitate discussion via social norms.
An example of identification from the Notebook samples is “Dominic is anonymous user
1882.”
Role assignment. This is defined as claiming or assigning stated assessment roles
such as captain, recorder, scout, decoder. The assumed purpose is to fulfill assessment
task directions and facilitate task completion. Examples of role assignment from the
Notebooks are “I will be person 1” and “Yu Hao be recorder” or “I want to be decoder.”
Task assignment. This is defined as claiming or assigning tasks such as finding
clues. The assumed purpose is to fulfill assessment task directions and facilitate task
completion. Examples of this from the Notebooks include “We need to decide who will
do what task” and “i would like to do the coloring task.”
Report content. This is defined as stating what information or answers
participants found while working on the task. The assumed purpose is to fulfill
assessment task directions and facilitate task completion. Examples of this from
Notebooks include the entries “They use red for declining populations” and “Page 8
answer is artic fox.”
92!
Seek help. This is defined as asking for assistance. The assumed purpose is to
facilitate task completion. Examples for the Notebooks include “Please help me I am
stuck on page 9” and “hey, anyone, clue 1 practice?”
Give support. This is defined as providing help or assistance. The assumed
purpose is to facilitate task completion. An example of this from the Notebooks is “Click
on the toolbar that has the search thingy and type in what you want to find. JingYing.”
Direct process. This is defined as providing unsolicited direction for task
orientation to facilitate task completion. An Example from the Notebooks is “hey!!!state
name n say something!! We not commounicating at all!!”
Clarify process. This is defined as correcting or redirecting processes for the task.
The assumed purpose is to facilitate task completion. An example from the Notebooks is
“We have to make sure we don’t end up with the same task.”
Time management. This is defined as awareness of time constraints, and
conserving extraneous efforts or planning efforts in regards to time awareness. The
purpose is to facilitate task completion. Examples from the Notebooks include “Quick!”
and “Can we begin now? It took too much time for us to begin.”
Goal setting. This is defined as deciding on a task or benchmark to achieve and
end. The purpose is to facilitate task completion. Examples from the Notebooks are “we
ALL HAVE TO DO TWO CARDS!!!!!!” and “role recorder, work on clue one—fly over
map—card 2 and 5.”
Develop threads of discourse. A thread is defined as two or more replies to the
same initial utterance, such that successive utterance replies co-reference the initial
utterance. An example of this from Notebook 11 is shown in Figure 11.
93!
Figure 11. Notebook sample illustrating thread of discourse between students.
Social discourse. This is defined as discursive acts that are social in nature,
typically used to connect with others, such as saying “hello”, or show respect for others
through using culturally appropriate discursive manners, such as prefacing a request with
“please” or following compliance with “thank you”. The assumed purpose of social
discourse is to promote a sense of congeniality or collegiality among group members as a
way of developing group cohesion, possibly towards task completion. Examples of
social discourse from the Notebooks include “hi”; “thank you sir”; “lol”; “Sup”; “pls
reply”; “@_@”, and other social text speak.
Text speak and emoticons. This is defined as text shorthand and character use as
discursive acts. The assumed purpose is to establish and maintain a social-emotional
connection through text-speak or emoticons, perhaps to establish social comfort in lieu of
face-to-face contact. Non-sample high school students assisted in evaluating these
symbols regarding current use meanings, and the high school student coding was fact-
checked on text speak sites such as NetLingo (333"&+*$'&(/",/45%,0/&64."7-7"8
Talktalk (333""#$%"#$%",/"195,/441&'*65"&'"()%:!and other sites.
The most common socially related text speak content displayed in the student
work samples were facial representations, such as the wide-eyed, shocked, freaked-out,
and questioning faces. The next most commonly symbolized expression was the smile,
94!
in various formats, such as squinty-eyed, and big grin. These symbols were perhaps used
to establish a friendly connection so as not to appear too business-like in a remote-located
collaborative situation where the social mood or connection and transmittal of emotions
cannot be established by body language.
Role conflict or confusion. This is defined as discursive acts relating to
confusion or disagreement over who will take what role. The assumed purpose is to
clarify or change roles assignment or the process of role assignment. Two examples of
this from the Notebooks are illustrated in Figure 12.
Example two:
Figure 12. Two Notebook samples display student role conflict or confusion.
Task conflict or confusion. This is defined as discursive acts relating to
confusion or disagreement over who will work on what task. The assumed purpose is to
clarify or change task assignment or the process of task assignment. Examples of this
95!
from the Notebooks are “Hillary: I cant answer 5 and 6 until you have solved 3 + 4 Ale,
what should I do in the meantime?” and “Can I pick what I want to do?”
Affective statements. This is defined as discursive acts that only relate emotion.
The assumed purpose is to express emotion or possibly to connect with the group socially
or seek emotional support. While emotion may be implied within discursive acts that
contain other categorical content, they are not coded as affective statements. An example
of this would be “how do you set this up? don’t get it!!!” as the “don’t get it!!!” with
three exclamation points could imply frustration, which is affective. However, as it is not
overtly affective, it was coded as Seeking Help. In another example “WHOS ERASING
IT??????????????????????????” implies anger, but again, as it was not overtly affective it
was coded as task direction. Examples of Affective Statement are as follows: “IM
confused” or “this is damn difficult” “this is making me …crazy!!” Confusion was the
most commonly expressed affect.
Off-task behaviors or topics. Off-task behaviors or topics are defined as any
discursive act that is not related to the process or product of the task, and has not been
coded differently. For example, this category does not include social discourse, which
can facilitate task through group cohesion, nor does it include affective statements, or
role and task conflict or confusion. Off-task topics or behavior may have the possible
purpose of sabotaging task completion, but likely just reflect student boredom with or
frustration by the assessment task, or some other issues not related to either the group or
the task, including issues that are totally unrelated to the school environment. Examples
of off-task discursive acts from the Notebooks include “;llo0ouio90” and
“wghdfdetbjghftdrtfgdchkahfauygivsawwer” or “i very cold.”
96!
Visual organization. This is defined as any attempt at visual adjustment to the
Notebook pages, such as the use of color or font for participant identification, outline
formats, threads of discourse separated by space as “chunks,” numbers, underlining or
other visual formats. These are embedded in discursive acts and not typically overtly
discursive on their own. The assumed purpose is to provide an organization to the digital
environment facilitates collaboration, with an overarching purpose of task completion.
This example reflects one of the few overtly discursive acts of visual organization,
although her act is also tied to Identification: “I’m Jenny :) this colour.” The example in
Figure 13 is from Notebook 32:
Figure 13. Notebook sample displaying visual organization.
Total entries. This is a count of the total number of discursive acts in a given
Notebook. They were counted to assess whether the number of entries had any
relationship with task completion.
97!
Total questions. This is a count of the total number of discursive acts that could
be coded as questions in a given Notebook. They were counted to assess whether the
number of questions had any relationship to task completion.
Separation of checklist and rubric traits. Categorical codes that were
considered specified by the assessment were put on an initial Rubric, see Phase Two
below, and the remaining content categories coded through discourse analysis were
placed on a Checklist to be used for further qualitative analysis of student work after
assessing for task ability. The final Six Traits Checklist of coded behaviors that were not
assessment-based Traits is located in Appendix E.
Results of Phase 2: Rubric Development
Phase 2 reviewed student work samples to define commonalities as coded traits
and qualities leading to the development of an initial rubric. See Table 11 for explication
of this iterative process for initial development of the rubric.
Table 11
Phase 2 Processes and Outcomes
Phase 2: Initial
Rubric development
Method Process Results/Outcome
Phase 2.0
RQ1a
Rubric
Development
Separate coded
Checklist traits
designated explicitly
as assessment tasks
in Arctic Trek
instructions (ex:
“choose a role”)
Develop trait
category from
constellation of
displayed activity
that supported or
facilitated
performing
assessment tasks
98!
!!
Table 11 (continued)
Phase 2: Initial
Rubric development
Method Process Results/Outcome
Phase 2.1
RQ1a
Expert Review Myself and another
educator used
Rubric v.1 with two
student work
samples
Trait “Sharing or
Checking Progress”
does not adequately
capture student
discourse to support
tasks. Replace with
Interactive
Regulated Learning.
Phase 2.2
RQ1a
Body of Work
method
Sort student work
sample Notebooks
by level of
assessment task
completion
Find range of
evidence of
assessment task
completion
Phase 2.2
RQ1a
Assess checklist
qualities against
task completion on
each student work
sample
Some qualities do
not appear related to
task completion, so
these are left on
checklist and not
added into Rubric
(ex: # of entries in
Notebook)
Phase 2.3
RQ1a
Rubric
Development
Use above to choose
six traits that
represent
components of
assessment task and
four skill levels of
displayed evidence
Six Trait Computer-
Supported
Collaborative
Learning (CSCL)
Rubric developed
for use
99!
Traits
Results from the discourse analysis were used as grounding traits for the Rubric.
In the task itself, students needed to choose roles; assign responsibilities; seek and offer
help; investigate various clues; and report answers to their teammates to co-construct
knowledge and create a shared team answer.
These specific tasks were stated as Traits 1, 2, 3, 4, 5 and 6; the six traits and their
defining characteristics are listed below.
• Trait 1: Identification & Role Assignment: Participants identify themselves
and take roles.
• Trait 2: Task Assignment: Who is responsible for what tasks?
• Trait 3: Interactive Regulated Learning: Evidence of seeking or offering help;
reporting progress; clarifying process; self or group evaluation; time
management; task orientation; goal setting; mediation; or appreciation.
• Trait 4: Sharing Content and Resources: Resources, answers or responses to
tasks are posted.
• Trait 5: Collaboration: Participants add to, evaluate or offer an alternative
response to the shared content.
• Trait 6: Co-Construction of Knowledge: Participants use shared and
evaluated content to construct final answers or responses or complete a task.
Expert review. An educator from the field and I independently rated two
Notebooks to see how the Rubric would capture the discursive elements displayed in
work samples. The initial six-trait rubric had a problematic trait identified in this review.
The Trait “Sharing or Checking Progress” did not adequately capture student discourse to
100!
support tasks. It was coded as a Trait due to its centrality in many work samples, yet
many higher scoring groups did not engage in overt or visible progress reports or checks,
and their lack of demonstrating this trait led to a lower overall score, although they
completed a high quality product. Initial review from the field reflected my discomfort
with how this trait fit the construct of collaboration. Educators in field review agreed
trait Reporting Progress was problematic in that it did not reflect nor capture the range of
discourse regarding group work behaviors.
There were a variety of coded categories on student work that did not fall under
“Sharing or Checking Progress” such as Seeking Help, Providing Support, Directing or
Clarifying Process, or Task Orientation, Goal setting, Time Management, or Mediating
Conflict. I decided that these behaviors did facilitate task accomplishment, and should be
able to be reflected on the rubric, even if there was variation in how often or in what
combinations they occurred. This led to the development of the Trait 3 Interactive
Regulated Learning. Interactive Regulated Learning is a more comprehensive trait and is
better able to capture group-oriented metacognitive behaviors than the previous trait
narrowly defined around explicit reporting or checking on progress.
Interactive regulated learning. Interactive Regulated Learning is self-regulated
learning with a group focus, such that the processes are evidenced individually and/or
mediated collectively through group communication, participation, or facilitation of those
processes within the group. This revision of Trait 3 was developed based on the coding
and content categorizing from the discourse analysis in Phase 1. A large number of
discourse fragments and threads identified were devoted to seemingly isolated traits, yet
most fell into the category of self-regulated learning.
101!
Self-regulated learning. Self-regulated learning is a social cognitive construct
defined as a constellation of metacognitive, behavioral and motivational strategies or
processes that allow or support a learner to mediate his or her learning (Zimmerman,
2000). Specific behaviors involved include goal setting, self-monitoring, self-evaluation,
task orientation, time management, and help seeking. Goal setting can be defined as
setting specific outcomes including both performance and process outcomes
(Zimmerman, 2000). Self-monitoring is defined as directing one’s attention to one’s
own learning processes with an eye towards directing efforts to task and evaluating
progress in the effort (Dabbagh & Kinsantas, 2005). Task orientation is attention to task,
with the use of strategies, tools and processes, including organization and planning, that
the learner believes will enable task accomplishment. Time management supports the
other self-regulatory processes and help seeking is a self-regulatory process whereby the
learner has used self-evaluation skills to identify when he or she needs assistance to
complete a task.
Group-meditation of self-regulated learning skills. As this assessment task took
place in a collaborative group environment, the self-regulatory processes were often
directed at others. Examples of this include: “Hurry, find your answers we are running
out of time” and “stop doen that dang lets do our work! okay number five.” Other
examples were in support of others “uhhhh 4 is supposed to be u just assign it to any one
ok?” and “ya wat do you need help on?” Still other examples facilitated the process of
others such as “How many colors do you see team?” I decided to call Trait 3 Interactive
Regulated Learning because the group was engaged in self-regulated learning processes
together; individual processes were connected to those of the other group members.
102!
There is only circumstantial evidence that one group member’s overt explicit
demonstration of self-regulated learning could promote that in another group member,
but nevertheless as the discourse was often reciprocal, I thought this Trait reflected group
mediation.
Preliminary Work Sample Assessment and Range Finding
Using the initial revised Rubric and the Checklist, student work sample
Notebooks were scored and sorted by level of assessment task completion. The goal was
to find a range of evidence of assessment task completion to help describe the scoring
categories. The categories that emerged were Non-collaborative, Emerging, Developing,
and Capable. The majority of student work samples reflected the Non-collaborative
category due to non-use of the Notebook. Of groups who did access and choose to use
the Notebooks, most were in the Developing to Emergent categories.
Preliminary Work Sample Assessment for Significant Traits
As the main purpose of the Rubric is to assess student ability to perform a
collaborative group task, specifically the Arctic Trek task, I wanted traits appearing on
the Rubric to be significant, meaning tied to task completion. In order to assess the initial
revised Rubric for relevancy to task, I assessed checklist qualities, that is elements coded
from student work that were not initially assigned to the Rubric, against task completion
on each student work sample. Some qualities did not appear related to task completion,
for example the number of entries or questions in a Notebook, so these were left on
Checklist and not added into Rubric.
103!
Six Traits Digital Collaboration Rubric
The initial Rubric, the Six Traits Digital Collaboration Rubric, had a format
similar to well-established curricular rubrics that are familiar to teachers in most content
areas. Traits are described with a guiding question and four levels of proficiency
evidenced by student work examples. The Six Traits Digital Collaboration Rubric is
shown in Appendix D. This rubric was used to initially score the student work samples,
and subsequently underwent further revision after examining the level at which the
Rubric captured of student ability.
Initial Assessment of Student Work, Using the Six Traits Rubric
Using the first iteration of the rubric, the Six-Traits Digital Collaboration Rubric,
the mean score was 5.4 and the median score was 4, from a total possible of 18 across the
0-3 score levels of the six traits. Eleven groups out of the thirty-three total scored at 50%
or better on the rubric, and five scored at 67% or better. Seven groups did not use the
document at all and scored a zero, and four additional groups used the Notebook very
sparsely leading to a score of one; these groups accounted for one-third of the sample.
See scores from evaluation with the first iteration of the Rubric, the Six Traits Digital
Collaboration Rubric, in Table 12.
Table 12
Overall Notebook Scores for Six Traits Digital Collaboration Rubric
Sample Mean Median Mode
N = 33 5.4 4 0
Note. The Total overall score possible = 18.
104!
Evaluation of Six Traits Digital Collaboration Rubric Use for Student Work
Assessment
Upon analyzing student work samples against scores generated from the Six
Traits Digital Collaboration Rubric, it was noted that similar overall scores on the Rubric
did not always relate to similar quality of collaborative work. For example, using the Six
Traits CSCL Rubric, Notebooks could score 12/18 overall, which is a high score for this
sample group, but not exhibit much beyond Emerging for actually being able to create a
collaborative product or complete the assigned assessment task. The rubric therefore as
composed did not have sufficient sensitivity to capture student skill ability within this
scoring frame.
Expert review of Six Traits Digital Collaboration Rubric. An educator from
the field used the Six Traits Digital Collaboration Rubric to score two work samples and
independently offered feedback that matched my observation above, that the overall score
on the Rubric didn’t readily tell her, as a teacher, how student skills were distributed with
regards to the structure or function of collaborative work; whether they had basic
collaborative skills that needed harnessing, or whether they could create a product
without substantial evidence of working together on the product. This review confirmed
my observations and led to a redesign of the rubric, as described below.
Rubric revision: 3+3 Six Traits Digital Collaboration Rubric. Based on the
concerns above, the original one-dimensional six-trait rubric was split into two
dimensions to give more specific information on student performance, each with a
component score; see the 3+3 Six Traits Digital Collaboration Rubric in Appendix E.
105!
The 3+3 Six Traits Digital Collaboration Rubric measures:
• Collaborative Learning Processes with Traits 1, 2 and 3
• Collaborative Learning Products with Traits 4, 5, and 6
Each trait carries a possibility of 3 points, for total sub-component scores of 9 and a
combined potential Digital Collaboration score of eighteen.
Becoming a two-dimensional rubric allowed scores to reflect relative strengths
and weaknesses across the dimensions, rather than being averaged together. Aside from
becoming a two-dimensional rubric, there were only minor adjustments made in the
language of the trait descriptions; see Appendix D and E for versions two and three of the
rubric.
The rubric was also adjusted to reduce confusing language. Specifically, in Trait
4, associated with Collaborative Learning Products, “Shared Content” was rewritten to be
“Sharing Content and Resources” in order to clarify that providing resources for group
members to consider may be just as or more useful than simply posting a content-based
“answer” to a task related question.
Results of Phase 3: Assess Student Work and Evaluate Rubric
In this highly iterative phase Student work samples were scored using three
iterations of the developed rubric each with a total of 18 points possible. First, the initial
Six Traits Digital Collaboration Rubric was used to assess student work samples, and
then the scored work samples were used reflectively to evaluate the rubric. Such
evaluation, along with reflection and expert review, led to the revision of the Rubric to a
two dimensional model, the 3+3 Six Trait Digital Collaboration Rubric, to better capture
student ability in multiple aspects of collaboration. The 3+3 Six Trait Digital
106!
Collaboration Rubric was then employed to assess the student work sample Notebooks
again, performing better than the initial Six Traits Digital Collaboration Rubric in
capturing student abilities. The group scores from this rubric were analyzed descriptively
and the Rubric was used by other raters to assess Notebook samples to evaluate the
Rubric for reliability; see Table 13 for an overview.
Table 13
Phase 3 Processes and Outcomes
Assessment of
student work and
evaluation of Rubric
Method Process Results/Outcome
Phase 3.0
RQ1a
Assessment of
student work using
Rubric
Use Rubric version
1 to score student
work samples
Quantified range of
skill displayed in
Notebook use
Evaluation of
Rubric use for
student work
Analyzed Notebook
scores against
overall task
completion and
other sample
Notebooks
Discovered rubric
score on process
traits generated
overall lower score
even when group
created a product
Phase 3.1 Expert review Educator feedback
regarding Rubric
version 1 against
three sample
Notebooks
Educator in field
gave feedback
reflecting above
observation
Phase 3.2 Rubric revision Restructure Rubric
to two dimensions:
process and product
Developed 3+3 Six
Trait Digital
Collaboration
Rubric
Phase 3.3
RQ1a
Assessment of
student work
Used 3+3 Rubric to
score student
sample Notebooks
Scores on version 2
of Rubric better
reflected evidence
of student ability
107!
!
Secondary Assessment of Student Work Using the 3+3 Rubric
Next, a second assessment of student work was completed using the new 3+3
Rubric to score student sample Notebooks. Scores on version 2 of the rubric better
reflected evidence of student ability. Whereas the overall scores remained basically
unchanged, see Tables 14 and 15, the two-dimensional rubric captured differences in the
types of ability displayed in student Notebook work samples. Scoring work samples with
Table 13 (continued)
Assessment of
student work and
evaluation of Rubric
Method Process Results/Outcome
Phase 3.4
Assessment of
student work
descriptive statistics
Comparison of
inter-rater scoring
by group and trait
Scoring trends of
sample Notebooks
similar between
groups of inter-
raters, showing
rubric coherence
Phase 3.4
RQ1b
RQ2
Assessment of
student work
Descriptive statistics
Examine Notebook
rubric scores by age
Slight age trend
between 11 and 15
years olds with
respect to mean
score on rubric
Phase 3.5 Inter-rater reliability Educators from the
field use Rubric to
score 8 sample
Notebooks in
asynchronous
format
Report Traits 1 and
2 difficult to score
due to lack of
context for
educators and
confusion about
assessment tasks
Moderated session
for Educator use of
Rubric to score 8
sample Notebooks
Similar feedback
regarding Trait 1;
similar confusion
regarding
assessment context
108!
the second iteration of the rubric, the mean score on the Process dimension was 2.97 and
the median was 3. The mean score on the Products dimension was 2.73 and the median
was 2.5.
The scoring indicates that using collaborative processes was somewhat easier for
students than creating a collaborative product, see Table 14. This confirms some of the
thinking in the ATC21S framework, outside the scope of this dissertation, that suggests
the Producer construct is shifted higher in difficulty for digital collaborative learning as
compared to the Consumer construct.
Table 14
Overall Scores for Process and Product Dimensions on 3+3 Six Traits Digital
Collaboration Rubric
Dimension Mean Median Mode
Collaborative
Processes
2.97 3 0
Collaborative
Products
2.73 2.5 0
Note. Total score per dimension = 9
Assessment of Student Work Descriptive Statistics
All of the student work sample Notebooks were rated for a final time, using the
3+3 Six Traits Digital Collaboration Rubric. Table 15 displays the total composite scores
of combined Product and Process dimensions for all 33 cases. !
109!
!
Table 15
3+3 Six Traits Digital Collaboration Rubric Combined Total Scores by Group
Score Frequency Cumulative Frequency
0 7 7
1 4 11
2 1 12
3 2 14
4 2 16
5 2 18
6 0 18
7 2 20
8 2 22
9 2 24
10 2 26
11 2 28
12 1 29
13 0 29
14 4 33
15 0
16 0
17 0
16 0
Totals 33
Note. Total Score = 18 N = 33
110!
The Product, Processes and Combined stem and leaf plots featured in Figure 14
show a break down of the scores per the two dimensions and the composite scores.
Collaborative Learning Processes Stem and Leaf Plot
Stem Scores out of nine possible points
0 0 0 0 0 0 0 0 0 0 1 1 1 2 2 2 3 3 3 3 4 4 4 4 4 5 5 5 5 6 7 7 8 9
Collaborative Learning Products Stem and Leaf Plot
Stem Scores out of nine possible points
0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 2 2 3 3 3 3 4 5 5 5 5 6 6 7 7 7 7 8
Combined Collaborative Composite Stem and Leaf Plot
Figure 14. Collaborative learning stem and leaf plots. These displays show scores from
the two dimensions and the composite scores.
Scores by Trait
The strongest traits displayed based on the overall scoring were identification or
role assignment, interactive regulated learning (asking for and offering help, reporting
progress, clarifying processes, time management, task orientation, goal setting, self or
group evaluation) and sharing content. These are tasks that some students have
experience with or exposure to, although sharing content with other students is not a
Stem Scores out of 18 possible points
0 0 0 0 0 0 0 0 1 1 1 1 2 3 3 4 4 5 5 7 7 8 8 9 9
1 0 0 1 1 2 4 4 4 4
111!
norm in some classroom situations unless sharing publically when answering a question
in class. However, even in these simpler tasks, many cases did not display much mastery
of traits, such as collaboration or even task assignment.
The least typically seen of the six traits rated on the rubric was co-construction of
knowledge. This arguably requires mastery of some of the other traits. Also the task may
not have elicited this behavior sufficiently from students without pre-instruction on how
to co-construct digitally. Therefore it is not surprising that this trait was seen less often.
However, since this is a skill that teachers would like to see happening in the classroom
when undertaking digital collaboration, this finding points to need for possible
instructional supports, to be addressed in a later research question.
Scores by Group
The scores by case (team or group) were varied, with high, low and mid-range
cases. The two dimensional scoring rubric reflected the relative difficulty of scoring
higher on the Product traits as compared to the Process traits. The two charts in Figure
15 also demonstrate that a group could score high in one area but not the other, such as
Group 5, Group 11 and Group 28.
Scores by Age
The mean and median per age are presented in Table 16. As noted previously, the
numbers of students by age in the sample were uneven, and small for some age groups,
with 10 cases for 11-year-olds, 5 cases for 13-year olds, 18 cases for 15 year-olds.
Descriptively, the age 15 students have a slightly higher mean overall and have fewer
notebooks scoring zero. Additionally, as can be seen in the samples in the appendices, 11
year olds showed less social discourse, and were the only group to display role conflict.
112!
However overall, there was not as dramatic a difference by age as might be expected
based on the literature review in Chapter I. This will be discussed more in Chapter IV.
Figure 15. Collaborative Process and Product Scores by Group.
Inter-Rater Reliability
A major component of the study is the development of the rubric as an assessment
tool for computer supported collaborative learning in a 21st century skills curriculum. As
such, it is important that the rubric be reliable by performing with some degree of
;!!?!@!
A!B!C!
D!
! ?! @! A! B! C! D!<;!<!!<@!!=?!=@!=A!=B!=C!=D!>;!>=!>>!
C
ol
la
bo
ra
tiv
e
Pr
oc
es
s S
co
re
s
Group Notebooks
Collaborative Process Dimension Scores by Group
;!!?!@!
A!B!C!
D!
! ?! @! A! B! C! D!<;!<!!<@!!=?!=@!=A!=B!=C!=D!>;!>=!>>!
C
ol
la
bo
ra
tiv
e
Pr
od
uc
t S
co
re
s
Group Notebooks
*+$$#,+-#"./&!0-+123"!4.5&6(.+6!Scores!,7!8-+2)!
113!
consistency across raters. Eight student Notebook samples were purposively chosen to
be representative of differing student ability as reflected by the scoring of the researcher
from low, medium and high scoring groups. Professional educators currently practicing
Table 16
Notebook Scores By Age Group
Age Sample (N) Mean Median Mode
All Sample ages N = 33 5.7 4.5 0
11 year olds n = 10 5.7 5.5 0
13 year olds n = 5 3.8 0 0
15 year olds n = 18 6.2 5 1, 14
Note. Total score = 18
in the field analyzed these eight student samples. The raters provided scores for the eight
groups as well as qualitative feedback on the instrument, the process of inter-rating, and
the assessment process that generated the student work. Of the total eight inter-raters,
three were asynchronous and five were part of a moderated group.
Inter-Rater comparison by group. Group scores were compared between the
different raters. Groups with lower overall scores showed the most consistent ratings,
likely due to the low-level of complexity in determining skill when little skill is
evidenced. Groups with higher scores varied as to the cohesiveness of the ratings. One
factor may be the qualitative issues involved such as not recognizing traits due to
misspelling, non-standard English, text speak, lack of clarity about the structure of entries
and so on. Raters reported confusion about role assignments versus task assignment and
stated that they often had difficulty distinguishing between those traits. An inter-rater
sample is shown in Figure 16. It shows a series of four raters (1 rater for each series)
114!
each independently rating a single case (Case 33) across the six traits shown on the
horizontal axis, and at the levels of performance shown on the vertical axis. For this
case, no rater differed by more than one level on any trait rating; many of the trait ratings
were at the same level. This was true of most cases. In only a few situations did a case
generate a trait score differing by more than one level, and even then the outlying rater
was a single rater among the full set of raters, who clumped more closely within one level
in their ratings.
!!
Figure 16. Asynchronous Inter-Rater Comparisons for Case 33. A series of four raters (1
rater for each series) each independently rating a single case (Case 33) across the six
traits shown on the horizontal axis, and at the levels of performance shown on the vertical
axis.
!! Descriptively comparing raters across cases and traits, Figure 17 displays the
work of four raters evaluating eight cases on two different traits, Trait 4 of Shared
Content and Trait 6 of Co-construction of Knowledge. It exemplifies how results for a
trait level analysis show that there is some variability between raters on exact point value
assignment to student work samples, but that overall raters share a fair degree of
consistency. The upper graph in Figure 17, Trait 4 is an example of the occasional
outlier, with raters within one score level of each other across the 8 cases shown with the
exception of Case 1, where three raters agree within one score level but one rater differs.
;!
!
! ?! @! A!
E+0'+.!E+0'+.?!
115!
Trait 4: Shared Content
Trait 6: Co-Construction of Knowledge
Figure 17. (top and bottom). Inter-Rater Comparison of Traits 4 and 6.
Overall, Trait 1, identification and role assignment, widely showed the strongest
agreement among raters. Trait 2, task assignment, had somewhat more of the one-level
variation, probably due to confusion about a distinction between role assignment and task
assignment subsequently mentioned by raters and noted in future refinements for the
rubric. Trait 3, interactive regulated learning is the only trait where the value direction of
the raters is not entirely consistent, meaning scores between raters do not move up and
down together, although in six of the eight groups at least two or three raters agree with
each other. This trait describes group mediated metacognitive skills, many of which
;!
!
! ?! @! A! B! C!
E+0'+.!E+0'+.?!
;!
!
! ?! @! A! B! C!
E+0'+.!E+0'+.?!
116!
could be expressed or evidenced with just one or two words. Trait 4, shared content, is a
straightforward descriptor, but there is still variance with the researcher more sensitive
and the possibility that non-standard communication by students makes it difficult for
raters to see the evidence of this trait. Trait 5, collaboration, and Trait 6, co-construction
of knowledge, were both dependent on group success with Trait 4, shared content, and
reflect increased depth in content sharing. The further distinction for Trait 6 includes
using collaborative comments to jointly revise and agree upon the response or new ideas
regarding content.
Following the use of the rubric by asynchronous inter-raters, a face-to-face,
moderated group of educators practice rated using Notebooks 32 and 33. These raters
individually rated the same eight samples the other inter-raters used. The results of the
moderated ratings displayed somewhat improved agreement among raters, with fewer
one-level differences, and more consensus in the rater comments that the rubric criteria
were understood.
Qualitative Review From Educator Raters
Educators serving as inter-raters offered feedback on the student work or student
assessment process, the inter-rater process, and the rubric itself. Regarding student
assessment factors, while raters were given rubric administration instructions that
included an overview of the student assessment, there was still a lack of understanding
regarding how much content there was in the assessment so they could rate how well
students addressed it. Another concern was whether roles such as decoder included tasks,
and why the students didn’t have a better grasp of what they were being assessed on.
Educators voiced confusion about the assessment task and how explicitly the students had
117!
been instructed or guided to identify themselves and take a role, as well as either take a
task or assign a task to another. There was confusion over a role not automatically being
connected to a task as is often typical in the field.
One educator pointed out that the directions given the students were three step
directions and that “working with people” was sandwiched between the more familiar
concepts to students of “use tools” and “find answers.” The educator further pointed out
that the students may not have fully understood they were being assessed on their
collaboration skills through the documentation of collaborative activity.
Possible effects of student ability in non-assessed content areas was also
discussed. There was mention that students may have difficulty with written expression,
and being assessed through written documentation of both applied and abstract skills and
content makes it difficult to tell what interaction or interference those elements may have
with collaboration. While for the full task set, there were many opportunities for
expression in other formats, for this particular work artifact, the mode of communication
was written text shared between students.
Possible effects of teacher/rater ability in assessed domains was also discussed by
the educator raters. There was acknowledgement that many teachers have never engaged
in computer-supported collaborative activities themselves, and may also experience very
little face-to-face collaboration, and this could interfere with both teacher abilities to
support these skills in students, and to perform ratings and judgments of student work.
Educators did consider the juxtaposition between finding the correct answers and
working collaboratively, and considered subtle facets of a collaborative environment.
One educator commented, “All answers were completed, but they were not acknowledged
118!
by each other. They completed the task well but seemed to work independently;
therefore, I had a difficult time deciding whether to give a 0 or a 3 for “Trait 6 Co-
construction”. Can absence of a correction be considered agreement?” A wider range
of task as exhibited by scoring not a single artifact but more portions of tasks and more
tasks would give a more complete view of such considerations.
Amid discussion about the distinctions between Trait 5 collaboration and Trait 6
co-construction of knowledge, there was a debate on whether the assessment single work
sample alone provided sufficient opportunity for displaying Trait 6. There was a feeling
among the rater group that because students were instructed to find clues to determine
pre-defined answers, the task did not support enough knowledge creation to show the
reaches of collaboration. While it was true that even on the simpler task few students
showed much mastery of digital collaboration, as one rater remarked, “Creation of
knowledge requires a deep task, with possibilities for synthesis, and such.” One
perception was that students did not have opportunities to co-construct knowledge or
bring in new thought or meaning, therefore the rubric was not able to capture the student
work sample potential to its fullest extent. Another consideration was whether at 45
minutes for the entire task there was enough time to draw out the full co-construction of
knowledge possible.
The asynchronous educator-raters expressed that they would have preferred a
moderated training as is held typically for rubric introduction, such that each member of
the team rates a couple of samples using the rubric, with a discussion to review the scores
given and the rationale, and come to agreement on what the appropriate point assignment
would be per trait. Then raters typically discuss each trait with respect to student samples
119!
Unfortunately, due to time and location issues, no such training was available for the
asynchronous raters. As one educator pointed out, using rubrics to rate student samples
for assessment is typically a collaborative process itself. This was addressed in the
collaborative moderation conducted subsequently.
Raters who had a moderated session said that the process was familiar to them, as
many had used the Six Traits Writing Rubric. They were surprised that negotiating for
consensus about what student evidence from work samples fit which trait level was a goal
of the moderation. As one teacher remarked, “When using the Writing Rubric it is
acceptable to have 1 degree of difference—not consensus.” Teachers did dissect the
descriptions under each level of each trait, and with the exception of being bothered by
Trait 1, and temporarily confused by Trait 6, the process was basically agreed to be
understood by the participants.
Rubric utility. One purpose of this study was to contribute to the guidance of
instructional design and professional development for collaborative learning and problem
solving in K-12 education, as expressed in Research Question 3. The educators thought
that the rubric was useful for a teaching grid in presenting and assessing collaboration in
a digital environment. All raters approached this rubric favorably, noting that they had
not seen any document prior to this rubric that could guide instruction in digital
collaboration or that described skills regarding the use of Google docs, CSCL, or even
collaboration in non-digital environments.
One educator noted that the high schools in her district are slated to begin
teaching digital collaboration using Google docs during the 2012-2013 school year, but
no one has of yet provided any instructional guidance. She commented, “I think this
120!
rubric is the first step in developing interactive group projects.” The rubric could be
generalized to other assignments, such as the Tumalo Community School assignment for
the 4th grade class to decide, in collaborative groups, on behavior guidelines and
strategies using Google docs as a medium. Implications of this will be further discussed
in Chapter IV.
Qualitative Review from Researchers in the Field
The rubric was reviewed by Dr. Gerry Stahl, Associate Professor in the College of
Information Science and Technology at Drexel University, and a widely published
scholar in the field of CSCL. Stahl stated that the rubric, operating within the educational
paradigm of classroom instruction, is “likely to be better comprehended by teachers than
anything he might propose” (Stahl, April 8, 2012, personal communication). He
described that although he had no prior experience with a classroom rubric such as this,
he stated that the rubric did capture many of the elements of collaboration. He was
unsure how much it could capture group cognition, defined as the emergence of ideas
through group discourse such that ideas are built on to produce new knowledge that is co-
constructed by the group process; going beyond the original ideas or beliefs of any
individuals. He was also uncertain that the concept of co-construction was adequately
conveyed to teachers and acknowledged that the language used in the rubric, such as
“sharing content” could be open to various interpretations not consistent with co-
construction of knowledge.
The review from the research perspective illuminates the gap between researchers
and practitioners in the field, whereby constructs are explored deeply in great detail, yet
the results often are not communicated to practitioners at the K-12 level and then rarely
121!
in a format useful to practice. Conversely, the direct needs of educators in K-12 practice
may not often be the focus of research in the field. More on this will be discussed in
Chapter IV.
Results of Phase 4: Examine Categorical Patterns and Trends in Student Work
A main objective of this project was to examine patterns and trends in student
ability in collaborative work in a digital environment, as outlined in RQ1b. An
examination of displayed student ability in the sample Notebooks by age as evaluated by
the 3+3 Six Traits Digital Collaboration Rubric, and discussed in section above on page
111, showed some but not substantial differences across the age groups of 11, 13 and 15
year olds. However, there are clear patterns and trends related to Notebook use in this
sample, regardless of age groups. The student work, scored using the 3+3 Six Traits
Digital Collaboration Rubric, reflected a general lack of skill in using collaborative
documents in a digital environment, with most groups placed in the Emerging skill
category. Analysis of the student work discussed in this section looks at the skills
displayed on a diagnostic level. Table 17 outlines the processes used to diagnostically
examine the sample Notebooks for patterns and trends.
The discourse displayed in student Notebook work samples was in general sparse,
with only nine groups having more than 20 entries in their Notebook—and an entry
counted down to utterance or emoticon level. The average number of entries for the nine
more heavily used Notebooks was 49.5. Six out of thirty-three groups did not access
their collaborative Notebook at all. Two or three groups appeared to have erased all signs
of their collaborative work, and left just the neatly numbered clues with answers and a list
of group member identification numbers.
122!
Diagnostic Summary of Student Work Samples: Patterns and Trends
Using information coded directly from student work sample Notebooks, scored on
the rubric, and quantified with a coding checklist, trends in student use of the shared
documents Notebook could be organized into three main patterns of student use, with
pattern sub-categories that described these patterns of use in greater detail.
Did not use collaborative tool shared document. The groups evidencing this
pattern fall into two categories; either they did not access the document at all, or they
accessed it but abandoned use of the document. Potential reasons that they did not access
it include not having the technology skills to recognize the resource or to know how to
open it. Another potential reason is that they opened it but did not know how to “start”
Table 17
Phase 4 Processes and Outcomes
Phase 4:
Examination of
patterns and trends
in student work
Method Process Results/Outcome
Phase 4.0
RQ1b
Body of Work Review student
Notebook use for
scores and traits
displayed
Determine broader
group patterns of
Notebook use
RQ1b Body of Work Diagnostic analysis
of individual group
Notebook
Analysis of group
patterns for sub-
skills and behaviors
RQ1b Examination of sub-
skills and individual
group behaviors
Discern behaviors
that may contribute
to group patterns of
Notebook use
123!
the use, which could be a technology based reason or an organizational/lack of
collaborative skills issue.
Used the collaborative tool shared document but did not progress with task.
Groups evidencing this pattern of use were able to access and use the shared document on
a technological level, but did not use the tool to complete their assigned task. There
appeared to be several patterns of behavior that led groups to fall into this category
including lack of group organization; role or task conflict or confusion; off-task behaviors
or topics; lack of visual organization; and poor relational skills.
Lack of group organization. The participants in these groups appeared and
disappeared on the Notebook randomly. Some team member might post that they would
take on a certain task, and never appear on the document again, to report their outcome.
Someone else might post just a number or utterance, not seemingly connected to any
other post or task. Participants may insert comments at the beginning, in the middle, or at
the end of other existing comments, making it difficult to track group processes.
Lack of visual organization. Visual organization may or may not co-occur with
group organization skills and evidence. Many groups were clearly engaged and posting
ideas, questions, and resources, but the posts and discussion threads were clumped and
intermingled stream of consciousness-style that made it hard to discern how threads were
connected, who posted what, and how responses connected to queries. A clear visual
format would likely have facilitated communication processes and task progress by
conserving the energy necessary to wade through unrelated posts to track a thread.
The following excerpt is from one of three pages in a sample Notebook. Out of nine
posts in a clump, four different clues are discussed or queried, along with two process
124!
questions and a prompt for participants to stay with their own identified font color. It
must be noted that the color scheme per participant in the sample below reflects some
advance skill and thinking regarding visual organization.
Figure 18. Notebook sample displaying trends in visual organization. The sample here
does show emerging use of visual organization with the use of color for participants, but
as described above, the participants did not all stick to a unique color, nor separate
threads regarding differing clues or topics.
Role or task conflict or confusion. This pattern of behavior occurred primarily
among the 11-year-old groups, and is characterized by participants attempting to organize
the group by identifying and taking or assigning roles to group members. In these cases,
groups used most of the document space on the conflict, without a real resolution.
Evidence showed that the topic of who would have what role was very important to some
members, to the extent that they could not engage in the actual task itself.
Poor relational skills. The student work samples did not reflect standard face-to-
face norms regarding relational skills. A typical face-to-face group would include
introductions and negotiation about roles, task assignments and workspaces. This could
be transferred to a digital environment without too much difficulty once an instructor has
planted the idea or formulated the structure. Some groups, comprised mostly of the more
experienced 15 year olds, did display a transfer of relational skills by introducing
125!
themselves and claiming a “workspace” in the form of a font color or the use of initials
and managing a task. That way, they could be tracked by others and responded to
consistent with their posts. The group could track whether posted queries had been
addressed or whether needs were persistent. They could also track who was participating
and in what ways. There was some displayed need to preserve claimed identity when a
participant said to another one who posted in his/her color “get off my colour.”
Off-task behaviors. Some patterns of off-task behaviors or topics occurred in
four groups, or approximately 12 percent of the sample. Off-task behaviors included
digressing into affective topics such as how hard or difficult or frustrating the task was,
but also included “messing around” as evidenced by typing random characters or
engaging in back and forth off task comments such as seen in Notebook 4, shown in
Figure 19.
Figure 19. Notebook sample reflecting off-task behavior.
Used the collaborative tool shared document and progressed with task. This
diagnostic category is difficult to completely assess by the student work samples as some
of the better organized and complete work samples did not reflect stages of collaborative
behavior. While there were vestiges of collaborative behaviors, the higher functioning
groups appeared to “clean up” their shared document so as to present the participants and
126!
the co-constructed or collaborative answers. As one teacher-rater remarked “They didn’t
appear to realize that they were being graded on their collaboration—they were focused
on answer generation and erased their collaborative evidence.” The tracking itself could
be better addressed in future assessments through the structure of the collaborative space,
or the intermediate recording of work products, or both.
Some groups displayed what looked like collaborative activity between two or
more participants, while other participants did not appear to show successful engagement
in the task, thus making the group product less truly collaborative. While collaborative
group members may have had cooperative learning strategies to employ, time constraints
may have inhibited sustained efforts at group organization. One group had a participant
who attempted to coach co-participants through a task, but gave up and responded to
persistent queries for assistance from a group member with “never mind, it takes too
long, I have made an answer for you, its easier” reflecting her frustration with using the
system to help someone locate, access and use a tool in a remote-located situation.
However, this also can indicate less skill development in the purposes and approaches to
collaboration, where building shared understanding has the potential to improve the
individual answer.
Interactive regulated learning and relation to collaboration. One pattern
displayed was a relatively high degree of interactive regulated learning: evidence of
seeking or offering help; reporting progress; clarifying process; self or group evaluation;
time management; task orientation; goal setting; mediation; or appreciation, paired with
sharing resources and content, but not progressing through collaboration or co-
construction of knowledge.
127!
These groups engaged in the process of collaborative work together, but did not
harness collective efforts to complete the task. This could have been due to an initial lack
of role or task assignment or even visual structural organization, as many Notebooks
displaying group activity in collaborative processes lacked essential organization such as
who is participating and what task will each do. It also could be due to the fact that
collaboration and co-construction of knowledge require a) relational skills combined with
b) task orientation and c) specific reciprocal interaction. This three-fold skill set may be
developmentally challenging for students who have not had explicit instruction and
substantial practice on this area.
Collaboration. The traits collaboration and co-construction of knowledge are the
least displayed in the student sample shared documents. Only nine out of 33 sample
Notebooks had ratings of “developing” skill for collaboration, and only two sample
Notebooks had ratings of “capable” for collaboration. Yet the essential question that
guides the analysis of the collaboration trait is simply stated “Did participants add to,
evaluate, or offer an alternative response to the shared resources or content?” The
requirement is to read a post by a co-participant/team member and add to it; disagree and
state why; offer an alternative, preferably with rationale; or acknowledge the contribution
with agreement. These are not inherently difficult tasks; even first grade students could
practice such an exercise verbally, supported with concrete prompts. The performance
levels of the sample student groups suggest that they were unaware of the protocol for
collaborative learning, and may perform in more productive ways if this skill is taught.
Co-construction of knowledge. Co-construction of knowledge is more
complicated. The essential question defining the trait co-construction of knowledge is
128!
“Did participants use shared and evaluated content to construct final answers or responses
or complete a task?” The difference between collaboration and co-construction of
knowledge thus defined is negotiated agreement on a shared response (or idea or
conceptual understanding, depending on the task) that is based on and negotiated from
the collaborative input of group members, making new meaning or knowledge. Co-
construction of knowledge involves the step of initiating a call for consensus or a joint
frame. “We have several ideas and various perspectives about the effects of global
warming on polar bears; how can we take this input and frame an answer?” The group
must then play with the input and perspectives to decide what they can agree on to submit
for a group response. Developmentally, this is advanced, and the sample Notebook
scores show it; only four Notebooks showed “developing” status for the trait co-
construction of knowledge, nine showed “emerging” status, and the other 20 samples
were non-collaborative.
Skills necessary to perform co-construction of knowledge include an awareness of
the skill and the steps involved; receptive and expressive communication skills; the
ability to hold multiple perspectives simultaneously; conflict resolution/negotiation
mediation skills; and open-mindedness to outcome or the ability to suspend ego-based
attachment to collaborative contributions. Developmental psychologists would place the
sub-skills necessary to engage in co-construction at late adolescent or young adult age,
respective of individual developmental differences. Nonetheless, Vygotsky (1978) holds
that socially mediated scaffolding of these skills would permit younger students, perhaps
late elementary or middle school, to engage in co-construction in a highly guided context.
129!
Results of Phase 5: Instructional Design
Instructional design elements were synthesized from the evaluation of student
works samples and the qualitative responses of educators using Cognitive Task Analysis
and Backward Design principles. This involves categorizing sub-skills into instructional
domains or curricular content areas; Table 18 outlines the process.
The analysis of composite skills for collaborative learning in a digital
environment situated in a global education context for 21st century skills shows that three
distinct skill areas, academic social-emotional skills, cooperative learning strategies, and
technology, could allow for the possibility of success in digital collaboration. To meet
best practices for 21st century skills and global education, the tasks lend themselves to
embedding in an authentic, real work context to increase student engagement and
facilitate transfer of skills. The model is illustrated in Figure 20.
Table 18
Phase 5 Processes and Outcomes
Phase 5:
Examination of skill
areas for
Instructional Design
Method Process Results/Outcome
Phase 5.0
RQ3
Cognitive Task
Analysis and
Backward Design
Categorize skills per
domain
Develop
instructional
categories to aid in
planning and
resource allocation
130!
!
Figure 20. Elements for Teaching Collaborative Learning in a Digital Environment.
The skills used to develop the collaborative traits used on the student shared
document Notebook fall into different domains: the use of the technology tools and the
understanding of the digital environment; the academic social-emotional skills; content
area skills such as decoding, reading comprehension, interpretation of charts and graphs,
and the ability to estimate or perform basic calculations; and a grasp of basic cooperative
learning principles with the practice necessary to organize task orientation and facilitate
the metacognitive group processing that will help the group stay focused on task, with all
participants engaged.
Technology Skills
Technologically, some students may not have understood the links, may not have
known how to open the document, or how to write on it. A Google document saves
automatically, so knowing how to save work is not immediately necessary, but students
Collaborative
Learning in a
Digital
Environment
Technological
Tools
Content-
based
Authentic
Tasks
Social-
Emotional
Learning
Cooperative
Learning
131!
had editing functions, so any student could have erased any other students’ work, by
accident or intentionally. While there is a function to recover former iterations of a
document, students may not have known how to access or use that function. Several
Notebook entries referenced not knowing how to access a link, not understanding how to
use a function, having just found the Notebook document after much confusion,
admonitions to stop erasing from Notebook, or trouble loading pages that held
information for the Arctic Trek clues.
Academic Social-Emotional Skills
This assessment scenario involves functional academic social-emotional skills
such as turn taking, communication, self-regulation, negotiated or interactive regulation,
and perspective taking. Other related skills such as empathy, self-awareness, dealing
effectively with conflict and decision-making skills would also be necessary for
successful negotiation of this collaborative task.
Cooperative Learning Strategies and Skills
Students who have frequent classroom practice in cooperative learning strategies
may have some automated responses when groups are formed to facilitate completion of
an assignment. Typical automated responses from adequate training in cooperative
learning techniques might be organizational processes such as role assignment and task
assignment. Familiar cooperative learning roles that promote group mediated or
interactive group regulation includes timekeeper, someone who will serve as recorder,
someone who will ask clarifying questions, and someone who will ask process questions.
Shared document skills. As the use of shared online documents is relatively
new, it is not unexpected that this skill would be emerging for most groups. The most
132!
successful groups would likely transfer their skills in academic social emotional learning
and cooperative learning to this technological medium. Adaptation would be necessary
to overcome the dependency on face-to-face interactions, and strength in social-
emotional learning skills would be one way to accommodate the lack of vocal, facial, and
postural cuing. One method students used to adapt their social-emotional skills in sample
Notebooks was with the use of emoticons and social text-speak.
The ability to communicate effectively and work collaboratively in a remote-
located shared document space is an important 21st century educational and workplace
skill. Many universities use shared document work in online or hybrid courses, and
businesses use this venue to facilitate work across time zones and locations, saving both
human and finite resource energy by allowing individuals to contribute no matter where
they are relative to the project home. The project home may be online or cloud-based
with project ownership shared between many participants; this assessment task attempts
to recreate the authenticity of real world adult skills.
Composite Domains Supporting Collaboration in a Digital Environment
Academic social emotional skills, cooperative learning skills, and skills in using
technological tools are necessary for the ability to be successful in a collaborative
learning task in a digital environment. Each of the skill areas is multi-dimensional, itself
a composite of many sub-skills; these are shown in Figure 22. Academic social
emotional skills are the essential building block due to how these basic skills support
successful use of cooperative learning strategies. Technology or ICT skills could be
taught or learned in an individualistic manner, but without training and practice in the
academic social emotional area, it may take much time and effort for obstacles to be
133!
cleared in a collaborative context. Overlaid on these curricular component areas are basic
skills in core content areas. If a student has all the requisite academic social emotional
skills, cooperative learning training, and knows how to work web 2.0 tools but has poor
reading skills, his or her participation in remote-located collaborative environments will
be challenging.
In the student work samples evaluated for this assessment, many groups did not
use the skill identification, or set up some sort of system for coding responses per
participant. In general, they did not make task assignments, though some groups
attempted this and in other groups individuals volunteered to get started on something
specific towards the shared goal. There was a general lack of discursive reciprocal
follow through with most groups. A participant may make a request or ask a question,
but not receive a response; this may be followed by a completely different request or
response, reflecting discontinuity within the group, such as excerpted from
Notebook 2 as shown in Figure 21.
Figure 21. Notebook sample illustrating lack of reciprocal discourse.
134!
It is impossible to know whether this behavior would hold true for the same group in a
face to face setting, but it is not uncommon for school age students to not fully listen to
one another or address each other’s concerns, which links to fluency in academic social-
emotional learning skills.
Each component area of collaborative learning in a digital environment has skill
sets that necessitate teaching, practice, and environmental supports; see Figure 22 for an
elaboration of the domains elated to digital collaboration and their relative component
sub-skills.
!!
Figure 22. Curricular Components of Collaborative Learning in a Digital Environment.
Curricular progressions exist for these components, as explicated by frameworks for
these areas. Many states have adopted standards for both social-emotional learning and
technology/ICT literacy. The curricular components do not need to be sequentially
Collaborative Learning in a Digital Environment
Social-Emotional
Learning
Self-Awareness
Social Awareness
Relationship Skills
Self Management
Responsible Decision-
making
Cooperative Learning
Positive
interdependence;
individual accountability
Teamwork skills: shared
leadership;
communication; conflcit
resolution; decision-
making
Group processing
Technology
Teachers learn how to
use tools themselves
Learn how to teach and
manage student use of
tools
Develop strategies or
work-arounds for tech
problems
135!
taught, though social-emotional skills are the foundation for success in cooperative
learning. It is possible for these skills to be presented in an integrated manner and
associated with authentic tasks. One example of this is was designed by a 4th grade
teacher using technology to create shared documents in cooperative learning groups
around the task of creating classroom norms and rules. This teacher prompted students to
structure their shared document space with color-coded identification to share their ideas
within a group. They posted their group results in a shared digital space and used the
collection of shared documents to find the commonalities between groups and as
discussion points in developing class rules for the year.
Results of Phase 6: Professional Development for Digital Collaboration
Professional development needs are described from the instructional design
elements and from the qualitative responses of educators regarding the student work
samples, rubric use, and perceived preparation to teach the sub-skills necessary for
success in the overall ICT literacy task. Educators were surveyed regarding professional
development in the areas of collaboration and technology. Table 19 provides a
description of phase six activities.
Qualitative feedback on professional development needs was obtained from
teachers in rubric rating session. Teachers reviewing the rubric and evaluating Notebook
samples reflected a diverse group of educators, trained in a variety of disciplines,
including Mathematics, Biology, Music, English Language Arts, Elementary Education,
Economics, and Psychology. They had varying time frames for their own teacher
preparation programs, with three teachers earning credentials in the last ten years, and the
rest having worked in the field between 20 and 33 years. All use the Internet at home and
136!
school, conduct business through email, and engage in some sort of social media use.
Only three of the eight had limited exposure to Google docs or other online collaboration
tools, and no one considered himself or herself proficient.
Educators discussed their need for professional development in relation to both
learning the technological skills necessary to engage in digital collaboration themselves,
as well as for training in the instruction of these skills for their students. Additional
professional development needs were cited for how to use these skills in a classroom
setting, and how they could be integrated with existing curricular demands, or if they
would be better taught in isolation. There was some concern about site-based support for
maintaining the technology necessary for instruction in this area.
Some raters observed that many newer teachers are no longer trained in
cooperative learning methods as had been popular in the 1980’s and 1990’s, and that
instead teachers are trained in data-based decision-making and assessment, so experience
teaching cooperative or collaborative behaviors may be lacking among many educators.
Table 19
Phase 6 Processes and Outcomes
Phase 6: Investigate
potential
professional
development
strategies
Method Process Results/Outcome
Phase 6.0
Needs Assessment Survey educators re
professional
development
Discover common
experiences and
needs
Qualitative feedback Educators reflect on
use of technology
and collaboration
Generate
information re task
specific needs
137!
Educators serving as inter-raters also took a short demographic survey regarding
their years of training and grade level experience; professional development in
technology, cooperative learning and social-emotional learning; their personal and
professional use of both technology and collaboration; and their exposure to 21st century
skills, specifically if they had seen or could identify a 21st century skills framework. The
survey can be seen in Appendix N, and the results of the survey raters in Table 20.
Table 20
Survey Results: Professional Development in Technology and Collaboration
Years of
Teaching
PD in
Technology
PD in
Cooperative
Learning
PD in
Social-
Emotional
Learning
Familiar
with 21st
Century
Skills
Frameworks
Personal
Experience
in
Collaboration
with
Technology
7 No No No No Yes
8 No No No No No
10 Yes Yes Yes No No
11 Yes Yes No No Yes
19 Yes Yes Yes Yes No
20 Yes Yes Yes Yes Yes
21 No Yes No No No
33 No No No No No
When asked about the use of collaborative work in a technological setting with
their students, six out of eight educators replied they lacked sufficient technology, while
two said they felt their students were too young. Teachers reported that they lacked time
due to other curricular needs; and that they don’t feel proficient or have the confidence to
138!
teach these skills. The raters who reported using collaborative work in a digital setting
professionally primarily referenced wikis or blogs, and everyone added the caveat “not
much” to their level or frequency of professional use of technology.
As a related comparison, most of the 40 participants in a UO graduate level
course in Information Technology in the summer of 2011 were introduced to Google
docs and other technological collaboration tools such as wikis or social bookmarking
sites for the first time, though they were seasoned educators and administrators enrolled
in a masters or doctoral program in Educational Leadership. This lack of experience and
training in technology was not specific to teachers living proximal to the University of
Oregon; about half of the class was from Canada attending a master of education
program. Such information helps to underscore the need for technology preparation for
teachers.
Analysis of Results and Validity Considerations
Validity is used here as an evaluative summary of the evidence for and
consequences of score interpretation through the degree to which empirical evidence and
theoretical rationale support the interpretations and the use of the assessments or
implications for action (AERA, 1999; Messick, 1995). Rather than being a property of
the instrument, validity is the meaning of the scores as a function of the items and the
people taking the assessment and the context of the assessment. Kane (1992) suggests
using a unified argument-based approach to validity, conceptualizing validity as an
argument with the interpretation of test scores supported by evidence being evaluated
against competing interpretations and potential counterarguments until the latter are
refuted. Messick (1995) describes validation as a continuing process and suggests
139!
constructing evidence supporting the intended purpose of the assessment as the first step
in creating a valid measure.
A series of validity considerations were addressed in this study through the
following approaches:
Construct Validity
One possible threat to construct validity in this study was construct confound,
including construct irrelevant variance (CIV) (Haladyna & Downing, 2004; Shadish,
Cook, & Campbell, 2002). Messick (1996) discusses richly contextualized performance
assessments and authentic simulations of real world tasks to be at risk for CIV due to
contextual clues, but considers that risk ameliorated by the construct relevance of the
clues. Messick also delineates the difference between a construct confound type CIV,
and the use of higher order constructs where complexity is required to subsume or
organize multiple processes and have various constructs operationally required at the
same time. Determining whether a source of variance is a relevant part of a focal
construct or simply a confound is key to avoiding CIV and maintaining construct validity
in this situation.
Construct Irrelevant Variance increases in highly contextualized tasks such as the
Arctic Trek performance assessment due to the aforementioned use of higher order
constructs composed of many sub-skills, which in the Arctic Trek assessment task
included technology skills, academic social emotional learning skills, and collaborative
learning skills, as well as content-based skills such as math and reading. ATC21s
assessment developers took great care to succinctly define and operationalize constructs
and match content to grade level standards, but it remains likely that the reading
140!
numeracy and other skills of individual students interfered with their ability to be
adequately measured on the collaborative process and product constructs embedded in
the assessment.
For ATC21S, a sampling design was used over a set of tasks, work products, and
group settings in order to address this concern and establish validity for the ATC21S
intended inferences of digital literacy assessment over a set of uses for collaborative
digital literacy in learning situations. However, it must be noted that this dissertation
study selected only one a single work product within a single task so exploring the wider
implications of the sampling design are outside the scope of this research. Therefore this
is a limitation of this study of a small segment of the data.
Domain theory and structure. Domain theory is the primary basis for
specifying the boundaries and structure of a construct for use in the development and
scoring of performance tasks and can be accomplished through task analysis or
curriculum analysis (Messick, 1996). Specifying boundaries includes determining the
knowledge, skills, attitudes, motives and values, such as ATC21s has outlined in the
KSAVE model that is used for both task delineation and scoring on the Arctic Trek
assessment. For each element of their 21st century skills framework, as discussed in
Chapter I, (Ways of Thinking, Ways of Working, Tools for Working, Living in the
World, and Digital Learning Communities), the KSAVE model describes sub skills
categorically divided between the (1) Knowledge; (2) Skills; and (3) Attitudes, Values
and Ethics needed to master the framework element.
The sub skills are elucidated specifically with detailed, measureable descriptions.
Refer to Appendix C to see the tables for Tools for Working and Ways of Working, the
141!
two ATC21S framework areas addressed in this dissertation. Functional importance, or
ecological sampling increases construct relevance and validity by considering both what
people do authentically in the performance domain and also what characterizes and
differentiates expertise in the domain (Messick, 1995). The ATC21S group has richly
simulated actual networking performance, assigned specific skills to each construct, and
established a three to four level range for expertise across representations, as shown in
the blueprinting process in Appendix A and B.
Process models and engagement. Process modeling assists with the
identification of domain processes to be revealed in the assessment tasks (Messick,
1995). The tasks must provide appropriate sampling of the domain processes while
covering domain content as well as evidence of participants engaging in task performance
in order to capture performance consistencies demonstrating domain processes. Sources
of process-based evidence might be from think aloud or self-talk protocol, computer
modeling of task processes, or correlation patterns among part scores, (Messick, p.745).
The Arctic Trek assessment incorporates rich representative sampling of domain
processes allowing ample opportunities for students to demonstrate their performance.
The assessment also engaged in extensive Cognitive Laboratory process evidence, not
described here as outside the scope of this dissertation data set, with extensive in-person
observational protocols in each country, as well as video showing students taking the
assessment, screen shots showing keystroke choices and curser navigation, and think
aloud verbalizations; the self-talk component was introduced to students as part of their
preparation for the assessment during the cognitive laboratory process. Teacher training
materials and the Assessment Delivery Booklet used by assessment administrators are
142!
also available, showing the degree of standardization in the process of assessment
delivery.
Scoring models and correlations with external variables. The cross scoring of
assessments with diverse models can highlight the assumptions and values of each
method (Moss, 1996). Messick (1995) suggests looking at assessment score relationships
with other measures and even non-assessment behaviors to check the interactive relations
within the construct. Finding evidence of a link between assessment scores and criterion
measures validates the scores for providing meaningful information about the construct.
Criterion validity efforts included the comparison of student scores across numerous
tasks and work products, and in different team arrangements. This data set is outside of
the scope of this dissertation as well, but is mentioned here regarding the assessment
development process more generally.
Selection Bias and History
Some threats to validity in this study were selection bias and history (Shadish et
al., 2002). The participating students were not randomly sampled, and though there were
attempts to include students representing different regions, socioeconomic status and
ages, there were unknowns about the variance in experience with the constructs measured
across the student sample. There was uniformity in both teacher training for preparation
and administration and in the administration procedures during the assessment, with
direct observation for fidelity to the model. However, the very small sample size and
generative nature of the task development requires caution in interpretation of the results,
as does the cross-case analysis on a limited number of comparison cases.
143!
External Validity
Threats to external validity include the generalizability of the findings to other
students and school settings. The degree to which the results will be generalizable to
others is affected by a variety of factors. Sample selection bias could be introduced by
using samples based on convenience, clustering, and self-selection (Alreck & Settle,
1995). The degree of variation in the sample and small sample size could also jeopardize
external validity. Due to the timing of the collection of pilot data for the ATC21s project,
a small sample was used. Within-sample variance on school site characteristics, grade
level, content exposure and opportunities for skill development could provide sampling
error, as according to Alreck and Settle (1995), the more the variance that exists in the
sample population, the greater the possible sampling error. Sampling bias or poor
representation could lead to sampling error and thus weaken the external validity.
Context-dependent mediation is another threat to external validity, as student
performance may be affected by novel situations and may not accurately reflect their true
performance estimate (Shadish et al., 2002).
The data set used here was part of a small pilot study of the tasks. Larger field
trials are currently ongoing in several countries, and will help address some of these
issues, but are outside of the scope of this dissertation.
Statistical Conclusion Validity
According to the APA Standards for Tests and Measurement (2002), analyses for
some of the questions in this study are appropriate for descriptive statistics, but the small,
non-random sample will be predisposed to Type 1 and 2 errors for inferential statistics,
which therefore are not used here for this emergent stage of the work on these assessment
144!
tasks (Shadish et al., 2002). A small sample was used due to time and cost constraints in
piloting new assessments, and due to the descriptive nature of the research questions that
employed the cross-case qualitative Body of Work method, a “thick description”
technique that focuses on patterns and themes in a smaller number of samples rather than
inferential aggregation over a large data set. The use of inter-raters in reviewing student
Notebooks helps guard against Type 1 error in the descriptive comparisons, as evidence
will be cross-coded and independently categorized, and the multiple lenses of the
iteration with both qualitative and quantitative data will assist in interpretation of results.
145!
CHAPTER IV
DISCUSSION
The purpose of this study was to examine student work samples from a
collaborative performance task in a digital environment and describe any patterns or
trends of collaborative skill evident in the body of work. The intent of this study was to
contribute to research that may inform practice for instructional and assessment strategies
in this emerging area of collaboration in a digital environment. This study hoped to
contribute to the research on digital collaboration in education in four key areas: a)
further the understanding of the cognitive and social processes involved in collaborative
digital literacy skills for students at ages 11, 13 and 15; b) help inform instructional
leaders on conceptions of student work in virtual collaboration; c) contribute to the
dialogue of instructional design to support collaborative learning in K-12 education; and
d) offer considerations for formulating professional development. This study had an
additional intent of contributing to research methods by providing an example of a
mixed-methods, multi-dimensional, multi-phase iterative design to organize qualitative
data for analysis and interpretation such that this type of data, exemplified by the student
work samples, can be adequately transformed to information useful for data-based
decision making in K-12 systems.
This chapter is divided into six sections. The first section presents summary
conclusions with respect to the research questions and hypotheses. The second discusses
the contributions this study makes to the body of knowledge for research and practice in
the field. The third outlines areas for future research. The fourth discusses the
limitations of the study and threats to internal and external validity. The fifth section
146!
addresses the implications of the study and results; and the final section offers
conclusions.
Research Question Summary
This study examined collaboration in a digital environment during a performance
assessment of 21st century skills among 11, 13 and 15 year old students. The results of
this study appear to fill a unique niche that bridges research in technology and
collaboration with concerns of practice situated in K-12 settings. There are three
outcomes related to this study: a) an examination of trends and patterns in student
collaborative ability; b) the development of a rubric that can be used as a prototype or
guide towards measuring collaborative learning in a digital environment in K-12 settings;
and c) an analysis of sub-skills needed to support collaborative learning and the curricular
domains in which they are housed to inform instructional design. This study was
designed around three research questions, which are described below with the results of
the analyses.
Research Question 1:
1. Does the use of the artifact Arctic Trek collaborative Notebook fall into distinct
patterns that reflect levels of skill development or show trends in collaborative learning
through a digital environment?
1a. Can categorical patterns be identified?
1b. Can these patterns be seen as types of performances referenced by
collaboration literature?
The results of this study suggest that student use of the collaborative document
Notebook do reflect levels of skill development, and that categorical patterns of
147!
collaborative skill can be identified. The patterns can be categorized by the following
trends: a) slightly higher skill with collaborative processes over generating collaborative
products, and b) skill development reflective of non-collaborative or emergent levels of
skill.
The overall trend of displaying non-collaborative behaviors is affected by the
number of groups who either did not access their shared document Notebook, or who
abandoned it after limited use, scoring a zero on the 3+3 Six Traits Digital Collaboration
Rubric that was generated by the body of evidence from the samples and then used to
measure student work samples. The study lacks sufficient background information to
ascertain whether students lacked technological skill to access or use the Notebook, did
not understand assessment directions, or simply made a choice not to work with their
group.
Of groups who accessed their Notebook and displayed emerging skill in
collaboration, the easiest skills or traits to display were identification or role assignment,
interactive regulated learning (asking for and offering help, reporting progress, clarifying
processes, time management, task orientation, goal setting, and self or group evaluation)
and sharing or reporting content. These would seem to be familiar and relatively easily
transferable skills from classroom or even non-instructional situations. The more
difficult traits to display were co-construction of knowledge and collaboration; these are
skills that the students may not have had prior exposure to or experience with.
Some of the patterns identified in this study are similar to those referenced in the
literature on collaboration. Literature referenced for this component of the study was
primarily that of Computer-Supported Collaborative Learning (CSCL), which has varied
148!
and divergent focal points and methodologies, and largely examines university-level
students, so studies that closely match this project are few (DeWever, et al., 2006). The
type of content discovered through discourse analysis in this study supports the research
of Gunawardena, Lowe & Anderson (1997) who described five phases of knowledge
construction:
1. Sharing or comparing information
2. Dissonance or inconsistency
3. Negotiating agreements or co-construction
4. Testing tentative constructions
5. Statement or application of newly constructed knowledge
They found that phases 1 and 3 were dominant and phases 4 and 5 occurred less often.
Through both task structure and qualitative analysis of the sample Notebooks, this study
showed all of those elements to be present in student discourse, and that sharing or
comparing information was one of the traits displayed frequently in student work
Notebook samples.
Research Question 2:
2. Will descriptive analysis show that levels of Notebook use have a relationship with
student age, for this sample?
2a. Can data displays show if and how patterns may cluster by age?
2b. Are there important trends to be seen in the age-related patterns, such as will
more advanced digital collaboration patterns be seen for younger or older students, in this
data set?
149!
My hypothesis was that I would find patterns associated as trends, and that they
would have a relationship with age. The results of this study show only a slight trend in
increased digital collaborative ability by age. Due to the small sample size with uneven
age distribution, generalizations cannot be made, but overall in this sample the 15 year
olds presented a slightly higher mean score on the 3+3 Six Traits Digital Collaboration
Rubric at 6.2, while 11 year olds presented a mean of 5.7; with a total possible score of
18 points, no age group demonstrated beyond on average emerging collaborative skill
when measured by the rubric.
This assessment of collaboration in a digital environment is un-instructed—and
therefore more formative in nature and not necessarily reflecting an opportunity to learn.
Older students have naturally had more experience in school, such that the higher the
grade level the more overall access students have had to technology, content skill, and
teamwork situations. The four years of schooling experience and societal exposure that
15-year-olds have beyond 11-year-olds can likely explain the slight gain in mean score.
Additionally, age 15 showed fewer cases with a score of 0, perhaps reflecting more
refinement at carefully following instructions or showing responsibility in classroom
situations. The 11-year-olds showed less volume of social discourse in their Notebooks
as per qualitative analysis, and were the only age group to engage in substantial amounts
of role conflict, which demonstrates perhaps some psychosocial differences between the
two age extremes of this study.
150!
Research Question 3:
3. Given the results of analysis in RQ 1-2 above, do performance patterns identified in
the digital collaborative work products suggest connections to student instructional
support, as examined through an instructional leadership focus?
The results of this study identified performance patterns in the digital
collaborative work products that suggest instruction in the components of digital
collaboration may enhance student performance. As there is not a widely available
curriculum for such instruction in digital collaboration as a subject, component areas
could be detailed and used for instruction to strengthen sub-skill areas until
comprehensive curricula exist specific to the processes involved in digital collaboration.
A program of instruction needs to be designed to teach the array of concepts and
skills needed for students to be equipped to engage in collaboration in a digital
environment. Effective instructional design will need to take into consideration
necessary developmental frameworks that are aligned with readiness and ability across
physical, cognitive, and social-emotional domains, and aligned both vertically across the
developmental spectrum and horizontally to tie into other key areas of conceptual and
skill development. Such frameworks would need vetting and trials within the applied
instructional setting to determine their accuracy and value.
As 21st century skills are not necessarily content-specific but span traditional
domains, creating models for multi-disciplinary integration or inter-disciplinary
opportunities may be essential for the infusion of these skills within K-12 settings
(Klieman, 2004; Pecheone & Kahl, 2010). Educators, especially in middle and high
school settings where students may be more developmentally ready to engage in
151!
collaboration through digital environments, are largely compartmentalized, and have
content-based frameworks and standards to address along with high stakes testing
accountability measures (Harris et al., 2009). Providing easily implemented activities or
exercises that apply 21st century skills towards learning discipline-specific content can
facilitate inclusion of such skills as digital collaboration (Inan et al., 2010). The new U.S.
common core standards allow for substantial integration of cross-cutting and higher order
skills into the traditional domains (National Governors Association Center for Best
Practices, Council of Chief State School Officers, 2010).
Development and testing of assessments for collaborative ability will be important
for the support of including this type of instruction in K-12 settings, which are
accountability-driven (Hew & Brush, 2007). The ATC21S performance assessment
Arctic Trek could be redesigned to include an ill-structured element to promote
possibilities for collaboration on shared meaning and co-construction of knowledge. The
assessment could also more explicitly state the purpose as assessment of collaboration,
provide a model of such, perhaps a video or think-aloud as part of the assessment
instructions, or provide some type of format that structures or mediates the group
interaction. The samples of student work from that revised assessment could be
examined to determine whether students showed an increase in collaborative skill over
these initial trials.
Contributions to Research and the Body of Knowledge
Despite the limitations of the study, this work makes potentially significant
contributions to the field of CSCL or broadly digital collaboration or collaboration and
ICT Literacy in a sub-category of the field that is not widely studied: practical
152!
applications to K-12 educational settings. This section discusses previous studies in the
literature and compares their suggestions for further research with the results generated
by this study. Contributions can be outlined as follows:
This study
• Addresses issues specifically called for in the research community
• Examines digital collaboration with this age group, unmediated, in a
performance assessment task, currently a unique sample in the field
• Describes patterns and trends of student skill displayed in a digital
collaboration task
• Analyzes sub-skills necessary for success in digital collaboration
• Explores Instructional Design for implementation of digital collaboration
• Considers Professional Development needs for instructional preparation
• Charts a methodology for organizing and analyzing collaborative student
work samples
• Provides an example of a multi-phase iterative methodology to bring
information from student work through task analysis and into instructional
design.
This study is somewhat unique in that it is one of very few studies of digital
collaboration among K-12 students, and where students of these ages are remotely
located and using a collaborative document. Other studies of digital collaboration in a K-
12 setting include a scaffolded, mediated digital collaboration with small groups of
middle school students solving math problems (Stahl, 2006); scaffolded instruction in
collaboration with middle and high school students in face-to-face classroom
153!
environments where groups were supported and mediated with a digital collaborative
script (Nussbaum et al., 2009); and face-to-face cooperative learning groups studied for
performance differences in mediated versus unmediated groups (Gillies, 2004). This
study adds to this body of work.
Rotherham and Willingham (2009) discuss curriculum, teacher expertise and
assessment as the main challenges for the integration of 21st century skills in the schools.
The researchers advocate for a long-term iterative process of planning, implementation,
reflection, and continued planning, with implications for teacher training and curriculum
development. This study addresses these challenges, exemplifies the iterative model, and
offers an iterative organizer for a model of Professional Development as illustrated in
Figure 24 located in the section ‘Implications for Professional Development’, in this
chapter.
Researchers in the field suggest examination of diverse groups and situations
using CSCL to help develop instructional practices that enhance virtual collaboration as
an educational tool and increase the understanding of the psychosocial processes in the
problem solving space (Strijbos et al., 2004a). This study contributes to the body of
knowledge with regards to students ages 11 through 15 and helps to develop instructional
practices for virtual collaboration through the analysis of domains and sub-skills involved
in virtual collaboration, as well as the initial attempt at designing a rubric to guide
instruction and measure student progress.
Hew and Brush (2007) identify current knowledge gaps as including teachers’
content and pedagogical knowledge for integrating technology in relationship to a
curriculum, specifically strategies for integrating technology into various subject areas.
154!
This project analyzed a demonstration of collaborative student work where students used
a shared document as a collaborative tool while searching for content to use in a problem-
solving space. The format and structure of this task could be utilized across disciplines,
and adjusted to become more open-ended or ill-structured as suggested by literature to
best promote collaboration.
Hew and Brush (2007) also suggest research examining cooperative group work
in a technological medium to 1) identify how a teacher would structure the task, and 2)
illuminate the obstacles involved in such strategies, leading to guidelines for instructional
design so teachers and instructional leaders can make informed decisions about how to
employ these strategies. This study examined that type of work and provides the
following to advance the body of knowledge: 1) explicates trends and patterns of student
behavior in a technology-based collaborative task in such as way as to inform teachers of
instructional issues and obstacles; 2) provides an analysis of sub-skills needed for
increased student success in such tasks; 3) offers a rubric to guide the organization of
instruction, and 4) suggests elements to consider for instructional design for both student
collaborative work and professional development for teachers.
Rotherham and Willingham (2009) describe uncovering the implicit domains
involved and discerning sub-skills that can be taught to support 21st century skills as a
significant contribution to methods for teaching 21st century skills. They suggest that this
could lead to targeted professional development for educators to become proficient in and
prepare for teaching such skills. This study addresses specifically the areas of implicit
domains and sub-skill analysis and suggests areas for targeted professional development
155!
based on surveyed needs, as well as a professional development model based on the
literature for best practices in professional development for school improvement.
Through a blend of what Stahl et al. (2006) define as experimental and descriptive
CSCL methodologies, this study utilized seeking patterns in data to uncover behaviors
and understand in very broad terms how the general practices work. The researchers
explain that descriptive examination offers the opportunity to discover both how groups
accomplish effective collaborative learning and also how they fail to do so.
Finding cases where interactional accomplishment of learning is absent, and
seeking to determine what aspects were missing or contributing to this lack of
collaborative learning is an important research effort, while being open minded about
what else of value might be accomplished by the participants in lieu of the collaborative
learning as student work is reviewed (Stahl et al., 2006). This study found the following
behaviors that appeared to interfere with collaborative work: poor virtual relational skills
including inefficiency establishing participant identity and role or task confusion or
conflict; frustration with the technological medium; lack of group organization; lack of
visual organization; and concern regarding time constraints. Additional behaviors that
appeared to interfere with performing collaborative work include students not fully
understanding the concept of collaboration, or perhaps not having sufficient requisite sub-
skills such as perspective taking, negotiation, or decision-making, most likely in the
absence of specific instruction on many of these skills.
156!
Areas for Future Research
This study illuminates the need for additional research to further explore the
performance of rubric components across task types; aspects of communication among
students in this age range; the effects of changes in venue or format from digital to face-
to-face settings on collaborative skill; the variables associated with digital collaboration
identified through this study; social-emotional based perceptions of collaborative work;
digitally embedded metrics for ongoing assessment of student work and increased utility
of shared document spaces; and professional development and school site infrastructure
needs to support instruction in digital collaboration.
Further research on the 3+3 Six Traits Digital Collaboration Rubric could include
investigating the utility of the rubric with other tasks, specifically gathering data to
describe the skill areas that could be optional in different tasks, such as role planning.
This would increase the usefulness of the rubric and allow for wider use of the rubric
with collaborative tasks across content areas and grade levels. The rubric could also be
tested on a similar task with a greater sample of teachers to better ascertain how it
performs as an assessment tool and instructional guide.
Another area of study might be the lack of thread development and reciprocal
communication among students of this age range to determine whether the digital
environment contributes to this phenomenon, or whether there may be generalized
difficulty executing reciprocal communication across environments. Comparison studies
could examine the use of scaffolding or mediated communication on thread development
and reciprocal communication among students, with extensions to whether sustained
scaffolding results in transfer of the skills.
157!
A format comparison study of a digital collaborative task as compared to face-to-
face collaboration on the same task, could describe where, how, and if transfer of face-to-
face skills occurs in a digital environment, and to examine which, if any, of the face-to-
face skills are either not necessary or inhibit collaboration in a digital environment.
Changes in the process of the digital collaboration task could be studied for a
broader understanding of digital collaboration, including the effects of structuring or
scaffolding a comparable digital collaboration task among similar aged students to see if
the mediation would promote a greater display of skill; the results of a similar task in
digital collaboration after students had received instruction and practice in this area; and
the effects on performance in a parallel digital collaboration task for similar aged students
that was ill-structured and thus more conducive to co-construction of knowledge.
Research to advance the understanding of development of collaborative skill
could identify variables for the rubric that might affect collaboration skills, such as i)
group facilitation by a teacher, more able peer such as an appointed leader or reciprocal
leader, as compared to no facilitation; ii) scaffolding of collaborative tasks; and iii) group
size. A variable analysis could be done next from the trait information realized from the
case analysis, using either the traits or the two dimensions of process and product as
variables.
The construct Interactive Regulatory Learning could be examined for possible
optimal ranges of activity that assist the process and facilitate a product, with notice to
possible interactions between the amount of IRL activity and productivity; for example,
as regulatory work goes up to assist process, is there a point at which the product work
stalls or decreases?
158!
A study of social-emotional processes might look at how students perceive the
collaborative process: do they have a preference for collaborating with others or working
independently, and what factors do they identify for their preferences. Would students
who had instruction and practice in collaboration have different perceptions from
students with less experience or from uninstructed settings?
Educational researchers could collaborate with software developers to investigate
the possibility of Embedded Analytics within the shared document structure that could
build items from the scoring rubric into the digital environment in an automated way to
increase the ease and facility of monitoring student progress in skill development, and
look at use patterns of the shared document features towards enhancing use through
format modifications.
Areas warranting further examination regarding professional development include
a needs assessment of a large sample of educators and of the settings within which
instruction is situated. A needs assessment of teacher ability to support collaboration in a
digital environment will need to include at minimum technology proficiency; knowledge
of appropriate pedagogical strategies for teaching technology use to student of different
ages and abilities; instructional capacity in the domains and sub-skills that support the
development of collaborative skills in students; and knowledge and ability with
instructional design principals to adequately integrate digital collaboration in multi-
disciplinary contexts.
Educators prepared to teach such 21st century skills as collaboration in a digital
environment cannot do so without the necessary site capacity regarding equipment and
connectivity, of course. While such aspects of the digital divide are outside the scope of
159!
this research, it should be noted that technology infrastructure and planning analysis at a
site or district level is warranted. Within the current economic climate, technology
sustainability is a challenging goal for schools, though this may change substantially if
digital devices such as digital books replace paper-and-pencil technologies in schools as
new materials are adopted that are less expensive in the digital format. Schools should
look for opportunities to include open-source and ubiquitous technologies into the
classroom as appropriate, and where cost-savings can be made to achieve the possibilities
through the materials available in schools.
Results of the study indicate that teachers may not have sufficient preparation
necessary to teach to the component skills of social emotional learning, collaboration and
technology and provide essential instructional support for students in these areas. These
findings point to further research not only for the development of instructional design to
incorporate these skills in K-12 programming, but also for professional development in
both pre-service and in-service educators.
Limitations of the Study
This study has numerous limitations related to sampling and the measures, as well
as internal and external validity, as discussed subsequently in this section. A detailed
analysis of validity concerns is addressed in Chapter III beginning on page 141. Despite
limitations referred to here, the results of this study provide useful information for an
initial examination of the components of this study towards practical application in K-12
settings. The key to working with the limitations is to maintain the frame of the
preliminary exploratory nature of this study, and not attempt to generalize the findings
widely beyond what is warranted.
160!
Limitations of the Sample
The sample is limited by its size and the sampling procedure. The sample size of
33 groups may be sufficient for a small formative survey of skill development of
collaboration in a digital environment, but is too small to make broad generalizations or
examine subgroups for behavioral patterns. The sampling procedure was not random. It
involved researcher invitation extended to schools with personnel known to the
researchers, which could create bias. Effort was made to get a purposive sample, with
lower and higher students in digital literacy by the countries; however schools selected on
the basis of providing sufficient technology also may have limited this range.
There was also sampling across countries for the cases, which introduced
differences but not uniformly throughout the sample. Overall different and inconsistent
conditions across school samples with regards to technology access and practice, SES,
nationality, setting, or configuration also contribute to this lack of sample uniformity, and
such international characteristics were not entered into the trend analysis due to
insufficient and non-representative data sets.
U.S. educators from the field were employed as inter-raters, and were also
surveyed regarding professional development and training in domains related to
collaboration in a digital environment. While their contributions were essential to the
study, they also exhibited sample limitations regarding sample size and procedure. The
sample was comprised of 8 educators, with a sample of convenience drawn from both the
local school district and remotely located purposively samples rural educators and one
inner city educator. The educators had a range of years in the field from 7 to 33, and only
one was male. Nevertheless, when taken as a snapshot for possible professional
161!
development needs and educator assessment of student work, the educator input provides
direction to the query of whether this rubric will behave similarly across raters, and what
training or experiences educators need to be prepared to teach 21st century skills.
Limitations of the Measures
The two measurement instruments used in this study, the Arctic Trek Performance
Assessment task and the 3+3 Six Traits Digital Collaboration Rubric, have limitations
associated with both their content and processes.
Arctic Trek content. Educational activities can be designed to encourage and
structure effective collaborative learning by presenting open-ended or ill-structured
problems requiring shared deep understanding (Stahl, 2009). Arctic Trek did not entirely
provide this ill-structured cognitive environment and hence was not truly conducive to
collaborative learning as defined by the research community of CSCL. Instead, the tasks
were well structured with clues leading to a set of pre-defined answers, thus rendering co-
construction of knowledge unlikely as students were comparing answers extrapolated
from pre-defined content instead of generating original content. However, the
information foraging, creation of digital tools and other activities in the broader tasks did
involve considerable knowledge construction.
Complexity of skill interaction and student ability. It is possible that student
ability in non-assessed content areas such as reading/decoding, reading comprehension,
or math and science skills such as interpreting graphs and charts interfered with student
ability to participate in collaboration. Similarly, students who were not fluent in
technology or who had never accessed or utilized a collaborative document prior to this
162!
assessment may have had their opportunities to participate collaboratively curtailed due
to this lack of fluency.
As discussed in the literature review, complex sets of academic social-emotional
skills interact to culminate in multi-faceted collaborative problem solving and co-
construction of knowledge. Sub-skills necessary to enter the process of developing social
capital, for instance, include receptive and expressive communication, empathy,
perspective taking, self-awareness, social cognition/other awareness, and the ability to
lead in a facilitative manner. For digital tasks, this is further compounded by the
necessity of having perspective, empathy, and social cognition in a remote-located
environment where face-to-face contact is not available for social cuing by voice quality,
facial expressions, or body postures.
3+3 Six Traits Digital Collaboration Rubric. The rubric, developed from
material generated by the qualitative analysis of student work samples through the body
of work method, discourse analysis, and cross case analysis, was constrained by the
structure of the assessment task implicit in student work, and so may not fully reflect
digital collaboration in a different context. To some extent, its conceptualization of
collaborative processes and products is construct-dependent on the Arctic Trek
Performance Assessment task, and may not adequately fit a different CSCL task.
One limitation noted by raters and review from the field is the subjectivity of the
language in the 3+3 Six Traits Digital Collaboration Rubric. One researcher noted that
the word “shared” as in “shared content or resources” could be construed by educators to
mean sharing a belief, implying that another collaborator could simply accept or reject
this belief, which would not lead to co-construction of knowledge. Rather, “share” in this
163!
context was used in lieu of “reported content or resources” as a prior review suggested
that the word “report” reflected a sense of closed content, seeming less aligned with
leading to collaboratively co-constructed new knowledge.
One difficulty in coding and describing traits is deciding how much something
matters, or should matter, in performing a CSCL task. For example, many groups
engaged on discourse around progress, so this was added to the first version of the rubric.
However, the reporting or tracking of progress did not impact scores, and scoring high on
the progress trait did not relate directly to an overall high score. While interactive
regulated learning traits were present in nearly every sample, and are seemingly intuitive
to any sort of collaborative exercise, this trait, greatly more encompassing of behaviors
than “reporting progress” still did not relate directly to an overall higher score on the
rubric. The distinction made between the dimensions collaborative process and products
helps to address this by distinguishing where groups fall in their collaborative
development, i.e. groups may be going through the processes that support collaboration,
but have not yet mastered the final steps.
Limitations to Internal Validity
As discussed in Chapter III, Messick (1995) describes validity as being defined by
as how the results of the study are interpreted and used in a social context. The results of
this study will be valid due to the fact that the findings of this study are considered
preliminary exploration of researched topics in a new setting, offering next steps and
direction for future research or potential application to practice.
Ecological sampling increases construct relevance and validity by considering
authentic tasks and elements that differentiate between novice and expertise in the task.
164!
Both the Arctic Trek performance task and the 3+3 Digital Collaboration Rubric were
bounded by task analysis and ecological sampling, thus supporting internal validity.
Limitations to External Validity
Student ability in collaboration in a digital environment as displayed in the rubric-
based assessment of student work samples in this study is not explicitly generalizable to
other students, collaborative tasks, or educational settings due to sample size, sampling
procedure, and assessment process and content concerns. The patterns and trends seen
may be somewhat comparable in other groups or settings, and with other products, but
this is not known at this time. The results of this study can act as a guide for thought
processes regarding further exploration of the topic.
Implications
Findings from this study suggest that students may lack experience with the
concept and practice of collaboration, and the sub-skills necessary to collaborate
successfully within a technological framework.
The results of this study point towards a need for comprehensive development in
the instructional, professional development, research and leadership areas of K-12
education in order to support the integration of 21st century skills such as collaboration
and ICT Literacy in K-12 system.
Implications for action in the field are several. Given the degree of distance
between research and practice strands in the field of education, the impetus for change
regarding the integration of 21st century skills and the new pedagogical strategies that
will best accommodate the incorporation of those skills will likely require direct action in
the field of practice. Perhaps exploratory research such as this study can provide
165!
direction for action, and action can be evaluated and refined to produce ever-optimized
results for teaching and learning, with the end goal of student achievement for optimal
participation in and contribution to society in a life-long capacity.
Implications for Instructional Design
A comprehensive instructional design model will need to highlight overarching
constructs and instructional goals; encompass the domains that support development of
the overarching constructs; include a focus on skills and sub-skills pertinent to fluency or
mastery in those domains; consider developmental implications as well as vertical
alignment within domain and horizontal alignment between domains; plan for a variety of
instructional modalities and practice applications; generate supports and accommodations
to maximize learning; and map assessment options.
Clarifying constructs and instructional goals will be important. Clearly defined
constructs and instructional goals will facilitate the instructional design process, direct
intentions, and allow for specificity in backwards planning. Educators will need to
specify domains and align sub-skills. Diagnostic analysis of overarching constructs and
explicit instructional goals allow for greater clarity in outlining essential sub-skills and
working towards alignment across developmental levels and between domains, such that
skills are introduced when students are ready to learn them and when the skills can be
supported and enhanced by similarly located skills in related domains.
In order to more thoroughly and successfully teach a greater number of students
with respect to individual receptive variation, it is necessary to have a wealth of ideas for
practicing application of emerging and newly acquired skills in order to have multiple
and varied opportunities to work towards mastery. These practice applications should
166!
reflect a diversity of instructional modalities so that individual learning styles and
preferences are met. This requires the review and acquisition of resources that are both
conceptual and practical.
Supports and accommodations will need to be in place such that instruction and
the level of support for learning is congruent with the needs of individuals and groups of
learners in order to achieve an instructional climate where most of the time learners are
operational at their appropriate rate and level of skills development.
Assessments, which will be key to the process, are built into the initial
instructional design phase to organize the backwards-planning process with the end
measure in mind. Knowing how the instructional goals will be assessed allows for
alignment of efforts throughout the instructional design process. Assessment for 21st
century skills such as digital collaboration can be performance-based and can be tied to
practice applications using a classroom-based performance assessment approach that
provides continuous feedback and opportunities for growth throughout the instructional
program.
Implications for Professional Development
Educators who are competent, confident, and able to seamlessly integrate a
variety of skill sets through curricular content are key to solid educational planning and
practices, and translate to better instructional support for students. The findings of this
study highlight potential need for professional development in the areas of technology
and collaboration, as well as the domains and sub-skills that support collaboration.
Results indicate that educators may not be well prepared to teach to digital collaboration,
and may lack the requisite training in the domains and sub-skills that contribute to the
167!
development of collaborative skill in order to feel confident providing instruction.
Moreover, findings imply that educators themselves may lack experience collaborating in
digital environments, further complicating their ability to integrate such skills into the
classroom.
Districts and region-level leadership units can use a needs assessment to
determine the types of training experiences and levels of training to offer educators in
order to prepare them to model and instruct in technology-based collaboration and other
21st century skills. Training in instructional design and curriculum development can
assist educators in creating ways to integrate 21st century skills with content areas, thus
changing the pedagogical ground of the standard didactic instructional methods.
Figure 23 outlines both the content and the format for teacher training in skills
development for a diagram of needs-based collaborative learning in a digital
environment, based on Cognitive Task Analysis of the enabling skills in technology and
collaboration; the feedback from educators in the field; Backward Design principles; and
the literature for best practices in professional development.
Curriculum Development for Teaching Digital Collaboration
As a relatively undefined curricular area, teachers may not have a clear idea of the
utility of collaboration in a digital environment, how to teach to it, or how to measure
student achievement in this area. Once a desired outcome has been established with a
way to measure success, teachers can work backwards to plan learning opportunities with
scaffolds to enhance success, using the Backward Design strategy for curriculum
development and instructional design described by Wiggins and McTighe (2001).
168!
!
Figure 23. Collaboration in a Digital Environment professional development model with
content and format. !
Several curricular frameworks exist for 21st century skills, but few are integrated
into instructional programs in the classroom. Consequently, teachers wishing to teach
21st century skills must find a way to integrate these skills into existing curriculum, or
create a 21st century skills curriculum, including the use of collaborative learning in a
digital environment, that can be folded into core content areas, such as science, social
studies or literature.
The use of Professional Learning Communities and a site-based approach to 21st
century skills could enhance efforts by having a school-wide focus on aligned SEL,
NETS, and Cooperative Learning skills within a supportive atmosphere of continuous
improvement. See Figure 24 for a possible Professional Development model created in
response to reported teacher needs and student work sample Notebooks.
Site-based
Professional
Learning
Community
Model
Training in
Technological
Tools
Training in
Social-
Emotional
Learning
Training In
Cooperative
Learning
169!
!!
Please Note: Assessment of student work is a two-way dual assessment: Assessed by process and
by product; assessed by teacher and by group.
Figure 24. Digital Collaboration Professional Development Process
Lesson planning, refinement and alignment. The use of Professional Learning
Communities to work as site or district-based teams for the development of 21st century
skills curriculum in technology and collaboration could lend itself to continuous and
reflective planning regarding lessons, assessments and outcomes for student growth in
this area which is aligned with research-based recommendations as discussed in Chapter
I. As teachers examine student progress, observed student needs for skill development
will drive planning efforts and alignment between skills and grade levels in the sub-
domain areas of cooperative learning, technology, and social-emotional learning. As
student needs become clear, educators can focus their professional development efforts
on the domain areas as well as the development of their 21st century skills curriculum
with collaboration in a digital environment.
Practice strategies
among PLC group
Map Curriculum
and Develop
Lessons for students
Assess Student
Work
Re-teach or take to
the next level
Share Successes
and Challenges
170!
Conclusion
This exploratory study addresses the relatively new research area of skill
development in collaboration in a digital environment. Drawing on a sample of 11-, 13-
and 15-year-olds, the study is intended to highlight the interaction of the research with
direct systemic implications for application in practice. As such, this study attempts to
anticipate and attend to the various needs of students and practitioners in the field in
order to facilitate the integration and instruction of digital collaboration and requisite
supportive skills within the K-12 educational setting.
The future of K-12 education can be positively influenced by the inclusion of 21st
century skills. Viewed broadly as a set of guidelines for complex thinking and
application of abilities, these skills can enrich instruction to help students create deeper
meaning at all levels of learning and become ever more proficient in their capacity for
meaningful participation in a global society.
171!
APPENDIX A
BLUEPRINT CONSTRUCT CHECKLISTS
172!
Consumer in Social Networks
(ATC21S, 2010)
These snippets show constructs and example loadings of a few item scores in the
scenario.
!!
*9:;<=>?!@:!;9*@AB!
:>CD9?E;!
F! G! F!
!! #$%&'$($)*+$),!&-)%.(/'! ;! ;! ;!F12('&(!,0+2'G'$'*6!/H!./10,+.57+/7$+! !! !! !!I&*+(0%*'&(!'&H/04%*'/&!'&!,/-+0+&*!9&/3$+2(+!H0%4+3/09! !! !! !!E+%0,-+.!.1'*+2!*/!7+0./&%$!,'0,14.*%&,+.! !! !! !!J'$*+08!+K%$1%*+8!4%&%(+8!/0(%&'L+!%&2!0+/0(%&'L+!'&H/04%*'/&57+/7$+! !! !! !!E++9'&(!+M7+0*!9&/3$+2(+!N7+/7$+!*-0/1(-!&+*3/09.:! !! !! !!
O'(-!
E+$+,*!/7*'4%$!*//$.!H/0!*%.9.5*/7',.! !! !! !!
!! 0-)%&$-.%!&-)%.(/'! ?!@:!;9*@AB!:>CD9?E;! G! G! G!
!! 0'/*+$2/!3'-4.&/'! ;! ;! ;!U+%4!.'*1%*'/&%$!%3%0+&+..!'&!70/,+..! !! !! !!V7*'4'L+!%..+4G$6!/H!2'.*0'G1*+2!,/&*0'G1*'/&!*/!70/21,*.! !! !! !!WM*+&2'&(!%2K%&,+2!4/2+$.!N+"("!G1.'&+..!4/2+$.:! !! !! !!#0/21,'&(!%**0%,*'K+!2'('*%$!70/21,*.!1.'&(!41$*'7$+!*+,-&/$/('+.!5!*//$.! !! !! !!O'(-! P-//.'&(!%4/&(!*+,-&/$/(',%$!/7*'/&.!H/0!70/21,'&(!2'('*%$!70/21,*.! !! !! !!
!! 5.)&+$-)*6!3'-4.&/'! ;! ;! ;!W.*%G$'.-'&(!%&2!4%&%('&(!&+*3/09.!X!,/441&'*'+.! !! !! !!Y3%0+&+..!/H!7$%&&'&(!H/0!G1'$2'&(!%**0%,*'K+!3+G.'*+.8!G$/(.8!(%4+.! !! !! !!V0(%&'L'&(!,/441&',%*'/&!3'*-'&!./,'%$!&+*3/09.! !! !! !!Z+K+$/7'&(!4/2+$.!G%.+2!/&!+.*%G$'.-+2!9&/3$+2(+! !! !! !!Z+K+$/7'&(!,0+%*'K+8!+M70+..'K+!%&25/0!1.+H1$!,/&*+&*!%0*'H%,*.!%&2!*//$.! !! !! !!Y3%0+&+..!/H!.+,10'*6!X!.%H+*6!'..1+.!N+*-',%$!%&2!$+(%$!%.7+,*.:! !! !! !!
Q'22$+!
[.'&(!&+*3/09'&(!*//$.!%&2!.*6$+.!H/0!,/441&',%*'/&!%4/&(!7+/7$+! !! !! !!
!! 1(/',$),!3'-4.&/'! ;! ;! ;!#0/21,+!.'47$+!0+70+.+&*%*'/&.!H0/4!*+47$%*+.! !! !! !!E*%0*!%&!'2+&*'*6! !! !! !![.+!%!,/471*+0!'&*+0H%,+! !! !! !!T/3! #/.*!%&!%0*'H%,*5!#+0H/04!G%.',!70/21,*'/&!*%.9.! !! !! !!!!
174!
Developer of Social Capital
(ATC21S, 2010) !
!! 4>H>B90>?!9I!;9*@AB!*A0@CAB! G! G! G!!! 7$%$-)*'8!&-))/&+-'! ;! ;! ;!U%9+!%!,/-+.'K+!$+%2+0.-'7!0/$+!'&!G1'$2'&(!%!./,'%$!+&*+070'.+! !! !! !!O'(-\! ]+H$+,*!/&!+M7+0'+&,+!'&!./,'%$!,%7'*%$!2+K+$/74+&*!N.-/1$2!.7%&!,%*+(/0'+.!G+$/3:! !! !! !!
!! 9'-:$&$/)+!&-))/&+-'! ;! ;! ;!I&'*'%*+!/77/0*1&'*'+.!H/0!2+K+$/7'&(!./,'%$!,%7'*%$!*-0/1(-!&+*3/09.!N+"("!.177/0*!H/0!2+K+$/74+&*:! !! !! !!O'(-! W&,/10%(+!41$*'7$+!7+0.7+,*'K+.!%&2!.177/0*!2'K+0.'*6!'&!&+*3/09.!N./,'%$!G0/9+0%(+!.9'$$.:! !! !! !!
!! 5.)&+$-)*6!&-))/&+-'! ;! ;! ;!W&,/10%(+!7%0*','7%*'/&!'&!%&2!,/44'*4+&*!*/!%!./,'%$!+&*+070'.+! !! !! !!Y3%0+&+..!/H!41$*'7$+!7+0.7+,*'K+.!'&!./,'%$!&+*3/09.! !! !! !!Q'22$+! P/&*0'G1*+!*/!G1'$2'&(!./,'%$!,%7'*%$!*-0/1(-!%!&+*3/09! !! !! !!!! 1(/',$),!&-))/&+-'! ;! ;! ;!#%0*','7%*'&(!'&!%!./,'%$!+&*+070'.+! !! !! !!VG.+0K+0!/0!7%..'K+!4+4G+0!/H!%!./,'%$!+&*+070'.+! !! !! !!T/3! R&/3'&(!%G/1*!./,'%$!&+*3/09.! !! !! !!!!
175!
Participator in Intellectual Capital (Collective Intelligence)
(ATC21S, 2010) !!
0A?C@*@0AC9?!@:!!@:C>BB>*C*C@H>!@:C>BB@8>:*>K! !! !! !!!! 7$%$-)*'8!;.$64/'! ;! ;! ;!^1+.*'/&'&(!+M'.*'&(!%0,-'*+,*10+!/H!./,'%$!4+2'%!%&2!2+K+$/7'&(!&+3!%0,-'*+,*10+.! !! !! !!O'\! J1&,*'/&'&(!%*!*-+!'&*+0H%,+.!/H!%0,-'*+,*10+.!*/!+4G0%,+!2'%$/(1+! !! !! !!!! 9'-:$&$/)+!;.$64/'! ;! ;! ;![&2+0.*%&2'&(!%&2!1.'&(!%0,-'*+,*10+!/H!./,'%$!4+2'%!.1,-!%.!*%(('&(8!7/$$'&(8!0/$+_7$%6'&(!%&2!4/2+$'&(!.7%,+.!*/!$'&9!*/!9&/3$+2(+!N2+$+*+2!`/H!+M7+0*.a:!'&!%&!%0+%! !! !! !!I2+&*'H6'&(!.'(&%$!K+0.1.!&/'.+!'&!'&H/04%*'/&! !! !! !!I&*+00/(%*'&(!2%*%!H/0!4+%&'&(! !! !! !!Q%9'&(!/7*'4%$!,-/',+!/H!*//$.!*/!%,,+..!,/$$+,*'K+!'&*+$$'(+&,+! !! !! !!
O'!
E-%0'&(!%&2!0+H0%4'&(!4+&*%$!4/2+$.!N7$%.*','*6:! !! !! !!!! 5.)&+$-)*6!;.$64/'! ;! ;! ;!Y,9&/3$+2(+.!41$*'7$+!7+0.7+,*'K+.! !! !! !!U-/1(-*H1$!/0(%&'L%*'/&!/H!*%(.8!(0%7-',!/0(%&'L+0.!%&2!/*-+0!0+70+.+&*%*'/&.!%&2!2'.7$%6.! !! !! !![&2+0.*%&2'&(!4+,-%&',.!/H!,/$$+,*'&(!%&2!%..+4G$'&(!2%*%! !! !! !!R&/3'&(!3-+&!*/!20%3!/&!,/$$+,*'K+!'&*+$$'(+&,+! !! !! !!
Q'2!
E-%0'&(!0+70+.+&*%*'/&.! !! !! !!!! 1(/',$),!<.$64/'! ;! ;! ;!R&/3$+2(+!/H!.10K+6!*//$.! !! !! !!YG$+!*/!4%9+!*%(.! !! !! !!T/3! #/.*'&(!%!b1+.*'/&! !! !! !!
! !! !! 34! !!!
176!
APPENDIX B
SUMMARY TABLE OF ATC21S SCENARIO
BLUEPRINT DATA COLLECTION
177!
=%%/%%(/)+!<6./3'$)+!=>0?@A!#/(-)%+'*+$-)!>*%B%C!ICT Literacy
(ATC21S, 2010)
!
ICT Literacy—Learning in digital communities PVSEU][PU5T+%0&'&(!V1*,/4+.!T+K+$.!N#0/(0+..'K+:! P/&.14+0! #0/21,+0! E/,'%$!P%7'*%$! I&*+$$+,*1%$!P%7'*%$! U/*%$!T+K+$!?! S5Y! S5Y! c+G!;!Y0,*',!! c+G!;!Y0,*',!=!=T,-%*!;!!
c+G!;!Y0,*',!=!=T,-%*!;! c+G!;!Y0,*',!A!=T,-%*!! c+Gd!P!
0-#3".3&P!Q!
*+/#-.#"&(!eE/4+!P]!'*+4.!NP/&.*01,*+2!]+.7/&.+:!3'$$!4+%.10+!.3!+D'-.,D!*-+!$'.*+2!$+K+$!N$'.*+2!$+K+$!'.!*/7!.,/0+:"!!!!!
178!
!!!!!!!!!!!!!!!
APPENDIX C
!
KSAVE MODELS
179!
Ways of Working: Communication
(Binkley et al., 2012)
!
180!
Ways of Working: Collaboration and Teamwork
(Binkley et al., 2012)!!!!
!!
181!
Tools for Working: Information Literacy
(Binkley et al., 2012)!!!!
!!!
182!
Tools for Working: Information Communication Technology Literacy
(Binkley et al., 2012)
!
183!
APPENDIX D
SIX TRAITS DIGITAL COLLABORATION CHECKLIST
184!
!
;.'!C-#."!4.M."#$!*+$$#,+-#".+6!*R&3%$.("!!
:+"!0-&(&6"! :25,&-!+S!@63.1&6"(!C-#."!! ;! <_>! ?_A! B_D! fD!
;&&%!O&$)!+-!8./&!;2))+-"!! ! ! ! ! !
4.-&3"!0-+3&((!N./4+/&+!2'0+,*.:!! ! ! ! ! !
*$#-.S7!0-+3&((!N./4+/&+!,$%0'H'+.:!! ! ! ! ! !
C.5&!=#6#M&5&6"!N%3%0+&+..!/H!*'4+!,/&.*0%'&*.8!,/&.+0K'&(!+HH/0*.:!
! ! ! ! !
8+#$!;&"".6M!N2+,'2'&(!/&!%!*%.9!/0!G+&,-4%09!*/!%,-'+K+:! ! ! ! ! !
4&/&$+)!CR-(!+S!
4.(3+2-(&!N*-%*!4%6!/0!4%6!&/*!G+!*%.9!0+$%*+2:!!
! ! ! ! !
H.(2#$!9-M#6.T#".+6!+S!
1+325&6"!N&14G+0.8!,-1&9.8!,/$/0.8!/1*$'&+!H/048!1&2+0$'&+.8!+*,:!!
! ! ! ! !
;+3.#$!4.(3+2-(&!!N-'8!7$+%.+8!*-%&9.8!$/$8!V"/)!fg8!hiih8!/0!/*-+0!./,'%$!*+M*!.7+%9:!!
! ! ! ! !
U!+S!>6"-.&(!! ! ! ! ! !
U!+S!V2&(".+6(!! ! ! ! ! !
?+$&!*+6S$.3"!+-!*+6S2(.+6!!! ! ! ! ! !
C#(%!*+6S$.3"!+-!*+6S2(.+6!! ! ! ! ! !
ASS&3"./&!("#"&5&6"(!!N'&,$12'&(!%G/1*!-/3!2'HH',1$*!*%.9!'.:! ! ! ! ! !
9SSW"#(%!X&R#/.+-(!+-!C+).3(!N%&6*-'&(!&/*!0+$%*+2!*/!3/09'&(!3'*-!(0/178!,/&*+&*!/0!70/,+..:! ! ! ! ! !
*+55&6"(Y!
!
!
185!
!
APPENDIX E
SIX TRAITS DIGITAL COLLABORATION RUBRIC
186!
!
!
;.'!C-#."!4.M."#$!*+$$#,+-#".+6!?2,-.3!
8-+2)Y!
?#"&-Y
C-#." :+6W
3+$$#,+-#"./&!N;: >5&-M.6M!N<: 4&/&$+).6M!N=: *#)#,$&!N>:
@1&6".S.3#".+6!Q!
?+$&!A((.M65&6"!!
EFD-!*'/!+D/!
3*'+$&$3*)+%!*)4!
GD*+!*'/!+D/$'!
'-6/%H"!!
S/!'2!/H!7%0*','7%&*.!/0!2'.,1..'/&!/H!0/$+.
I2!/0!0/$+.!4+&*'/&+2!/0!b1+0'+2)!4%6!G+!0/$+!,/&H$',*
E/4+!7%0*','7%&*.!'2!/0!,-/.+!0/$+.)!/0!0/$+!,/&H$',* Y$$!7%0*','7%&*.!'2!%&2!%$$!0/$+.!%0+!%..'(&+2
C#(%!A((.M65&6"!
EFD-!$%!'/%3-)%$;6/!
:-'!GD*+!+*%B%H"!
S/!2'.,1..'/&!/H!*%.9!%..'(&4+&* U%.9!%..'(&4+&*!4+&*'/&+2!/0!b1+0'+2)!/0!*%.9!,/&H$',*
E/4+!*%.9.!%0+!%..'(&+2)!4%6!G+!*%.9!,/&H$',* Y$$!*%.9.!%..'(&+2)!&/!*%.9!,/&H$',*
?&)+-"N;R#-&!!
*+6"&6"!
EI*2/!*)%G/'%!-'!
'/%3-)%/%!+-!+*%B%!
;//)!3-%+/4H"!
S/!,/&*+&*!'.!0+7/0*+2!/0!.-%0+2 E/4+!,/&*+&*!'.!4+&*'/&+2!/0!2'.,1..+2)!0%*'/&%$+!&/*!('K+&
E/4+!,/&*+&*!'.!.-%0+2)!./4+!0%*'/&%$+!('K+&
Q/.*!,/&*+&*!'.!.-%0+2)!3'*-!./4+!0%*'/&%$+)!4%6!&/*!0+H$+,*!%$$!7%0*','7%&*.
?&)+-"N;R#-&!+-!
*R&3%!0-+M-&((!
EI-G!$%!/2/'8-)/!
3'-,'/%%$),H!J%!
*)8-)/!%+.&B!-'!
)//4$),!D/63H"!!
S/!4+&*'/&!/H!70/(0+..!'&!*%.9 #0/(0+..!.*%*1.!4+&*'/&+28!G1*!&/!+M7$%&%*'/&!/0!-+$7!.++9'&(
#0/(0+..!.*%*1.!4+&*'/&+2!%&2!+M7$%'&+2)!%&25/0!,-+,9!70/(0+..!/H!/*-+0.)!%&25/0!-+$7!./1(-*!/0!/HH+0+2
#%0*','7%&*.!.9'$$+2!%&2!*%.9!,/47$+*+!3'*-/1*!&++2'&(!,-+,9_'&
*+$$#,+-#".+6!!
E#$4!3*'+$&$3*)+%!
*44!+-K!/2*6.*+/!-'!
-::/'!*)!*6+/')*+$2/!
'/%3-)%/!+-!+D/!
%D*'/4!&-)+/)+H"
S/!.-%0+2!,/&*+&*!/0!.-%0+2!G1*!&/*!%,9&/3$+2(+2
E-%0+2!,/&*+&*!%,9&/3$+2(+2)!&/!+K%$1%*'K+!/0!%$*+0&%*'K+!,/44+&*.!/HH+0+2
E/4+!.-%0+2!,/&*+&*!'.!%,9&/3$+2(+28!,/00+,*+28!%22+2!*/!/0!+K%$1%*+2!
W%,-!7'+,+!/H!.-%0+2!,/&*+&*!'.!%,9&/3$+2(+28!,/00+,*+28!%22+2!*/!/0!+K%$1%*+2
*+W*+6("-23".+6!
+S!E6+Z$&1M&!!
E#$4!3*'+$&$3*)+%!
.%/!%D*'/4!*)4!
/2*6.*+/4!&-)+/)+!
+-!&-)%+'.&+!:$)*6!
*)%G/'%!-'!
'/%3-)%/%!-'!
&-(36/+/!*!+*%BH"
S/!,/&*+&*!.-%0+2!/0!.-%0+2!G1*!&/!0+.7/&.+!/0!-%.!0+.7/&.+!G1*!&/!+K%$1%*'/&
Y(0++4+&*!3'*-!/0!%$*+0&%*'K+!0+.7/&.+!*/!+K%$1%*+2!.-%0+2!,/&*+&*!'.!4+&*'/&+28!0+b1+.*+2!/0!2'.,1..+2!G1*!&/*!0+./$K+2
Y(0++4+&*!/&!/&+!/0!4/0+!%&.3+0.!/0!0+.7/&.+.!*-0/1(-!+K%$1%*+28!,/00+,*+2!/0!+&-%&,+2!,/&*+&*
Y(0++4+&*!/&!+%,-!%&.3+0!/0!0+.7/&.+!*-0/1(-!+K%$1%*+28!,/00+,*+2!/0!+&-%&,+2!,/&*+&*
C+"#$!;3+-&Y!
*+55&6"(d!!
187!
!!!!!!!!!!!!!!!
APPENDIX F
3+3 SIX TRAITS DIGITAL COLLABORATION RUBRIC
188!
!
[\[!;.'!C-#."!4.M."#$!*+$$#,+-#".+6!?2,-.3!
8-+2)Y!
!
?#"&-Y!
4
.5
&6
(.
+6
!
C-#."! :+6W
3+$$#,+-#"./&!
JGK
>5&-M.6M!
JFK
4&/&$+).6M!
JLK
Capable
(3)
C
-#."!;3+-&!
___/3
@1&6".S.3#".+6!Q!
?+$&!A((.M65&6"!!
EFD-!*'/!+D/!
3*'+$&$3*)+%!*)4!GD*+!
*'/!+D/$'!'-6/%H"!
S/!'2!/H!7%0*','7%&*.!/0!2'.,1..'/&!/H!0/$+.
I2!/0!0/$+.!4+&*'/&+2!/0!b1+0'+2)!4%6!-%K+!0/$+!,/&H$',*
E/4+!7%0*','7%&*.!'2!/0!*%9+!0/$+.)!/0!0/$+!,/&H$',*!
Y$$!7%0*','7%&*.!'2!%&2!%$$!0/$+.!%0+!%..'(&+2)!%&6!0/$+!,/&H$',*!'.!0+./$K+2!
C#(%!A((.M65&6"!
EFD-!$%!'/%3-)%$;6/!:-'!
GD*+!+*%B%H"!
S/!2'.,1..'/&!/H!*%.9!%..'(&4+&*! U%.9!%..'(&4+&*!4+&*'/&+2!/0!b1+0'+2)!4%6!-%K+!*%.9!,/&H$',*!
E/4+!*%.9.!%0+!%..'(&+2)!4%6!-%K+!*%.9!,/&H$',*! Y$$!*%.9.!%..'(&+2)!&/!*%.9!,/&H$',*
C
ol
la
bo
ra
tiv
e
L
ea
rn
in
g
Pr
oc
es
se
s
@6"&-#3"./&!
?&M2$#"&1!B-6.6M!
EJ%!+D/'/!/2$4/)&/!-:!
%//B$),!-'!-::/'$),!D/63L!
'/3-'+$),!3'-,'/%%L!
&6*'$:8$),!3'-&/%%L!%/6:!-'!
,'-.3!/2*6.*+$-)L!+$(/!
(*)*,/(/)+L!+*%B!
-'$/)+*+$-)L!,-*6!%/++$),L!
(/4$*+$-)L!-'!
*33'/&$*+$-)H"
S/!+K'2+&,+!/H!.++9'&(!/0!/HH+0'&(!-+$7)!0+7/0*'&(!70/(0+..)!,$%0'H6'&(!70/,+..)!.+$H!/0!(0/17!+K%$1%*'/&)!*'4+!4%&%(+4+&*)!*%.9!/0'+&*%*'/&)!(/%$!.+**'&()!4+2'%*'/&)!/0!%770+,'%*'/&!
WK'2+&,+!/H!*)8!I&*+0%,*'K+!]+(1$%*+2!T+%0&'&(!G+-%K'/0.!%.!2+.,0'G+2!'&!,/$14&!<8!G1*!7%0*','7%&*.!2/!&/*!%,9&/3$+2(+!/0!0+.7/&2!
I&*+0%,*'K+!]+(1$%*+2!T+%0&'&(!G+-%K'/0.!%0+!%,9&/3$+2(+2!/0!0+.7/&2+2!*/8!G1*!0+.7/&.+!2/+.!&/*!0+./$K+!*-+!&++2!/0!'..1+!0%'.+2!!!
I&*+0%,*'K+!]+(1$%*+2!T+%0&'&(!G+-%K'/0.!+$','*!%,9&/3$+2(+4+&*!/0!0+.7/&.+!*-0/1(-!*-+!0+./$1*'/&!/H!*-+!&++2!/0!'..1+8!'H!&+,+..%06
C+"#$!(3+-&!+6!*+$$#,+-#"./&!B-6.6M!0-+3&((&(!4.5&6(.+6! __/9
;R#-&1!!
*+6"&6"!
EI*2/!*)%G/'%!-'!
'/%3-)%/%!+-!+*%B%!;//)!
3-%+/4H"
S/!,/&*+&*!'.!.-%0+2
E/4+!,/&*+&*!'.!4+&*'/&+2!/0!2'.,1..+2)!0%*'/&%$+!&/*!('K+&!
E/4+!,/&*+&*!'.!.-%0+2)!./4+!0%*'/&%$+!('K+&
Q/.*!,/&*+&*!'.!.-%0+2)!3'*-!./4+!0%*'/&%$+)!4%6!&/*!0+H$+,*!%$$!7%0*','7%&*.
*+$$#,+-#".+6!!
E#$4!3*'+$&$3*)+%!*44!+-K!
/2*6.*+/!-'!-::/'!*)!
*6+/')*+$2/!'/%3-)%/!+-!
+D/!%D*'/4!&-)+/)+H"
S/!.-%0+2!,/&*+&*!/0!,/&*+&*!.-%0+2!G1*!&/*!%,9&/3$+2(+2)!0+b1+.*+2!G1*!&/*!.1G4'**+2
E/4+!.-%0+2!,/&*+&*!%,9&/3$+2(+2)!&/!+K%$1%*'K+!/0!%$*+0&%*'K+!,/44+&*.!/HH+0+2
E/4+!.-%0+2!,/&*+&*!'.!%,9&/3$+2(+28!,/00+,*+28!%22+2!*/!/0!+K%$1%*+2
W%,-!7'+,+!/H!.-%0+2!,/&*+&*!'.!%,9&/3$+2(+28!,/00+,*+28!%22+2!*/!/0!+K%$1%*+2
*+
$$
#,
+-
#"
./
&!
B&
#-
6
.6
M!
0
-+
1
2
3"
(!
*+W*+6("-23".+6!+S!
E6+Z$&1M&!!
E#$4!3*'+$&$3*)+%!.%/!
%D*'/4!*)4!/2*6.*+/4!
&-)+/)+!+-!&-)%+'.&+!:$)*6!
*)%G/'%!-'!'/%3-)%/%!-'!
&-(36/+/!*!+*%BH"
S/!,/&*+&*!.-%0+2!/0!,/&*+&*!.-%0+2!G1*!&/!0+.7/&.+!/0!-%.!0+.7/&.+!G1*!&/!+K%$1%*'/&
Y(0++4+&*!3'*-!/0!%$*+0&%*'K+!0+.7/&.+!*/!+K%$1%*+2!.-%0+2!,/&*+&*!'.!4+&*'/&+28!0+b1+.*+2!/0!2'.,1..+2!G1*!&/*!0+./$K+2
Y(0++4+&*!/&!/&+!/0!4/0+!%&.3+0.!/0!0+.7/&.+.!*-0/1(-!+K%$1%*+28!,/00+,*+2!/0!+&-%&,+2!,/&*+&*
Y(0++4+&*!/&!+%,-!%&.3+0!/0!0+.7/&.+!*-0/1(-!+K%$1%*+28!,/00+,*+2!/0!+&-%&,+2!,/&*+&*
C+"#$!(3+-&!+6!*+$$#,+-#"./&!B-6.6M!0-+123"(!4.5&6(.+6! __/9
189!
APPENDIX G
ASYNCHRONOUS INTER-RATER DIRECTIONS
!
190!
Thank you for so much for helping me out!
You are serving as a "purposively sampled inter-rater" meaning that we needed teachers
from different levels of education to rate the same student samples using the rubric, to see
how much variance there is in the scores. If the rubric is reliable, it will perform
consistently across raters and provide the same or near-same score for each sample no
matter who (within the profession) uses the tool.
There are actually 8 student work samples (I realize I said 7), labeled “Notebook #”
There are eleven Attachments to the email:
1) This orientation/set of instructions
2) Notebooks 1, 2, 6, 8, 14, 22, 30 and 33.
3) The 3 + 3 Six Trait Digital Collaboration Rubric.
4) A Scoring sheet that is just a word doc so you can add to it and send it back to me. I
created a format for reporting your scores.
Rubric:
The rubric has two dimensions: collaborative learning processes and collaborative
learning products. Therefore, each group will get a score per each trait, and then a “total
score” for each dimension.
The Rubric could be used in a formative manner to guide instruction, as well as in a
summative manner to assess learning and progress in this skill area.
To assess the student work samples using the rubric, please follow this procedure:
1. Read thru the student work sample. You may make notations on the sample to
highlight or code information if you desire.
2. You may want to read through the sample again for clarity.
3. Read through the traits on the Rubric.
4. With the work sample and rubric side by side, match the evidence from student
work to the elements in the descriptors on the rubric.
5. Refer to both the descriptors and student work sample as much as you need to in
order to make a thorough evaluation of the work.
6. Circle the appropriate descriptor box for each trait.
7. Add the scores per trait, as outlined per descriptor box, into the score column.
8. Send me your results by Sunday evening, February 19th.
Please note these aspects about the student work:
191!
Spelling, grammar, and writing conventions were not specified as integral to the task
and should not be considered in evaluating this work.
Also, the students were of different ages (11, 13, and 15) and are from different
countries (US, Australia and Singapore). The data is de-identified, so we do not know
which students belong to which documents, and the sample size will be too small to
analyze on a country level.
Assessment Background:
Students were given a 45-minute computer-based performance assessment called
Arctic Trek in a team format. Teams of 3-4 students did an interactive web
search/web quest type exercise to demonstrate their ability with technology and
collaboration, among other skills. The team members were co-located on separate
computers and were instructed not to talk to one another if they were near enough to
each other in the classroom. They were not told beforehand who was on their team,
and each team member was assigned a number (ex: 144). Their only means of
communication to collaboratively solve the clues and enter their information was
through the Notebook, which is a Google doc.
Students had to find the link to the shared document/Notebook, and enter it to share
the information they found, get help, or other such behaviors. The main
administration instructions for teachers are copied below. Teachers were to be very
hands-off as with most assessments, and not give help even if the students had
difficulty accessing the doc or links or the computer.
These samples are from cognitive lab and pilot data, so there was some variance
across classrooms as the instrument was adjusted slightly.
Students were told in the Trek to find the answers to questions by searching the clues,
and to access their team members on the Notebook, choose roles, and share answers.
Test Administration instructions (For teachers administering Arctic Trek)
In about 5 MINUTES, give students "ASK THREE THEN ME" directions. Every student
is expected to explore three sources of information before asking instructor or test
administrator help. These three are: (1) task directions and resources on each screen, (2)
questions online of team members to get and give help, and (3) access internet for
information PRIOR to requesting help. Instructor help is to be RARELY given (see
below for instructions on how), and students are to explore and do their best with the
information and team members available. Instruct students that collaborating and using
the Internet is expected and is NOT cheating for this assessment" !
192!
SAY:
“I will provide you with ASK THREE THEN ME directions. Every student is
expected to use three sources of information before asking for help. First, you
are expected to use task directions and resources on each screen. Second, work
with your team members to get and give help. Third, use the internet for
information. PLEASE KEEP IN MIND THAT THIS IS NOT CHEATING.
Otherwise, you should explore the tasks and do the best you can with the
information and team members provided. You are being assessed on YOUR
ABILITY to work with tools and people online.”!!!!Y(%'&8!"R#6%!7+2!(+!/&-7!523R8!%&2!7$+%.+!2/&j*!-+.'*%*+!*/!+4%'$!4+!3'*-!%&6!b1+.*'/&.!6/1!4%6!-%K+"!!k%0G%0%!
193!
!!!!!!!!!!!!!!!
APPENDIX H
SAMPLE STUDENT NOTEBOOK: HIGH SCORING
194!
!
195!
!!!!!!!!!!!!!!!
APPENDIX I
SAMPLE STUDENT NOTEBOOK: LOW SCORING
196!
!
!
197!
!!!!!!!!!!!!!!!
APPENDIX J
ISTE STANDARDS FOR TECHNOLOGY INSTRUCTION
198!
(ISTE, 2008) !
!!
199!
NIEUW8!=;;C:!!
!!!!!
200!
!
!
APPENDIX K
ISTE ESSENTIAL CONDITIONS FOR TECHNOLOGY IN EDUCATION
201!
NIEUW8!=;;C:!
!!
202!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
APPENDIX L
TEST ADMINISTRATOR MANUAL FOR ARCTIC TREK ASSESSMENT
203!
!!!
!!!!
Berkeley Evaluation and Assessment Research Center
University of California, Berkeley
!
!
ATC21S
ASSESSMENT & TEACHING OF 21st CENTURY
SKILLS
PILOT TESTING !
!
204!
C&("!A15.6.("-#"+-!=#62#$!
!
!
!
!
!
!
Checklist for Test Proctors
Note: This checklist is provided as a summary only. It is essential that you read
this entire guide in order to ensure the proper administration of the test.
! Before the testing
" Read the Test Administrator Manual in its entirety.
" Print this manual if you are reading electronic copy of the manual and think you
might need a paper copy during the administration of the test.
" Communicate with the Test Coordinator (Project Administrator) of your country
to review the testing schedule and to arrange for the students who require
accommodations. Also review procedures in the Test Administrator Manual.
" Check if technology requirements are met on your student computers (see
Technical Requirements section).
" Receive your student logins and passwords, and online access to instructor
preview scenarios (contact Test Coordinator for student logins and passwords).
" Access online preview scenarios to become familiar with them.
" Decide if Kodu to be installed or not (optional but engaging for students).
" Ensure that students are provided with the necessary student ID and passwords. If
you are planning to distribute login and password forms, make sure that you have
forms available printed in advance.
" Have a timer available.
" Ensure administrator knows how to correctly answer all parts of the scenario.
" Ensure administrator has access to a computer workstation for every student.
" Ensure computers meet requirements and have access to Internet, tasks and links
(see Technical Requirements section).
! During the testing
" Post a “Testing—Please Do Not Disturb” sign on the room where testing is
conducted.
205!
" Ensure all students have comfortable and adequate workspaces, and that students
on same team should be seated at least two to three workstations apart, to
effectively encourage interactions to be online.
" Monitor students to ensure they are working in the correct sections of the test.
" Monitor students’ handling of computer hardware to keep it in proper condition.
" If you are administering accommodations, make sure that the accommodations are
provided as were determined prior to testing and according to the regulations of
the region in which the test is being administered.
" Take notes during the test of any testing irregularities and notify the test
coordinator of your country after the testing. Be as specific as possible. If you
notice any technical issues or issues with the computer testing system, please
record the issue in the Teach Aid text box for the computer on which the problem
was found.
! After the testing
" Verify that all login and password forms have been collected.
" Verify that all computer hardware used by students during testing is left in proper
condition.
" Verify that any testing irregularities are reported to the testing coordinator.
Guidelines for a Suitable Testing Environment
• The testing room should be appropriately heated or cooled, adequately ventilated,
and free from distractions.
• Lighting and screen brightness should enable all examinees to read the computer
screen in comfort. It should not produce shadows or glare on the computer screen
or writing surface.
• The testing room should comfortably accommodate the number of testing stations
placed in it.
• Position the computer monitor, keyboard, and mouse properly for ease of use
without strain.
• Testing room must be quiet throughout all test administrations. When testing is
scheduled, or is in progress, other activities that would disrupt the testing
environment should not be conducted.
• Depending on the regulation of the state and country of the testing, the building,
testing rooms, and restrooms should be accessible to people with disabilities,
including wheelchair access.
• Cell phones that might distract students from the test should be turned off.
206!
ATC21S Directions for Administering "Learning in Digital
Networks" Assessments
Note: This guide assumes 50 minutes scheduled for administering EACH scenario.
This will consist of a 5-minute instruction period, and a 45-minute test period.
BEFORE ADMINISTERING, you MUST verify the technical
requirements at http://bearcenter.berkeley.edu/test/test.html for each of the
student computers. To do this, login to the link from the student computers
and answer all the questions. The answers will be specific to each computer,
so if you do not have a standard computer setup, each computer will need to
be checked. !
Test Administration instructions
In about 5 MINUTES, give students "ASK THREE THEN ME" directions. Every student
is expected to explore three sources of information before asking instructor or test
administrator help. These three are: (1) task directions and resources on each screen, (2)
questions online of team members to get and give help, and (3) access internet for
information PRIOR to requesting help. Instructor help is to be RARELY given (see
below for instructions on how), and students are to explore and do their best with the
information and team members available. Instruct students that collaborating and using
the Internet is expected and is NOT cheating for this assessment" !
SAY: “I will provide you with ASK THREE THEN ME directions. Every
student is expected to use three sources of information before asking for help.
First, you are expected to use task directions and resources on each screen.
Second, work with your team members to get and give help. Third, use the
internet for information. PLEASE KEEP IN MIND THAT THIS IS NOT
CHEATING. Otherwise, you should explore the tasks and do the best you can
with the information and team members provided. You are being assessed on
YOUR ABILITY to work with tools and people online.”!!!
Provide each student with their correct login and password for FADS (the delivery
system).
Write down http://bearcenter.berkeley.edu/atc21s-americas/ on the board or provide on
the paper.
SAY: “In the paper handed to you, you will find the login ID and password
you need in order to login to the system from the website written on the board (or
provided on the paper) (Give students the name of the practice test to which they
are assigned, see the sampling matrix provided by your country representative).
207!
“Now you will login to the system. You will select the task and start the test. (Give
students the name of the instrument being delivered. Tell them to select this
name on the screen). If you have a SERIOUS technical problem with either the
test or the computer, please raise your hand and I will help you. You have 45
minutes.!Please pace your time appropriately and do not spend too much time on
a particular task.” !
If students are taking Global Human Legacy Task 2011 (Webspiration
poetry), say:
SAY: “Average time you have for each screen is about 5 minutes. Note that
once in Webspiration (Global Human Legacy Task 2011,poetry), you should try
to leave the document by selecting Document>Sign Out”. Otherwise next time
the orange box with the link to your document might not appear. Then you will
need to find your document under the Recently Opened menu that you will see. If
you encounter this problem, ask me for help.”
SET TIME FOR 45 MINUTES. Starting time: __________ Ending time:
__________
(Write the “Starting time” and “Ending time” on the board if necessary.)
Note: In RARE cases, if student needs help and CANNOT PROCEED AT ALL
during the assessment, administrator may provide assistance. To do so, FIRST record
information in TeachAid screen available by clicking “T” icon in lower right of student
screen, THEN provide help to student face-to-face. This is primarily for special needs
students or to record unusual technical problems that do not occur for most students so
that they can be addressed in future versions.
When 45-minute testing period complete:
SAY: ”Please stop working, logout from the system and turn off
computers.”
Note: Collect all login and password forms distributed to students earlier. Make sure
that all computer hardware used by students during testing is left in proper condition. Do
not forget to report any technical issues and testing irregularities to the testing
coordinator of your country.
208!
Technical Requirements
BEFORE ADMINISTERING, you MUST verify the technical
requirements at http://bearcenter.berkeley.edu/test/test.html for each of the
student computers. To do this, login to the link from the student computers
and answer all the questions. The answers will be specific to each computer,
so if you do not have a standard computer setup, each computer will need to
be checked.
Task Access:
Web address (for U.S. administration only):
http://bearcenter.berkeley.edu/atc21s-americas/
login and password: see assigned list or contact test coordinator of your
country.
Once logged in, select the desired assessment from the list. Note that
ATC21S cognitive laboratory passwords are preset to access only one
scenario each:
1. Global Human Legacy Task 2011 (poetry)
2. Global Collaboration Contest 2011 (Arctic trek)
3A. Global 2nd Language Chat: Native Speaker
3B. Global 2nd Language Chat: Language Learner
If you are using demo accounts to preview the tasks, make sure you are
using the right age level demo accounts.
Technical details:
• devices supported - PC or Mac
• headphones for students and color monitor required
• browsers - PC: IE 7.0+, FireFox 3.0+; Mac: Safari 4.0+, FireFox 3.0+
209!
• browser settings - javascript and pop-up windows must be enabled
• plugin - Adobe Flash 10.3+
• internet connectivity - broadband suggested (1.5Mbit/s or higher)
• screen size/resolution - 1024x768 or higher recommended, works at less
• access to external websites in the tasks
• microphone may be required for some scenarios
• permissions to download files from a browser.
• empty browser caches prior to test administration
• test audio for playing podcasts in advance
• ensure no auto-update software will launch to impede the use of the
computer in a timely
manner
• ensure that the network performance is adequate:
1. Direct your browser to "http://www.speakeasy.net/speedtest/"
2. Click on "Dallas, TX"
3. Note the Download Speed and Upload Speed. Speed below
1.0Mbs or 0.7Mbs
indicates inadequate performance.
Technical Assistance
For ATC21S technical assistance, contact bearit@berkeley.edu. Note that
technical assistance will be provided within two business days, with business
days/times 10 am-5 pm Monday-Friday U.S. Pacific Standard Time.
HIGHLY CONFIDENTIAL HIGHLY
CONFIDENTIAL
ANSWER KEY for ARCTIC TREK CLUES
(answers shown below are confidential)
This answer key is provided for teachers who are previewing the Arctic
Trek scenario and would like to check clue answers as they preview the
task.
Age 11:
Clue 1: Arctic Basin is expected - Link: Polar Bear Map.
Clue 2: Arctic Fox is expected - Link: Land Animal Food.
210!
Clue 3: Answer may be 3 (1 point), 5 (2 points), or 6 (3 points), any other
number is no credit. Link: Polar Bear Population.
Clue 4: Correct answer might look like the following: “In most places the
polar bear population is dropping, so that could be a problem for polar
bears” Link: Polar Bear Population.
For the line graph, a number of lines and sliders can be used. We want to
see a reasonable trend showing that approximates the data, and whether
students can explain why they used what they used. This task measures ICT
Literacy with some quantitative reasoning representations.
For the spinners, 5 spinner sections are ideal, each section being roughly
proportional to the corresponding bars on the graph.
Kodu: Whether or not Kodu is installed is an assessment question. Students
should attempt to check and answer themselves. Their response will be
compared to the information received from the corresponding country.
Countries for which Kodu is installed can then continue with the screen.
Clue 5: Answer might be similar to: “No, the web screen does not give
information to answer this question.” For the question about whether it is
possible estimate, students should be able to say they cannot estimate by
using the Finnish page supplied, but might be able to estimate by using other
information resources online, for instance. Their reasoning argument for
how to estimate using digital resources will be examined. Link: Finnish
Artic club.
Age 13:
Clue 1: Barents sea - Link: Polar Bear Map.
Clue 2: Any two of the following: Artic Fox, Alopex lagopus, White Fox,
Snow Fox - Link: Land Animal Food.
Clue 3: Answer may be 3 (1 point), 5 (2 points), or 6 (3 points), any other
number is no credit. Link: Polar Bear Population.
211!
Clue 4: Correct answer might look like the following: “In most places the
polar bear population is dropping, so that could be a problem for polar
bears” Link: Polar Bear Population.
For the line graph, a number of lines and sliders can be used. We want to
see a reasonable trend showing that approximates the data, and whether
students can explain why they used what they used. This task measures ICT
Literacy with some quantitative reasoning representations.
For the spinners, 5 spinner sections are ideal, each section being roughly
proportional to the corresponding bars on the graph.
Kodu: Whether or not Kodu is installed is an assessment question. Students
should attempt to check and answer themselves. Their response will be
compared to the information received from the corresponding country.
Countries for which Kodu is installed can then continue with the screen.
Clue 5: Answer might be similar to: “No, the web screen does not give
information to answer this question.” For the question about whether it is
possible estimate, students should be able to say they cannot estimate by
using the Finnish page supplied, but might be able to estimate by using other
information resources online, for instance. Their reasoning argument for
how to estimate using digital resources will be examined. Link: Finnish
Artic club.
Age 15
Clue 1: Barents sea - Link: Polar Bear Map.
Clue 2: Snow - Link: Land Animal Food.
Clue 3: Answer may be 3 (1 point), 5 (2 points), or 6 (3 points), any other
number is no credit. Link: Polar Bear Population.
Clue 4: Correct answer might look like the following: “In most places the
polar bear population is dropping, so that could be a problem for polar
bears” Link: Polar Bear Population.
212!
For the line graph, a number of lines and sliders can be used. We want to
see a reasonable trend showing that approximates the data, and whether
students can explain why they used what they used. This task measures ICT
Literacy with some quantitative reasoning representations.
For the spinners, 5 spinner sections are ideal, each section being roughly
proportional to the corresponding bars on the graph.
Kodu: Whether or not Kodu is installed is an assessment question. Students
should attempt to check and answer themselves. Their response will be
compared to the information received from the corresponding country.
Countries for which Kodu is installed can then continue with the screen.
Clue 5: Answer might be similar to: “No, the web screen does not give
information to answer this question.” For the question about whether it is
possible estimate, students should be able to say they cannot estimate by
using the Finnish page supplied, but might be able to estimate by using other
information resources online, for instance. Their reasoning argument for
how to estimate using digital resources will be examined. Link: Finnish
Artic club.
213!
Some Screen Shot Examples from tasks:
First screens from an example scenario, for your reference. Please see the
online preview scenarios you will receive, referenced above, in order to
obtain preview access to your practice and assessment screens.
!!
!!!
214!
!! !
!
215!
!
APPENDIX M
SAMPLES OF CODED NOTEBOOKS
216!
Sample of Notebooks Coded Through Qualitative Analysis
Notebook 2!
217!
Sample of Notebooks Coded Through Qualitative Analysis
Notebook 8!!
!
218!
!
APPENDIX N
SURVEY OF EDUCATORS
219!
Survey: Teacher Technology Use and Professional Development for CSCL
Experience: How many years teaching: grade levels:
Technology (Circle all that apply)
I was trained in Information Communication Technology:
pre-service in-service through district sought training on my own no training in ICT
Do you use technology at home?
Computer laptop handheld device other Frequency?
Do you use technology at school?
Computer laptop handheld device other Frequency?
Do you use technology with your students?
Computer laptop handheld device other Frequency?
How do you currently evaluate your student tech work, if applicable?
Cooperative Learning: (circle all that apply)
I was trained in Cooperative Learning:
pre-service in-service through district sought training on my own no training
I use cooperative learning components in my classroom instruction
Yes No Frequency:
Collaboration:
I use collaborative working arrangements with my students
Yes No Frequency:
I do collaborative work in a technological setting in my personal life
No Yes Google docs wikis blogs prezis animoto other tech tool/program
Please state other:
220!
I do collaborative work in a technological setting in my professional/school site setting
No Yes Google docs wikis blogs prezis animoto or other tech tool
Please state other:
I use collaborative work in a technological setting with my students
No Yes Google docs wikis blogs prezis animoto or other tech tool/program
Please state other:
If not, why not?
Lack of technology lack of time (other curricular needs) age of students
I don’t feel proficient/confident to teach these skills other (Please describe)
Social-Emotional Learning:
I was trained in SEL:
pre-service in-service thru district sought training on my own no training in SEL
I teach SEL skills to my students: As needed Regularly Frequency:
I feel confident teaching social-emotional skills to my students Yes No
I have a curriculum for SEL (please name)
21st Century Skills:
I have seen or could identify a framework for 21st century skills Yes No
Professional Learning Communities (PLC’s)
I participate in a / some PLC’s through:
school site district a professional organization
!
221!
APPENDIX O
ACADEMIC SOCIAL EMOTIONAL LEARNING STRANDS
222!
!
Academic Social Emotional Learning Strands (Casel, 2003)
Strand Elements
Self awareness Recognizing one’s emotions and values
as well as one’s strengths and limitations,
self efficacy
Self management Managing emotions and behaviors to
achieve one’s goal, impulse control and
stress management, self-motivation and
discipline, goal setting and organizational
skills
Social awareness Perspective taking, showing and
understanding empathy for others,
appreciating diversity, and having respect
for others
Relationship skills Communication, social engagement,
building relationships, working
cooperatively, negotiation, refusal, and
conflict management, help seeking and
providing forming positive relationships,
working in teams, dealing effectively
with conflict
Responsible decision-making Problem identification and situation
analysis, problem solving, evaluation and
reflection, personal, moral and ethical
behavior, and making ethical,
constructive choices about personal and
social behavior
!
!
!
!
223!
APPENDIX P
TECHNICAL AND INTER-RATER STUDIES
224!
Rubric inter-rater exploration. The rubric used to score the Notebooks was developed
using the Body of Work Method and Discourse Analysis as described above. In order to
increase the content validity and technical adequacy, a draft rubric was sent to
professionals in the field for review of the components. Professional selection was
comprised of at least one expert from research in communication, collaboration or digital
collaboration, and one expert from practice in middle level through high school teaching.
Correspondence theory and matching were used to sample the evidence of student work
in the Notebooks documenting student use of the ATC21S construct components along
with coherence theory, matching the evidence to the emergent theory and relevant
literature for theory testing (Shadish, Cook & Campbell, 2002).
Rubric technical adequacy. Referring to the American Psychological Association
Standards for Testing (2002), several methods were used to estimate the evidence quality
of the rubric.
Criterion validity. Criterion validity of the rubric was established through work in
Phases 1, 2 and 3 with iterative review of student work sample characteristics against
described Traits on the rubric, including the review of inter-raters.
Inter-rater reliability. The Body of Work scoring and range finding process was
replicated with eight inter-raters purposively chosen from the field in order to check for
alternative ideas about proficiency; this is discussed further in the following sections.
Performance levels were then narrowed in the pinpointing phase, designating levels of
225!
proficiency as per completion of stated assessment tasks. Purposive sampling was used to
select Notebooks and raters for the inter-rater comparison. Approximately 25% of the
Notebooks (eight) were purposively sampled for cross-rater analysis. Notebooks were
selected for a teachable purposive sample reflecting differentiated patterns of skill
development to represent different instructional levels. Notebooks patterns of skill
development were based on initial placement regarding low to high collaborative skill as
determined through the Body of Work method. Saturation evaluation analysis based on
Discount Usability Engineering or Heuristic Evaluation Quality Scoring (HEQS) was
used to add raters from an initial three to a maximum of eight, depending on when
information function begins to stabilize. Research on Discount Usability Engineering and
HEQS holds that after four raters, a substantial amount of additional new information is
rarely gathered (Kirmani, 2008; Nielsen & Landauer, 1993) In this study, eight raters
were involved in the inter-rater process, three in an asynchronous format, and five in a
moderated session; the eight workbook samples were each reviewed by the eight raters.
See below for descriptions of the rating session processes
Purposive raters. The sample frame of raters were chosen to reflect the different
experience-based perspectives available in education and the number of raters that
provide the highest level of new information. The crossover perspective, blending years
of practice with doctoral level study can be represented by the author of this study. Other
perspectives include the research and higher education perspective and the K-12
professional practice perspective. At least one rater with backgrounds in research and
higher education and one rater with experience in practice teaching at the middle to high
226!
school level were included. Descriptive statistics were used to look at the trends between
raters.
Administration procedures. In addition to statistical reliability estimates, administration
of the rubric included administration instructions that constrained the respondent’s frame-
of-reference regarding the student work considered when completing the rating.
Notebooks remained deindentified as to age and regional source so as not to bias the
review.
Differentiated rating situations. Raters were recruited for both remote location
asynchronous rating and a face-to-face rating session. Three asynchronous raters were
sent a packet with rubric administration instructions that included background
information on the assessment along with eight samples and a sheet to record their
results. They took a survey, shown in Appendix N, regarding their professional
development and classroom use of strategies in the areas of collaboration, technology,
and computer-supported collaboration. Asynchronous raters were not in touch with other
raters, and did not have a practice-rating sample with feedback.
Moderated rating session. Following the use of the rubric by educators serving as
asynchronous inter-raters, a trial was established using a face-to-face, moderated group of
educators gathered in Eugene, Oregon. Five educators met after school to review the
purpose and structure of the assessment task, review the rubric, and practice-rate two
samples with moderation. Face to face raters received the same administration
227!
instructions and background information on the assessment as asynchronous raters.
Following the practice rating, teachers discussed the scores they gave with other teachers
and came to agreement about student sample elements that met rubric traits. Questioning
and clarification of traits occurred using the samples rated (Notebooks 32 and 33).
Educators then individually rated the same eight samples the other inter-raters used. !!
228!
REFERENCES CITED !
Agosto, D. E., & Abbas, J. (2010). High school seniors' social network and other ICT use
preferences and concerns. Proceedings of the American Society for Information
Science and Technology, 47, 1-10. doi: 10.1002/meet.14504701025
Alreck, P., & Settle, R. (1995). The survey research handbook. New York:McGraw-Hill !
American Educational Research Association, American Psychological Association &
National Council on Measurement in Education. (1999). Standards for
educational and psychological testing. Washington, DC: Author.
American Management Association. (2010). AMA 2010 Critical Skills Survey. Retrieved
from http://www.amanet.org/news/AMA-2010-critcal-skills-survey.aspx"!!
American Psychological Association. (2002). Standards for educational and
psychological testing. Washington, D.C.: Author.
Anderson-Inman, L., Knox-Quinn, C., & Tromba, P. (1996). Synchronous writing
environments: Real-time interaction in cyberspace. Journal of Adolescent & Adult
Literacy, 40, (2), 134-138. Retrieved from http://www.jstor.org/stable/40016751
Baker, E.L. (2008). Learning and assessment in an accountability context. In K.E. Ryan
and L. A. Shepard (Eds.) The future of test-based educational accountability (pp.
277-287). New York, NY: Routledge.
Balistreri, S., Di Giacomo, F. T., Noisette, I., & Ptak, T. (2011). Global education:
Connections, concepts and careers. Retrieved from College Board website
http://professionals.collegeboard.com/data-reports-research/cb/Global_Education-
Connections_Concepts_Careers
Barron, B., & Darling-Hammond, L. (2008). How can we teach for meaningful learning?
In L. Darling-Hammond, B. Barron, G.N. Cervetti, P.D. Pearson, A.H.
Schoenfeld, E. K. Stage, J.T. Tilson, & T.D. Zimmerman, (Eds.) Powerful
learning: What we know about teaching for understanding (pp. 11-70). San
Francisco, CA: Jossey-Bass.
Becker, H.J. (2001, April). How are teachers using computers in instruction? Paper
presented at the 2001 Meeting of the American Educational Research Association,
Seattle, WA. Available online at http://www.crito.uci.edu/tlc/FINDINGS/special3/!!
Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble,
M. (2012). Defining 21st century skills. In P. Griffin, B. McGaw, & E. Care (Eds.),
Assessment and teaching of 21st century skills (pp 17-66). New York, NY: Springer.
229!
Buckingham, D. (2006). Defining digital literacy: What do young people need to know
about digital media? Nordic Journal of Digital Literacy, 4, 263-276. ISSN Online:
1891-943X
Caracelli, V. J., & Greene, J. C. (1997). Crafting mixed-method evaluation designs. In
J.C. Greene & V. J. Caracelli (Eds.), Advances in mixed- method evaluation: The
challenges and benefits of integrating diverse paradigms (pp. 19-32). San
Francisco, CA: Jossey-Bass.
Childre, A., Sands, J. R., & Tanner Pope, S. (2009). Designing challenging curriculum:
Backward design. Teaching Exceptional Children, 14 (5), 6-14.
Cizek, G. J., & Bunch, M. B. (2007). Standard setting: A guide to establishing and
evaluating performance standards on tests. Thousand Oaks, CA: Sage.
Cohen, J.E. (2006). Social, emotional, ethical, and academic education: Creating a
climate for learning, participation in democracy, and well-being. Harvard
Educational Review 76 (2), 201-238.
Collaborative for Academic, Social, and Emotional Learning (CASEL). (2003). Safe and
sound: An educational leader’s guide to social and emotional learning programs.
Retrieved from www.casel.org.
Conley, D. T. (2010). College and career ready: Helping all students succeed beyond
high school. San Francisco, CA: Jossey-Bass.
Culp, K. M., Honey, M., & Mandinach, E. (2003). A retrospective on twenty years of
education technology policy. Retrieved from U.S. Department of Education,
Office of Educational Technology.
http://www.nationaledtechplan.org/participate/20years.pdf"!!
Dabbagh, N., & Kisantas, A. (2005). Using web-based pedagogical tools as scaffolds for
self-regulated learning. Instructional Science, 33, 513-540. doi 10.1007/s11251-
005-1278-3. !
Darling-Hammond, L., & Adamson, F. (2010). Beyond basic skills: The role of
performance assessment in achieving 21st century standards of learning.
Retrieved from Stanford University, Stanford Center for Opportunity Policy in
Education website http://www.edpolicy.stanford.edu/.
Dede, C. (2010). Comparing frameworks for 21st century skills. In J. Bellanca & and R.
Brandt (Eds.), 21st century skills: Rethinking how students learn (pp. 51-75).
Bloomington, IN: Solution Tree Press.
230!
Dede, C. (2009). Technologies that facilitate generating knowledge and possibly wisdom:
A response to Web 2.0 and classroom research. Educational Researcher 38 (4),
60-63.
Dede, C. (2005). Planning for “neomillennial” learning styles: Implications for
investments in technology and faculty. In J. Oblinger and D. Oblinger (Eds.),
Educating the net generation (pp. 226-247). Boulder, CO: Educause Publishers.
De Wever, B., Van Keer, H., Schellens, T., & Valcke, M. (2009). Structuring
asynchronous discussion groups: The impact of role support and self-assessment
on students' levels of knowledge construction through social negotiation. Journal
of Computer Assisted Learning, 25, 177-188.
De Wever, B., Schellens T., Valcke, M., & Van Keer H. (2006). Content analysis
schemes to analyze transcripts of online asynchronous discussion groups: A
review. Computers and Education, 46, 6-28.
Donovan, M. S., Bransford, J. D., & Pellegrino, J. W. (Eds.). (1999). How people learn:
Bridging research and practice. Committee on Learning Research and
Educational Practice, National Research Council. Retrieved from the National
Academy Press website: www.nap.edu/catalog/9457.
Driscoll, M. P. (2002). How People Learn (and what technology might have to do with
it). Eric Digest. ED470032 ERIC Clearinghouse on Information and Technology
Syracuse, NY. !
Education Northwest. (2010). The strongest link: Supporting highly effective teachers.
Education Northwest Magazine 15(3), 18-22. Retrieved from
http://educationnorthwest.org/resource/1118 !
Evans, M. A., Feenstra, E., Ryon, E., & McNeill, D. (2011). A multimodal approach to
coding discourse: Collaboration, distributed cognition, and geometric reasoning.
Computer-Supported Collaborative Learning 6, 253-278. doi 10.1007/s11412-
011-9113-0
Fahy, P. J. (2001). Addressing some common problems in transcript analysis.
International Review of Research in Open and Distance Learning 1(2), Retrieved
from http://www.irrodl.org/index.php/irrodl/article/view/321/530.
Gersten, R., Dimino, J., Jayanthi, M., Kim, J. S., & Santoro, L. E. (2010). Teacher study
group: Impact of the professional development model on reading instruction and
student outcomes in first grade classrooms. American Educational Research
Journal, 47, 694-739. doi: 10.3102/0002831209361208
Gillies, R.M. (2004). The effects of cooperative learning on junior high school students
during small group learning. Learning and Instruction, 14(2), 197-213.
231!
Govtrack. (2011). 21st Century Skills Readiness Act. Retrieved from
http://www.govtrack.us/congress/bill.xpd?bill=s112-1175.
Gray, L., Thomas, N., & Lewis, L. (2010). Educational Technology in U.S. Public
Schools: Fall 2008 (NCES Publication No. 2010-034). Retrieved from U.S.
Department of Education, National Center for Education Statistics website:
http://nces.ed.gov/pubsearch/.
Greene, J. C. (2007). Mixed methods in social inquiry. San Francisco, CA: Jossey-Bass.
Greene, J. C., Kreider, H., & Mayer, E. (2005). Combining qualitative and quantitative
methods in social inquiry. In B. Somekh & C. Lewin (Eds.), Methods in the social
sciences (pp. 274-281). London, UK: Sage.
Griffin, P., McGaw, B., & Care, E. (2012). The changing role of education and schools.
In P. Griffin, B. McGaw & E. Care (Eds.), Assessment and teaching of 21st
century skills (pp 1-16). New York, NY: Springer.
Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of a global online
debate and the development of an interaction analysis model for examining social
construction of knowledge in computer conferencing. Journal of Educational
Computing Research, 17(4), 395-429.
Haladyna, T.M., & Downing, S.M. (2004). Construct-irrelevant variance in high stakes
testing. Educational Measurement: Issues and Practice, 23(1), 17-27.
Harris, J., Mishra, P., & Koehler, M. (2009). Teachers’ technological pedagogical content
knowledge and learning activity types: Curriculum-based technology integration
reframed. Journal of Research on Technology in Education, 41(4), 393-416.
Henri, F. (1992). Computer conferencing and content analysis. In A. Kaye (Ed.)
Collaborative Learning Through Computer Conferencing: The Najaden papers
(p. 117-136). Berlin: Springer-Verlag.
Herman, J. L. (2008). Accountability and assessment: Is public interest in K-12 education
being served? In K. E. Ryan & L. A. Shepard (Eds.), The future of test-based
educational accountability (pp. 211-232). New York, NY: Routledge.
Huitt, W. (1999, October). The SCANS report revisited. Paper delivered at the Fifth
Annual Gulf South Business and Vocational Education Conference, Valdosta
State University, Valdosta, GA, April 18, 1997. Retrieved from
http://www.edpsycinteractive.org/papers/scanspap.pdf
232!
Inan, F. A., Lowther, D. L., Ross, S.M., & Strahl, D. (2010). Pattern of classroom
activities during students’ use of computers: Relations between instructional
strategies and computer applications. Teaching and Teacher Education 26(3),
540-546.
International Society for Technology in Education (2008). The national educational
technology standards and performance indicators for students. Retrieved from the
ISTE website http://www.iste.org.
Ito, M., Horst, H., Bittanti, M., Boyd, D., Herr-Stephenson, B., Lange, P. G., Pascoe, C.
J., & Robinson, L. (2008). Living and learning with new media: Summary of
findings from the Digital Youth Project. John D. and Catherine T. MacArthur
Foundation Reports on Digital Media and Learning. Retrieved from
http://www.macfound.org
Jarvela, S., & Hakkinen, P. (2002). Web-based cases in teaching and learning – the
quality of discussions and a stage of perspective taking in asynchronous
communication. Interactive Learning Environments. 10(1), 1-22.
Jenkins, H., Clinton, K., Purushotma, R., Robinson, A. J., & Weigel, M. (2006).
Confronting the challenges of participatory culture: Media education for the 21st
century. Retrieved from The MacArthur Foundation website www.macfound.org
Johnson, C. C., & Fargo, J. D. (2010). Urban school reform enabled by transformative
professional development: Impact on teacher change and student learning of
science. Urban Education, 45, 4-29. doi:10.1177/0042085909352073
Johnson, R. T., Johnson, D. W., & Stanne, M. B. (1986). Comparison of computer-
assisted cooperative, competitive, and individualistic learning. American
Educational Research Journal, 23, 382-392.
Johnson, R. T., Johnson, D. W., & Stanne, M. B. (2000). Cooperative learning methods:
A meta-analysis. Methods 1, 1-33. Retrieved from www.mendeley.com
Johnson, D. W., & Johnson, R. T. (2009). An educational psychology success story:
Social Interdependence theory and cooperative learning. Educational Researcher,
38, 365-379. doi:10.3102/0013189X09339057
Kane, M. T., Crooks, T. J., & Cohen, A. S. (1999). Validating measures of performance.
Educational Measurement: Issues and Practice, 18(2), 5-17.
Kieras, D.E., & Meyer, D.E. (2000). The role of cognitive task analysis in the application
of predictive models of human performance. In J. M. Schraagen, S. F. Chipman,
& V. L. Shalin (Eds.), Cognitive Task Analysis, (pp. 237-260). Mahwah, NJ:
Lawrence Erlbaum, Associates.
233!
Kirmani, S. (2008). Heuristic evaluation quality score (HEQS): Defining heuristic
expertise. Journal of Usability Studies 4(1), 49-59.
Kleiman, G.M. (2004). Myths and realities about technology in k-12 schools: Five years
later. Contemporary Issues in Technology and Teacher Education, 4(2), 248-253.
Lazonder, A. W. (2005). Do two heads search better than one? British Journal of
Educational Technology,36(3), 465-475. Retrieved from
http://dx.doi.org/10.1111/j.1467-8535.2005.00478.x
Lenhart, A., Purcell, K., Smith, A., & Zickuhr, K. (2010). Social media and young adults.
Retrieved from the Pew Internet & the American Life Project website
http://pewinternet.org/Reports/2010/Social-Media-and- Young-Adults.aspx.
Levy, F., & R. Murnane. (2007). How Computerized Work and Globalization Shape
Human Skill Demands. In M. Suárez-Orozco (Ed.) Learning in the global era (pp
158-174). Berkeley, CA: University of California Press.
Linn, R. L. (2008). Educational accountability systems. In K. E. Ryan & L. A. Shepard
(Eds.), The future of test-based educational accountability (pp. 3-24). New York,
NY: Routledge.
Lou, Y., Abrami, P. C., & d’Apollonia, S. (2001). Small group and individual learning
with technology: A meta analysis. Review of Educational Research, 71(3), 449-
521.
Lowry, P. B., & Nunamaker, J. F., Jr. (2003). Using Internet-based, distributed
collaborative writing tools to improve coordination and group awareness in
writing teams. IEEE Transactions on Professional Communication, 46 (4), 277-
297.
Lowry, P. B., Roberts, T. L., Romano, N. C., Jr., Cheney, P. D., & Hightower, R. T.
(2006). The impact of group size and social presence on small- group
communication: Does computer-mediated communication make a difference?
Small Group Research, 37 (6), 631-661.
Marshall, C., & Rossman, G. B. (1995). Designing qualitative research, 2nd ed. Thousand
Oaks, CA: Sage Publications
McTighe, J., & Seif, E. (2010). An implementation framework to support 21st century
skills. In J. Bellanca and R. Brandt (Eds.), 21st century skills: Rethinking how
students learn (pp.33-49). Bloomington, IN: Solution Tree Press.
Messick, S. (1995). The Validity of Psychological Assessment. American Psychologist,
50 (9), 741-749.
234!
Messick, S. (1996). Validity of performance assessments. In G. Phillips (Ed.), Technical
issues in large-scale performance assessment (pp. 1-18). Washington, DC:
National Center for Education Statistics.
Metiri Group & North Central Regional Education Laboratory (NCREL). (2003).
EnGauge 21st century skills: Literacy in the digital age. Chicago, IL: NCREL.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded
sourcebook, 2nd ed. Thousand Oaks, CA: Sage.
Militello L. G., & Hutton, R. J. B. (1998). Applied cognitive task analysis (ACTA): A
practitioner’s toolkit for understanding cognitive task demands. Ergonomics, 41
(11), 1618-1641.
Monroe-Baillargeon, A., & Shema, A. L. (2010). Time to talk: An urban school’s use of
literature circles to create a professional learning community. Education and
Urban Society, 42, 651-673. doi: 10.1177/0013124510370942
Moss, P. A. (1996). Enlarging the dialogue in educational measurement: Voices from
interpretive research traditions. Educational Researcher, 25(1). 20-29. doi:
10.3102/0013189X025001020
National Commission on Excellence in Education. (April, 1983). A nation at risk.
Available online at http://www.ed.gov/pubs/NatAtRisk/risk.html"!
No Child Left Behind Act of 2001, 20 U.S.C. § 6319 (2002). Retrieved from
http://www.ed.gov/policy/elsec/leg/esea02/index.
Nussbaum, M., Alvarez, C., McFarlane,A., Gomez, F., Claro, S., & Radovic, D. (2009).
Technology as small group face-to-face collaborative scaffolding. Computers &
Education 52, 147-153.
Organization for Economic Cooperation and Development. (2005). The definition and
selection of key competencies: Executive summary. Paris, France: OECD.
Retrieved from http://www.oecd.org/document/17/
Partnership for 21st Century Skills. (2003). Learning for the 21st century. Retrieved from
the Partnership for 21st Century Skills websited: http://www.21stcenturyskills.org
Partnership for 21st Century Skills. (2008). 21st Century Skills Education &
Competitiveness: A Resource and Policy Guide. Retrieved from Partnership for
21st Century Skills website!
http://www.p21.org/documents/21st_century_skills_education_and_competitiven
ess_guide.pdf
235!
Partnership for 21st Century Skills. (2009). Framework for 21st century skills. Retrieved
from Partnership for 21st Century Skills website!http://www.p21.org/index.php !
Pecheone, R., Kahl, S., Hamma, J., & Jaquith, A. (2010). Through a looking glass:
Lessons learned and future directions for performance assessment. Retrieved
from Stanford University, Stanford Center for Opportunity Policy in Education
website: http://www.edpolicy.stanford.edu/
Prenksy, M. (2001). Digital natives, digital immigrants. On the Horizon, 9, (5), 1-6. !
Raggozino, K., Resnick, H., Utne-O’Brien, M., & Weissberg, R.P. (2003). Promoting
academic achievement through social and emotional learning. Educational
Horizons (81), 169-171.
Rideout, V.J., Foehr, U.G., & Roberts, D.F. (2010). Generation M2: Media in the lives of
8 to 18 year olds. Retrieved from the Kaiser Family Foundation website
http://www.kff.org
Rienties, B., Tempelaar, D., van den Bossche, P., Gijselaers, W., & Segers, M. (2008).
Students’ motivations and their contributions to virtual learning. Presented at the
meeting of International Conference of the Learning Sciences, June 2008,
Utrecht, the Netherlands.
Rotherham, A. J., & Willingham, D. (2009). 21st century skills: The challenges ahead.
Teaching for the 21st Century (67) 1, 16-21.
Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (1999). Assessing social
presence in asynchronous text-based computer conferencing. Journal of Distance
Education, 14, 51-70.
Saunders, W. M., Goldenburg, C. N., & Gallimore, R. (2009). Increasing achievement by
focusing grade-level teams on improving classroom learning: A prospective,
quasi-experimental study of title I schools. American Educational Research
Journal, 46, 1006-1033. doi:10.3102/0002831209333185
Savenye, W. C., & Robinson, R.S. (2004). Qualitative research issues and methods: An
introduction for educational technologies. In D.H. Jonassen, & P. Harris, (Eds.),
Handbook of research for educational communications and technology (pp. 1045-
1071). Mahwah, NJ: Lawrence Erlbaum Associates.!!
Scardamalia, M., Bransford, J., Kozma, B., & Quellmalz, E. (2012). New assessments
and environments for knowledge building. In P. Griffin, B. McGaw, & E. Care
(Eds.), Assessment and teaching of 21st century skills (pp 67-142). New York,
NY: Springer.!!
236!
Schellens, T., & Valcke, M. (2005). Collaborative learning in asynchronous discussion
groups: What about the impact on cognitive processing? Computers in Human
Behavior, 21, 957-975. !
Schellens, T., Van Keer, H., & Valcke, M. (2005). The impact of role assignment on
knowledge construction in asynchronous discussion groups: A multilevel
analysis. Small Group Research, 36, 704-745.
The Secretary's Commission on Achieving Necessary Skills. (1991). What work requires
of schools: A SCANS report for America 2000. Washington DC: U.S. Department
of Labor. Retrieved from http://wdr.doleta.gov/SCANS/whatwork/!!
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-
experimental designs for generalized causal inference. Boston, MA: Houghton,
Mifflin and Company.
Shepard, L. A. (2008). A brief history of accountability testing, 1965-2007. In K. E. Ryan
& L. A. Shepard (Eds.), The future of test-based educational accountability (pp.
25-46). New York, NY: Routledge.
Silva, E. (2008). Measuring skills for the 21st century. Washington, DC: Education
Board.
Smeets, E. (2005). Does ICT contribute to powerful learning environments in primary
education? Computers & Education (44) 343-355.
Smith, K. A., Shepherd, S. D., Johnson, D. W., & Johnson, R. T., (2005). Pedagogies of
engagement: Classroom-based practices. Journal of Engineering Education, 94, 1-15.
Stahl, G. (2009). How to study group cognition: Analyzing interactions. Retrieved from
citeseerx.ist.psu.edu/viewdoc/download
Stahl, G., Koschmann, T., & Suthers, D. (2006). Computer-supported collaborative
learning: A historical perspective. In R. K. Sawyer (Ed.), Cambridge handbook of
the learning sciences (pp. 406-427). Cambridge, UK: Cambridge University
Press.
Stanton, N.A. (2006). Hierarchical task analysis: Developments, applications, and
extensions. Applied Ergonomics, 37 (1), 55-79. Retrieved from
http://dx.doi.org/10.1016/j.apergo.2005.06.003
Stecher, B. (2010). Performance assessment in an era of standards-based educational
accountability. Retrieved from Retrieved from Stanford University, Stanford
Center for Opportunity Policy in Education website
http://www.edpolicy.stanford.edu/
237!
Strijbos, J. W., Martens, R. L., Prins, F. J., & Jochems, W. M. G. (2006). Content
analysis: What are they talking about? Computers & Education 46 (1), 26-48.
Strijbos, J. W., Martens, R. L., & Jochems, W. M. G. (2004a). Designing for interaction:
Six steps to designing computer-supported group-based learning. Computers &
Education, 42, 403-424.
Strijbos, J. W., Martens, R. L., Jochems, W. M. G., & Broers, N. J. (2004b). The effect of
functional roles on group efficiency: Using multilevel modeling and content
analysis to investigate computer- supported collaboration in small groups. Small
Group Research, 35, 195-229.
Stuart, L., & Dahm, E. (1999). 21st Century Skills for 21st Century Jobs. Federal
Publications, Paper 151. http://digitalcommons.ilr.cornell.edu/key_workplace/151
U.S. Department of Education (1996). Getting America’s students ready for the 21st
century: Meeting the technology literacy challenge: A report to the nation on
technology and education. Retrieved from
http://www.ed.gov/about/offices/list/os/technology/plan/national/index.html
U.S. Department of Education (1997). Overview of technology and education reform.
Retrieved online from
http://www.ed.gov/pubs/EdReformStudies/EdTech/overview.html
U.S. Department of Education, Office of Planning, Evaluation and Policy Development,
(2010). ESEA Blueprint for Reform. Washington, D.C. Retrieved from
www2.ed.gov/policy/elsec/leg/blueprint/blueprint.pdf
Veerman, A. L., & Veldhuis-Diermanse, E. (2001). Collaborative learning through
computer-mediated communication in academic education. In P. Dillenbourg, A.
Eurelings & K. Hakkarainen (Eds.), European perspectives on computer-
supported collaborative learning: Proceedings of the 1st European conference on
computer-supported collaborative learning (pp. 625-632). Maastricht: University
of Maastricht. Retrieved from!http://www.gsic.uva.es/public.php?lang=en&list_
Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of
professional learning communities on teaching practice and student learning.
Teaching and Teacher Education, 24, 80-91. doi:10.1016/j.tate.2007.01.004
Vygotsky, L.S. (1978). Mind in society: The development of higher psychological
processes. Cambridge, MA: Harvard University Press.
Wagner, T. (2008). The Global Achievement Gap; Why even our best schools don’t teach
the new survival skills our children need-and what we can do about it. New York,
NY: Basic Books.
238!
Web-based Education Commission (2000). The power of the Internet for learning:
Moving from promise to practice. Retrieved from
http://www.ed.gov/offices/AC/WBEC/FinalReport/WBECReport.pdf!!
Wei, R. C., Darling-Hammond, L., Andree, A., Richardson, N., & Orphanos, S. (2009).
Professional learning in the learning profession: A status report on teacher
development in the U.S. and abroad. Retrieved from the National Staff
Development Council website http://www.srnleads.org/resources/publications/
pdf/nsdc_profdev_tech_report.pdf
Wiggins, G. & McTighe, J. (2001). What is Backward Design? In G. Wiggins & J.
McTighe, (Eds.), Understanding by Design, (1st ed.), (pp. 7-19). Upper Saddle
River, NJ: Merrill Prentice Hall.
Wilson, M., Beja, I., Scalise, K., Templin, J., Wiliam, D., & Torres Irribara, D. (2012).
Perspectives on methodological issues. In P. Griffin, B. McGaw, & E. Care
(Eds.), Assessment and teaching of 21st century skills (pp 67-142). New York,
NY: Springer.
Wilson, M., & Scalise, K. (2012). Measuring Collaborative Digital Literacy. Paper
presented at the Invitational Research Symposium on Technology Enhanced
Assessments, Session on Measuring Problem Solving, Creativity,
Communication, and Other Cross-Curricular 21st Century Skills within the
Common Core State Standards, Washington, D.C.
http://www.k12center.org/events/research_meetings/tea.html.
World Bank. (2003). Lifelong learning in the global knowledge economy: Challenges for
developing countries. Washington, DC: World Bank. Retrieved from http://
www.wds.worldbank.org/external/default!!
Zimmerman, B.J. (2000). Attaining self-regulation: A social cognitive perspective. In M
Boekaerts, P. Pintrich, & M. Seidner (Eds.), Self-regulation: Theory, research and
applications (pp. 1-39). Orlando, FL: Academic Press.
Zins, J.E., Bloodworth, M. R., Weissberg, R. P., & Walberg, H.J. (2006). The scientific
base linking social and emotional learning to school success. Journal of
Educational and Psychological Consultation 17, 191-210.