A Continuum for Assessing Science Process Knowledge in Grades K-6
by
Michael E. Beeth, The Ohio State University
Linda
Cross, Highland Park Elementary, Grove City, OH
Christy Pearl,
Highland Park Elementary, Grove City, OH
Janice Pirro, Highland
Park Elementary, Grove City, OH
Kara Yagnesak, Highland Park
Elementary, Grove City, OH
Janette Kennedy, Richard Avenue
Elementary, Grove City, OH
Assessing Science Process Knowledge
Assessing elementary school students' science process knowledge, their knowledge about doing science, is essential for several reasons. First, if students are to increase the depth of their content knowledge as they participate in opportunities to learn science, teachers need more accurate information regarding what students have learned about scientific processes. Second, more accurate assessment of content and process knowledge at all grade levels would provide teachers, policy makers, and administrators with better information on which to make instructional decisions. Third, many constructivist approaches to learning assume that valid knowledge of students' existing ideas will be communicated to others as a condition of learning. While the National Science Education Standards (NRC, 1998) call for teaching science content in greater depth at each grade level, they also recognize that students need to learn inquiry processes associated with doing science at all grade levels. More accurate assessments of science process knowledge would provide teachers, parents and district administrators with information on the effectiveness of the enacted curricula, and policy makers with information about the effectiveness of the inquiry based instructional approaches recommended for science teaching and learning. This article reports on the development and implementation of a rubric for assessing science process knowledge in grades K-6. Excerpts from interviews conducted with teachers applying this rubric to assess students' science process knowledge are presented throughout to substantiate claims about the usefulness of the rubric.
In spite of the facts that curricular materials and assessment tools exist for science content or concepts at all grade levels, there is very little information on how to assess knowledge of processes students are expected to learn as a result of science instruction. Our efforts are based on previous experiences with the development and use of continua for assessing reading and writing literacy's (see Appendix A). It is important to note that the science assessment continuum we developed is not designed to evaluate everything a student should know about science. Knowledge of specific science concepts, for example, is not something this continuum can assess. Instruments that document learning of specific science content knowledge are generally prepared to reflect a course of study adopted by a local school district (e.g., the Southwestern City Schools Science Course of Study, n.d.). In addition, many states have or are developing competency based examinations in science that evaluate unifying concepts such as systems, constancy and change, and form and function (see Science: Ohio's Model Competency Based Program, 1994). In Ohio, state sanctioned instruments that assess this type of information are required of all students in grades four, six, and nine. Frequently, the feedback provided on these types of instruments is "high-stakes" in that it is used as the primary source of data for evaluating student learning and, in some cases, promotion to the next grade. We, on the other hand, intentionally wrote our continuum to assess science process knowledge in order to capture a different aspect of learning - the growth of an individual student's knowledge about processes that we believe are essential to learning about science processes. The continuum for assessing science process knowledge we developed is capable of assessing changes in a student's knowledge over time rather than assessing learning at just one moment in time.
The assessment rubric that arose from our efforts to document changes in learning to learn science is presented in Appendix B. Before describing the development of this continuum, however, it is necessary that we provide the reader with some of our assumptions about the ability of elementary school teachers to assess science process knowledge. Given that elementary school teachers who would use this assessment rubric had used a similar continuum for assessing reading and writing in the past, we assumed they would be able to observe and record information regarding science process skills once they understood the assessment items on our continuum. A second assumption was that repeated use of the continuum by a teacher would capture change in a student's science process knowledge over time. Our intent here was to have an assessment rubric that could document change in knowledge of science processes that may be incremental and not easily captured by one administration of a 'paper and pencil' instrument. Finally, we assumed that any instrument we developed would require few changes to the existing science course of study for an elementary school or teacher. The point here is that we did not want teachers to think that their curriculum must change to address the items on our continuum. Rather, we wanted teachers using the continuum to decide if the items on our rubric caused any changes in their instruction. Before presenting the science continuum we will describe some of the context that contributed to producing the continuum for assessing science process knowledge in Appendix B.
The context for assessing learning
Construction of the assessment continuum took place at Highland Park Elementary (Southwestern City Schools, Grove City, OH) through the efforts of four Highland Park teachers, one teacher from Richard Avenue Elementary, and a university science educator. Highland Park is one of seventeen elementary schools in the Southwestern City School District with approximately 500 students, kindergarten through grade five. Most students live in the surrounding neighborhoods, although some attend by special request of their parent(s). The children come from a wide range of socio-economic backgrounds. Most of the teachers at Highland Park have been teaching at this school for more than five years, and most hold a masters degree in education. The staff at this child-centered school shares a developmental philosophy of learning that is not linked strictly to a students chronological age. The Highland Park view of learning includes the notions that children are motivated and capable learners from their first enrollment at the school. Students experience elementary school as only one point on a learning continuum that begins with their preschool experiences. Students in all grade levels follow their own interests through a curriculum that focuses on a limited number of thematic units of instruction that are covered in great depth. During instruction, teachers work with individual students and collaborative groups to ensure that all areas of the curriculum have been covered.
Through their shared teaching experiences at Highland Park, these teachers have found that many children progress through stages of development that reflect increasingly complex ways of representing what they are learning. These teachers have settled on describing the progressive development of students as emerging, beginning, developing, advancing or consolidating with respect to how they represent their thinking on a topic (see the Literacy and Writing sections of Student Progress Reports in Appendix A). The teacher's task when using our rubric is to assess a student's stage of intellectual development and then expand upon that students knowledge and abilities so that he or she develops competency in specific intellectual abilities as well as practical skills. In doing so, these teachers explicitly recognize that children learn at different rates and in different ways. They plan their instruction, both individual and whole class, in response to feedback they receive from applying assessment rubrics like the ones for reading and writing. Although the methods of instruction at Highland Park differ significantly from other schools in the Southwestern School District, students in the school are expected to follow the same course of study as other students in the district.
Applying this philosophy of learning, the Highland Park staff regularly seeks out professional development activities that suit their needs as teachers. In 1991, the College of Education at Ohio State University selected Highland Park Elementary as a professional development school (PDS). The model of a PDS at Ohio State is designed to "connect colleges of education with schools; to establish working partnerships among university faculty, practicing teachers, and administrators that are designed around systematic improvement in practice; and to serve as settings for teaching professionals to test different instructional arrangements, for novice teachers and researchers to work under the guidance of gifted practitioners, for the exchange of professional knowledge between university faculty and practitioners, and for the development of new structures designed around the demand of a new profession." (Kirschner, 1995).
Highland Parks involvement in a PDS allowed staff members to initiate and design experiences that contributed to their professional growth while earning graduate credit from The Ohio State University. In 1991, the Highland Park staff sought out Dr. Becky Kirschner, an Ohio State University professor, to assist with the coordination of their professional development interests. Among these interests was an action research project involving two teachers in redesigning the schools Student Progress Report (Howlett & Kerstetter, 1995). The intent of this research was to make the assessment of reading and writing consistent with the Highland Park philosophy of learning. In brief, these teachers wanted to change "the way they assessed children, both for ongoing instruction and for reporting to parent purposes" (Dickinson, Kirschner, & Rogers, 1995, p. 43). In light of these interests, they wanted to develop an assessment instrument that would communicate developmental aspects of learning to read and write in addition to answering the most commonly asked question by parents - "Is my child reading/writing at grade level?" An assessment capable of this would also offer teachers feedback on their instruction, feedback that could be used when planning future instruction. In the end, these teachers developed a system of documenting student progress that included portfolios of student work to document growth as learners, a revised assessment instrument that could contribute to their ongoing instruction, and a revised reporting mechanism for parents (Dickinson, Kirschner & Rogers, 1995). Appendix A contains samples of the Primary and Intermediate Student Progress Report's that were developed for assessing reading and writing by the Highland Park Faculty.
Developing a continuum for assessing Science Process Knowledge
The Student Progress Report developed in 1995 lumped subjects such as social studies, science, and health remained lumped together under the heading of "Integrated Curriculum (see Appendix A). In an effort to continue developing the Student Progress Report, teachers at Highland Park (the co-authors of this paper) contacted a science educator (Dr. Michael Beeth) to construct a continuum for assessing science that was similar to those already in use for reading and writing. Our joint involvement began in 1996 by sharing each individuals ideas about what it might mean to be scientifically literate in grades K-6. Next we discussed processes of science we believed were applicable when learning a wide variety of science content. Among the processes we identified as necessary for K-6 students were observing physical properties, asking questions, naming and classifying natural objects, applying science vocabulary to describe details, familiarity with using science equipment, using print, electronic and human resources, rational thinking, and integrating other disciplines with science. From this list we developed a rationale for why we thought each component was an important aspect of scientific process knowledge (see Table 1). We did not specify any particular science content since the teachers using this rubric would need to comply with their District Course of Study.
We then illustrated each component of science with assessment items we felt a student might say or activities they might engage in that would indicate they were competent with a particular item at each developmental level (see Table 2).
Table 1
Processes of Science
Observing
Rationale:
Scientific questions usually begin with observations of the natural
world. Scientists observe objects, properties of those objects, and
phenomena that objects undergo
Asking questions
Rationale:
Scientists ask questions about objects found in the natural world
and the phenomena they undergo.
Naming and classifying natural
objects
Rationale: Fundamental to all scientific investigations
is communicating about the objects, parts of objects, and phenomena
that occur in the natural world. Scientists give names to objects
and phenomena so they can be precise when talking about the object
or phenomena they are interested in studying
Attending to details
Rationale:
Scientists keep careful records of their observations. All
scientists collect, organize, and analyze data in many forms to help
answer their questions.
Familiarity with equipment
Rationale: Scientists use equipment to help them make more
precise observations. They must be comfortable with the technology
used in their investigation.
Using resources
Rationale:
Scientists use existing resources to help them think about their
current questions. Some of the resources they consult include people
(colleagues or experts in the field), reference books, tables,
printed reports of past research, and the Internet.
Rational thinking
Rationale:
Scientific thinking involves reasoning about data and drawing
conclusions. This reasoning may be either inductive (drawing
conclusions based on specific instances) or deductive (establishing
generalizations from which conclusions follow).
Integrating science
Rationale: Science includes using
(and sometimes learning) mathematics, writing, thinking, reading,
and working with others. More often than not, science involves teams
of researchers with each member of the team contributing different
strengths to the combined efforts of all. Scientists report the
results of their investigations in several ways --orally at
conferences involving their peers and through written media such as
journals and the Internet.
Developmental Levels and Science Process
Knowledge
Science:Your childs learning progression as
of |
|||||
Emerging |
Advancing |
||||
|
Nov |
Jan |
Mar |
Jun |
|
EFFORT |
___ |
___ |
___ |
___ |
Revising the continuum
In spite of our efforts to bring the continuum to this stage of development, our first attempt to use the continuum was met with several significant problems. First, teachers found that some assessment items could not be assessed from a single interaction with a student. To address this problem we identified those assessment items that we felt could be evaluated by a single interaction with (+) and those that need multiple interactions with (_ ). Second, we found that the most frequently recorded information from students were verbal and written statements. This problem was addressed by adding columns to the right of each assessment item for observational and pictorial information easily available to a teacher. Information placed in any of these columns could come from a variety of sources including observation of a students behavior, verbal statements, written text or illustrations. The intent of these columns was to help teachers become aware of multiple ways that students might represent their understanding of science process knowledge other than through traditional forms of literacy (e.g., reading and writing).
I could see where some kids were based on how they described things. One child could say, "This is wet", and another could elaborate on why it is wet, another child could write about it. Some kids were more descriptive than other kids. (Second grade teacher)
Another finding from our first use of the continuum was that several teachers noticed that the students they rated highest on the science continuum were not necessarily the same students who placed highest on continua for reading and writing.
My placement of students in science and literacy didn't match. A lot of kids were strong on both but there were those kids who placed higher in science - a strong interest, a strong knowledge, yet they weren't necessarily highest in literacy. I also had some kids higher in writing who weren't high in science. (Third grade teacher)
The kind of kid who was hard for me to place was someone who was verbally strong. They had scientific language, actions and behaviors but couldn't reproduce it. So I had to really try to separate [assessment of literacy from assessment of science]. One student couldn't write. When he would turn in an observation it might look like an Emergent writer [on the literacy continuum] but I knew him better and I knew if I talked to him he would easily fall into Beginning or Developing [on the science continuum] because of his verbal comments. (Second grade teacher)
This seemed odd at first because there was an assumed expectation that an accomplished reader and writer should be accomplished in all subjects. However, as the teachers began to talk about the individual students who they placed highest on the science continuum, it became clear that evidence sited for competency in science was not exclusively limited to competency with reading and writing. We believe that assessing across the four categories of observation, verbal comments, written text, and pictures etc. offers teacher's more contexts within which to assess learning, resulting in more accurate assessment of a students knowledge of the science processes we identified.
One [who placed higher on the science continuum] was actually a non-reader, still in the Emerging level on the reading and writing continuum. I had another first grader that was actually up to the Developing level in science. He was at the Beginning level in reading and writing but at Developing in science. (First and Second grade teacher)
Kids that were just curious about life, always asking questions, just jumped right to the top [of a developmental category]. They were the ones seeking out their own information. They were the ones using science tools properly. (First and Second grade teacher)
I had to keep telling myself this is not reading and writing. For a lot of kids I ended up going more with observation and verbal characteristics. I would sit back and think what a student did other than produce something [written] The kids who were higher readers and writers might have fallen in the middle of the science continuum. I also had some really low, like Kindergarten ability students, who were placed pretty high on the science continuum. (Second grade teacher)
To further address the problem we recognized when first using the continuum - what counts as evidence - we continued to add examples of student's work and anecdotal comments from teachers using the continuum. These "data" were placed on a large roll of paper as exemplars of student work meeting a particular assessment item. In hindsight, examples of student work served two critical purposes that would be essential for the teachers who actually used the rubric to assess learning. First, multiple examples of student work placed on the rolled paper allowed teachers to compare the overall placement of the student they were assessing, say at the beginning of Emerging, to students placed at the same place by other teachers. In effect, this display of student work helped teachers validate their placement of a student developmentally by comparison to the data and observations made by their colleagues, regardless of grade level. These exemplars also included multiple forms of evidence (i.e., observations by a teacher or written products from a student) that represented a variety of ways to document achievement of a particular assessment item. This process of standardizing assessment decisions required a considerable amount of time (and is still underway) but helped many teachers when placing students on the continuum for the first time.
I used the big poster we hung in the lounge with actual samples of student work. I need to see things. So I would go look at the samples of work in the Beginning section and say 'does this student do work like this'? I tended to mark them higher if I didn't look at other student's work. But the visual display really helped. (Second grade teacher)
I think that the big poster we developed helped a lot of teachers understand how you place students on the continuum. Without that poster, it would be incomplete. (First and Second grade teacher)
We believe that standardizing our continuum against actual student work in this public way allowed us to capture the developmental aspect of learning we built into the continuum. In essence, teachers using the continuum placed students at appropriate developmental levels as compared to other teachers using the continuum. In particular, because the evidence we accepted for competency in science is demonstrated through multiple modes of representation in comparison to other students at similar developmental levels, we believe that our continuum is a more accurate means of assessing changes in how a student is learning science processes. In the simplest case, some aspects of learning how to learn science (e.g., uses science equipment safely, appropriately and effectively) must be demonstrated and can not be determined by paper and pencil assessments. Alternatively, students can effectively communicate aspects of learning such as "describes the outcome of an investigation" in a variety of ways. We believe our continuum allows students to express their learning in multiple ways and gives teachers a reasonable process for documenting that learning is occurring.
We then tested the continuum a second time and found it to be not only easier to use but also it provided more information in terms of effecting instruction. An unexpected benefit of using the revised continuum was that teachers were starting to become aware of how the science activities they planned for students did or did not allow them to assess items on the continuum.
It made me work harder as a teacher. I could see [the rubric] making me put more thought into 'why are we doing this activity'? In conjunction with the course of study - how does this curriculum fit in with that assessment? And to be sure I now give students the opportunities to do things that fit in with the descriptors on the science continuum. I ask myself, what do I have in the classroom that will help make that happen? (Second grade teacher)
It affected my teaching in that it made me focus on pulling all of the science processes together, with the content. Instead of just saying 'here is what you need to learn' it gave me different ways to introduce science processes and to explore them We want to hit science processes in the District Course of Study really hard. In planning inservice for the teachers [who will use this continuum] a big component is going to be teaching the teachers how to assess science processes. (First and Second grade teacher)
While feedback from using this continuum allowed teachers to assess students with more and more accurate information regarding development as a learner of science, it also allowed the teachers to reflect on the opportunities for learning science processes in their instruction. The impacts on instruction of the assessment rubric we developed was an unexpected outcome but one that fits well with the National Science Standard recommendation that there be a "match between the technical quality of the data collected and the consequences of the actions taken" (NRC, 1998, p.5).
Conclusions
Parents of students attending Highland Park have yet to receive Student Progress Reports that include this science assessment. However, they have been very receptive in the past to similar information about their childs development with respect to reading and writing.
Last year I never had a parent question me [about placing a student on the literacy continuum]. Not once. The thing is though, you do have to know were you are putting them and why. I mean, when I put kids on a continuum I have to think it through and look at their work and reflect and I'm very, very picky about where I start them on a continuum. (Third grade teacher)
Highland Park teachers have reason to believe that many parents use feedback from the literacy portion of the Student Progress Report to help their children work on specific learning outside of the school setting. For example, if the teacher informed the parent that their child does not recognize sight words, the parent could work with their child on this at home. Information about learning like this have been exchanged between teachers and parents during a parent conference with quite good results - the parents like to know which specific skill they can wok on at home with their child. We anticipate that feedback from the science continuum will have similar impacts on the parents of these students (e.g., helping a child develop vocabulary that goes beyond 'the gross physical characteristics of an object' for example). It is also our belief that, in combination with district and state level evaluations of specific science concepts and themes, our continuum does provide a more complete picture of the process skills a student learns throughout Grades K-6.
Under the old system, if we were studying frogs and the kid knew about frogs he was going to get a good mark. Under the new system, if we were studying how to make good observations or what is an observation or what is the difference between what you see and what you believe to be true, and so on and so forth, that could be a whole different thing. It is kind of like if you are studying addition facts you might do really well but then when you shift to fractions you don't do very well - it looks like your grade is dropping. No your grade isn't dropping, you are at different places on a continuum. We almost need a continuum for math. (Second grade teacher)
References
Dickinson, R., Kirschner, N. & Rogers, T. (1995). Teacher-researcher and the creation of new roles for teachers. Ohio Journal of English Language Arts, 36 (1), 40-47.
Howlett, S, & Kerstetter, K. (1995). Bringing practice into line with philosophy: The development of an alternative reporting document. Ohio Journal of English Language Arts, 36 (1), 62-68.
Kirschner, B. (1995). Teacher-researcher: New voices and multiple perspectives. Ohio Journal of English Language Arts, 36 (1), 5-10.
National Research Council. (1998). National science education standards. Washington, DC: National Academy.
Science: Ohio's Model Competency-Based Program. (1994). Columbus, OH: State Board of Education.
Southwestern City School Science Course of Study. (n.d.). Available from Southwestern City Schools, 2975 Kingston Avenue, Grove City, OH, 43123.
Michael E. Beeth is an Associate Professor (Science Education) in the College of Education at The Ohio State University where he has taught science methods courses for six years. His scholarly interests include understanding the application of conceptual change models of learning by classroom teachers and supporting teachers who engage in classroom action research.
Linda Cross has taught elementary school for eleven years. She currently teaches all subjects in a second grade classroom. Her work with the Science Continuum was the basis for an action research project that she submitted as part of her Master of Arts degree at The Ohio State University. Christy Pearl has taught grades K-5 for eighteen years in the Southwestern City Schools. Her curriculum always includes a significant science component. When not teaching, Christy enjoys collecting rocks and turtles. Janice Pirro has taught elementary school for twelve years. She currently teaches third grade at Highland Park Elementary. Her interests in learning science first hand lead her to volunteer time at a local archaeological site last summer. Kara Yagnesak has taught for eight years at Highland Park Elementary. She is currently teaching second grade. She has continued to develop her personal interests in science teaching and learning by working on the Science Continuum since completing her Master's of Arts degree in 1997. Janette Kennedy has been an elementary school teacher for twenty-eight years. Her current teaching assignment is in a third grade classroom. She serves as a Mentor for a preservice teacher and is the co-author of a Venture Capital Grant (Ohio Department of Education) that was funded to develop a multiage program of instruction at her school.Appendix A
Highland Park Student Progress Report
(Note format for assessing Reading and Writing processes)
|
|
+ assessed by one observation, relatively easy to assess
_
assessed by more than one observation, adequately assessed over time
+ | _ |
Assessment item |
Observe |
Verbal |
Text |
Picture, etc. |
|
|
|
sorts and classifies objects based on physical characteristics |
states obvious physical characteristic - |
writes descriptive comments about an object - the brown bear has big teeth |
draws a picture that resembles an object - colors a bear brown |
|
|
|
|
compares differing volumes - combines two or more volumes into one container |
writes about the number of objects (i.e., counting bears) that fit in a container |
|
|
|
|
|
gives names to objects - this is a kangaroo, a crystal, the root of a plant |
|
|
|
_ |
asks questions of a factual nature |
|
questions can be answered with an undisputed fact _ How many ...? What are the parts of ...? |
|
|
|
_ |
defers explanation to others/authorities |
|
Defers explanation to others - my parent said... |
cites the ideas of others - in the book it said ... |
|
|
|
Assessment item | Observe | Verbal | Text | Pictures, etc. |
+ |
|
asks questions about the characteristics of objects and phenomena |
|
asks questions about the properties of an object - Why is this rock shiny? What makes thunder? Which objects sink/float? |
writes questions as hypotheses - We wanted to know why this rock is shiny. We wanted to know why things sink and float. |
|
+ |
_ |
|
|
mentions the interaction of two or more objects - the plant needed sunlight to grow |
|
|
+ |
_ |
uses science equipment to collect information (rather than as a toy) |
gradually spends more time working with (than playing with) equipment |
|
|
|
|
_ |
understands that phenomena can have names |
|
|
|
|
|
_ |
gives egocentric reasons as an explanation |
|
|
|
|
|
|
Assessment item |
Observe |
Verbal |
Text |
Pictures, etc. |
+ |
|
understands how to collect and organize data |
|
|
includes a summary of data as part of a lab report - includes a chart, graph or drawing |
provides a title and labels the axis of a bar graph, pie chart or drawing |
+
|
|
uses science equipment safely, appropriately, and effectively |
uses equipment to extend senses - uses a magnifying glass, balance or eye dropper to make precise observations or measurements |
|
writes about how equipment helped them extend their senses - we measured exactly five drops of water... |
|
+
|
|
identifies variables that affect an experiment |
|
states which variable(s) they might investigate - we could investigate the effects of water or light or soil |
writes about the variable(s) and control group they plan to investigate - we studied how much water plants need to grow well by... |
|
+ |
|
gives procedures for what was done |
|
states procedures in sequential order - first we... then we... |
writes procedures - Step 1: prepare soil. Step 2: plant seeds just under the soil. |
|
+
|
_
|
explores the research of others |
consults existing resources - reads books, explores the Internet resources |
identifies the questions that were important to an investigator - these investigators wanted to know... |
cites information gleaned from more than one source in a lab report - Smith said...and Jones said... |
|
|
_
|
gives increasingly more precise descriptions of common physical objects |
|
knows the names for common parts - the parts of a plant are the root, stem, leaf, and flower |
describes the functions of parts of an object - the root absorbs water, the leaf makes food for the plant |
labels a drawing of an object - labels the parts of a plant accurately |
|
_
|
is thinking about objects and physical event from a perspective other than their own |
|
explains how other might see an event - if you were on the sun, the earth would revolve around you |
|
illustrates objects that are beyond their immediate perception - draws objects seen through a microscope or the planets in our solar system |
|
_
|
links explanations for an |
|
relates an observation to an explanation - the puddle dried up when the sun came out and made the water evaporate |
explains how they think something happened - the red dye went through the Celery and into the leaves |
|
|
|
Assessment item |
Observe |
Verbal |
Text |
Pictures, etc. |
+
|
|
describes common physical objects in precise detail |
|
uses precise terminology - this is the femur |
describes objects in detail - the crystals are clear, triangular, and shiny |
labels precise details in a drawing - labels the filament, anther, stigma, style, and ovary |
|
_ |
predicts how an object would behave if you changed the conditions |
tests a prediction - puts hot water and cold in a freezer to see which turns solid first |
states a prediction - if I put hot and cold water... |
writes about the results that came from testing a prediction - we put hot and cold water in the freezer and... |
|
+
|
|
uses science information books or resources in the library |
locates resources such as atlases, encyclopedias, and field guides |
refers to an information resource they used - we found this is in.. |
uses information from resources in written reports - includes a bibliography in a report |
incorporates illustrations from science information resources in written reports |
|
_ |
extracts useful facts or constants from reference materials |
|
|
includes new facts in a concept or idea map |
|
|
_ |
recognizes the importance of the data or information collected |
|
reports on data collected - the sample of water we looked at had.. |
summarizes data - the data we collected tells us that... |
|
+ |
|
selects appropriate science equipment to use during an investigation |
chooses equipment to measures precise volume, mass, etc. |
|
|
|
|
_ |
links information into a chain/sequence of events that explain some phenomena |
|
states events in a sequence that explains how a event happens - water, warmed by the sun, evaporates into the atmosphere, condenses around a dust particle, precipitates as rain, snow or dew, and runs back to the ocean |
writes a sequence that explains how a event happens |
|
|
_ |
describes the outcome of an investigation |
|
talks about an investigation in several ways - to describe procedures, persuade peers, summarize data, etc. |
writes about the what, when, where, and how of an experiment |
|
|
|
Assessment item |
Observe |
Verbal |
Text |
Pictures, etc. |
+ |
|
uses scientific vocabulary appropriately and accurately |
|
|
|
|
+ |
_ |
is comfortable/confident using science equipment |
|
|
|
|
|
_ |
gives causal explanations for why something happened as it did |
|
|
|
|
+ |
|
beginning to reason about events that could happen hypothetically |
|
|
|
|
|
_ |
completes a series of investigations on one topic |
|
|
|
|
|
_ |
writes about questions they would like to study next |
|
|
|
|
|
_ |
communicates their findings and questions of interest to others |
|
|
|
|