by
Michael Svec,
Ph.D.
Furman University
One of the first topics taught in a traditional introductory high-school or college physics course is motion, including the concepts of position, velocity and acceleration. Graphs of objects in motion are frequently used since they offer a valuable alternative to verbal and algebraic description of motion by offering students another way of manipulating the developing concepts (Arons, 1990). Graphs are the best summary of a functional relationship. Many teachers consider the use of graphs in a laboratory setting to be of critical importance for reinforcing graphing skills and developing an understanding of many topics in physics, especially motion.
If graphs are to be a valuable tool for students, then we must know the level of the students' graphing ability. Studies have identified difficulties with such graphing abilities. Students have difficulties making connections among graphs of different variables, physical concepts and the real world, and they often perceive graphs as just a picture (Linn, Layman, & Nachmias, 1987; McDermott, Rosenquist, & van Zee, 1987).
Laboratory activities which focus on graphing more than the traditional labs are valuable in the investigation of student use of graphs. Microcomputer-based labs (MBL), provide immediately available, computer-drawn graphs of objects in motion. MBL is centered around a sonic ranger which measures the distance to an object and creates a distance-versus-time line graph of the object's motion in real-time. Learners can move and see the graph of their motion on the computer screen respond to their motion. When compared to traditional physics labs, the MBL places much more emphasis on reading and making graphs. The computer labs provide an excellent tool to explore the connection between graphing skills and learning science concepts.
Whether the students are in middle school, high school, or college, MBL has demonstrated the ability to improve their understanding of science concepts and cognitive skills such as observation and prediction (Brasell, 1987a; Thornton & Sokoloff, 1990; Friedler, Nachmias, & Linn, 1990). Students can connect abstract concepts with concrete, kinesthetic experiences. The ability of the computers to display the data graphically is cited as one of the reasons why MBL is so effective.
MBL materials appear to improve students' graphing skills (Linn, Layman & Nachmias, 1987; Mokros & Tinker, 1987; Brasell, 1987b). MBL activities seem to help students overcome difficulties with discrimination of slope and height, changes in slope and height, and matching narrative with graph features. This is particularly true of motion labs which involve the student physically moving and creating a graph.
As cited above, MBL and its use of graphs have been shown to improve content knowledge specific to graphing problems and graphing skills. This investigation further explores the relationship between learning the content and graphing interpretation skills, and whether or not students can apply the new content knowledge to new problems which do not use line graphs. Two types of motion content knowledge were defined: 1) content knowledge restricted to graphing problems, and 2) more general content knowledge including word, math and picture problems. The definition of graphing skills was narrowed to interpretation skills such as calculating and interpreting slopes, and changes in slope, which are required to read motion line graphs.
Methods
Research Questions
Because the use of graphs to help students learn content has important classroom implications, it is important to document what the students are learning when using MBL labs and how they are learning those topics. The purpose of this study was to examine the relative effectiveness of the traditional lab method and the MBL for engendering conceptual change in students and to investigate students' ability to interpret and use graphs to help them better learn the kinematic concepts and to apply this understanding of those concepts to new non-graphic problems.
This study seeks to assess the conceptual change in the students' general graphing interpretation skills, specific kinematic graphing skills and conceptual understanding of motion as indicated by achievement on the Graphing Interpretation Skills Test (GIST) and Motion Content Test (MCT). In addition, the results from the instruments will be used to investigate students' abilities to interpret and use graphs to better learn the kinematics concepts and to apply this understanding of those concepts to new problems. The research questions to be explored are presented below.
Research Question 1 (RQ1) was assessed using the scores on the Graphing Interpretation Skills Test (GIST). Questions from the Motion Content Test (MCT) which involve interpreting motion graphs were used to answer RQ2, while the remaining MCT questions helped answer RQ3. Assessment included an analysis of the overall means and item analysis of specifically designed questions containing distractors which identified difficulties. To further frame the original research questions and guide the item analysis, answers were sought to specific hypothesis. By identifying the difficulties, it was possible to provide a richer interpretation of the overall means.
Instruments
The GIST contained 11 multiple-choice items. Three of the items were adapted from the Test of Graphing in Science (TOGS) by McKenzie and Padilla (1986). The remaining items were written by the investigator to match the specific interpretation skills. Distracters were developed and were based on previously identified misconceptions and difficulties. In a field test, the graphing interpretation skills test (GIST) had a KR-20 reliability of 0.97.
The MCT was constructed from several sources and consisted of two types of problems, those which focused on kinematic graphs and those which focused on more traditional non-graphing motion questions. Multiple-choice items on kinematics graphing were adapted from tests by Thornton and Sokoloff (1990) and written by the author in consultation with course instructors. Non-graphing motion items were adapted from the Mechanics Diagnostic Test (Halloun & Hestenes, 1985a), and the Force Concept Inventory (Hestenes, Wells & Swackhamer, 1992). Additional items were developed by the investigator based on the pilot study. The motion content test (MCT) had an KR-20 reliability of 0.98 as a pre- and post-test in a field test.
Both GIST and MCT were field tested in a pilot study conducted
in the Fall of 1993 and reported in Svec (1994). The field study
involved 133 students enrolled in a physics for elementary teachers
course and 420 students enrolled in a general physics course.
The motion content test (MCT) had an KR-20 reliability coefficient
of 0.98 as a pre- and post-test for both populations. On the
graphing interpretation skills test (GIST), the KR-20 reliability
was 0.97. In the study and the pilot study both instruments had
KR-20 reliabilities above 0.90. Test having KR-20 reliabilities
greater than 0.70 are generally considered to be reliable for
group measurements.
The validity of the instruments was constructed based on the results.
Individual test questions were used to identify misconceptions
and difficulties which were identified in other studies using
similar multiple-choice tests and descriptive studies. Content
validity was established by having two physicists and one physics
lab instructor review the test and answers. Only one item on
the MCT and two items on the GIST were identified as being potentially
confusing leaving 95% of the items appropriate. In addition,
since many of the items were taken from published tests designed
for college students, from previous tests given in the course
and by the researcher for college students, the appropriateness
of the items for the grade level is seen as high. The results
between pilot study and final study were consistent with the
same misconceptions and difficulties being documented in approximately
the same percentages. The items on the MCT test accurately represent
the topics covered in both the courses.
Design
The nonequivalent control-group design (Campbell & Stanley, 1963) was selected. The students in one undergraduate introductory physics class, Physical Science for Elementary Teachers (Q202), used the MBL laboratories and served as the treatment group. Another undergraduate introductory physics class, General Physics (P201), employed more traditional motion laboratories and functioned as the control group. The two courses are separate from each other and are taken by different student populations. The dependent variable will be student achievement on a motion test and a graphing interpretation skills test. The content of the lectures and laboratory activities was documented to identify differences in instruction which may affect the variables.
Sample
The samples consist of students enrolled in two general-level undergraduate physics courses. The two courses were offered at two different large Midwestern universities located in the same state. The courses included lecture and laboratory and were required in their majors. A pilot study was conducted in the Fall of 1993 (author, 1995) and the final data were gathered in the Spring semester of 1994.
The treatment course was Physical Science for Elementary Teachers (Q202) which is required to be taken by elementary education majors. The course is three credit hours and involves two 50-minute lectures and one three-hour lab each week. No textbook was used.
The control course was General Physics (P201). The course is taken mainly by premed, life and health science majors. The course focuses on mechanics and waves and is the first half of a 1 year sequence. The class consists of one 2 1/2-hour lecture, one discussion section and one 2-hour lab for five credits. The course text book was Physics, Principles with Applications (Giancoli, 1991).
The treatment group is 83% female while the control group is more evenly mixed with 53% of the students being female. While there are gender differences between the two groups, they are of the same ethnicity, predominantly Caucasian. Both populations had similar number of high school credits in science. Q202 students did take slightly more math in high school and 46% had physics in high school compared to 28% of the P201 students. The P201 students did complete more science course in college
Data Analysis
Test means and standard deviations for GIST, and the two sub scales of the MCT were determined. The effect size is a statistic for quantitatively describing how well the average student who receives the intervention performed relative to the average student who did not receive the intervention. Effect sizes are commonly used in meta-analysis but its use in educational research is becoming more popular. It is a useful statistic for assessing the practical significance of research results (Borg & Gall, 1989).
Since both groups come from different samples, the change from pre-test to post-test was used instead of the actual raw scores themselves to determine if significant change has occurred. The effect size of the changes for the control group and the treatment group was calculated using the raw score gains, which result from subtracting the pre-test mean score from the post-test mean score. The standard deviation of the control group on the post-test was used as the denominator. The effect size is calculated using the following equation:
E.S. = (Xtreatment post-Xtreatment pre) - (Xcontrol post-Xcontrol pre)
scontrol post
Where X represents the raw score mean and s is the standard deviation.
The effect size will help answer the questions as to whether MBL's
graphic presentation improve students' ability to interpret a
variety of line graphs and whether MBL improves students' conceptual
understanding of velocity and acceleration.
To further answer the three research questions and to provide data, individual item analysis was employed. The Statistical Analysis System (SAS) was used to construct matrices which displayed the frequency of students' responses on two different questions or on both the pre- and post-test. For each hypothesis, matrices of appropriate items were constructed to answer the hypothesis.
With the matrix method, it is possible to document whether students' difficulties or misconceptions are consistent across questions and whether there is a relationship between difficulties and misconceptions identified in one question and the way in which students answered another question. For the purposes of evaluating the effect of MBL material, this method of data presentation is superior to the simple presentation of students' percentages selecting a specific multiple-choice answer on the pre- and post-test. The matrix method allows the strength of the students' understanding of a concept to be evaluated.
The results from a matrix were classified into one of three categories: thorough understanding, partial understanding and no evidence for understanding. When comparing two similar questions, the students who answer both questions correctly were classified as having a thorough understanding. Students who answer one of the questions correctly but answer the second question incorrectly were classified as having a partial understanding. Students who miss both questions demonstrate no evidence for understanding.
A sample matrix is illustrated in Figure 1. The column represent the possible choices for Q19 while the rows represent the responses for Q14. Question 19 involved calculating the slope of a line on a generic graph and Question 14 had the students determining the slope of a specific line on a graph of mass versus volume. The total for the column shows how many students selected that option for the question. Within the matrix are boxes which contain the data on how students responded on both questions. The number in the box is the frequency, which is a total number of students who selected the corresponding question options.
The results of the matrix in Figure 1 are interpreted below
as an example. The correct answer, shown in bold on the matrix,
is B for Q19 and C for Q14. Sixty-four of the 134 students, or
48%, answered both questions correctly and have demonstrated a
thorough understanding of the concept tested in the two questions.
This corresponds to the frequency on the matrix. Of the 90 students
who answered Q14 correctly, 71% (64 of the 90) of those students
also answered Q19 correctly. Eighty-two students answered Q19
correctly, of those, 78% (64 of the 82) were also successful and
answered Q14 correctly. Students who answer one of the questions
correctly but answer the second question incorrectly have a partial
understanding. Of the 90 students who answered Q14 correctly,
26 of the students missed Q19. These 26 along with the 18 students
who answered Q19 correctly and Q14 incorrectly comprise those
students who have a partial understanding. They account for 33%
of the students. For the 19% of the students who miss both questions
there is no evidence they have an understanding of the concept
or ability.
Figure 1: Example of a Matrix Produced by SAS
Q |
Frequency |
Question 19 |
|||||
A | B | C | D | J | Row Total |
||
A | 6 | 8 | 0 | 0 | 9 | 23 | |
B | 2 | 6 | 2 | 1 | 2 | 13 | |
C | 9 | 64 | 8 | 4 | 5 | 90 | |
D | 0 | 4 | 2 | 1 | 0 | 7 | |
J | 0 | 0 | 0 | 0 | 1 | 1 | |
Column Total |
17 | 82 | 12 | 6 | 17 | N=134 |
The data are from the Q202 Spring 1994 pre-test. The correct responses are shown in bold and the interpretation of the responses appears in the text. Based on the above matrix and additional matrices for Q202 post-test, P201 pre-test and P201 post-test, the Table 1 was constructed and is characteristic of the tables used in the study.
Table 1: Summary of matrix Q14 and Q19
(in percent)
Pre-test | Post-test | ||||||
Course | thorough | partial | none | thorough | partial | none | |
Q202 | 48 | 33 | 19 | 65 | 23 | 12 | |
P201 | 58 | 31 | 11 | 76 | 16 | 8 |
The matrix methods help answer each individual hypothesis. The results from each hypothesis and the effect size results will be used in the final analysis to find detailed answers to three research questions.
Results and Analysis
Graphing Interpretation Skills
The calculated effect size using the gains in the raw scores was 0.78, which is an effect size of medium strength. The control group's initial score was higher than the treatment group's but after instruction, the control group's score did not change. The treatment's score improved and, on the post-test, exceeded the control group's score. MBL was successful at improving the Q202 students' graphing interpretation abilities. Traditional instruction on motion did not improve P201 students' graphing skills. The raw score means, standard deviations (s) for the means and the number of students (N) who took the instruments are summarized in Table 2. To further address research question 1, five hypothesis were developed for item analysis. Results from GIST were used in the item analysis and following discussion.
Table 2: Pre- and Post-test results for the GIST
Pre-test | Post-test | ||||||
Course | mean | s | N | mean | s | N | |
Q202 | 7.9 | 1.6 | 138 | 9.1 | 1.6 | 136 | |
P201 | 8.6 | 2.0 | 64 | 8.4 | 1.8 | 34 |
The maximum score on the GIST is 12.
MBL will improve students' ability to read and interpret
curves.
Students in both groups and on both the pre-test and post-test
demonstrated they could successfully read and interpret graphs.
Four questions from GIST were used in the analysis. In both courses,
86-97% of the students answered the four questions correctly.
MBL did not improve the students' ability to read and interpret
curves since they already possessed that ability.
MBL will improve students' ability to calculate the slope
of a curve.
Students demonstrated they could calculate the slope of a line,
but had difficulty with the concept of zero-slope. Three items
assessed students' ability and two matrices were constructed.
Both groups improved slightly on the post-test, but when including
the zero-slope concept, Q202 Physical Science students showed
improvement whereas the P201 General Physics students did not.
The treatment group appeared to have more conceptual change occurring.
In the end, about 50% of students in both groups could calculate
the slope and understand the zero-slope. The calculations can
be successfully performed by the student without the student fully
understanding what graphically corresponds to a zero slope and
what are the implications of dividing numbers by zero.
MBL will improve students' ability to qualitatively interpret
the slope of a curve.
Two items, used to construct a matrix, addressed the qualitative
interpretation of slopes. The first question required the students
to recognize that the slope of the line on a mass-versus-volume
graph was the density and that the more steep the curve, the more
dense the metal. The students were told that the density was the
ratio of mass divided by volume. The second question involved
a more complicated curve of the mean temperature versus month.
Approximately half of the students on the pre-test demonstrated
a thorough command of qualitatively interpreting the slope. Post-test
results showed the treatment improving and the control group staying
the same. The results are shown in table 3. MBL improved students'
ability to qualitatively interpret the slope.
Table 3: Summary of matrix measuring student's
ability to interpret slope
(in percent)
Pre-test | Post-test | ||||||
Course | thorough | partial | none | thorough | partial | none | |
Q202 | 51 | 43 | 6 | 67 | 30 | 3 | |
P201 | 55 | 37 | 8 | 47 | 42 | 11 |
MBL will improve students' ability to qualitatively interpret
the change in slope of a curve.
Students in both courses performed poorly on both the pre- and
post-test. Two items were used to evaluate student knowledge.
Q202 improved after intervention while P201 did not. Even after
instruction, less than 30% in both courses demonstrated a thorough
understanding of the change in slope. MBL did improve the students'
ability to qualitatively interpret the change in slope with 23%
of the Q202 students improving their understanding to thorough.
MBL will improve students' ability to interrelate the results
of two or more graphs.
One question was used to assess how students related data on two
separate graphs. The ability of interrelating the results of two
graphs again showed P201 students not improving after instruction
and Q202 students making slight improvements. These improvements
brought the Q202 students up to the same level as the P201 students.
The most frequent error made by students was not reading the scales
on the graph; instead they use the relative heights of curves
to determine magnitudes. MBL only slightly improved the students'
ability to interrelate the results on two graphs.
RQ1. Does MBL's graphic presentation improve students' ability
to interpret a variety of line graphs?
The computation of the effect size demonstrated the treatment
had a practical effect on the students. This was born out in the
item analysis. The Q202 students typically had initial scores
lower than the P201 students, but MBL labs lead to improvements
which brought the Q202 students to the same level of achievement
as the P201 students. P201 students typically did not change in
their understanding. Students using MBL made improvements, but
the change was small. Since graphing interpretation was never
overtly taught in either class, any improvement would be the result
of using the skills in laboratory. Improvements are the results
of students applying what they learned in a different context
to new situations. MBL did improve the students' ability to interpret
a variety of line graphs.
Motion Graphs
The calculated effect size using the gains in the raw scores
was 1.71, which is a large effect. There were 34 items which assessed
the students' ability to interpret motion graphs. The Q202 students'
pre-test mean was lower than the P201 students by 8%. Post-test
results showed Q202 students making large gains and exceeding
by almost 20% the P201 students, who made only a slight improvement.
MBL was most successful at improving the students' ability to
interpret motion graphs. The raw mean scores are summarized in
Table 4. To address research question 2, six testable hypothesis
were developed for item analysis. Results from the graphing portion
of the MCT were used in the item analysis and following discussion.
Table 4: Results for the Motion Content Test; Graphing Items
Pre-test | Post-test | ||||||
Course | mean | s | N | mean | s | N | |
Q202 | 11.5 | 3.8 | 138 | 24.4 | 6.1 | 136 | |
P201 | 14.2 | 4.5 | 64 | 17.7 | 5.5 | 34 |
The maximum score on the graphing portion of the MCT was 34.
MBL will improve students' ability to determine the direction
of motion from a motion graph.
Students started off not understanding how to interpret direction
from motion graphs. Nine questions were used in the assessment
and four matrices were constructed. The control group was more
successful with distance-graphs on the pre-test, but with velocity
and acceleration-time graphs, both group's performances were nearly
the same. The students had more difficulty with motion toward
the origin than motion away from the origin. A large number of
the students, 20% of the Q202 and 34% of the P201 students, selected
the distance analog on a pre-test matrix. Figure 2 illustrates
one questions used in the matrix which revealed the students alternative
conception. Acceleration graphs were the most difficult to interpret.
Post-test results show dramatic improvement in the treatment group.
The treatment group scores all surpassed the control groups, with
sometimes 30% more of the Q202 students than P201 having a thorough
understanding of the direction. The control group scores did improve,
but the gain was small compared to the treatment groups gains.
MBL significantly improved the students' ability to determine
the direction of motion from a motion graph.
Figure 2: Sample answers to Q52 illustrating the distance-analog
MBL will improve students' ability to determine the magnitude
of velocity from a motion graph.
Responses from five questions were used. Pre-test results
showed 25% of the students used height of curve as the criteria
for determining the magnitude of the velocity from a distance-graph.
Both groups improved on the post-test, but Q202 made a more significant
gain and surpassed the P201 students. In both groups about 78%
of the students were able to defrmine the magnitude of the velocity
by calculating the slope after instruction. The same was true
of determining the magnitude from a velocity-time graph. On the
post-test, most of the P201 students continued to assume the sign
on the graph indicated magnitude and not direction. MBL improved
the students' ability to determine the magnitude of velocity from
a motion graph.
MBL will improve students' ability to determine the magnitude
of acceleration from a motion graph.
Determining the magnitude of the acceleration was a more difficult
task for the students than determining the velocity. Three test
questions were used. Q202 students demonstrated more difficulties
on the pre-test than P201 students. They made sign errors more
often and based their answers on their beliefs and not the available
data. Terminology also played a role, students were more successful
when the phrase "change in velocity" was used instead
of acceleration. The treatment group again made larger gains on
the post-test than the control group. These gains helped approximately
the same percentage of Q202 students as P201 reach a thorough
understanding. MBL improved the students' ability to determine
the magnitude of acceleration from motion graphs.
MBL will improve the students' ability to qualitatively
interpret distance-time graph curves.
The ability of students to interpret distance-time graphs was
assessed using a series of six questions. P201 students were more
successful in interpreting distance-time graphs on the pre-test.
On the post-test the treatment group surpassed the P201 students,
with typically 20% more students having a thorough understanding
on any one question. Common difficulties on the pre-test included
not being able to determine direction, interpreting velocity,
and interpreting acceleration from distance-time graphs. Post-test
results showed 90% of the Q202 students had a thorough understanding
of distance-time graphs. The results for six of the seven questions
are shown in table 5. MBL significantly improved the students'
ability to qualitatively interpret distance-time graphs.
Table 5: Percentage of students responding
correctly to distance-time graphing questions
(in percent)
Pre test |
Post test |
|||
Question | Q202 | P201 | Q202 | P201 |
Q31 | 65 | 67 | 88 | 73 |
Q32 | 36 | 76 | 95 | 76 |
Q33 | 23 | 51 | 94 | 71 |
Q34 | 60 | 87 | 95 | 73 |
Q35 | 52 | 65 | 90 | 84 |
Q36 | 55 | 81 | 90 | 63 |
MBL will improve the students' ability to qualitatively
interpret velocity-time graph curves.
Students ability was evaluated using a series of seven questions.
Initially, both populations of students answered velocity-time
graphs questions in similar fashions and approximately the same
percentages having partial and thorough understandings. The most
common error involved using position criteria when interpreting
velocity-time graphs as shown in table 6. Post-test results showed
a significant improvement in the Q202 scores with approximately
80% of the students having a thorough understanding of velocity-time
graphs. P201 students also made gains, typically improving by
20% or more. MBL significantly improved the students' ability
to qualitatively interpret velocity-time graphs.
Table 6: Percentage of students responding
with distance analog
(in percent)
Pre-test |
Post-test |
|||
Question | Q202 | P201 | Q202 | P201 |
Q52 | 46 | 51 | 11 | 23 |
Q53 | 45 | 46 | 9 | 13 |
Q54 | 23 | 37 | 3 | 15 |
Q56 | 33 | 40 | 8 | 34 |
Q57 | 18 | 29 | 13 | 15 |
MBL will improve the students' ability to qualitatively
interpret acceleration-time graph curves.
Both populations of students did poorly on acceleration-time graphs
with less than 10% having a thorough understanding on the pre-test.
Five questions were used in the assessment. Students used velocity-graph
criteria to solve acceleration graph questions. Approximately
half of the students selected a velocity analog, and another 20%
selected a distance analog. The errors were further complicated
by a lack of understanding of sign and an inability to interpret
"speeding up at a steady rate." Post-test results demonstrated
little improvement for the control group but more significant
improvement for the treatment. Less than 15% of the P201 students
had a thorough understanding compared to approximately 57% of
the Q202 students. MBL significantly improved the students' ability
to qualitatively interpret acceleration-time graphs.
RQ2. Does MBL's graphic presentation improve students' ability
to interpret distance-time graphs, velocity-time graphs and acceleration-time
graphs?
The effect size shows MBL made the largest impact on students'
ability to interpret motion graphs. This is an expected result
since the treatment group spent much of its lab time working with
motion graphs. The item analysis further enforces the gains made
by the treatment group. Both groups had similar difficulties,
and the treatment was more effective at addressing the difficulties
and overcoming the problems. MBL significantly improved the students'
ability to interpret distance-time, velocity-time and acceleration-time
graphs.
Conceptual Understanding of Motion
The calculated effect size using the gains in the raw scores
was 0.88. The control group's initial raw score mean was higher
than the treatment group's and did improve slightly on the post-test.
The treatment's score improved more significantly on the post-test.
MBL was successful at improving the students' conceptual understanding
of motion as measured by non-graphing questions. The raw score
means, standard deviations (s) of the means and number of students
(N) are summarized in Table 7. To further address research question
3, four testable hypothesis were developed for item analysis.
Results from the non-graphing portion of the MCT were used in
the item analysis and following discussion..
Table 7: Test Results for the Motion Content Test; Non-graphing
Pre-test | Post-test | ||||||
Course | mean | s | N | mean | s | N | |
Q202 | 9.8 | 2.4 | 138 | 13.6 | 3.7 | 136 | |
P201 | 13.4 | 3.9 | 64 | 14.3 | 3.3 | 34 |
The maximum score on the non-graphing MCT test is 22.
MBL will improve students' ability to differentiate between
position and velocity.
If students' understanding of position and velocity are undifferentiated,
then it should be possible to detect their confusion by analysis
of a matrix of questions 43 and 44. The results are shown in table
8 and the question in shown in figure 3. P201 students did significantly
better on the pre-test being able to differentiate between position
and velocity. On the pre-test, 27% of the Q202 used a distance-analog
to answer both questions. Q202 students did make significant gains
on the post-test, but P201 students still had a more thorough
understanding and fewer of the students used a distance-analog
on both the pre-test and the post-test. The significant improvement
by the Q202 students demonstrated that MBL does improve the students'
ability to differentiate between position and velocity.
Table 8: Summary of matrix of question 43 and
44
(in percent)
Pre-test |
Post-test |
||||||||
Course | thorough | partial | none | analog | thorough | partial | none | analog | |
Q202 | 42 | 16 | 42 | 27 | 71 | 7 | 22 | 12 | |
P201 | 69 | 18 | 13 | 8 | 76 | 5 | 19 | 4 |
Figure 3: Question 43 and 44
The number which characterizes the ball's velocity
when:
43. the ball travels downward____ | a. gets larger |
44. the ball travels upward ____ | b. is constant and equal to zero |
c. is constant and not equal to zero | |
d. gets smaller |
MBL will improve students' ability to differentiate between
velocity and acceleration.
A matrix of question 43 by 45 and question 49 by 50 detected students'
understanding of velocity and acceleration. The Q202 students
more likely to confuse velocity and acceleration. The difference
might be due to the P201 lecture bias. The instructor's lecture
before the pre-test included some of the topics covered on the
pre-test. Approximately 70% of the Q202 students used velocity
criteria to answer acceleration questions. Post-test results show
no change for P201 students and sizable gains for the Q202 students.
Even with the large gains for the Q202 students on the post-test,
over 20% more of the P201 students had a thorough understanding
of the concepts. Acceleration is the most difficult concept for
the students; even after instruction, less than 60% of the students
in either class have a thorough understanding of the concept.
MBL did improve the students' ability to differentiate between
velocity and acceleration.
MBL will improve students' ability to solve simple quantitative
problems.
Two simple quantitative problems were presented. Q202 students
were more capable at answering simple quantitative questions prior
to instruction than P201 students. After instruction, both groups
improved with P201 slightly improving more, which is to be expected
from a course which emphasized quantitative understanding. Instruction
in both cases improved ability to solve quantitative problems.
MBL improved the students' ability to solve simple quantitative
problems, but similar gains were also achieved with traditional
instruction.
MBL will improve students' ability to solve picture problems.
Questions 60 and 65 and a matrix of questions 59 and 64 were used
in the assessment. Figure 4 is the illustration for questions
59 and 60. Question 59 and 64 asked the same question: when do
the balls have the same speed, for two different pictures. The
students on the pre-test were not successful with picture problems,
with less than 30% in either class demonstrating a thorough understanding.
The post-test results showed approximately equal gains for both
groups. MBL did improve the students' ability to solve picture
problems, but the same gain was also achieved from traditional
instruction.
Figure 4: Example of picture problem, Q59 and Q60.
Two balls A and B move at constant speeds on separate tracks. Positions occupied by the two balls at the same time are indicated in the figure below by identical numbers. The balls are moving to the right. Starting points are not shown.
RQ3. Does MBL's graphic presentation improve students' conceptual
understanding of velocity and acceleration?
The effect size demonstrated the treatment had a practical affect
on the students. The treatment students made gains in their abilities
to differentiate between concepts and in their ability to solve
quantitative and picture problems. The gains they made were typically
larger than the gains made by the control group although more
P201 students started and finished with a thorough understanding
of the topics. The Q202 students learned the content within the
context of motion graphs and were able to apply it to non-graphing
problems. The P201 students were not able to apply their knowledge
to graphing problems. Through out the MCT and GIST, the Q202 group
appears to be making more conceptual changes than the P201 students.
MBL improved the students' conceptual understanding of velocity
and acceleration.
Discussion and Conclusion
This study sought to explore what the students were learning and how well they mastered those concepts. In addition, the study sought to identify what the students knew prior to instruction. Knowing what the students bring to the classroom helps the instructor better address the students' knowledge and elicit conceptual change, bringing the students' beliefs closer to the accepted scientific explanations.
The results on the Graphing Interpretation Skills Test and Motion Content Test indicated significant differences between a traditional laboratory and microcomputer-based laboratory. MBL was more effective at engendering conceptual change in students. The results were determined by computing effect sizes and by item analysis. The multiple-choice instruments had high reliability and the use of gain scores corrects for the initial differences in the two student populations.
The students in both groups entered the study being able to read and interpret a variety of line graphs. Understanding the change of slope remained the most difficult skill for the students, which is not unexpected since they probably have never had to interpret this before. Only about one-half of the students were able to calculate a slope and interpret a zero-slope curve. This is a disappointing result for such a basic graphing skill. The ability to work with the slope is a significant skill for the students to master in order to facilitate interpreting motion graphs. Instructors can not assume the students will completely understand how to calculate and interpret the slope. MBL provides an excellent medium for helping students develop a more thorough understanding of how to calculate the slope and how to interpret its meaning. Activities which involved slope should be built into the motion labs so that the students will gain experience with the skill. Students should be expected to understand how to determine the slope and how to apply this specifically to motion after instruction.
Students will probably have never experienced motion graphs prior to using MBL labs. When they first encounter these graphs, they can rely on either their graphing interpretation skills or their own preconceptions. It appears that students resort to their own preconceptions before they attempt to use their graphing interpretation skills to determine an answer. This might be the result of a multiple-choice test in the study, a format which is typically used to measure the students' ability to memorize facts. The students might also choose the answer which requires the least amount of effort. On a test where time is a limiting factor, it is more likely the student will resort to memorized facts or their own preconceptions than to take the time and figure out an answer. These limitations influence the pre-test motion graph results and would yield scores which are lower than the students' actual abilities.
The biggest difficulty encountered by the students was interpreting the direction of motion from a motion graph. This affects their success in interpreting the results of all motion graphs. Students find motion toward the origin especially difficult. Perhaps motion away from the origin is easier for the students because the reference point is also the location where the motion began. The students do not have to change their reference frame. On a distance-time graph, the motion away is an additive process with increasing distance from the same spot where the motion began. This may be easier for the students than moving toward the origin, involving distances from the origin getting smaller even though you are getting further from the starting point. Motion toward requires the students to change their frame of reference which is a difficult task without prior experience. MBL labs must provide the students with experiences predicting and interpreting direction on a variety of motion graphs. As demonstrated by the successful Q202 laboratories, experiences should include motion away from the origin to provide the students with an anchoring concept, something they are familiar and successful with. An understanding of motion toward the origin can be built upon their understanding of motion away from the origin.
Beyond determining direction, the students often confused types of graphs, perhaps reflecting their confusion of the associated concepts. In addition, the students must have an opportunity to explore the differences and the relationships among the various motion graphs. They need to create distance-time, velocity-time and acceleration-time graphs of the same motion and discuss the differences between the graphs. As the students learn to differentiate between the graphs, they will begin to develop a more precise definition of the concepts and will be able to better differentiate the concepts. Numerous examples are needed in order to provide the student with sufficient experience and examples.
The dominant preconception the students bring to the classroom is undifferentiated understandings of the concepts of position, velocity and acceleration. Scientists have precise definitions of these concepts which they can apply to numerous situations. Students have had a variety of experiences in which they have built up their understanding of the concepts. Since they probably have never had a formal course in physics, their understanding consists of overlapping, misinterpreted pseudo-definitions which may work in a few specific instances but are not generalizable. Formal instruction will hopefully provide the students with a scheme for understanding their experiences and narrowing their definitions of the concepts. MBL provides an excellent environment in which to address the students' preconceptions. The students can test their own theories and the proposed scientific theory with quick, easily understood graphs.
Challenging the students' understanding does not guarantee that conceptual change will occur. When faced with a discrepancy, students can change their own beliefs, rationalize the data away or they can become apathetic. Indeed, in lab, students would ask the instructor if they "should write down the correct answer or what the graph said," thereby attempting to rationalize away the data and maintain their current beliefs. While MBL is a powerful tool for conceptual change, it must be accompanied by other factors which will encourage the students to challenge their beliefs and to begin to change their own concepts. Factors might include motivation, interest and relevancy for the student. Identifying those factors will be a significant research topic to pursue.
The study showed that the Q202 students learned more about graphing interpretation skills, more about motion graphs and more about conceptual understanding of motion, than did the P201 students. That learning was made possible by the effective use of MBL activities. While the P201 students spent less time on motion, it can and should be argued that if the students would gain as much as the Q202 students, then it would be to their advantage to devote more time to the basic concepts. Q202 took advantage of the learning environment and instructional possibilities made possible by MBL. As demonstrated by the Q202 lab materials, to effectively use that time, MBL activities designed to elicit conceptual change should be incorporated into introductory physics courses.
Bibliography
Arons, Arnold B. (1990). A Guide to Introductory Physics Teaching. New York: John Wiley.
Borg, W. R. and M. D. Gall. (1989). Educational Research: An Introduction. (5th ed.). New York: Longman.
Brasell, H. (1987a). Effectiveness of a microcomputer-based laboratory in learning distance and velocity graphs. Dissertation Abstracts International. 48, 2591.
Brasell, H. (1987b). The effects of real-time laboratory graphing on learning graphic representation of distance and velocity. Journal of Research in Science Teaching. 24 (4) 385-395.
Campbell, D. T. and J. C. Stanley. (1963) Experimental and quasi-experimental design for research. Boston: Houghton Mifflin Co.
Clement, J. (1982) Students' preconceptions in introductory mechanics. American Journal of Physics. 50 (1) 66-71.
Friedler Y., Nachmias R., & Linn, M.C. (1990). Learning scientific reasoning skills in microcomputer based laboratories. Journal of Research in Science Teaching. 27 (2) 173-191.
Giancoli, D. C. (1991) Physics: Principles with applications. (3rd ed.). Englewood New Jersey: Prentice Hall.
Halloun I. and D. Hestenes. (1985a). The initial knowledge state of college physics students. American Journal of Physics. 53, 1043-1055.
Halloun I. and D. Hestenes. (1985b). Common-sense concepts about motion. American Journal of Physics. 53 1056-1065.
Hestenes D., M. Wells and G. Swackhamer. (1992, March). Force concept inventory. Physics Teachers. 30 (3), 141-158.
Linn, M.C., Layman, J., & Nachmias, R. (1987). Cognitive consequences of microcomputer-based laboratories: Graphing skill development. Contemporary Educational Psychology. 12 (3) 244-253.
McDermott, L.C., M.L. Rosenquist and E.H. van Zee. (1987). Student difficulties in connecting graphs and physics: Examples from kinematics. American Journal of Physics. 55 (6) 503-513.
McKenzie, D. L. and M. J. Padilla. (1986). The construction and validation of the Test of Graphing in Science (TOGS). Journal of Research in Science Teaching. 23, 571-579.
Mokros, J.R., & Tinker, R.F. (1987). The impact of micro-computer based labs on children's ability to interpret graphs. Journal of Research in Science Teaching. 24 (4) 369-383.
Svec, M. T., W. J. Boone, and C. Olmer. (1995). Changes in a Preservice Elementary Teachers Physics Course. Journal of Science Teacher Education. (2) 79-88.
Thornton, R.K. & Sokoloff, D.R.. (1990). Learning motion concepts using real-time microcomputer-based laboratory tools. American Journal of Physics. 58 (9) 858-867.
About the author . . .
Michael Svec, Ph.D., is an Assistant Professor of Science Education in the Department of Education at Furman University, 3300 Poinsett Highway, Greenville SC, 29613-1134, FAX (864) 294-3341, michael.svec@furman.edu.