BioLogica Research
Just as BioLogica embodies phenomena of genetics in a multilevel hypermodel, our research seeks to construct and elaborate a multilevel model of learning with hypermodels in science classrooms. We integrate research findings and methodologies from cognitive psychology, science education, instructional technology and educational psychology to create interventions and assessments that instantiate theoretical concepts. Our research requires the support of technology not only for creating the interventions and assessments, but also for collecting and analyzing the data generated when learners use BioLogica.
Conceptual Framework
The importance of models and modeling in scientific research has been widely documented . Models are used both to describe scientific phenomena and to generate testable hypotheses . Today, models and modeling are considered essential components of scientific literacy . With the importance of models and modeling to science education comes the need for a coherent theory of model-based teaching and learning (MBTL).
We have been working on such a theory during the last decade. Our basic hypothesis is that understanding biological phenomena requires learners to construct, elaborate and revise mental models of the phenomena under study.
Figure1: Model-Based Learning Framework
In response to task demands, learners who are engaged in model-based learning construct models from prior knowledge and new information. If the model enables them to perform the task successfully the model is reinforced. However, if the model does not enable successful performance, the model may be rejected and a new one constructed. Or the model may be revised or elaborated prior to another attempt . Such model-based learning results in knowledge that is integrated, usable and extensible in the domain. The model-based learning framework shown in Figure 1 represents the cognitive core of the learner level our model of classroom learning. In addition, there are learner beliefs that influence the effort a learner chooses to invest in the task and in participation in the classroom culture, which presents not only a variety of tasks but also participation structures affecting a learner's interactions with phenomena, resources, tools, and other learners. These factors along with commonly held beliefs constitute the classroom culture and therefore are key elements in the classroom level of our developing multilevel model of learning with hypermodels.
Buckley (1992) documented and contrasted the learning goals, gains, and strategies of two students in a classroom of 28 students who were using an interactive multimedia resource about the circulatory system. Through microanalysis of multiple forms of data Buckley created a rich description of model-based learning situated in a technology-rich high school biology classroom. The mental models of the learner were described as models embedded in models forming an anatomical hierarchy. This study grounded the development of an initial model of model-based learning. Key elements in the model are prior knowledge, interpretation of the assigned task, mindfully seeking out and integrating necessary information about parts, processes and mechanisms, then evaluating both the new information and existing models prior to integration. Buckley (2000) extended the initial model to include the role of representations in model-based learning. Using a framework developed by Buckley & Boulter representations in the interactive multimedia resource were analyzed for their potential contributions to the learner's correct and incorrect conceptions. The interaction between the learner's existing models and the representations was documented from classroom discourse.
Representations, whether discourse, text, diagrams, animations, gestures or linked, multilevel hypermodels, are an essential element of model-based teaching and learning. Representations and the phenomena they re-present are the links between the internal cognitive processes and the external sociocultural processes that enable learning in classrooms . The use of representations is often problematic in that the 'reader' often misses the intent of the 'writer' . However, many studies have examined the effectiveness of different types of representations in supporting learning (c.f. . Gobert and Clement (1999, NSF# 9150002) studied the differential benefits of student-generated models versus summaries and the benefits of student-generated models versus explanations on rich learning (Gobert, 1997). It was found that rich tasks like modeling and explaining, which require inference-making, lead to deeper understanding of content area than lower-level tasks such as summarizing. Gobert (2000, NSF# 9980600) studied the causal reasoning associated with models of varying levels of causal integration and the visual/spatial inferences afforded on the basis of different models. We found that reasoning on the basis of models that were spatially correct in terms of the relative placement of the layer inside the earth afforded visual inferences; alternatively spatially "incorrect" models could not support inference-making until remediated.
Prior Research
Technology enables the creation of visuals and models of phenomena that simulate the structures and behaviors of phenomena as well as providing the opportunity for learners to interact with the elements of the models on a variety of tasks. For example, Horwitz and White (1988) developed the ThinkerTools software that helped sixth graders formulate models involving the action of forces in two dimensions. They did a careful study of the students� misconceptions prior to instruction and administered a post-test designed to probe students' misconceptions. The sixth graders scored significantly higher on the post-test than high school physics students in the same school system.
However, as prior research has demonstrated (Buckley, 1992; Christie, 1999; Horwitz & Christie, 1999), providing learners with representations and models for normally inaccessible phenomena may be a necessary condition to facilitate science learning of this kind, yet it is not sufficient one. In Buckley's (1992) study only one student in a class of 28 engaged in model-building where all had access to a rich interactive multimedia resource about the circulatory system. In Horwitz & Christie�s (1999) work with GenScopeä , the precursor to BioLogicaä , that provided high school science students with an interactive, exploratory, and constructivist genetics hypermodel, few learners demonstrated the hoped-for gains in understanding and reasoning as measured by statistical differences in pre- and post-tests. However, many GenScopeä students demonstrated an ability to correctly solve difficult genetics problems in-situ as well as an increased proper use of scientific language to explain their reasoning (Christie, 1997).
Follow-on work documented students� beliefs that these problem-solving experiences were very unlike their prior science class learning experiences. Students went so far as to characterize their GenScopeä learning experiences as "somethin� more than learnin�." In particular, GenScopeä experiences gave students "time to think" and "time to talk" (Christie, 1999). Christie�s (1999, 2000, 2001) qualitative research on perceptions of technology-supported science learning experiences suggests that certain technologies challenge students� prior beliefs about the meaning of key educational concepts, such as learning, success, and failure. Moreover, it suggests that students organize their thoughts about academic experiences in concert with particular instructional activities (e.g., reading, writing, talking, or doing) and the "achievement resources" (e.g., books, computers) available or required for participation in such instruction (Christie, 2001; 1999; Hickey, Kindfield, Horwitz & Christie, 2000). Therefore, qualitative methods are critical in understanding technology-supported learning processes and subsequent learning or achievement outcomes (Christie, 2001; 2000; 1997).
We attribute these and other such findings (Christie, 2001) in part to the cultural changes that emerge in a classroom setting as a direct result of the presence of technologies that challenge existing models of teaching and learning. This point is critical and worth repeating: Although new technologies do not necessarily produce statistically significant improvements in learning outcomes, their presence alone can shake up commonly held beliefs about what constitutes teaching, learning, and success or failure in the classroom. We refer to these commonly held beliefs and the behaviors that co-exist with them as "classroom culture."
Research Design:
From prior research we know that providing access to hypermodels is not sufficient for learning, but we have some ideas about what is needed to foster learning and transfer when learners use BioLogica. However, we also have lots of questions about how well these ideas work in real classrooms with real students. For example, in the table below are the research questions we are posing and the data we will use to answer those questions.
Research Questions |
Data |
Do students using BioLogica learn more than students who don�t use BioLogica? |
Pre/post content tests of BioLogica vs. Control classes |
Does scaffolding built into latest version of BioLogica make a difference in learning? |
Pre/post content tests with different versions of BioLogica |
What is the nature and extent of student learning in genetics when they have access to BioLogica? |
Pre/post content tests, selected interviews |
How do they go about that learning? |
Data logs, student questionnaire |
In what way do individual student characteristics influence student's learning with BioLogica? |
Pre/post content, student questionnaire |
How well does BioLogica function as an introduction to genetics vs. a review of genetics? |
Pre/post content tests Across classroom comparisons Teacher questionnaire |
To help answer those questions we design interventions that guide and scaffold a learner's interactions with the hypermodel. When these are implemented in classrooms we examine the changes in learners' knowledge, but we also investigate the elements in BioLogica and the classroom that influence learning gains and classroom culture.
We answer these questions by integrating elements of both qualitative and quantitative methods into our disciplined, systematic inquiry. Comparative case studies are situated within quantitative analysis. Some questions will be answered by quantitative measures such as test scores, others require more qualitative techniques, such as content analysis of the data logs. Because our design integrates investigations of model-based learning and technological classroom cultures, we bridge the gap between research on learning and research on educational practice.
Operational definitions of concepts
Model-based learning gains are assessed by performance on pre and post tests that examine the learner's ability to:
- Understand the representations of the elements of the hypermodel and link them to genetics terminology and concepts
- Reason from cause-to-effect and from effect-to-cause in order to solve genetics problems dealing with predicting phenotype from a given genotype or vice versa.
- Reason about the parts, processes and mechanisms that produce variations in the population such as how mutations occur and are inherited.
- Reason about novel instances taken from reports of current genetics
Cognitive scaffolding is implemented as different functional sets of prompts that support
- using BioLogica and its representations (terminology is part of this),
- model-based learning: a focus on parts, processes, & mechanisms
- problem-solving (progressive domain specific hints embedded in activities),
- embedded assessment (formative - interspersed in the activities and summative - at the end, printable).
Tasks are embodied in BioLogica activities. Components of activities are coded to indicate the type of performance required (listed above under MBL) as well as the parts, processes, and mechanisms of genetics.
R&D activities for 2000-2001
Drawing on prior research and on classroom observations during early implementations, we have been building different types of cognitive scaffolding into BioLogica activities. These include:
- orienting tasks and advance organizers to focus the learner's attention on the task at hand
- text and other aids to help students interpret and reason with the representations of DNA, genes, chromosomes, and organisms
- Tasks (challenges and puzzles) that require reasoning from cause to effect and from effect to cause, both within a generation and between generations
- embedded multiple choice and essay questions to focus the learner's attention on the parts and processes of genetics and to foster deeper processing by asking learners to re-present their observations and reasoning in their own words
- progressive context sensitive hints
- opportunities to reflect on their own learning at the end of the activity
These interventions have been implemented in only a small portion of the activities to date.
At the same time, we have been developing methods of data collection and analysis to help us answer the questions posed. We have modified the pre and post tests developed for the GenScope project to better reflect the goals of the BioLogica project. We have retained questions that require reasoning about genetics problems. Questions have been added that focus on students' understanding of the terminology, representations and models in BioLogica. Questions from the MCAS and NY Regents exams have also been added as a measure of transfer and to gauge potential impact on the few genetics questions found on such exams. The pre and post tests are currently paper and pencil tests so that they can be administered to control classes and to experimental classes without requiring access to computers. We will explore the possibility of administering an online version.
We have also created survey questionnaires that elicit information about
- teacher practices and prior experiences using computers in science teaching
- student epistemology and perceptions of themselves as science learners
- student and teacher experiences using BioLogica.
We have implemented data logging in most BioLogica activities. Their scripts generate log files that include user name, date, and responses to questions and to challenges in a time-stamped log that records what actions the user took at what time. For instance, we can examine when and how many times the user changed the alleles or traits of an organism as well as their responses to conceptual questions.
Field Trials
By the end of the current school year (2000-2001), together with our collaborators at the National Classroom, we will have piloted BioLogica in a total of six public high schools: two urban, two suburban, and two rural. In April we brought together a total of 24 middle school students from three different schools on Martha�s Vineyard for a three-day intensive trial during a school vacation week. In addition, teachers in three middle schools and a private high school chose to try it out in their classes. In all, nearly twenty teachers and approximately 700 students will have used various versions of BioLogica by the time school is out.
Data Collection
In all of the sites we have collected the following data:
- Pre- and post-tests of content knowledge. The post-test is divided into three sections, which measure learning gains in general genetics knowledge, understanding of vocabulary, and ability to use models to reason at multiple levels.
- Student questionnaires, one before the BioLogica module and one after it. The first deals with the students� prior experiences in learning science, the second provides feedback on BioLogica itself.
- Teacher questionnaires, one on prior experiences teaching science, the other on reactions to BioLogica.
- Data logs generated by BioLogica Activities.
In addition, implementation activities were videotaped as part of the Martha�s Vineyard implementation as were interviews with selected students. Informal classroom observations took place when members of the team were present in classrooms to help support the teachers and students in their use of BioLogica.
Data Analysis: A Work in Progress
We are currently analyzing the data we've collected. We must stress that analysis is still preliminary and the bulk of the data remain to be analyzed. We expect to complete data analysis by the end of the summer. Our results will be updated weekly.
Data analysis will proceed in phases. The first step involves scoring all the pre and post tests and performing multiple analysis of variance. Statistical analysis will also be done on the students' data logs in order to characterize the learning processes of skilled and unskilled learners. Such analyses will no doubt generate questions about why there were differences or not and a need for more descriptive explanations of the results.
The second phase will involve more qualitative analysis of the data logs and pre and post tests. This will require analyzing the content of the responses that learners typed or wrote and trying to trace the development of their models over the course of their use of BioLogica. In addition we will be looking for different patterns of use with BioLogica among students with different levels of prior knowledge and learning gains. The Martha's Vineyard implementation will be used to construct a case study of different learning trajectories through BioLogica and different outcomes.
The questionnaires and surveys will be analyzed and used to inform our analysis of the learning gains and refinement of the classroom level of our model of learning with hypermodels.
Next Steps:
From data analysis we will construct case studies of the different implementations situated within the context of the overall picture. For instance, from the Martha's Vineyard data we expect to be able to describe what students learned who had little prior exposure to genetics and how they used the BioLogica activities to accomplish that learning. We will describe the experiences of students who had different levels of learning gains and make cross case comparisons to elucidate the interactions between learners' prior knowledge and BioLogica's representations and activities that produced the learning.
References:
Bempechat, J., London, P., & Dweck, C. (1991). Conceptions of ability in major domains: An interview and experimental study. Child Study Journal, 21, 11-36.
Boulter, C. J., & Buckley, B. C. (2000). Constructing a typology of models for science education. In J. K. Gilbert & C. J. Boulter (Eds.), Developing models in science education (pp. 25-42). Dordrecht, Holland: Kluwer.
Buckley, B. C. (1992). Multimedia, misconceptions and working models of biological phenomena: Learning about the circulatory system. Unpublished Doctoral Dissertation, Stanford University.
Gobert, J., & Discenna, J. (1997). The relationship between students' epistemologies and model-based reasoning. Paper presented at the American Educational Research Association, Chicago.
Goldman-Segall, R. (1998). Points of viewing children�s thinking: A digital ethnographer�s journey. Mahwah, NJ: Lawrence Erlbaum.
Grosslight, L., Unger, C., Jay, E., & Smith, C. L. (1991). Understanding models and their use in science: Conceptions of middle and high school students and experts. Journal of Research in Science Teaching, 28(9), 799-822.
Guba, E.G., & Lincoln, Y. (1981). Effective evaluation. Bevery Hills, CA: Sage Publications.
Hammersly, M. & Atkinson, P. (1996). Ethnography: Principles in practice. London: Routledge.
Hein, G. (1969). The impact of "stuff" in the classroom. Educational Technology, 9(12), 54-8.
Hickey, D.T., Kindfield, A.C.H., Horwitz, P., and Christie, M., (1999). Advancing educational theory by enhancing practice in a technology supported genetics learning environment. Journal of Education, 181(2), 25-55.
Horwitz, P. (1995) Linking Models to Data: Hypermodels for Science Education, The High School Journal, Vol. 79, No. 2, pp. 148 - 156.
Horwitz, P. and Barowy, W. (1994). Designing and using open-ended software to promote conceptual change, Journal of Science Education and Technology, Vol. 3, No. 3, 161 - 185.
Horwitz, P. & Christie, M.T. (1999). Hypermodels: Embedding curriculum and assessment in computer-based manipulatives, Journal of Education, Volume 181, Number 2, pp. 1 � 23
Lawrence-Lightfoot, S., & Davis, J. H. (1997). The art and science of portraiture. San Francisco: Jossey-Bass Publishers.
Meece, J.L., Blumenfeld, P.C., & Hoyle, R.H. (1988). Students� goal orientations and cognitive engagement in classroom activities. Journal of Educational Psychology, 80(4), 514-523.
Merriam, S.B. (1998). Qualitative research and case study applications in education. San Francisco: Jossey-Bass Publishers.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook (2nd edition). London: Sage Publications.
Nicholls, J.G., Patashnick, & Nolen, (1985). Adolescent theories of education. Journal of Educational Psychology, 77(6), 683-692.
Patrick, H., Ryan, A.M., Anderman, L.H., Middleton, M., Linnenbrink, L., Hruda, L.Z., Edelin, K., Kaplan, A., & Midgley, C. (1997). Observing patterns of adaptive learning: A protocol for classroom observations (OPAL). Ann Arbor, MI: Leadership & Learning Laboratory, University of Michigan.
President�s Information Technology Advisory Committee (2001). Using information technology to transform the way we learn. PITAC: Washington, DC.
Seidman, I. (1998). Interviewing as qualitative research: A guide for researchers in education and the social sciences (2nd edition). New York: Teacher�s College Press.
Weiner, B., Graham, S., Taylor, S.E., & Meyer, W. (1992). Social cognition in the classroom. Educational Psychologist, 18(2), 109-124.