Notes from Course on Research in Teacher Education

EDPSY 597B

Class 2

Thursday, January 22, 2009

Guest Lecturer: Dr. Richard Duschl

 

Handbook of Research on Teaching

Goals: Understand the term paradigm

 

Joseph Schwabb

  • Leader of BSCS reform curriculum in science teaching.
  • Become a reform theoriest
  • Came up with the four commonplaces of schooling

    • The learner and learning
    • The teacher and teaching
    • Subject matter
    • The Milieau (Classroom learning environment)
  • When you want to understand what is happening in a classroom, you can examine them through one of the four commonplaces of schooling

 

Notes from Today:

  • Behaviorism is currently the received view or the established paradigm
  • Research reports – Are available online free

    • Knowing What Students Know
    • How People Learn
  • Early 1900’s – The Differential perspective

    • Started by the U.S. army
    • A sorting out process based on ability/IQ tests
    • Nature of individual differences – individuals differ on their mental compacities
  • The Behaviorist Perspective

    • Focus on behaviors – must be observed
  • The Cognitive Perspective

    • When we started to ask questions about what is happening in the mind.
    • Focus on reasoning and solving problems
    • How knowledge was encoded, stored, organized, and retrieved
    • It is informed by artificial intelligence – how could you recall information, memory studies
    • Memory is about chunking information that activates the recall of other information.
    • Popular instructional strategy is concept mapping. The act of constructing the concept map or an outline is a way of chunking information.
    • Also known as information processing.
    • Had a lot to do with computers – Can we get a computer to behave like a human?
    • What was happening in the brain was logical and very calculated
  • The Situative Perspective

    • Emphasis on social and language components
    • Vygotsky & Bruner
    • There is a social or socio-cultural or socio-historical perspective
    • Language matters. The social context matters. Who is doing the talking?
  • We know that children have an innate nodule of causal reasoning and another innate nodule in numerals.
  • What are the beliefs of the teacher? What beliefs are present in the curriculum? What are the beliefs of the school culture? (Ex. if they have Madeline Hunter model at the school, then they have a behaviorist learning theory). To research, it is important to look at Schwabb’s four commonplaces and how the three come together to create the fourth – the milieau

 

Thomas Kuhn:

  • There are periods of time when all disciplines go through a scientific revolution. Different disciplines have achieved a paradigm when everyone is in pretty much agreement.
  • Ex. psychology does not have a paradigm. There is no consensus to how people learn. Biology does have a paradigm – DNA; Chemistry – the Periodic Table of Elements; Geology – Plate Tectonics
  • A paradigm is an umbrella under which things sit so that academics function.
  • When people or subjects have different paradigms, then they can’t communicate, according to Kuhn.
  • He shifted the complexity of our thinking to get beyond the factual information to the theories.

 

 

Three major disciplines have impacted educational research: psychology, anthropology, and economy. Economic models are coming in to dictate how schools should function.

 

 

EDPSY 597B

Class 3

Thursday, January 29, 2009

 

  • Both teacher education and professional development research have traditionally been weak.
  • We will choose some of the readings that we will do.
  • Your questions drive your methods. Given a question, what are different ways that you can address that question? What are the different trade offs?

 

Types of Research Methods

  • Stallings Snapshot (Jane Stallings –President of AERA and mentor of Dr. Knight) – Quantitative Method – way to familiarize w/systematic observation
  • Stimulated Recall Analysis –you videotape the teacher. You can mark the incidences or they decide where they make the marks. You interview them at the same time to try to get at their decision-making. It’s an attempt to see how their beliefs and knowledge impact practice. Can be either quantitative or qualitative. She often takes qualitative and quantify it.
  • Interview/Content Analysis – primarily a qualitative approach
  • Focus Group Interviews – looks at barriers, etc.
  • Video Case Analysis – Video a teacher. Do some sort of intervention. Video again and then compare the two variables.

 

  • Design experiments are a way of implementing and evaluating cycle of the intervention. You can improve the intervention as you go along while still studying it.

 

A classic study by Sleeter – integrating culturally appropriate strategies into mathematical workshops. Survey results said that the teachers believed that they learned and that they were implementing. There was a mismatch between what the teachers believed that they were doing and what they were actually doing.

 

To Do:

Rank order the Methods and Applications

Pick 5 and rank order them.

 

 

Review and Preview:

  1. Paradigms

    1. There’s often a lag in the paradigms in use. For ex. NCLB was using a behavioral process/product approach when the university had shifted to a more socio-cultural and cognitive approach.
  2. Jigsaw – See notes on document
  3. Back to the Future
  4. Because we have to define things all of the time, it slows the field down.
  5. When defining, are there attributes that would qualify something under a particular term.

 

EDPSY 597B

Class 4

Thursday, February 5, 2009

 

  • Research in teacher education has a bad reputation partially because of the poorly done qualitative research. Research studies have not clearly articulated the methods so that they can be critiqued. Others have not recognized their possible bias.

 

The Tennessee Study

  • Longitudinal study. Where they actually randomized the students into classrooms, which is unheard of in education. They did different configurations. They followed students through different teachers.
  • The found out that class size made a difference. Less than 15 students did better and did better when they had to go to larger sizes.
  • They didn’t implement it because it cost too much.

 

Research isn’t in the only factor that determines implementation because education is a social, political, and moral entity.

 

Slaven – Best Evidence Synthesis – A synthesis of quantitative and qualitative studies

 

Cross case analysis– picking different cases within a study or studies and comparing them.

Cross study analysis or synthesis – picking different studies to identify the similarities and differences

 

For the AERA summary, which parts impact your studies….

 

 

EDPSY 597B

Class 5

Thursday, February 12, 2009

 

The data collection method depends upon the question. They have to be aligned.

Prioritize your research questions. Which one needs to be asked first?

The question drives the method.

 

Replication could mean doing the same study with different characteristics to identify the scope of the generalizability.

Using different methodologies across studies to see if the result is different findings.

 

There is NO perfect study. There are always trade-offs. It helps to recognize the bias because there will always be some.

 

The chain of reasoning shows the rationale between your theoretical framework and your decisions. Your findings need to link back as well.

 

“What is the effect…” usually indicates a causal study but is often used incorrectly as correlational.

 

When asking your research questions, use the same rule of thumb that we do with the interns – don’t ask a yes or no question.

 

Question – Capel 1997 did a study about anxiety of preservice teachers about the evaluation of their practical/clinical experiences. My wondering – Is the preservice teachers’ fear a result of them linking supervision and evaluation? Are the two linked in the experience? If the two were differentiated, would the anxiety still exist?

 

Woods & Weasmer 2003 did a study about the importance of establishing clear communication between the mentors and interns. In the PDS, we have the Weekly Check-in as an instrument to facilitate this communication. I wonder if we (meaning PDAs) should have weekly check-ins with mentors, too.

 

EDPSY 597B

Class 6

Thursday, February 19, 2009

 

Jane Stallings and Ted Stevens were part of the process-product movements and advocates for it.

 

Snapshot – Time sampling

  • Tells the kinds of interactions of adults and children
  • The size of groups
  • Kinds of materials (Mode – my term)

 

5 minute Interaction – Event Sampling

 

 

EDPSY 597B

Class 7

Thursday, February 26, 2009

 

Focus Group Interviews:

  • FGIs need to make sure that there are no power differentials in the group.
  • It’s sometimes best to do it with people who don’t know each other.
  • It can be used to explain quantitative data.
  • It’s great for evaluating curriculum.

 

Could use FGIs with interns who have been in SYP and those that were not. However, it’s best if the people who don’t know each other, which isn’t the case for the interns.

 

 

EDPSY 597B

Class 8

Thursday, March 5, 2009

 

Design Experiments:

  • Context is critical.
  • Cyclical in nature
  • It begins with an innovation or intervention grounded in theory. It is implemented and then analyzed. It is then analyzed and revised and the new entity is implemented again. The cycle then continues. The DEAR Cycle (Design Enact Analyze Redesign) (Stephanie’s Cycle)
  • Assessment is a big part because it is important to understand what is happening.
  • It’s both deductive and inductive.
  • Theory informs/refines/creates practice and practice informs/refines/creates the theory.
  • It’s similar to product development in engineering. Except in education, the product is the intervention or the innovation.
  • It’s a type of longitudinal study.

 

Borko and Putnum article in Educational Research about how teachers learn for our Core 4 Study.

 

For Core 4, we need to articulate what outcomes may happen and they become dependent variables. We also need to identify the independent variables. What are the characteristics of the experience that they are having that is going to relate to the outcome?

 

Multiple Methods

  • Tasha Kkori and Charles Teddlie quantitative researchers
  • Onwuegbuzie
  • John Creswell has characterized multiple methods studies.
  • How are they used? Are they concurrent? Sequential? Are they confirmatory? Each one must contribute to the knowledge that is gained. It cannot be an add on.
  • Multiplists – p. 460/461
  • A dialectical approach is a conversation between the data. You want to look at discrepant data to inform…Multiple methods would lend itself to a dialectical approach where one talks to the others. You have iterations of confirmation and disconfirmations. A multiple method can be a dialectical approach because you are juxtaposing multiple views where the two are compared and questioned. It is used to confirm or disconfirm.

 

EDPSY 597B

Class 9

Thursday, March 26, 2009

 

Final Project:

  • Create a 5-7 page document that lists the potential sources and plan of action to tackle a comp. question that deals with the teacher education’s place.
  • Teacher ed. Is not considered a profession because it doesn’t have a knowledge base. It is merely a craft. I would be arguing there is a knowledge base. I would be tracing it through the seminal works.
  • EDPSY Final due May 15th!!!

 

Stimulated Recall is used to get a cognitive processes – things that are not visible. It’s a way of making it transparent. We try to devise the questions in a way so that they can’t reconstitute what is going on? Unstructured in our questions and structured in our intention. When you have a video, you must decide on what to focus. Time needs to be spent on whom I want to focus the video – who is the object of video/focus. We have to make choices about viewing the tape and stop it at appropriate times or should we stop it at appropriate times. We do the analysis.

EDPSY 597B

Class 10

Thursday, April 2, 2009

 

Stimulated Recall

  • Be sure to record the conversation. The conversation is then transcribed and coded. It can be quantified using thought units. Separate the transcription into thought units. Then group them into like categories. Then look across categories for themes.
  • One way to combine the methods is to have a longer time segment and tell the participant to stop the video if s/he is making a decision before the time frame has expired. This way the two methods would be combined allowing for the participant to still have choice while also trying to capture the unconscious or the non-verbalized thinking.
  • One downfall is the reflection. It’s not in real time. How can we get to real time (Twitter?)

 

EDPSY 597B

Class 11

Thursday, April 9, 2009

 

Interview Analysis:

  • Spradley (a founding work in interview analysis). Offers a prescription for interviewing. Suggests beginning with a grand tour question.
  • Ethnographic interviews have a broad overview and do not focus until later as the data begins to unfold.
  • See handout.
  • The number of codes is determined by the complexity of the interviews.
  • You want one master copy that is unaltered that has been coded.

 

Spradley suggests:

  • Record words èdetermine codes è code units è organize codes into domains to understand the relationship between the codes è organize the domains into themes to understand the relationships between the domains

 

EDPSY 597B

Class 12

Thursday, April 23, 2009

 

Portfolios:

  • There are only two ways to go – qualitative or quantitative rubrics. The National Writing Project has some sample rubrics. There are rubrics out there, so don’t reinvent the wheel unless needed.
  • Devise the rating scale, try it out…see Developing Rubrics PDF.
  • See p.2 in Rating Scales PDF for steps in a rating scale development
  • Ideally the papers should be blind reviewed.
  • Stephanie suggests sorting all into piles before grading.
  • Validity Issue

    • You must pay attention to different kinds of validity. It depends on the nature of the portfolio.
    • You know that you’re doing the right thing when you have some correlation (greater than .5) between more pieces of evidence.
    • Take 20% and code their reliability. You need some type of co-efficient. You can report it as a range with a mean or median reliability.
    • Have inner-related reliability parties – you only get food and beverage when you finish. (So, if you have a hundred examples, you need enough people to cover the 20%).
  • Inter-rater reliability
  • Cohen’s kappa is more accepted because it takes into consideration chance.