ISEN-ASTC-L Archives

Informal Science Education Network

ISEN-ASTC-L@COMMUNITY.LSOFT.COM

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Sender:
Informal Science Education Network <[log in to unmask]>
Date:
Wed, 6 Sep 2006 23:06:47 -0400
Reply-To:
Informal Science Education Network <[log in to unmask]>
Subject:
MIME-Version:
1.0
Content-Transfer-Encoding:
7bit
In-Reply-To:
Content-Type:
text/plain; charset="US-ASCII"
From:
David Smith <[log in to unmask]>
Parts/Attachments:
text/plain (187 lines)
ISEN-ASTC-L is a service of the Association of Science-Technology Centers
Incorporated, a worldwide network of science museums and related institutions.
*****************************************************************************

Here's another vote for beginning with objectives and then choosing the
evaluation that makes sense.  Below, I have attached the evaluation plan
(slightly edited) from a successful proposal to the PA Dept of Ed as an
example.  Measures are linked directly to the objectives of the
proposal, and were developed in consultation with the external evaluator
(one of the benefits of hiring them is they help you write this sort of
stuff and they know about useful instruments and tools). If your goal is
to increase vocabulary knowledge, then the vocabulary quiz is a measure
that addresses that.  Of course, you also have to worry about whether
kids in a more or less voluntary program are going to have any interest
in a vocabulary quiz.  In the PDE program, we were ultimately required
to add a measure for student achievement.  We refused to add a test for
ethical reasons and refused also to do a true experimental design.  We
did however use a comparison group (quasi-experimental) design.  We used
science notebooks, guided by the work of Ruiz-Primo and Shavelson at
Stanford, and scored them using a rubric.  It works very well to provide
evidence of science learning (including, but definitely not limited to,
appropriate use of vocabulary) in kids as young as first grade.
Evaluation does turn out to be one of the major time and dollar
commitments of the project, but now we know we are having real success
(not just that everybody likes the program, though we're happy about
that, too) and can use that with other funders and even politicians.

It really helps if the goals are outcome-oriented instead of
process-oriented.  For example, "Engage teachers in two weeks of
inquiry-based professional development." is a process goal.  It tells
you about the program, but it doesn't tell you anything about what
happened to the teachers as a result.  "Increase teachers' content
knowledge" is an outcome goal.  It tells you about a change in the
teacher as a result of the program.  

Given your comments, you seem to be most interested in conceptual
knowledge.  The goal for the ferret lesson might be something like
"Children will be able to improve their ability to identify the habitats
of different organisms."  It's only one session, so you are not looking
for perfection, but an increase in ability.  You can do a pre-post
design where you ask kids to do a minute-write or a quick-draw about
their own habitat.  It's not a test, it's a part of the program.  It can
also be their ticket into the animal activity and their ticket out to go
home.  Sure, not all will take it seriously, but you do what you can.  A
goal for the program might be "Children will increase their
understanding of the relationship between living organisms and their
environments."  You don't necessarily need a separate assessment for
this goal, you can accumulate the smaller assessments into a body of
evidence, or you can have kids keep a journal/notebook and use it to
show the accumulation of knowledge, or the drawing of connections, or
the increasing detail of observations, or whatever your program is
trying to do.  If you can write down some goals, I'd be happy to help
brainstorm some measures.

Dave


Evaluation Plan

Independent evaluators and project staff will conduct formative and
summative evaluations of teacher leaning and teacher practice, linked to
each objective as follows:
1.	Increases in teachers' content knowledge:  Science inquiry
performance assessments, lesson plan audits, and classroom observation
will be administered to a sampling of participating and
non-participating teachers each year.  The performance assessment will
evaluate the depth of content knowledge and the ability to apply it to
novel situations, two benchmarks of expertise.
2.	Integration of science, language arts, and math:  Lesson plan
audits and classroom observation, as above, from a sample of project and
non-project teachers will be evaluated for content and appropriateness
of the learning strategies for each content domain.
3.	Increasing teachers' ability to plan and conduct an inquiry
lesson: Same as  #2, above.
4.	The ability of teachers to lead professional development:
Observation of professional development study groups and  participant
evaluation of professional development sessions will provide direct
formative evaluation.  Items 1-3 above will also provide indirect
evidence of the effectiveness of turn-around training. 
Focus group interviews at the close of the project will provide
additional summative evaluation.
Available student achievement data (such as math, reading, and writing
PSSA's) will be evaluated across project and non-project classrooms, as
part of the district's ongoing self-evaluations

David L. Smith, Ph.D.
Director of Professional Development
Da Vinci Discovery Center, Allentown, PA 
http://www.davinci-center.org
"Who will pick up where Leonardo left off?"


> 
> Jonah Cohen wrote:
> 
> >ISEN-ASTC-L is a service of the Association of Science-Technology 
> >Centers Incorporated, a worldwide network of science museums and 
> >related institutions.
> >*************************************************************
> ****************
> >
> >OK, I ask you, my cohorts for advice, and please tell me if my gut 
> >reaction is way off.
> >
> > 
> >
> >Here's the deal: we've gotten mucho $$$ to do after school 
> programs at 
> >a number of local Boys & Girls Clubs. We'll be doing ~10 programs at 
> >each club. So far so good. Apparently, one of the 
> stipulations of the 
> >grant is that we have to assess the impact our programs have on the 
> >kids at B&G. Our bigwigs have tentatively decided we should come up 
> >with a bunch of vocabulary words, give the kids a quiz before the 
> >programs, then give them one after the quiz to see if 
> they've learned 
> >anything.
> >
> > 
> >
> >My gut reaction? This is complete {#$&*-Censored- (!^`#$} Here's why:
> >
> > 
> >
> >1)       It takes our programs, designed to cover a wide spectrum of
> >scientific knowledge, and reduces it to rote regurgitation 
> of a couple 
> >words. Yech. Not that vocabulary isn't important or 
> anything, but if, 
> >say, one of our live animal programs demonstrates how 
> animals have to 
> >be well suited to live in their homes (for example, we show how a 
> >ferret's flexible body is useful for maneuvering 
> underground) kids can 
> >grok that even if they don't recall the meaning of "adaptation" or 
> >"subterranean". I'd prefer that they do, but assuming that 
> the ability 
> >to do so is the best way to assess our stuff doesn't make 
> sense to me.
> >
> >2)       Oy, what a logistical nightmare. We've done 
> programs at Boys &
> >Girls Clubs before. It's pretty chaotic. Every week the kids 
> who show 
> >up are different. Often their parents are picking them up 
> mid-program. 
> >How on earth are we supposed to assess the kids based on who 
> attended 
> >which program and was there for specific vocab or other info. Or for 
> >that matter, who shows up for the assessment.
> >
> >3)       Worst of all, a Boys & Girls Club isn't school. 
> Kids go there
> >for fun, not to be tested (again). A big part of what we do involves 
> >getting kids to have a positive attitude about science, to enjoy 
> >science, to get confidence that they can do science. If they 
> see it as 
> >(yet another) round of testing, they may simply not show up for our 
> >programs and just go to B&G to play basketball or other things.
> >
> > 
> >
> >I'm sure these beefs sound familiar, as the listserv has 
> seen much talk 
> >about standardized tests - but the fact that they're now being 
> >transferred to non-school environments is an annoying 
> wrinkle. So, now 
> >that I've vented/made my case, my questions to you are:
> >
> > 
> >
> >1)       Am I nuts in my above complaints?
> >
> >2)       Anyone have any experience assessing the impact of programs
> >like this in a setting as thoroughly informal as B&G? Any 
> advice? Help 
> >me. Please.
> >
> > 

***********************************************************************
More information about the Informal Science Education Network and the
Association of Science-Technology Centers may be found at http://www.astc.org.
To remove your e-mail address from the ISEN-ASTC-L list, send the
message  SIGNOFF ISEN-ASTC-L in the BODY of a message to
[log in to unmask]

ATOM RSS1 RSS2