Assessment of students - Compilation

Ian_Max.Moller at fysbot.lu.se Ian_Max.Moller at fysbot.lu.se
Wed Apr 30 11:37:58 EST 1997


Dear Plant Ed'ers

On 7 March and again on 18 March I posted the=20
following question to the group:


"I am a lecturer in plant physiology and presently=20
I am attending a course in the=20
evaluation/assessment of student performance - not=20
just exams, but anything that we can use to check=20
how much they have learnt, what they have learnt=20
and, most importantly, to what extent they really=20
understand what they have learnt. Although I have=20
been teaching for a number of years, it is only=20
now dawning on me just how difficult a task this=20
is.

As part of this course I have to do a small=20
project and I have chosen to ask you to help me=20
compile a list of all the various forms of=20
assessing student performance that you have tried.=20
One very common way is a final exam with various=20
types of questions (multiple choice, true/false=20
and essays). But there are many other forms.=20
Please tell me what forms you have tried (applied=20
at any time during the course and used to evaluate=20
any aspect of the course), what type of course it=20
was tried in (subject, number of students,=20
duration of course), how successful you thought=20
the various evaluation forms were and what the=20
student responses were.=20

I very much appreciate your response before the=20
end of March. I intend to post a compilation=20
sometime during April."


Here are the answers I received roughly in=20
chronological order:


1) From: Brian Scott Dunston=20
<bdunston at whale.st.usm.edu>

often, i use essay type questions that require the=20
students to utilize=20
experimentally derived  data and to develop a=20
conclusion and provide the=20
basis for their designing a set of experiments to=20
expand upon the data=20
provided.

2) From: Todd.Silverstein at plantcell.lu.se (Todd=20
Silverstein)

 Here are some of my thoughts on assessing=20
students' performance in class.  In my own classes=20
(intro chem, 2 semesters, biochem/1 semester), I=20
generally stick to pretty traditional assessment=20
instruments:  quizzes, hour exams, and final=20
exams.  I do incorporate a literature search in=20
the biochem. class every semester:  the students=20
pick a topic that we've covered that they enjoy,=20
go to the library to find a research article on=20
that topic, read the article and write a simple,=20
2-3 page summary of the article.  I grade their=20
summary on their understnding of the article, as=20
well as on their presentation (prose, etc.).  This=20
works fairly well - it's easy for students to do,=20
and it exposes them to the primary literature and=20
shows them that they can actually understand=20
research articles.

In my intro chem. class I sometimes incorporate a=20
"debate" in which I split the class into two or=20
three teams of 3-6 students each, and I assign=20
them a position on either a pressing social issue,=20
or on an issue in chemistry (I.e, what is better,=20
the molecular orbital model or the valence bond=20
model of molecular bonding and structure).  The=20
teams present their arguments based on the=20
relevant chemistry involved.  They then decide=20
amongst themselves, who "won" the debate, and they=20
give each other grades as well. This is a very=20
popular exercise, but it does eat up a whole class=20
meeting.

On my hourly exams I often include a bonus=20
question at the end asking students what they=20
studied for but found to be missing from the exam=20
- write a question on this topic and answer it.  I=20
then award bonus points for both the question and=20
the answer.
=20
I have tried two other tactics more rarely.  I=20
have given open book exams two or three times in=20
the upper level biochem. course, generally as=20
final exams.  Students get a whole weekend to=20
write up their answers.  This worked well once,=20
but the second time I did it I found evidence of=20
cheating among a group of three students, so I=20
have not tried it again.  Students enjoy the=20
challenge of open book exams, but they are=20
challenging to write as well.  They must be=20
creative and involve much "synthetic" thinking on=20
the part of the students.  Open book exams are=20
definitely not eassy to write.

Another idea which a colleague of mine uses in the=20
intro chem course is to assign a pair of students=20
the task of outlining the next chapter in the=20
book, picking out the key concepts, and=20
delineating key implications of the material, and=20
key questions that must be asked and answered. =20
These outlines are then printed up and distributed=20
to the students in the class. The quality of the=20
outlines is uneven of course, but the experience=20
is good for the pair of students, and the result=20
can sometimes be used by the students as a study=20
guide, and by the professor as a guide in crafting=20
the exam.

One more thing.  Once in my biochem. course I had=20
each student write up
five questions for the final exam.  There were=20
about 10 students in the
course.  I then edited all 50 questions and=20
drafted them onto a single
handout which I distributed to all the students=20
and told them that I would
draw the final exam questions from these 50, plus=20
I would write a few of my
own.  I believe I also gave students a few bonus=20
points if I ended up using
their questions on the exam.  It was an=20
interesting experiment, though not
wholly satisfactory (I have never done it again). =20
Writing that exam
involved editing the student questions and=20
redrafting them so they were
clear, carefully choosing questions to give even=20
coverage of the course
material, and writing some of my own questions to=20
fill in gaps.  After the
exam students complained that they had mainly=20
studied only the questions on
the sheet, and were therefore unprepared for the=20
questions that I wrote
myself.  Grades on that exam were no better or=20
worse than any other.  I do
feel that students benefited from tkaing the time=20
to write exam questions,
and that seeing their questions on the exam was=20
good for them.  Studying
from a list of questions gave the students a bit=20
too much structure though,
as they seemed to neglect studying for the course=20
as a whole.

3)	From: "Scott T. Meissner"=20
<smeissne at prairienet.org>

I just did an "assessment" of my botany students.

We have been going through the life cycle of=20
plants and algae.  I want them to learn how the=20
sporophyte and gametophyte fit into the lives of=20
various algae and to be able to compare these to=20
the plants.  For them to do this they have to=20
master the diplobiontic, haplobiontic diploid, and=20
haplobiontic haploid life cycles.  So we have=20
covered this in lecture.

Wishing to know if my students had this down, I =20
assigned them to take one minute in class and,=20
without using their notes, draw out either a =20
diplobiontic life cycle or a haplobiontic haploid=20
life cycle, labeling the names of the=20
multicellular
and single celled stages.

Out of a class of 35 people fifteen were able to =20
give me an excellant life cycle with all the=20
parts. About ten were confused on where meiosis=20
fits in, and  the remainder were quite lost. =20

I collected the papers, without students names on =20
them, and used them in lecture as a basis for=20
review.  While I was scanning the papers I had two=20
students, who I knew to be my better students,=20
draw out the two life cycles on the board.  Then I=20
noted a few of the common problems that seem to=20
have appeared.  This was effective use of lecture=20
time because I was dealing with student generated=20
problems, and because it forced me to communicate=20
my expectations clearly to the students. =20

The original papers I marked up last night with=20
comments and will post them in the hall way so=20
that students can see the good and bad ones and=20
have a chance to  reinforce, or improve, their=20
understanding of this material. =20

In a Field botany class I once started off having=20
students do a one minute definition of "species", =20
and then covered the various definitions used by
taxonomists, ecologists, physiologists, etc.  At=20
the end of the lecture I had them again do a one=20
minute definition of species.  All the pages were=20
anonymous, and turned in.  So I had a nice idea of=20
my student's understanding before and after the=20
lecture of the concept of a species.  I entered=20
these into a computer and printed out their=20
definitions and used the start of the next lecture=20
having them argue for and against various=20
definitions that they had submitted.  This made=20
for a good review of the material. =20

4) From: Anne Heise=20
<aheise at orchard.washtenaw.cc.mi.us>

We have just begun this type of assessment too. =20
One book that you might want to see is by Angelo=20
and Cross; I don't remember the exact title but=20
can look it up for you if you want.  It has lots=20
of suggestions about assessment, mostly quickies=20
you would do during a single class period.

One thing we have tried is intended to assess how=20
well the introductory bio course is working.  We=20
ask students in 2nd level courses how well they=20
feel their 1st level course prepared them for=20
different topics in the 2nd level course (e.g.,
cell structure and function, respiration, etc.). =20
However, we have not made much use of this info=20
yet.  A difficulty w/ this approach is that many=20
of the students in our 2nd level class did not=20
take the intro course at our school.

5) From: "Janice M. Glime" <jmglime at mtu.edu>

All courses are 10 weeks plus exam week:
1.  weekly email of 2 essay questions per group of=20
4, based on lecture or lab, and answered by those=20
submitting.  These were subsequently redistributed=20
to entire class with my questions, and each set of=20
questions was graded.  First term freshman course=20
in general biology with 100 students.
2.  week 2 test worth half a test, test designed=20
by students.  Each group submitted what they=20
thought would be a balanced test on what we had=20
covered.  All submitted questions and answers were=20
available in learning center.  First term freshman=20
course in general biology with 100 students.
3.  skill points on labs, including proper use of=20
microscope, use of balance, measuring pH,=20
dissection  First term freshman course in general =20
biology with 100 students.
4.  scavenger hunt, 40 points (items) required, up=20
to 100 possible, with
list, but prohibiting collection of protected=20
items or large samples - for
up to 150 students in sophomore botany- students=20
like it and get others
interested in the course; it helps that it occurs=20
just as the snow is
disappearing after 6 months.
5.  student-designed projects, 100 students, first=20
term freshman biology
6.  take-home exam requiring group discussions but=20
individual write up on what they though the=20
evolutionary tree might be for 5 division of=20
plants. Sophomore course in plant morphology, 20=20
students
7.  Behavioral observations (distinguishing=20
between observation and interpretation), stressing=20
good observations, using Attenborough films with=20
sound turned off. First term freshman course in=20
general biology with 100 students.
8.  Weekly lab quizzes - intro bio, botany, plant=20
morphology.  20-150 students.
9.  tests three times a term plus final exam;=20
essay and fill in the blank.
10.  Practical exams (1 or 2 plus final) - plant=20
morphology, botany,
general biology; 20-150 students

All of these were fairly successful and the=20
students seem to appreciate the variety of forms=20
of assessment.  Those with learning disabilities=20
have stated they feel they have a better chance.

6) From: Mary Williams      =20
mwilliams at THUBAN.AC.HMC.EDU (Mary Williams)

Last year I taught a plant development course for=20
11 senior biology majors. This was a "seminar"=20
course, in which we mostly read papers and=20
discussed them, both through formal presentations=20
by the students and informal discussions. It was=20
relatively easy to evaluate the students formal=20
presentations - I told them the two most important=20
criteria were to clearly convey the material to=20
the class, and to promote discussion. To evaluate=20
the extent to which each student learned the=20
course material, I asked them to write a "grant=20
proposal" of about 10 pages, with
approximately 5 pages of background material and 5=20
pages of proposed "research". From this writing=20
assignment I could clearly assess the degree to=20
which the students had learned the material - =20
both their level of understanding of the body of=20
knowledge we had covered, their understanding of=20
the experimental approaches we had covered, and=20
their abililty to analyze and interpret data.

The students LOVED the course. One student wrote=20
"I appreciated her treating us like responsible=20
adults who were interested in taking the class=20
because we wanted to, not because we wanted to get=20
an A in the class." and another wrote "I liked=20
giving presentations and being active in the=20
course; I felt that this made me learn more since=20
I was active in the learning process."

I think this form of course works very well for=20
advanced students (seniors or maybe juniors) and=20
for small classes, but I don't imagine it would be=20
as easy with a large class or with more=20
introductory material.

I also teach a larger sophomore level class in=20
which I give three midterms and a final. I stress=20
their understanding of the important concepts, and=20
as such the test questions are almost entirely=20
short essay - usually presenting the results of an=20
experiment and asking them to interpret them, or=20
asking them to think of a way to differentiate=20
between two hypotheses. I believe that I can=20
assess the students understanding very well by=20
these questions, and it trains them to study and=20
learn in a more productive way than if I gave them=20
a science vocabulary test, which I think rewards=20
memorization but not real understanding.......

I also give the sophomores a small writing=20
assignment which is to find an article in Science=20
or Nature and write a two page summary of the=20
research - the experimental approach and the=20
significance of the findings. The purpose of this=20
assignment is mainly to get the students familiar=20
with the scientific literature, and it counts for=20
only a small part of their grade.

7) From: "Doug Jensen"=20
<doug_jensen at smtpgtwy.berea.edu>

I have been teaching for a number of years, but=20
only this year do I have full course=20
responsibility.  Previosly, I taught with a fair=20
amount of supervision and I was never able to do=20
exactly as I wished.  Thus, my answers are not=20
really based on years of experience.

I teach introductory botany, dendrology, and field=20
botany at a small  college. These are all=20
semester-long (15 week) courses.  My large classes=20
are  around 16 students.  Small classes (field=20
courses) are around 10-12.

I use 3-4 written exams throughout the semster to=20
evaluate students.  The exams are 1 hour long.  I=20
feel that this gives me a better assessment of=20
what students know than true/false or multipliple=20
choice.  The problem is that they take a while to=20
grade and that the grading of these is said to=20
subjective.  I argue that my assessment of what=20
the students know from a written exam is more=20
acurate than from a mult choice or T/F exam.
    In the field classes, I use pop quizzes of=20
plant species nearly each lab period (3=20
times/week).  This is a good assessment of whether=20
students know the plants, plus it gives them=20
immediate feedback.
    I also use some exercises that students turn=20
in for grades.  I correct all these, but grade=20
them easily, their purpose being for the students=20
to learn rather than to be assessed.  Doing this=20
also allows me to ask tougher questions on exams,=20
because low exam grades are offset by high grades=20
on exercises. Humanities instructors often do=20
exercises like this.  They call them 'low stakes=20
assignments'.  Although they are loosely graded=20
(sometimes just with a check or check-), a large=20
number of them can be converted to a significant=20
portion of a grade.

    I hope this is helpful to you.  I dislike=20
assessment as much as anyone, and=20
I think my methods are sufficient, although not=20
perfect.  If you could, I would=20
like to see a summary of your responses when you=20
are done.


8) From: Ross Koning <koning at ecsu.ctstateu.edu>

I have used multiple (gag) choice, essay, short=20
answer, and other kinds of exam questions.  Most=20
of my exams are a mixture of many kinds of=20
questions, some more structured some less so.  I=20
also have diagrams to label, diagrams to
draw and label, etc.  I like to ask open-ended=20
questions whenever I feel creative enough.

I have also used laboratory exercise worksheets,=20
laboratory reports, oral reports, term projects,=20
and computer simulation results to evaluate=20
student performance.

As there are different teaching styles and=20
different student learning styles evident, I try=20
to provide a variety of pathways to evaluation. =20
As class size is lower, my efforts for diversity=20
in evaluation increase.  I am thrilled to be at a=20
teaching/learning university where class sizes are=20
strictly limited and I can really bring up some=20
scientists through a developmental pathway. =20
Creative thinking in
evaluation can be done when classes are smaller. =20
In this environment I find that I can evaluate the=20
multidimensional qualities needed in a developing=20
scientist.

I know I have not really answered your question=20
about "which form was most useful" but that is on=20
purpose.  To me each form of evaluation has its=20
value and usefullness and so I aim for diversity=20
to test the metal in all directions.
There is no one form that can adequately test the=20
qualities of a scientist, nor the progress of a=20
student, in my opinion. Even those multiple choice=20
questions I loathe (because they only test=20
concrete learning and short-term memorization
skills out of the context of true science) have=20
their use. If you wish to assess current facility=20
with vocabulary, then this tool is appropriate. =20
If you want to test the ability to think, then an=20
expository writing evaluation is one possible=20
way...a project can lead evaluation of inquiry=20
skills...sketching and diagram labeling can test=20
the visual/ cognitive connection for visual=20
learners...oral reports can test the=20
organizational and communication skills of the=20
student.  A sustained independent study project=20
over several semesters can be used to evaluate the=20
progress of the student as a true scientist.  So I=20
really don't like tests that are all one type of=20
evaluation unless they are supplemented with=20
evaluations of other types in the course.

The structures of three of my courses, including=20
sample exams and other evaluation instruments, are=20
outlined on my WWW pages at the URL in my=20
signature file below.
http://koning.ecsu.ctstateu.edu/



9) From: Gunnar.Fridborg at fysbot.uu.se (Gunnar=20
Fridborg)

We are talking about evaluation, assessment of=20
students' knowledge. In addition to ordinary=20
written exams I have some experience, nothing=20
unusual, but it might be useful to you, so I would=20
like to contribute:

a) Oral exam with students in groups of two.=20
During a course called Biology 2 with=20
biochemistry, cellbiology and organismal=20
physiology (zoological and botanical) we have a=20
written exam after each part of the course, four=20
exams, and end with an oral exam preferably two=20
teachers and two students, but sometimes only one=20
teacher because of lack of time. These oral exams=20
have two advantages: 1) they can be done in three=20
days and you do not have to correct 100 written=20
exams; 2) you can give more general questions and=20
test the student's understanding. The students are=20
nervous, but the exams are often quite pleasant=20
dialogues. The teachers are also nervous the first=20
time since it is impossible to be an expert on all=20
of biochemistry, cell biology and physiology -=20
that is why it is better with two teachers. The=20
marks from the oral exam are combined with those=20
from the written exam for the final grade.

b) Surprise written quizz on the practicals near=20
the end of the course. This was part of a=20
pedagogical experiment with the purpose of=20
evaluating the value of the practicals which I=20
hope to be able to present at the SPPS Congress at=20
Ultuna this summer. It was not so much the=20
technical content of the practical, but rather=20
their conclusions and the theory. This quizz did=20
not count to the final mark, but one could just as=20
well have counted it.

c) Retention test ca six months after the end of=20
the course. I have done several such, the latest=20
in connection with the above pedagogical=20
experiment, where the students (one test group and=20
a control group) were given the same questions on=20
the practicals as well as some questions on=20
theory. (The test group had only had one lab per=20
group, whereas the control group had had the usual=20
number of practicals. In both groups all the=20
practicals had been reported to all the students).=20
An earlier retention test with course related as=20
well as more general questions were carried out=20
together with plant physiology courses in=20
Linkoping, Uppsala and Umea. The purpose was to=20
compare what one could call 'quality in teaching'.=20
Unfortunately the investigation was incomplete,=20
but gave me some quite interesting results.=20
However, surely such retention tests can not be=20
used for the grading of the students?

One problem we have had are our often large and=20
short part courses (80-100 students for four weeks=20
for a basic course in plant physiology). (Editor's=20
note - A course is full-time with no parallel=20
courses), which makes it almost impossible to form=20
an impression of every single student's=20
performance during the course: in seminars,=20
practicals, etc. This also means that one has to=20
think carefully before sitting down to make an=20
exam so that the workload correcting and=20
evaluating it does not become prohibitive. I think=20
that there are computer programs with=20
self-correcting test?


10) From: "Christopher T. Cole"=20
<colect at CAA.MRS.UMN.EDU>

This note is to second Ross Koning's thoughtful=20
reply about assessment,  particularly in his=20
emphasis on using a variety of devices for=20
assessing  student learning, and to make a very=20
few suggestions.

First, students respond very positively to having=20
some role in determining  how the assessment takes=20
place.  What you can do may be limited by the size =20
of the class, etc., but students generally=20
appreciate having a say in how  many tests,=20
quizzes, lab reports, term papers, etc. they have;=20
when they  are scheduled; how much each should=20
count for; etc.  At the beginning of  each term, I=20
ask students to think about these questions;=20
depending on the  course and its size, I set some=20
limits on how many tests, term papers, etc. They=20
also take it well when you _seriously_ ask for and=20
listen for their  feedback on how methods of=20
assessment or instruction are working.   This has=20
to be genuine-- they can spot a phony effort=20
before its  even finished, but a genuine request=20
for their opinion, (and action  on it, too) goes a=20
long way.  After all, don't we want to be =20
assessed, too?

Second, I'm going to make a plug for a different=20
kind of  multiple-choice question, the kind that=20
does not merely test recall (I  usually throw a=20
few of those on a test, too, in part because=20
students usually are comfortable finding some=20
there, and recall of content is part of what we're=20
aiming at) but test ability to analyze problems. =20
During the term, I have students work on multiple=20
choice problems (on problem sets or in-class=20
tests) that present an experimental design, its=20
results, and then asks the student to interpret=20
the results and select one or more from a=20
list of answers.

The _simplest_ example of one of these would be to=20
give a picture (photo or diagram) of a DNA agarose=20
gel, and have the student select the correct=20
restriction map out of five possibilities.  They=20
can get a lot more creative than that.

These problems tend to be quite hard.  Their main=20
pitfall is that a student can understand, say, 85%=20
of the material necessary to answer the question=20
but still get it wrong-- and do so on a bunch of=20
questions, thereby getting a low score that poorly=20
reflects his or her understanding.  Consequently,=20
I use these sparingly on tests (where there is=20
time pressure) and more on take-home=20
quizzes/problem sets, where there is more time and=20
each question is worth fewer points.

Third, I explain to the students that the tests=20
are designed to test both recall and=20
understanding, so they know there will be=20
different kinds of questions.  Also, quite=20
importantly, I give lots and lots of "practice"=20
problems of the latter type (e.g. predict the=20
results of an experiment, or interpret results) so=20
that the general idea is familiar to them before=20
tests and they are comfortable with the mix of=20
formats.

Fourth, my courses emphasize (more than most) the=20
"how" of science; that is the reason for all the=20
work predicting and interpreting experimental=20
results.  We also discuss experimental methods,=20
design, and results in class more than most=20
courses.  But I also emphasize that science gets=20
done by people working together, and have several=20
methods that aim to build the skills of teamwork. =20
There's a whole bunch of literature on=20
collaborative learning that I won't summarize, but=20
would emphasize the importance of (a) mutual=20
interdependence, (b) individual accountability,=20
and (c) structured methods, as keys to teaching=20
this way.  We start (the first day) working on=20
this, and students learn some skills of working=20
together.  They use these skills through the=20
quarter (e.g. working on specific problems), and=20
have to apply them to lab reports, which are=20
authored by the teams that conducted the exercises=20
in lab (each set of exercises takes a month or=20
more to conduct).  They are graded by me but also=20
include peer scores, whereby each team member has=20
a fixed number of points that are allocated to=20
team members according to her/his assessment of=20
each member's contribution to the report.

Fifth, I frequently include options: "from the=20
following, select two" type of questions.  I think=20
it's more psychological than anything else, but it=20
takes a bit of the pressure off of students,=20
ahving a bit of choice.

Finally, I try to make each test include at least=20
some new material, so that the students learn=20
something new by taking the test-- say some=20
material that connects ideas from class, etc.  One=20
example would be a question presenting=20
experimental results on the distribution and=20
function of C3 and C4 plants at different=20
altitudes (etc.) and have the students interpret=20
the results-- but it brings in the various=20
information they have about water loss efficiency,=20
etc.

Above all, students want to feel they are treated=20
with the respect of fairness.  If you operate on=20
that as a gut instinct, you will certainly make=20
mistakes and certainly will learn from them.  If=20
not, you will just make mistakes.

'Nuff said. Time to go give an exam.  All of these=20
methods have drawbacks, and all are to some degree=20
stolen from others.


11) From: "David J. Oliver" <doliver at iastate.edu>
This year I am teaching freshman biology to 200=20
first and second year students.  There is no good=20
assessment mechanism for this many students.  I=20
give five multiple choice tests and weekly quizes. =20
There is not much else I can do.  I like the class=20
and the students, but I am not comfortable that=20
you can work with this number of students and=20
evaluate their performance in any reasonable=20
manner.
        For the last 15 years I taught the upper=20
level/graduate level biochemistry course of about=20
40 to 50 students.  These students were given=20
three or four written tests each year.  These were=20
tests where the answers ranged from a one=20
paragraph summary or analysis of a concept to=20
calculations and analysis.  This is a much better=20
way of assessing students, but is very demanding=20
of time.  I also had one or two major projects. =20
These were either research proposals or literature=20
reviews where the students needed to prepare a 5=20
page paper on a biochemical topic of their choice.
        I also taught a small (10 to 15 student)=20
plant biochemistry course of graduate students. =20
This course was assessed in a number of ways.  I=20
am convinced that the best thing to do is to have=20
the students become so knowledgeable in a subject=20
that they can teach that subject and this is the=20
approach I used the last time.  I spent the first=20
half of the semester doing an overview of those=20
bits of biochemistry that are unique to plants.=20
The second half of the semester, the students each=20
did two chapters of the Biochemistry Issue of The=20
Plant Cell.  They were required to present a=20
lecture providing a background and then a review=20
of the subject.  The final test was written by the=20
students (they each wrote a question or two on=20
their chapters and then provide the answers).  It=20
did not work as well as I would like, because the=20
quality of the presentations were not as good as I=20
would
like, but the students thought that they learned a=20
lot about their individual subjects.  It was an=20
okay first attempt, but I would spend more time=20
working to get the quality of the student=20
presentations better so that the others in class=20
did not suffer quite so much.

If you come up with any great ideas to work with=20
200 students pass
them along.


End of the responses


Thank you all for your contribution! I am=20
considering compiling an edited and organized list=20
of student assessment forms in May-June.

Dr. Ian Max M=F8ller
Department of Plant Physiology
Lund University
Box 117
S-221 00 Lund
Sweden

Tel. +46-46-222 7789
Fax  +46-46-222 4113
E-mail  ian_max.moller at fysbot.lu.se



More information about the Plant-ed mailing list