Poster: To what extent do microscale activities help students develop conceptual understanding?

Discussion questions

(Discuss via Twitter – remember to use #MICER17 – or in the comments below)

  • Do students have a better grasp of chemical concepts having experienced chemical observations / measurements through a microscale practical activity compared with a ‘standard’ practical activity?
  • Is there a novelty effect around microscale activities or is there a sustained benefit for teaching AND learning of chemistry?
Click on image to get full size

7 thoughts on “Poster: To what extent do microscale activities help students develop conceptual understanding?”

  1. So how would we measure this….? Might be easier of how not to measure it first. Definitely don’t want to do experiments in Microscale and test the kids on the concepts and look at assessment scores. Measuring understanding is notoriously tricky, you have to be completely sure that your assessment items are doing that and are not able to be bypassed by memory or repetition of lower order skills.

    I wonder if you could turn the question on its head in the light of recent evidence that teacher direction/knowledge/talk is important in driving rates of progress. What effect does Microscale experimentation have on the teacher’s behaviour? How much time they spend with each pupil, how many pupils they are able to connect with in a lesson? You could classify the teacher interactions eg, behaviour management interaction, answering student procedural question, answering student conceptual question, directing conceptual questioning.

    Like

    1. Thanks Kristy. I agree that a simple ‘use microscale experiment – test understanding’ regime would be unlikely to yield any reliable information. I’m thinking about whether we can tease out longer term impacts (over a term?) where some form of assessment can show any link between the use of microscale and conceptual development.

      I like the idea of a focus on the teaching interactions. One clear benefit of microscale is the reduction in time-to-results, which gives more time to discussion in class (peer / group / teacher-student) which helps with conceptual development. In this sense the question moves away a bit from my original question in that you are looking at microscale in the wider context of the lesson.

      Perhaps my original first question is falling between two stools: i) the use of microscale to minimise the area of focus to allow ‘better’ observation / measurement by the student and ii) the use of microscale to maximise opportunity for discussion and conceptual development.

      Like

  2. I think for a first exploration into this kind of qualitative research it is best to have quite a tight question. So if you wanted to stay with the assessment impact theme then you could tighten your question to evaluating the impact of microscale practical work on students’ achievement in assessments covering the related concepts. So you’re not saying you’re trying to make a judgement on the mythical beast that is understanding.

    Like

  3. Thanks for getting the ball rolling. An interesting topic and as you say throws up lots of questions.

    For research, I would be interested in seeing how students can explain chemical concepts as a result of a microscale activity. How can they verbalise, or how can they relate micro/macro. Perhaps you could get them to draw what is happening at a particulate level and describe what they have drawn. It would be interesting to see what level of understanding of chemical concepts are emerging. Despite the reluctance to assess, there are some pretty neat diagnostic assessment instruments out there.

    As an aside, does this belong in “practical work” classroom time? Woolnough argued that practical work should not be ‘subservient’ to teaching concepts but rather be a separate identifiable part. Is microscale “practical” as we know it, in terms of development of psychomotor skills?

    Like

  4. Thanks for the comment Michael. Certainly inferring understanding from particulate diagrams, written explanations etc would form part of my evidence base. I’m all for continual assessment in teaching – not all summative of course, but that constant assessment of what students are saying/doing – trying to pin down as well as we can the ultimately unknowable learning that’s going on in their heads!

    Do you have particular links to diagnostic assessment instruments that have been validated?

    For me, practical work falls into many categories – I find Peter Main’s characterisation a useful starting point when thinking about why I use practical work (SSR, 2014, 95(352), 46 – https://www.ase.org.uk/journals/school-science-review/2014/03/352/3570/ssr-march-2014-046-052-main.pdf). Certainly we should expect students to develop psychomotor skills. Learning particular apparatus and techniques are important in and of themselves at KS3/GCSE/A Level (including for assessment purposes), but many of these particular apparatus/techniques I expect will be superseded once they progress to HE and industry etc. Crucially, however, they are developing the skills in manipulating objects and making accurate observations, something that can only be really developed through hands-on experience and practice. Once they have that ‘base-load’ of competence, learning other apparatus/techniques becomes easier.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s