#MICER17 Participant Reflection 4

Have you thoughts to share on #MICER17? If so, please send me a link to your own blog or the text to post it directly onto this website. Reflections are welcome from those who attended on the day and those who followed the conference on Twitter. 

From: http://www.possibilitiesendless.com/

“Methods in Chemistry Education Research 2017 was last week. I didn’t make it down to London for it but did submit a poster and followed along a bit on Twitter.

I’ve been to a few of these style of events, initially as ‘Getting Started in Pedagogic Research 2012, I missed one in 2013, showed up in 2014 then attended MICER 2016 last year.  I’ve not yet made my way fully through the Storify of MICER 2017 or the other reflective posts but I have a few thoughts on the necessity of such meetings…” Continue reading at

http://www.possibilitiesendless.com/2017/05/micer-2017/

#MICER17 Participant Reflection 3

Have you thoughts to share on #MICER17? If so, please send me a link to your own blog or the text to post it directly onto this website. Reflections are welcome from those who attended on the day and those who followed the conference on Twitter. 

From: https://handwavingchemistry.wordpress.com/

“On Friday 19th May, it was the second annual Methods In Chemistry Education Research conference, held once more at the Royal Society of Chemistry in London and run by Michael Seery. Again, the conference was a great opportunity to learn about and discuss the tools, methods, and philosophies used when conducting research into chemistry education. It was particularly great to see an increasing number of chemistry education researchers at the postgraduate level, and there will be a specific satellite meeting for these folks at ViCE/PHEC in August!”

Patrick has a series of posts planned – at the moment you can read:

An overview summary: tl;dr: https://handwavingchemistry.wordpress.com/2017/05/22/micer17-conference-reflections/

Reflections on Suzanne Fergus: https://handwavingchemistry.wordpress.com/2017/05/22/micer17-reflection-1-suzanne-fergus/

#MICER17 Participant Reflection 2

Have you thoughts to share on #MICER17? If so, please send me a link to your own blog or the text to post it directly onto this website. Reflections are welcome from those who attended on the day and those who followed the conference on Twitter. 

From: https://mistrygroup.wordpress.com/

“…The Royal Society of Chemistry launched the Methods in Chemistry Education Research conference last year to discuss the practicalities of performing chemistry education research. I was able to attend this year’s offering, even presenting an online poster. Here’s what I enjoyed and what I’ll take back with me to Leeds…”

Continue reading on: https://mistrygroup.wordpress.com/2017/05/22/methods-in-chemistry-education-research-2017/

#MICER17 Participant reflection 1

Have you thoughts to share on #MICER17? If so, please send me a link to your own blog or the text to post it directly onto this website. Reflections are welcome from those who attended on the day and those who followed the conference on Twitter. 

From https://dave2004b.wordpress.com/

“At the sumptuous RSC Library at Burlington House, we gathered Methods in Chemistry Education Research 2017,  a day of lectures, activities and catching up with friends and colleagues. From school teachers to a Professor Emeritus, we gathered with a common purpose – to spend a day thinking about methods in chemical education research…”

[continue reading at: https://dave2004b.wordpress.com/2017/05/20/micer17-review/]

#MICER17 Reflection

My own reflections on MICER17:

Two related themes emerged for me from the Methods in Chemistry Education Research meeting last week: confidence and iteration.

Let’s start where we finished: Georgios Tsaparlis’ presentation gave an overview of his career studying problem solving. This work emerged out of Johnstone’s remarkable findings around working memory and mental demand (M-demand).1,2 Johnstone devised a simple formula – if the requirements of a task were within the capability of working memory, students would be able to process the task; if not, students would find it difficult. This proposal was borne out of the plots of performance against complexity (demand) which showed a substantial drop at the point where M-demand exceeded working memory, and these findings seeded a remarkable amount of subsequent research. Continue reading at http://michaelseery.com/home/index.php/2017/05/reflections-on-micer17/

Evaluating (and changing) classroom practice

In this pre-reading Orla Kelly introduces action research which is often a starting point for educational research. There are a few places for #MICER17 remaining: register here

How effective our classroom practice is can be evaluated in lots of different ways. Traditionally, there are two elements which need to be considered; formative and summative. Formative is the kind of evaluation that we use day to day, week to week. It informs our teaching and can be described as assessment for learning. It enables us to review our students’ learning and our teaching as well as provide opportunities for the students to reflect on their own learning. Summative is the kind of evaluation we use at the end of block of work, module or programme. It summarises the learning and can be described as assessment of learning. It enables us to state the progress our students have made towards pre-determined learning objectives (either internal or external). Teachers should be doing both of these to ensure their teaching and the students’ learning is fit for purpose.

However the traditional focus on end of module/year examinations puts much more focus on the summative evaluation. This can mean students have no sense of how well they are doing, how well they understand concepts, techniques, etc. until after they receive their examination result (which may of course be too late!). Equally, the teacher may have no sense of how successful (or not!) they were in teaching until the module is over and in the best case scenario lessons learned will be actioned in time for the next cohort. Fortunately, the rise in acceptance of continuous assessment in higher education and the use of digital technologies has meant that students now have more opportunity to share their learning and reflect on the content of their modules during the course. This allows both them and the teacher to identify areas that need more attention. This type of assessment should be ongoing and assumed to be part of all teaching and learning.

Sometimes as a result of formative (or summative) evaluation teachers see students consistently struggle with a particular concept or technique or perhaps that while particular elements of the desired learning are being achieved other areas such as co-operative group work or communication have been neglected. This often leads to teachers taking action to change their practice and hopefully improve their teaching and the students’ learning. This is where action research comes in.

What is Action Research?

According to Hendricks (2006), “democratic workplaces produce employees that take ownership of their work, which increases both morale and productivity” (p. 6). It was with this vision in mind that social psychologist Kurt Lewin (1946) first proposed the phenomenon of action research; a research methodology where practitioners directly engage in the systematic inquiry of self-identified issues. Known as the father of ‘action research and planned change’ (Schein, as cited in Burnes, 2004, p. 978), Lewin (1946) referred to the process as one where knowledge gained from one inquiry becomes the impetus for additional inquiry; “a spiral of steps, each of which is composed of a circle of planning, action, and fact-finding about the result of the action” (p. 209).

Teacher Action Research

In this session, I will share my own action research story in chemistry education, considering some of the theory that underpins it and the kind of methods we can use. I have copied below extracts from 2 research papers; one is from a literature review on the use of action research in higher education (Gibbs et al. 2017) and the other an example of a recent action research project in chemistry education (Vogelzang & Admiraal 2017) Both of these extracts link to the story I will share during the session. When reading,

  • Consider if and how you have introduced alternative or new teaching approaches (pedagogies) in your setting.
  • Reflect on what methods or tools you used to evaluate these new approaches, if appropriate.
  • Consider what was your motivation behind the introduction of these new approaches, if appropriate.

Pedagogies

In a more confined arena, Action Research (AR) is used to evaluate attempts to introduce critical pedagogies in teaching into higher education settings (Baptist and Nassar 2009; Guy Wamba 2011; Humphries-Mardirosian, Irvine Belson, and Lewis 2009; Taylor and Pettit 2007). Whilst many of these studies contribute interesting discussions of innovative teaching and theoretically rooted pedagogies, there is considerably more emphasis on the exploration of critical pedagogy per se and how it links with the choice of AR as an emancipatory research method. How AR is utilised as a research method – how data are collected and analysed, how positionality and bias are negotiated, and how the AR spiral/cycle is enacted, and so on – often goes unexplored, leaving open any questions on rigour and reliability of the findings. AR often appears to be used as a tool to encourage critical reflection rather than to be reflexive (Kinsler 2010), and to increase professional efficacy in such instances rather than to serve as a research method.[Gibbs et al 2017 p. 6]

This study can be understood as classroom action research (CAR) in which the first author studied formative assessments with two of his classes in a chemistry context-based course on lactic acid. CAR can be seen as a combination of practitioner inquiry (Orland-Barak 2009), teacher research (Cochran-Smith and Lytle 1990, 1999; Zeichner 2003) and technical action research (Kemmis 2009), resembling a method of finding out what works best in an individual’s specific context to improve student learning (Mettetal 2001). CAR fits in the centre of a continuum ranging from personal reflection at one end to formal empirical educational research at the other. CAR is more systematic and data-based than personal reflection, but it is more informal and personal than formal educational research (Mettetal 2001). The goal of CAR is to improve a teacher’s teaching in their own classroom (or department or school) (Mettetal 2001). While there is no requirement that the CAR findings be generalized to other situations, as in traditional ‘positivistic’ research, the results of CAR can add to the knowledge base (Mettetal 2001). CAR goes beyond personal reflection to use informal research practices such as a brief literature review, group comparisons and data collection and analysis (Mettetal 2001, Zeni 1998). [Vogelzang & Admiraal 2017 p. 158]

References

Gibbs, P., Cartney, P., Wilkinson, K., Parkinson, J., Cunningham, S., James-Reynolds, C., Zoubir, T., Brown, V., Barter, P., Sumner, P., MacDonald, A., Dayananda, A. & Pitt, A. (2017) Literature review on the use of action research in higher education, Educational Action Research, 25:1, 3-22, DOI: 10.1080/09650792.2015.1124046

Teacher Action Research (No date) http://teacherasresearcher.blogspot.ie/ (accessed 27th April 2017)

Vogelzang, J. & Admiraal, W.F. (2017) Classroom action research on formative assessment in a context-based chemistry course, Educational Action Research, 25:1,155-166, DOI: 10.1080/09650792.2016.1177564

 

Poster: Do STEM teachers praise intelligence more than arts/humanities teachers?

As well as general feedback on our methods we would be interested to discuss two questions.

  1. Would this project benefit from a ‘big data’ approach?
  2. Given the differences between science/maths and tech/engineering at a school level would it be better to further narrow the question?

You can discuss any or all of these points in the discussion below or Kristy and Scott are on Twitter (@doc_kristy @Sci_DrScott). Remember to tag any conversations with #MICER17.

We have written our first question to compare ‘STEM’ teachers with teachers of arts and humanities.  Technology and engineering at a school level are very different from science/maths (the contain a significant coursework portion and less emphasis on a final examination).  They also have far smaller numbers of pupils as they are option subjects.  Would we be better to look at only science and maths teachers (in this data set there would be reports for biology, chemistry, physics and maths)?