“Assessment drives learning” raises wry smiles (and occasional heckles) whenever it is mentioned. However it’s unfortunately the case that ‘encouragement’ to understand and learn comes from the need to demonstrate that new knowledge in some form of test. In Post Graduate Medical Education these tests are specialist exams which are required in order to progress to more senior stages of training. The scope of information needed is huge (and often the exams split into various sections to make them manageable!). The curriculum describing the knowledge similarly large and there is great heart ache as you scroll through 50+ pages of bullet points each starting, “the trainee will be required to know…”.
By necessity curriculum are bland but what has become increasingly apparent to me is the large blocks of ‘grey’ knowledge that appear to be missing from them. I say ‘them’, I can only really speak for Paediatrics, but I am given confidence in this assertion by the ever increasing momentum of the #FOAMed movement. Chris Nickson has previously eloquently explained why #FOAMed itself doesn’t need a curriculum. However while assessment drives learning, so does experience. It’s quite clear from the huge amount of materials posted daily on a variety of clinical topics there is a thirst for information that is not readily available by common reference sources. I’ll use the #SMACCGold conference to demonstrate this point.
Lets take the section on statistics in Royal College of Paediatrics and Child Health (RCPCH) emergency medicine curriculum.
To be fair this is clear, practical knowledge that is needed by clinicians. How does this translate into actual practice though? How will we use this to provide better patient care? What will convert these concepts into something clinicians can practically deliver? I think the answer comes from Prof. Simon Carley’s two excellent lectures at #SMACCgold. One on what to believe and when to change and one on risk factors in ED
As another example Paediatricians are involved in extremely challenging situations in resuscitating seriously ill children and sometimes need to make decisions about when that resuscitation should end. Is there a curriculum that defines and aids this process? It appears not but certainly Cliff Reid’s talk at SMACC is a great example of how this topic could be approached.
Finally I scanned through the College of Emergency Medicine curriculum for “chronic pain” and struggled to find anything. However with over 1000 views so far this probably fits the bill:
It would be harsh to say that those writing curricula have been lax in their tasks, it’s more that some topics are not easily encapsulated. They are often hidden, or camouflaged, in between more clear cut points.
This is where #FOAMed has come to the rescue. A variety of brilliant minds around the world have been able to recognise knowledge they needed to know but just wasn’t currently available ( a recent talk is entitled #paedstips you won’t find in books!). Ultimately #SMACCgold stands as a testament to revealing this camouflaged curriculum content (and I am sure more will be revealed at #SMACC2015). It may well be the case that assessment drives learning, but perhaps those in charge of medical education may want to use #FOAMed to drive curricula…
Thanks Damian- I think this fits with the ‘community as curriculum’ model of rhizomatic learning proposed by Dave Cromier http://davecormier.com/edblog/2008/06/03/rhizomatic-education-community-as-curriculum/
If you haven’t read yet I think you will enjoy. The conversations that come up in #foamed about assuring quality and a need for curriculum are quite fascinating.
A lot of in the community are wrestling with comments which I think Dave and others can help with.
The conclusion of that paper:
“In a sense, the rhizomatic viewpoint returns the concept of knowledge to its earliest roots. Suggesting that a distributed negotiation of knowledge can allow a community of people to legitimize the work they are doing among themselves and for each member of the group, the rhizomatic model dispenses with the need for external validation of knowledge, either by an expert or by a constructed curriculum. Knowledge can again be judged by the old standards of “I can” and “I recognize.” If a given bit of information is recognized as useful to the community or proves itself able to do something, it can be counted as knowledge. The community, then, has the power to create knowledge within a given context and leave that knowledge as a new node connected to the rest of the network.
Indeed, the members themselves will connect the node to the larger network. Most people are members of several communities—acting as core members in some, carrying more weight and engaging more extensively in the discussion, while offering more casual contributions in others, reaping knowledge from more involved members (Cormier 2007). This is the new reality. Knowledge seekers in cutting-edge fields are increasingly finding that ongoing appraisal of new developments is most effectively achieved through the participatory and negotiated experience of rhizomatic community engagement. Through involvement in multiple communities where new information is being assimilated and tested, educators can begin to apprehend the moving target that is knowledge in the modern learning environment.”
As always Anne-Marie thanks not only for taking the trouble to read but also commenting!
I am working on a different approach to evaluating #FOAMed as O think currently we are being constrained by ‘typical’ constructs. This is certainly one such approach.
While I think the “quality assurance’ question is an important one for the community I am certainly NOT for a FOAM curriculum. I think, as presented, #FOAMed provides a great opportunity to inform curricula but certainly shouldn’t be constrained by one itself.
I think we are at the edge of really demonstrating that delivering medical education is not only a ‘physical’ concept but also as a shared network process.
I’m also thinking about quality assurance in terms of curation here in our projects with medical students in Cardiff. My hunch is that learners are the best at deciding on ease of access/appropriate level/relevance but that experts/teachers are best at knowing quickly whether content is accurate (no matter the source). So together they can curate aka crowdsource excellent content together. Of course we are talking about #foamed content here.
Wow. This is very compelling stuff. I wonder if FOAM would respond well to a contract grading approach to quality control. I’ve had some luck with it. http://davecormier.pressbooks.com/chapter/avoiding-resistance-to-grading-contracts/
Hmmm – thanks @Dave. But in #foamed who is the contract between? #foamed content isn’t usually produced as the result of a task on an assessed course. It’s made by enthusiastic individuals who want to share what they know and have learnt. So is the contract between that individual and the whole community? Should the producer try to self-assess and say why they think they have done a good job, and the community then respond on those terms? Of course the needs of others in the community might be very different. They might see other strengths and weaknesses in what the producer has made. And that is where I come in with being so interested in the use of curation platforms like scoopit- cause you are able to say why a given piece of #foamed is of use to you and why others close/like you might also find it useful. How does that model fit with contract grading?
By the way we have no assessment of our curation project – it is all voluntary.
I think there’s something interesting about self-surveyed contract based on a community developed template. It could serve as a valuable onboarding strategy for new members and offer them some tools to potentially explain/describe/validate their learning to those people outside the community who might need that.
Damian: Great points! Another parallel element is the tacit knowledge that is required to be a competent physician. While a comprehensive curriculum blueprint of the knowledge required for any specialty would be a fool’s errand (knowledge evolves, practices are typically idiosyncratic), such an attempt would fail to articulate the tacit knowledge that grounds competent practice. In the end, a curriculum should articulate key abilities (including the ability to learn and identify knowledge gaps in practice). The era of a physician being defined by the content they know, I htink has been replaced to a competency-based era, where a physician is defined by their abilities (i.e. outcomes)