New to Meducation?
Sign up
Already signed up? Log In
Personalise Your Feed

Trending in your community

Currated by 158,000 medical professionals.
O punch face sweat facebook

Developing Resilience at Work

Don't wait for others to change things for you, do what you can now, especially if that means stopping.  
Dr Dee Gray
over 1 year ago

Patients or Patience?

Walking into a cubicle, introducing myself and acquiring a patient's permission to ask them a few questions about what brought them into hospital and then to examine them. Sounds simple enough, until you're in A&E, the patient is seriously ill and you're the first person to see them. I learned the hard way this week, spending an hour with a patient only to realise that they were so confused (in the medical - not the academically challenged - sense) that the history that I had taken was essentially null & void. It was to be their partner and carer who would provide me with the history that would allow qualified members of the healthcare profession to attempt to make their loved one better. Lesson 1. Sometimes the patient isn't the best person to tell you what's wrong with them. Lesson 2. Sometimes they are. On the flip side, some patients LOVE a good chinwag! They'll tell you everything about their health, family and day-to-day life without a moments pause. And it takes some guts to interrupt them mid-flow.. Despite the obvious time constraints, these are my favourite interactions. I often wonder how I find myself in such an honourable position. Why do people feel they can share so much of their life story with me? Some laugh, some cry, others just want to vent their frustrations. Either way I'm there, I'm listening, and most importantly I'm learning. Written by Chantal Cox-George, 3rd Year Med Student at University of Bristol  
Chantal Cox-George
over 3 years ago
92029673 0 nw texas healthcare

Make families part of the medical team

Families know more about their loved ones than medical professionals can ever know or have time to learn. Involving families can improve diagnoses, care and outcomes.  
Bonnie Friedman
over 1 year ago
Buzzword 100530476 primary.idge

One ‘Buzzword’ at a Time

The US Healthcare System Briefly Explained.  
Catherine Bruce
about 1 year ago

Protein Translation 1

Visit us ( for health and medicine content or ( for MCAT...  
over 2 years ago

Clinics - Making the most of it

Commencing the first clinical year is a milestone. Things will now be different as your student career steers straight into the unchartered waters of clinical medicine. New challenges and responsibilities lie ahead and not just in an academic sense. After all this is the awaited moment, the start of the apprenticeship you have so desired and laboured for. It won’t be long before these clinical years like the preclinical years before them, will seem just as distant and insular, so why not make the most of it? The first days hold so much excitation and promise and for many they deliver, however, it would be wise not to be too optimistic. I am afraid your firm head standing abreast the doors in a prophetic splaying of arms is an unlikely sight. In this new clinical environment, it is natural to be a little flummoxed. The quizzical looks of doctors and nurses as you first walk in, a sure sign of your unexpected arrival, is a recurring theme. If the wards are going to be your new hunting ground, proper introductions with the medical team are in order. This might seem like a task of Herculean proportions, particularly in large teaching hospitals. Everyone is busy. Junior doctors scuttling around the ward desks job lists in hand, the registrar probably won’t have noticed you and as luck would have it your consultant firm head is away at a conference. Perseverance during these periods of frustration is a rewarding quality. Winning over the junior doctors with some keenness will help you no end. What I mean to say is that their role in our learning as students extends further than the security of sign-off signatures a week before the end of the rotation. They will give you opportunities. Take them! Although it never feels like it at the time, being a medical student does afford some privileges. The student badge clipped to your new clinic clothes is a license to learn: to embark on undying streaks of false answers, to fail as many skills and clerkings as is required and to do so unabashed. Unfortunately, the junior doctors are not there purely for your benefit, they cannot always spare the time to directly observe a history taking or an examination, instead you must report back. With practice this becomes more of a tick box exercise: gleaning as much information and then reconfiguring it into a structured presentation. However, the performance goes unseen and unheard. I do not need to iterate the inherent dangers of this practice. Possible solutions? Well receiving immediate feedback is more obtainable on GP visits or at outpatient clinics. They provide many opportunities to test your questioning style and bedside manner. Performing under scrutiny recreates OSCE conditions. Due to time pressure and no doubt the diagnostic cogs running overtime, it is fatefully easy to miss emotional cues or derail a conversation in a way which would be deemed insensitive. Often it occurs subconsciously so take full advantage of a GP or a fellow firm mate’s presence when taking a history. Self-directed learning will take on new meaning. The expanse of clinical knowledge has a vertiginous effect. No longer is there a structured timetable of lectures as a guide; for the most part you are alone. Teaching will become a valued commodity, so no matter how sincere the promises, do not rest until the calendars are out and a mutually agreed time is settled. I would not encourage ambuscaded attacks on staff but taking the initiative to arrange dedicated tutorial time with your superiors is best started early. Consigning oneself to the library and ploughing through books might appear the obvious remedy, it has proven effective for the last 2-3 years after all. But unfortunately it can not all be learnt with bookwork. Whether it is taking a psychiatric history, venipuncture or reading a chest X-ray, these are perishable skills and only repeated and refined practice will make them become second nature. Balancing studying with time on the wards is a challenge. Unsurprisingly, after a day spent on your feet, there is wavering incentive to merely open a book. Keeping it varied will prevent staleness taking hold. Attending a different clinic, brushing up on some pathology at a post-mortem or group study sessions adds flavour to the daily routine. During the heated weeks before OSCEs, group study becomes very attractive. While it does cement clinical skills, do not be fooled. Your colleagues tend not to share the same examination findings you would encounter on an oncology ward nor the measured responses of professional patient actors. So ward time is important but little exposure to all this clinical information will be gained by assuming a watchful presence. Attending every ward round, while a laudable achievement, will not secure the knowledge. Senior members of the team operate on another plane. It is a dazzling display of speed whenever a monster list of patients comes gushing out the printer. Before you have even registered each patient’s problem(s), the management plan has been dictated and written down. There is little else to do but feed off scraps of information drawn from the junior doctors on the journey to the next bed. Of course there will be lulls, when the pace falls off and there is ample time to digest a history. Although it is comforting to have the medical notes to check your findings once the round is over, it does diminish any element of mystery. The moment a patient enters the hospital is the best time to cross paths. At this point all the work is before the medical team, your initial guesses might be as good as anyone else’s. Visiting A&E of your own accord or as part of your medical team’s on call rota is well worth the effort. Being handed the initial A&E clerking and gingerly drawing back the curtain incur a chilling sense of responsibility. Embrace it, it will solidify not only clerking skills but also put into practice the explaining of investigations or results as well as treatment options. If you are feeling keen you could present to the consultant on post-take. Experiences like this become etched in your memory because of their proactive approach. You begin to remember conditions associated with patient cases you have seen before rather than their corresponding pages in the Oxford handbook. And there is something about the small thank you by the F1 or perhaps finding your name alongside theirs on the new patient list the following morning, which rekindles your enthusiasm. To be considered part of the medical team is the ideal position and a comforting thought. Good luck. This blog post is a reproduction of an article published in the Medical Student Newspaper, Freshers 2013 issue.  
James Wong
over 3 years ago

The Nosology of Descriptive Psychopathology from a Philosophical Perspective

In the initial interviews with patients who suffer psychotic symptoms, it might be striking that the usage of terminology of descriptive psychopathology lingers on an arbitration of knowledge of 'truth' by using terms like delusions or hallucinations with their definition as false beliefs or false perceptions (Casey & Kelly 2007). These terms can cause annihilation of value to patient's experience, which may pose an initial strain on the egalitarian patient-doctor relationship. In an era, where deference to experts is dead, it might be worthy on agreeing on the effect of these experiences prior to lablelling them. Delusions can not be objectively detected and described, because it evolves and exists within subjective and interpersonal dimensions. Severe psycopathological symptoms share the fact that they are statistically deviant, and thus can be labeled as 'unshared'. Symptoms may be perceived as 'distressing' and they might be 'disabling' to them. The outcome behaviour which may raise concern can be a 'dysfunctional' behaviour (Adams & Sutker 2004). Jaspers considered the lack of understandability of how the patient reached conclusion to be the defining factor of a delusional idea. The notion of defining 'delusion' as false belief was challenged by Jaspers. Sims gives the example of a man who believed his wife was unfaithful to him because the fifth lamp-post alone on the left was unlit. What makes it a delusion is the methodology not the conclusion which may be right (Sims 1991). Some delusions might be mundane in their content, others may not be falsifiable. Dereistic thinking is not based on logic but rather on feelings. It is possible to find ways to evade falsification; an ad hoc hypotheses may also be part of the presentation. Fish stated that delusional elaboration may follow delusion and/or hallucination which may have convergence with the concept of the ad hoc hypothesis. Absence of verification from the patient's side does not lead to deductive falsification (Casey & Kelly 2007). Otherwise, the doctor-patient relationship carry the risk to transform to detective-suspect relationship, where the latter may perceive the need to present evidence of innocence. Mental health professionals are usually encountered by people who suffer to various degrees or make others suffer, and not because of various degrees of conviction. The primary role of the therapist is to be defined as some one who tries to alleviate the sufferings of others rather than correcting their beliefs. Communicating with patients in terms of how functional is their belief rather than it's truth may prove to be more egalitarian and clinically tuned. This may provide some middle ground in communication, without having to put an effort on defining the differences between what is 'true' and what is 'real'. The criterion for demarcation between what is real and what is pathologic may be different in the patient-doctor relationship. The assertion on the clinician's part on the falsity of a belief or experience can have the risk of dogmatism. The statistical deviance of symptoms, their distressing nature, disabling consequences, the resultant dysfunctional behaviour and apparent leap from evidence to conclusion may be a more agreeable surrogate starting points. This might be more in line with essence of medicine or 'ars medicina' (art of healing). Concordance with patients on their suffering may serve as an egalitarian platform prior to naming the symptoms. The term delusion commonly identified as false fixed belief, when used by a psychiatrist, it does not address only a symptom. It rather puts the interviewer in the position of an all knowing judge. After all, a service-user may argue that how come a doctor who never encountered or experienced any of the service-user's aspects of the problem as being persecuted at work and home, as plainly false. Then, does the psychiatrist know the truth. From a service-user point of view what he/she experience is real; which might not necessarily be true. The same applies for people who lead an average life, people who go to work bearing with them their superstitions, beliefs about ghosts, luck, horoscopes, zodiacs, or various revered beliefs. This term has the risk of creating a temporary crack in the mutual sense of equality between the therapist and the service-user. This may be due to the labelling of certain dysfunctional belief as unreal by one side. It has the potential for a subtle change in the relationship to the mental health professional placing himself/herself in the omniscient position and it contrasts with the essence of medical practice where practitioners assume the truth in what the patients say as in the rest of subjective symptoms as headache for example. The subsequent sequel of this is other labels such as 'bizarre delusions' or 'systematised delusions', further add to the deviation of the role of the professional therapist to an investigator in the domain of 'Truth' and architecture of 'Truth'. Furthermore, it might be strenuous to the relationship when the therapist - based on skeptic enquiry - starts explaining such symptoms. For example, if the service-user believes that Martians have abducted him, implanted a device in his brain and sent him/her back to earth, and the response communicated back is the 'delusional'. It could be argued by the service-user that the therapist who had not seen a Martian or a brain device before, labelled the whole story as 'delusion' in a rather perceived dismissive labelling with no intention to check on the existence of Martians or the device. In other words, the healer became the arbiter of truth, where both lack evidence for or against the whole thing; one member in the relationship stepped into power on basis of subjective view of plausibility or lack of thereof. In the case of hallucinations, the clinician labelling the patient's experience as hallucinations can be imposing fundamental dilemma for the patient. For example, if a patient hears a voice that says that everything is unreal apart from the voice, and the clinician says that the voice is the thing that is unreal. Both do not give evidence to their 'truth' apart from their statement. The clinician's existence to the patient's subjective reality is distorted by the multiple realities of the patient, and arguing on basis of mere existence that the 'voice' is the one that is 'false', does not give the patient a clue of the future methodology to discern from both, since percetption is deceived and/or distorted. In this case, another tool of the mind can be employed to address the patient. The same can be applied to a concept like 'over valued ideas', where the clinician decides that this particular idea is 'over valued', or that this 'idea' is 'over valued' in a pathological way. The value put on these ideas or not the patient values but the clinician's evaulation of 'value' and 'pathology'. The cut of point of 'value' and 'over value' seems to be subjective from the clinician's perspective. Also, 'derailment' pauses the notion of expecting a certain direction of talk. The concepts of 'grooming' and 'eye contact' implicitly entail the reference to a socio-cultural normative values. Thus, deviation from the normative value is reflected to the patient as pathology, which is an ambiguous definition, in comparison to the clarity of pathology. The usage of terms like 'dysfunctional unshared belief' or 'distressing auditory perception' or other related terms that address the secondary effect of a pathologic experience may be helpful to engage with the patient, and may be more logically plausible and philosophically coherent yet require empirical validation of beneficence. Taylor and Vaidya mention that it is often helpful to normalise, but this is not to minimise or be dismissive of patient's delusional beliefs.(Taylor & Vaidya 2009). The concept can be extended to cover other terms such as 'autistic thinking, 'apathy', 'blunting of affect', 'poor grooming', 'over-valued ideas', other terms can be applied to communicate these terms with service-users with minimal deviation from the therapeutic relationship. The limitation of these terms in communication of psychopathology are special circumstances as folie a deux, where a dysfunctional belief seems to be shared with others Also, symptoms such as Charles-Bonnet syndrome; usually does not have negative consequences. The proposed terms are not intended for use as a replacement to well carved descriptive psychopathological terms. Terms like 'delusion' or 'hallucination' are of value in teaching psychopathology. However in practice, meaningful egalitarian communication may require some skill in selecting suitable terms that is more than simplifying jargon. They also may carry the burden of having to add to the psychiatric terminology with subsequent effort in learning them. They can also be viewed as 'euphemism' or 'tautology'. However, this has been the case from 'hysteria' to 'medically unexplained symptoms' which seems to match with the zeitgeist of an era where 'Evidence Based Medicine' is its mantra; regardless advances in treatment. Accuracy of terminology might be necessary to match with essence of scientific enquiry; systematic observation and accurate taxonomy. The author does not expect that such proposal would be an easy answer to difficulties in communication during practice. This article may open a discussion on the most effective and appropriate terms that can be used while communicating with patients. Also, it might be more in-line with an egalitarian approach to seek to the opinion of service-users and professional bodies that represent the opinions of service-users. Empirical validation and subjection of the concept to testing is necessary. Patient's care should not be based on logic alone but rather on evidence. Despite the limitations of such proposal with regards to completeness, it's hoped that the introduction of any term may help to add to the main purpose of any classification or labelling that is accurate egalitarian communication. DISCLAIMER This blog is adapted from BMJ doc2doc clinical blogs Philosophical Streamlining of Psychopathology and its Clinical Implications The blog is based on an article named 'Towards a More Egalitarian Approach to Communicating Psychopathology' which is published in the Journal of Ethics in Mental Health, 2013 Bibliography Adams, H. E., Sutker P.B. (2004). Comprehensive Handbook of Psychopathology. New York: Springer Science Casey, P., Kelly B., (2007). Fish's Clinical Psychopathology: Signs and Symptoms in Psychiatry, Glasgow: Bell & Bain Limited Kingdon and Turkington (2002), The case study guide to congitive behavior therapy for psychosis, Wiley Kiran C. and Chaudhury S. (2009). Understanding delusion, Indian Journal of Psychiatry Maddux and Winstead (2005). Psychopathology foundations for a contemporary understanding, Lawrence Erlbaum Associates Inc. Popper (2005) The logic of scientific discovery, Routledge, United Kingdom Sidhom, E. (2013) Towards a More Egalitarian Approach to Communicating Psychopathology, JEMH · 2013· 8 | 1 © 2013 Journal of Ethics in Mental Health (ISSN: 1916-2405) Sims A., Symptoms in the mind, (1991) an introduction to psychopathology, Baillere Tindall Taylor and Vaidya (2009), Descriptive psychopathology, the signs and symptoms of behavioral disorders, Cambridge university press  
Dr Emad Sidhom
over 3 years ago

Male Postnatal Depression - a sign of equality or a load of nonsense?

Storylines on popular TV dramas are a great way of raising the public's awareness of a disease. They're almost as effective as a celebrity contracting an illness. For example, when Wiggles member Greg Page quit the group because of postural orthostatic tachycardia syndrome, I had a spate of patients, mostly young and female, coming in with self-diagnosed "Wiggles Disease". A 30% increase in the number of mammograms in the under-40s was attributed to Kylie Minogue's breast cancer diagnosis. The list goes on. Thanks to a storyline on the TV drama Desperate Housewives, I received questions about male postnatal depression from local housewives desperate for information: "Does it really exist?" "I thought postnatal depression was to do with hormones, so how can males get it?" "First it's male menopause, now it's male postnatal depression. Why can't they keep their grubby mitts off our conditions?" "It's like that politically correct crap about a 'couple' being pregnant. 'We' weren't pregnant, 'I' was. His contribution was five seconds of ecstasy and I was landed with nine months of morning sickness, tiredness, stretch marks and sore boobs!" One of my patients, a retired hospital matron now in her 90s, had quite a few words to say on the subject. "Male postnatal depression -- what rot! The women's liberation movement started insisting on equality and now the men are getting their revenge. You know, dear, it all began going downhill for women when they started letting fathers into the labour wards. How can a man look at his wife in the same way if he has seen a blood-and-muck-covered baby come out of her … you know? Men don't really want to be there. They just think they should -- it's a modern expectation. Poor things have no real choice." Before I had the chance to express my paucity of empathy she continued to pontificate. "Modern women just don't understand men. They are going about it the wrong way. Take young couples who live with each other out of wedlock and share all kind of intimacies. I'm not talking about sex; no, things more intimate than that, like bathroom activities, make-up removal, shaving, and so on." Her voice dropped to a horrified whisper. "And I'm told that some young women don't even shut the door when they're toileting. No wonder they can't get their de facto boyfriends to marry them. Foolish girls. Men need some mystery. Even when you're married, toileting should definitely be kept private." I have mixed feelings about male postnatal depression. I have no doubt that males can develop depression after the arrival of a newborn into the household; however, labelling it "postnatal depression" doesn't sit all that comfortably with me. I'm all for equality, but the simple fact of the matter is that males and females are biologically different, especially in the reproductive arena, and no amount of political correctness or male sharing-and-caring can alter that. Depressed fathers need to be identified, supported and treated, that goes without saying, but how about we leave the "postnatal" tag to the ladies? As one of my female patients said: "We are the ones who go through the 'natal'. When the boys start giving birth, then they can be prenatal, postnatal or any kind of natal they want!" (This blog post has been adapted from a column first published in Australian Doctor  
Dr Genevieve Yates
over 3 years ago

April 2015 DAS intubation guidelines draft for comment

Kudos to the work of the Difficult Airway Society in UK. They are updating their intubation guidelines and have released an  April 2015 draft for open comment/feedback. Check it out and help them produce a worthy publication for us all! Find it HERE My thoughts on the April Draft : cricoid pressure to be removed…
about 2 years ago

Student Credits for Contributing to Online Content

I have some very exciting news to share with you today - the University of California (UC) in San Francisco will become the first medical school to give academic credit to students for editing content on Wikipedia. Wikipedia has had a tempestuous history in academia. It was originally considered to be a very unreliable source until it was shown to be as accurate as the Encyclopaedia Britannica in 2005. Since then it has been gaining recognition among both students and academics as a reliable and important part of the research phase. Wikipedia acts as a base upon which further research can be built - its strong focus and policies surrounding citation mean that it’s easy to dig deeper into the information it provides. It’s brilliant to see that institutions are now recognising not only the value of using Wikipedia, but also the importance of contributing back to it and the value the service provides to both the student and the reader. Like me, you're probably wondering how this will work. Well, students will be given the opportunity to improve commonly used but lower quality Wikipedia articles. Professors will then give credits based on the quality of each student's contribution to the article. Not only will it enhance the quality of online medical resources, it will also encourage collaborative working which will, in turn, lead to innovative thinking and advances in medicine. This progress is such great news for the future of medical education. I can’t wait for the day Meducation Authors are rewarded with credits for the amazing content they give to us! As Charles Darwin once said “In the long history of humankind (and animal kind, too) those who learned to collaborate and improvise most effectively have prevailed.”  
Nicole Chalmers
over 3 years ago

Cardiac Embryology - Embryology

Arabic | Chinese (simplified) | French | German | Hebrew | Hindi | Indonesian | Italian | Japanese | Korean | Portuguese | Romanian | Russian | Spanish | YiddishThese external translations are automated and may not be accurate.
almost 3 years ago

Increasing length of health news stories on local television may benefit public health

Previous research has shown that the most popular way Americans get their health news is by watching local television broadcasts.
about 2 years ago

MedEd Technology: Twitter

Recently, I made a short video about where I see the use of technology in MedEd (take a look), and when asked to write for Meducation, I thought it would be great to get people thinking about technology and its uses in medical school. Social media – easy to use on smart phones, instant access to resources and thousands of likeminded people. Seems like a good place for medical education? Uses Now Students are often at the forefront of technology – we’ve grown up with it, and so many staff and lecturers within medical schools will be lagging. This makes it difficult to integrate technology into the curriculum, especially before technology has moved on. This is potentially why the use of twitter remains informal, and that may be its charm. By remaining informal, it means students can ask questions and get involved with hashtags without the constraints of marks and tests. Revision questions, mnemonics, diagrams and pictures are all over twitter, if you know where you’re looking. Here’s my current list of useful people in medical education to follow, and the hashtags I’m following: Advantages Easy and quick to set up an account Thousands of medics around the world – ask questions, network and share resources Can get involved as little or as much as you like Disadvantages Mixing social life and education – medicine can already take over your life, do we really want to be thinking about it in our spare time? And do you want your lecturer to follow you? Privacy – can only make full use of twitter with an open account Getting students involved – many students don’t want twitter, so if it was to be used formally in education, there would have to be incentives GMC advice on the use of social media can be found here. People to follow Hashtags to follow @knowmedge #quclms @meducation #twitfrig @twitfrg #FOAMed @MedEdNcl #MedEd @MedFinalsRev Content @patientuk Content I’ll update the list as it changes – leave a comment if you find anything good! Future Uses But can it be used in medical school? In my university, some lecturers put up a twitter feed, using the course name as a hashtag, where students ask questions without shouting out. The hashtag can be used after that, to ask questions and share relevant resources. I like this idea – but could it work in medical education? Maybe in early years it could be used in the same way, but once students are on placement it gets harder. While everyone is in different hospitals, it could be a good way to integrate learning, check students are meeting objectives and ask questions throughout to check understanding. Maybe its only use is announcements – “placement letters must be handed in by the 21st Jan”. The other question is, how long can twitter last for? We’re already seeing a gradual decline in Facebook, so it may not be worth medical schools investing time and money into social media. Are you on twitter? Do you keep it purely social or do you mix in medicine? Would you like to see your lecturers on board and tweeting you questions? At the moment, I’m not too sure – I keep my twitter for medicine, answering the questions from @knowmedge, saving the mnemonics from @medfinalsrev, but I’m not sure how much I would get involved if my medical school used it officially… Written by Anna Willis Anna is a Medical Student at Sheffield University and is a Resident Blogger for Meducation Follow Anna on Twitter: @AnnaPeerMedEd  
Anna W
over 3 years ago

Commitment Issues

I recently read a question on meducation posted around a year ago, the jist of which was “as a medical student, is it too early to start developing commitment to a specialty?” I.e. “even though I haven’t graduated yet, should I start building a portfolio of experience and evidence to show that specialty X is what I really want to do?” MMC revolutionised (for better or worse) the medical career structure forcing new graduates to decide on a career path much earlier. Many have appreciated the clear delineation of their career pathway. Others have found the 15 month period between leaving university and applying for specialty training too short to make an informed decision (just ask the 10% of FY2s that took a career break last year (i)). Whether right or wrong, there is now less time to rotate round ‘SHO’ jobs, decide on a career and build a CV capable of winning over an interview panel. You’ll probably find you’re in one of 2 camps at university: Those who are absolutely 110% certain there is nothing they want to do, ever, other than specialty X, or Those who really like specialty X, but also like specialties W, Y and Z and haven't made up their minds (A few people find themselves feeling they don’t want to be part of any medical career, but that’s for another post.) Students identifying with the first statement are usually concerned they will not get enough general experience, or that they will be stuck with their decision if they change their minds later on. Those who are leaning more towards statement 2 may not build as strong a body of evidence for any one specialty; however it’s possible to get involved in activities either relevant to a few career options, or several specialty-specific activities and subsequently edit the CV for a specific interview. The key message is that whether you think you have your career mapped out or not, medical school is the perfect time to start collecting evidence that you’re interested in a career in a particular specialty: time for extra-curricular activities only becomes scarcer when you have a full time job complete with working long days, nights and weekends. Your experiences at medical school can then be supplemented with taster weeks, teaching and judicious use of your study budget for training days and conferences; bear in mind that all specialties allow at least 3 years* following FY2 before starting specialty training which can be used for gaining further experience (but be prepared to justify and defend your actions). It’s also important to consider the manner in which individual specialties require such a commitment to be demonstrated: In general terms, the more niche and/or competitive the specialty, the more they will want you to demonstrate that you a) really know what the job entails and b) have made a concerted effort to further your knowledge of the subject. To get a job in neurosurgery for example, which is not only niche but had a completion ratio of 4.9 in 2013(ii) you’ll need to have gone to courses relevant to neurosurgery and have achievements related to the specialty such as a neurosurgical elective, attachment or taster experience(iii). Some specialties assess commitment in a variety of situations e.g. the radiology interview this year had stations on the general overview and future of radiology as a career, a CV based demonstration of commitment to specialty as well as a station requiring the interpretation of images. General Practice on the other hand which in its very nature is very broad, at no point allocates marks specifically for commitment to specialty (or anything else on a CV for that matter) as it is entirely dependent on an exam (SJTs and clinical questions) and skill-based stations at a selection centre. The person specification* details what is expected and desirable as demonstration of commitment in each specialty. So, how do you actually show you’re committed to a specialty? It may be pretty obvious but try to get a consistent and well-rounded CV. Consider: • Joining a student committee or group for your specialty. If there isn't one at your university, find some like-minded people and start one • Asking the firms you work for if you can help with an audit/research even if data collection doesn’t sound very interesting • Finding a research project (e.g. as part of a related intercalated or higher degree) • Prizes and examinations relevant to the specialty • Developing a relevant teaching programme • Selecting your selected study modules/components, elective and dissertation with your chosen specialty in mind • Going to teaching or study days aimed at students at the relevant Royal College Remember it’s not just what you’ve done but also what you’ve learnt from it; get into a habit of reflecting on what each activity has helped you achieve or understand. This is where most people who appear to have the perfect CV come unstuck: There will always be someone who has more presentations and publications etc. etc. but don’t be put off that it means they are a dead cert for the job. Whatever you do, make sure you have EVIDENCE that you’ve done it. Become a bit obsessive. Trust me, you forget a lot and nothing counts if you can’t prove it. Assessing commitment to specialty aims to highlight who really understands and wants a career in that specialty. From my own recent experience however, just identifying experiences explicitly related to a specific specialty ignores the transferable and clinically/professionally/personally important skills one has that would make them a successful trainee. I’d be very interested in your views on ‘commitment to specialty’: for example do you think the fact someone has 20 papers in a given specialty means they are necessarily the best for the job? Or are you planning to take a year out post-FY2 to build on your CV to gain more experience? Let us know! References *See person specifications for specialty-specific details at i. ii. iii.  
Dr Lydia Spurr
over 3 years ago


It is Julian Bagginis' book, 'The Virtues of the Table: How to Eat and Think', that is fuelling me to write. I am particularly drawn to the chapter 'Willpower and Weight-loss' and would like to dedicate my first blog-entry to the ‘F**** -it-button’; the antithesis of willpower and thus.. weightloss. The ‘F***-it button’ is a useful concept for any -visually minded- [medical] student that in an attempt to lead the perfect [medic] lifestyle of ‘work hard-play hard' (whilst focusing on eating that all-important 'balanced diet' and exercising regularly)... ... ends up ‘working hard but playing harder', finds themselves eating the leftover cake in the nurses’ room (in order to gain that much-needed energy to sit in a clinic), and ditches the gym in favour of Facebook. In sum, the ‘F***-it button’ is that imaginary ‘button’ –definitely red in colour- that is available for any individual to press at any time, BUT is only pressed on occasion, for which there is usually a 'trigger' (an emotion or event that 'forces' you to do it). Once pressed, willpower and self-control are fully deactivated. The individual is free to do as they will. As you can imagine, the ‘F***-it button’ can be applied to anything... However, for me, this button is most often activated when trying to be 'healthy' and lead that balanced lifestyle. In this context, ‘F***-it button' 'initiation’ does not mean the breakfast that when portioning a bowl of cereal, a handful of flakes is ‘allowed’ to reach the mouth before the bowl... We are talking about total and utter blowout. Examples include: Having THE healthiest day EVER- planned to perfection- and then rewarding oneself with a night in a cocktail bar or a ‘few’ beers to celebrate the ‘day of healthiness’. (Usually on a Sunday) Finishing the week’s worth of ‘healthy’ food you stocked up to eat that week, you get a lil bit peckish at about 5p.m, about the time Sainsbury’s closes, and low and behold, you find yourself entering a housemate’s (sorry!) cupboard and demolishing their chocolate biscuits… the ‘walk of shame’ in the form of a ‘replacement trip’ to Sainsbury’s follows the next day, or if Sainsbury’s is open, promptly. That day when you are absolutely ravenous at 10.30am, lunch is too far away but you will not survive the morning… the only option is a chocolate bar or other high calorie snack… lunchtime comes but you are not hungry due to that cheeky snack BUT by mid-afternoon- maybe just as you are about to clerk a patient- you are ravenous, and a trip to the nurse’s room for a cake ‘to keep you going’ is inevitable. In all three cases, by this time you may as well give up completely, throw off all restrictions and eat whatever you like for at least the next week… you will eat everything and anything that comes to your mind, and it will involve numerous visits to the ‘chocolate shop’ (provided the cravings come before 11pm).* The ‘F***-it button’ has been pressed. In a way, I do love the ‘F***-it’ button conception; it allows a period of freedom to indulge, in a mood dependent fashion, in what you want, when you want to. However, it is very much a love-hate relationship.. eating the nurses cake is undoubtedly essential for those times when you need a ‘pick me up’ but regularly doing so leaves you unpopular, with tight trousers and intense feelings of guilt. The ‘hate’ part of the relationship is the reason I want to consider why it is that the‘F- it’ button exists. I have been informed by various ‘dieting’ websites that certain triggers, for example fatigue will trigger certain impulse responses, for example ‘F- it button’. This is a situation most of us can recognize… you really want to lose weight but faced with the temptation of the nurse’s cake on a day you feel tired, your desire for the cake, only needs to overrule your weight-loss resolution for a minute (or so) and all your good intentions are crushed, as the cake finds its way to your mouth. How is it that some of my friends – you know the type (my housemate)- can have one cookie of a packet of fifteen, and the other fourteen remain on their desk (or cupboard for that matter), next to where they are working, in full view, where they can smell, see and think about it, for at least half a year before they decide to have Biscuit Number Two, and ONLY Biscuit Number Two. This I just do not understand. My mind is put at ease- to an extent – by the knowledge gained from my new ‘bible’, ‘that experiments with toddlers show that even at a very young age, some people are able to resist or defer gratification better than others’. I was definitely NOT one of those toddlers… Baggini goes on to say that this matters more than just for weight loss: having good self-control is a very strong predictor of academic career and success… Hmm, I think we will leave it at that. *Chocolate shop as defined as the nearest newsagents or supermarket.  
Catherine Bruce
over 3 years ago

Biohacking - The Brighter Side of Health

2014 is already more than a month old (if you can believe it) and with each passing day, the world we live in is speeding towards breakthroughs in every sphere of life. We're running full tilt, wanting to be bigger and better than we were the day or the hour before. Every passing day reinvents the 'cutting edge' of technology, including medical progress and advancement. Gone are the medieval days when doctors were considered all knowing deities, while medicine consisted of leeches being used to drain 'bad blood'. Nowadays, health isn't just about waiting around until you pick up an infection, then going to your local GP to get treated; in today's world it's all about sustaining your wellbeing. And for that, the new kid on the block is biohacking. Biohacking is the art and science of maximizing your biological potential. As a hacker aims to gain complete control of the system he's trying to infiltrate, be it social or technological; similarly a biohacker aims to obtain full control of his own biology. Simply put, a biohacker looks for techniques to improve himself and his way of life. Before you let your imagination run away with you and start thinking of genetic experiments gone wrong, let me assure you that a biohack is really just about any activity you can do to increase your capabilities or advance your wellbeing. Exercising daily can be a biohack. So can doing the crossword or solving math sums, if it raises your IQ by a few points or improves your general knowledge. What characterizes biohacking is the end goal and the consequent modification of activities to achieve that goal. So what kind of goals would a biohacker have? World domination? Not quite. Adding more productive hours to the day and more productivity to those hours? Check. Eliminating stress and it's causes from their lives? Check. Improving mood, memory and recall, and general happiness? You bet. So the question arises; aren't we all biohackers of sorts? After all, the above mentioned objectives are what everyone aspires to achieve in their lives at one point or the other. unfortunately for all the lazy people out there (including yours truly), biohacking involves being just a tad bit more pro active than just scribbling down a list of such goals as New Year resolutions! There are two main approaches to selecting a biohack that works for you- the biggest aim and the biggest gain. The biggest aim would be targeting those capabilities, an improvement in which would greatly benefit you. This could be as specific as improving your public speaking skills or as general as working upon your diet so you feel more fit and alert. In today's competitive, cut throat world, even the slightest edge can ensure that you reach the finish line first. The biggest gain would be to choose a technique that is low cost- in other words, one that is beneficial yet doesn't burn a hole through your pocket! It isn't possible to give a detailed description of all the methods pioneering biohackers have initiated, but here are some general areas that you can try to upgrade in your life: Hack your diet- They say you are what you eat. Your energy levels are related to what you eat, when you take your meals, the quantity you consume etc. your mood and mental wellbeing is greatly affected by your diet. I could go on and on, but this point is self expanatory. You need to hack your diet! Eat healthier and live longer. Hack your brain- Our minds are capable of incredible things when they're trained to function productively. Had this not been the case, you and I would still be sitting in our respective caves, shivering and waiting for someone to think long enough to discover fire. You don't have to be a neuroscientist to improve your mental performance-studies show that simply knowing you have the power to improve your intelligence is the first step to doing it. Hack your abilities- Your mindset often determines your capacity to rise to a challenge and your ability to achieve. For instance, if you're told that you can't achieve a certain goal because you're a woman, or because you're black or you're too fat or too short, well obviously you're bound to restrict yourself in a mental prison of your own shortcomings. But it's a brave new world so push yourself further. Try something new, be that tacking on an extra lap to your daily exercise routine or squeezing out the extra time to do some volunteer work. Your talents should keep growing right along with you. Hack your age- You might not be able to do much about those birthday candles that just keep adding up...but you can certainly hack how 'old' you feel. Instead of buying in on the notion that you decline as you grow older, look around you. Even simple things such as breathing and stamina building exercises can change the way you age. We have a responsibility to ourselves and to those around us to live our lives to the fullest. So maximise your potential, push against your boundaries, build the learning curve as you go along. After all, health isn't just the absence of disease but complete physical, mental and social wellbeing and biohacking seems to be Yellow Brick Road leading right to it!  
Huda Qadir
over 3 years ago

Do medics need to get over themselves?

Lucas Brammar
over 3 years ago

Growing violence against doctors alarms medical profession

Grave concern about the growing incidence of violence against doctors across the globe has been expressed by the World Medical Association.
about 2 years ago

Early Retirement and Career Change

This is my first blog on Meducation. I decided to tell the reader a bit about myself, so that future blogs will make sense. At age 48 and in an active and successful academic practice of OB & GYN, my best friend died from a complication of cardiac surgery. This tragic event made my wife's and me consider other things in life than just work, thus at age 55, I decided to retire from my academic position and to start working as a locum in many different cultural settings. The plan was to work somewhere in an area of need for six months and alternate this with travel for six months. It did not quite work out exactly that way, but close enough. I worked in Japan, then Pakistan, Tasmania, Australia, New Zealand, Alaska, St Lucia, and Chiapas in Mexico. Much earlier I had had a two year experience in Africa. It was a very satisfying experience and my wife and I have never looked back. Many of my friends and colleagues kept urging me to write a book about our experiences and how we accomplished them. For a long time I kept resisting, probably because I felt that no one might be interested, and because I might have been lazy, and most likely for both of these reasons. I finally gave in, started writing and published an e book. The title is "Crosscultural Doctoring. On and Off the Beaten Path." the book can be down loaded for free from Smashwords at: The book is meant for medical as well as non medical people. It is written as a series of loosely connected anecdotes, some medical, some non medical, some funny, some not so funny. The book describes the immense satisfaction my wife and I experienced from our decision and I hope that reading the book might inspire others, medical or non medical people, who might be thinking about a career change or early retirement to jump of the beaten path. The book might also inspire other with similar experiences to write about them. I would love to receive some comments. William J. LeMaire JUNE 2014 Learn more about me please visit my website at:  
DR William LeMaire
about 3 years ago