2nd year medical student Ajay Shah, writes about the anxiety surrounding lab research and publications- something all medical students can relate to!
“Have you done any research?”
“Yeah actually I have a publication in the Canadian Journal of Obstetrics, I spent my summer doing a meta-analysis for Mount Sinai”
“Oh, so do you want to do obstetrics?” “No, I just managed to get the position through a friend. I don’t actually care about it, research is kind of boring”
I’ve had this conversation with so many students, at St. Andrews and otherwise. It seems to be hammered into us from an early age. “Get published, get hired” is the prime directive for stethoscope-wielding youth. For Canadian students, the pressure is multiplied, as the prestigious residency positions will only accept the most credentialed, cited and publication-producing students.
So, we slave. I am no exception to this rule, I write from my experiences. In high school, I spent most of a year doing a research project at Princess Margaret Hospital. Today, if you saw me standing with my poster at conferences, you’d be smitten by my passionate descriptions of MTS Assays and In silico analysis. If you saw me back then, in that stuffy lab from 5-7 PM everyday, micropipetting row upon row, redoing protocol due to calculation errors, waiting for colonies to incubate, you’d feel nothing but pity.
‘you’d feel nothing but pity’
And that is the reality of lab research. It is boring. You will fail. Even when you do everything right, you will get negative results, contamination, and false positives. My project looked at the role of microRNAs in human breast cancer cells, and we received negative results with the first two microRNAs we tested. After 8 months in the lab, we concluded that microRNA-449 levels were correlated with tumour viability, then found 4 genes that potentially contribute to tumourigenesis. Peanuts, in the big picture – but another tree planted in the forest of cancer research.
-‘Even when you do everything right, you will get negative results, contamination, and false positives’-
I am not trivializing lab research, nor am I insinuating that incremental accomplishments are meaningless. I can imagine nothing more fulfilling than conducting scientific inquiry into one’s passions. I am simply trying to relate a feeling I had to one that many of you may experience, the feeling of purposeless disappointment. The infinite thought loop of “What exactly have I accomplished?” to “Was that a worthwhile use of my time?” to “What did I gain from that?”, rinse and repeat.
I believe that feeling arose because I was doing research for all the wrong reasons. I was doing that project to get my name on a paper to improve a university application to increase my odds of getting a slightly better education which would, theoretically, ultimately help me achieve the abstract notion of “success”. The idea of success instilled in us by our superiors, mentors and role models, a concept I could hardly visualize, let alone materialize. My search for material scientific findings had an ulterior purpose, and my findings therefore seemed purposeless and immaterial.
-‘that feeling arose because I was doing research for all the wrong reasons’-
So to all those go-getters doing research this summer: I beg you, be wary. If you lack passion in your topic, and are simply doing research to get published, think twice. Perhaps an unpaid shadowing position may help you learn more, think more, and gain a sense of purpose. Or perhaps this mental purgatory is a necessity that we medical students must pass through on our journey – jumping through hoops, dotting I’s and crossing T’s, going through the motions, as we wait for our careers to begin. Whatever it may be, I suppose it can’t hurt to ask – “Hey, do you know anyone hiring students to work in a research lab?”
3rd year Caroline Cristofaro investigates restless leg syndrome, a relatively unknown condition that can significantly impact a patient’s qualIty of life.
At first restless leg syndrome may seem like an innocent condition, however, it can be very debilitating for the patient. In addition to the obvious symptom of the patient feeling the uncontrollable desire to move their legs, patients can often suffer from insomnia, parasthesias and various uncomfortable sensations in the lower limbs (Facheris, M., et al., 2010). The goal of this article is to shed light on the symptoms and aetiology of this little known syndrome and what medications are given to treat it.
RLS has been found to have a strong genetic link and most cases are diagnosed as Familial RLS. Secondary RLS can be experienced during pregnancy and renal failure, both of which are associated with iron deficiency (Facheris, M., et al., 2010). Opioid-dependent patients can also experience RLS after stopping opioids as a withdrawal symptom (Ghosh, A. et al., 2014).
Restless Leg Syndrome (RLS) is categorized as both a sensorimotor and sleep disorder with the sensorimotor symptoms being the main complaint in most patients. The sensation is described using a myriad of terms such as itching, burning, crawling, tingling but is relived, albeit only temporarily by movement of the legs (Facheris, M., et al., 2010). Depending on the severity of the patient’s RLS, relief can be by simply stretching out the affected leg(s) or the patient may have to get up and walk around. Although it is fairly obvious why RLS would delay the onset of sleep, the explanation as to why RLS causes disturbed and fragmented sleep is somewhat less evident. Periodic Limb Movements of Sleep (PLMS) are defined as “semirhythmic movements of the limbs which last a few seconds and occur at regular intervals” (Cotter, P. and O’Keeffe, S., 2006), as is indicated by their name. These limb movements are nearly always present in the population suffering from RLS and, in addition, these movements are typically exaggerated and cause patients to awaken leading to patients’ awareness of RLS sensorimotor symptoms, which, of course, makes the task of falling asleep more arduous (Cotter, P. and O’Keeffe, S., 2006).
Whilst the exact cause of RLS has not yet been cemented, two factors have been shown to heavily impact symptoms. These are iron and dopamine levels within the substantia nigra.
Dopamine is a neurotransmitter, which plays an important role in regulating movement. The two main receptors for dopamine in the brain are D1 and D2 receptors and, each via a specific pathway, increase or decrease movement, respectively. In RLS, Positron Emission Tomography (PET) scans have consistently shown decreased activity of the D2 receptor, which normally decreases movement, leading to increased movement (Ruottinen, H.M., 2000). In support of these findings, Connor, JR et al. (2009) demonstrated a 30% reduction in D2 receptors in the basal ganglia using autopsies of patients who had been affected with severe RLS. As RLS occurs in the lower limbs, it is thought that the dopaminergic A11 cell group is mainly affected because they are the only neurones which provide dopaminergic axons to the spinal cord. This hypothesis was strengthened by Clement, S. et al. (2006) demonstrating increased movement in mice with a lesioned A11 dopaminergic cell group.
Iron acts as a coenzyme for tyrosine hydroxylase (an enzyme essential in dopamine synthesis) and it is also related to monoamine oxidase (an enzyme which degrades dopamine) (Facheris, M., et al., 2010). Due to its close relationship with dopamine, the concentration of iron in the brain influences the numbers of dopamine transporters and dopamine receptors (specifically D2 and D4); decreased concentration of iron in the CSF leads to decreased densities of dopamine transporters and receptors which have been attributed to the cause RLS symptoms (Omer, P. et al, 2007). However, a recent study by Conner J.R., et al. (2009) compared the post-mortem tyrosine hydroxylase concentrations of patients who had suffered from RLS to a control group and found no significant difference. Due to conflicting evidence, the previously explained role of dopamine in RLS is considered to be a hypothesis and, as a result, is still referred to as the “Dopamine Hypothesis”.
A critical review conducted by the International Restless Legs Syndrome Study Group in 2013 investigated the efficacy of various drugs to treat long-term RLS. For first line treatment it is recommended to give the patient dopamine-receptor agonists or calcium channel antagonists (pregbalin or gapapentin encarbil) (Gracia-Borreguero, D., et al., 2013). The dopamine agonists recommended are either ergot-based (pergolide and cabergoline) or non-ergot based (pramiprexole, ropinirole, rotigotine, levodopa). Calcium channel antagonists are used mainly when the patient complains of the physical symptoms and act to stop excessive muscle contraction. Iron supplements are also recommended when the patient’s serum ferritin levels are low (Gracia-Borreguero, D., et al., 2013). Together, these treatments decrease muscle contraction, help stabilize the dopaminergic pathways in the brain and increase the patient’s serum iron levels, further stabilizing the concentration of dopamine.
Although RLS may seem like a relatively benign condition at first thought, its constant, intrusive symptoms encroach on nearly every aspect of a patient’s life, making it very difficult to live with. Both its pathophysiology and, consequently, its treatment options draw similarity to Parkinson’s disease, which should also alert you to the seriousness of this syndrome. Unfortunately, none the drugs used to treat the condition have been proven to be effective for more than 1 year (Gracia-Borreguero, D., et al., 2013), which means that we have yet to find a drug that can effectively treat RLS for when the syndrome persists over a year. Hopefully future studies are able to find a concrete link between dopamine and RLS to explain the efficacy of dopamine agonists in treating the condition, or otherwise determine the accurate aetiology of the syndrome to prompt studies of new drugs able to effectively cure RLS.
Clemens, S., Rye, D., & Hochman, S. (2006). Restless Legs Syndrome: Revisiting the Dopamine Hypothesis from the Spinal Cord Perspective. Neurology, 67(1), 125–130.
Cotter, P. E., & O’Keeffe, S. T. (2006). Restless Leg Syndrome: Is it a Real Problem? Therapeutics And Clinical Risk Management, 2(4), 465–475.
Garcia-Borreguero, D., Kohnen, R., Silber, M. H., Winkelman, J. W., Earley, C. J., Högl, B., … Allen, R. P. (2013). The long-term treatment of restless legs syndrome/Willis–Ekbom disease: evidence-based guidelines and clinical consensus best practice guidance: a report from the International Restless Legs Syndrome Study Group. Sleep Medicine, 14(7), 675–684.
Ghosh, A., & Basu, D. (2014). Restless Legs Syndrome in Opioid Dependent Patients. Indian Journal Of Psychological Medicine, 36(1), 85–87.
Oner, P., Dirik, E. B., Taner, Y., Caykoylu, A., & Anlar, O. (2007). Association Between Low Serum Ferritin and Restless Legs Syndrome in Patients with Attention Deficit Hyperactivity Disorder. Tohoku J. Exp. Med. The Tohoku Journal Of Experimental Medicine, 213(3), 269–276.
Pramstaller, P. (2010). Update on the Management of Restless Legs Syndrome: Existing and Emerging Treatment Options. Nature And Science of Sleep, 2, 199–212.
Rios Romenets, S., & Postuma, R. B. (2013). Restless Legs Syndrome. Current Treatment Options In Neurology, 15(4), 396–409.
Winkelman, J. W. (2006). Considering the Causes of RLS. European Journal Of Neurology, 13, 8–14.
In our first article of 2016, Ajay Shah writes about his experience with the news headlines’ exaggeration of research paper results and discovering how to interpret them.
On October 26th, 2015, the results of a new study rippled across social media and news outlets. Sensationalist Twitter hashtags and Facebook posts lamenting its results dominated the cyber landscape. Sales at supermarkets and grocery stores fell by millions of pounds in the following weeks. Thousands of people swore off their favourite foods in the panic, hoping to bargain for a few extra years of life. What could this study have discovered to cause such a viral reaction throughout society? Well, of course, it showed that bacon causes cancer!
Hold up. What? One of the most universally loved foods in the world (Americans consume 18 pounds per capita annually), cast aside by a suddenly health-conscious population, just because of one study’s results? How could this have happened? As is often the case, the culprit is not those responsible for creating the study (they just did the science), nor the general population (we just read the articles). The culprit is far more insidious, something capable of unnecessarily inspiring fear into millions of innocent people. That culprit was The Headlines.
I woke up that morning to see a short BBC notification on my phone declaring that red meat causes cancer. Confused, I called my girlfriend as I skimmed the article.
“WHAT??” said my girlfriend when I read the headline to her, likely picturing morsels of steak morphing into tumours as they passed through her colon.
“Yup, the WHO has declared that red meats cause colon cancer. It’s true – even the BBC is reporting it,” I replied.
After some quick research, and reading headlines and tweets such as:
“OMG! Bacon causes cancer!” (New York Post)
“Processed meats rank alongside smoking as cancer causes — W.H.O. U.N. health body says bacon, sausages and ham among most carcinogenic substances along with cigarettes, alcohol, asbestos and arsenic.” (The Guardian)
“Bacon, hot dogs and processed meats cause cancer/are as dangerous as smoking, says @WHO.” (PBS)
the alarm bells began to ring. My girlfriend quickly swore off processed meats for life, and limited red meats to one meal per week. I, an avid meat-eater myself, decided not to let The Headlines fool me, and began to do some research of my own.
As I had suspected, the initial reaction to the study was quite overblown. In fact, the first paragraph of the WHO report itself says that “red meat is probably carcinogenic to humans based on limited evidence ”. “Limited evidence” means that, while a positive correlation is present between the exposure (meat) and outcome (cancer), other explanations (confounding factors, bias, chance) cannot be ruled out.
The findings are based on an assessment of more than 800 studies, which gave more weight to the studies with larger sample sizes, fewer confounding factors and better experimental design. Approximately two thirds of the case-control/cohort studies showed a positive correlation between eating processed meats and colorectal cancer incidence. It was found that eating 100g of red meat per day raises colorectal cancer incidence by 17%, while eating 50g of processed meat daily raises incidence by 18%. Both types of meat consumption were also linked to pancreatic, prostate and stomach cancer.
Personally, I think that the study was very well-conducted and is a fantastic example of “Good Science”. The researchers importantly identified a mechanism of action, identifying known carcinogens produced during the cooking of meat, and postulating that these compounds lead to cancer formation. They also examined a wide range of studies under specific criteria, minimizing the influence of confounding factors and bias. The different types of meats were classified appropriately, and suitable recommendations were made.
Upon reading the WHO report, I felt quite upset at the media for their treatment of the story. The WHO researchers were quick to reassure, suggesting that a reduction of processed meat consumption would be a preferable alternative to widespread vegetarianism. Indeed, they identified the valuable nutrients and compounds in red meat, and after contrasting these with the increased cancer risks, officially only recommended cutting down red meat consumption to 70g daily. This is a stark contrast to the doom-and-gloom picture portrayed by the media.
An increase of cancer incidence by 17-18% may seem like a lot, and with the media’s handling of this report, any meat eater would do a double-take before chowing into that cheeseburger. However, this “relative risk” looks a lot different when put into perspective. It is estimated that, among 1000 people who eat little-to-no processed meat, 56 would develop bowel cancer during their lives. Comparatively, of the 1000 people who eat the most processed meat, 66 are expected to develop bowel cancer during their lives. Indeed, the 17% increase seems miniscule when compared to smoking (2,400% increased risk) or drinking alcohol (500% increase). And this ignores the fact that red meat offers many healthy nutrients, which may overpower the carcinogenic effect in some. Regardless, red and processed meat now joins an increasingly cluttered list of carcinogens, including aloe vera (Class 2B), the Sun (Class 1) and Air (Class 1), some of most dangerous things known to man.
So, throughout the journey from “my bowels are full of carcinogen-induced tumours” to “eating meat probably, might, slightly increase the risk of developing a tumour in my large intestine”, I learned a lot about clinical studies, the media, and The Headlines. I learned that “carcinogen” is a broad term, and a subatomic particle that may be essential to the fabric of our reality (neutrons) is also a known human killer. I learned that clinical studies completed under similar conditions could have polar opposite results. But most importantly, I learned that in a world dominated by clickbait headlines and diminishing attention spans, the onus is on us to do our research, and Rewrite the Headlines.
Jackie Liu discusses prosopagnosia, a cognitive disorder that leaves sufferers struggling to recognise the faces and identities of those around them.
Prosopagnosia is that awkward and embarrassing encounter where you are greeted by someone you supposedly know, but whose face you simply cannot recognise. It’s September in St. Andrews, you meet hundreds of new students, but a few days later in the street you have no recollection of meeting this individual because you cannot recognise their face. For some, this is a way of life, a daily threat, a constant unease. Imagine being in a crowd at a football match, lost and alone, not recognising anyone: not a single identifiable face. Prosopagnosics face this possibility everyday. This condition can lead to severe consequences.
What is prosopagnosia?
Prosopagnosia, or ‘Face Blindness’, is defined as having an inability to, or difficulty with, recognising faces: a cognitive disorder of facial perception. Some cannot perceive faces, and others cannot relate the face to an identity. This, for most people, is a simple and at least a partially unconscious process developed soon after birth, but in some this cognitive process is impaired or missing. The German psychologist Joachim Bodamer first used the term prosopagnosia in 1947 in his paper titled ‘Die Prosop-Agnosie’. It was a derivation from the classical Greek “πρόσωπον” (prósōpon), meaning ‘face’, and “αγνωσία” (agnōsía) ‘non-knowledge’ (Bodamer, 1947).
How does facial recognition work?
To understand prosopagnosia, it is necessary to understand the development of facial recognition. The process is complicated, and seems to begin in babies soon after birth. Newborn infants have been observed to show interest in, and track, basic sketched faces. This innate interest declines after the first month. This development is crucial to facial recognition as researched by Grand et al. (2001). The study examined patients deprived of normal vision with bilateral congenital cataracts, which were surgically corrected 118 days post-birth on average. This delay, however, caused an insufficient development of the facial processing ability. These patients were compared to a control group who had had no previous visual-impairment and the results indicated that deprivation of facial visual input at birth led to permanent deficits in configural facial processing – the spacing of facial features. Yet the visually deprived patients had normal featural processing – the shaping of features. So while these patients can process and recognise the shapes of features like eyes or nose, their deprivation of visual input at birth may have led to a permanent difficulty in recognising differences in the spacing of features, and hence the inability to connect identity with a specific set of facial features (Grand et al., 2001). Prosopagnosia is a defect specifically with the holistic or configurational processing of faces (Busigny et al., 2010).
Hence, prosopagnosia may lead to the dissociation of two functional processes: the recognition of a face through its’ features, and spacing. Spacing is a function that is separate from featural processing, but assists in the identification of the owner of the face, which has been discussed as a secondary process by Schiltz (2005). This paper suggests that there are indeed two levels of processing: face detection, followed by individual identification, and that these levels can be dissociated (Schiltz, 2005). In a person without prosopagnosia, these processes integrate, working largely unconsciously to enable facial recognition.
Anatomically, there is debate (Halgren et al., 2000) with regards to the location of facial recognition processes, but the general consensus is that it functionally localises to the occipitotemporal or fusiform gyrus.
Within the gyrus is an area called the ‘Fusiform Face Area’, which is suspected to function in facial recognition (Kanwisher & Yovel, 2006). Functional MRI (fMRI) reveals an increase in blood flow in the Fusiform Face Area when the patient looks at faces. The evidence shows that the Area demonstrates functional specificity in relation to faces rather than objects, and also area specificity suggested by the differences in response profiles from other face-selective regions (Kanwisher & Yovel, 2006). However, some argue (Goldstein, 2009) that the area cannot be the sole anatomical basis for facial recognition, as it is not fully developed until adolescence, and babies can differentiate faces as early as 3 months old. It also argued that cognitive functions, like facial recognition, are not limited to being domain-specific mechanisms (Kanwisher & Yovel, 2006).
Is there a difference between the two hemispheres of the brain?
Studies have suggested that the facial recognition process is expressed from the right hemisphere (Meadows, 1974). Patients with acquired prosopagnosia develop the condition after traumatic brain injury or damage. Meadows (1974) describes prosopagnosia patients as almost always having a right occipitotemporal lesion (Meadows, 1974). A study by Schiltz (2006) also provides supporting evidence for this theory, indicating a correlation between lesions in the fusiform gyrus on the right hemisphere with difficulties identifying faces (Schiltz, 2006).
What are the different types of prosopagnosia?
Prosopagnosia is not a unitary disorder: there are different types and levels of impairment. The condition can be categorised into two main groups: acquired and developmental. Acquired has two sub-groups:
Apperceptive prosopagnosia is impairment to the earliest processes in facial recognition, usually caused by damage to the right occipitotemporal region. These patients may be unable to recognise faces at all, whether familiar or not. These patients are then dependent on other factors, like voice and clothing, for recognition (Gainotti & Marra, 2011).
Associative prosopagnosia is impairment to other early face recognition processes and also their links with memory, usually caused by damage to the right anterior temporal regions. These patients can determine faces and what the face may show about the person (age and sex), but cannot associate the face with anyone they know; there is no identification (Gainotti & Marra, 2011).
Developmental is defined as being present from birth, and manifests in early childhood (Jones & Tranel, 2001). Inheritance is possible: there are families with multiple members affected by prosopagnosia. It is still unknown what actually causes this condition, but it is likely to be of genetic origin.
So how can this affect people?
Acquired prosopagnosia is rare because of the location of the damage impairing facial recognition. Developmental prosopagnosia however, is relatively common. The estimated figures are between 2% to 2.9% in a general population (Duchaine & Nakayama, 2006; Kenneknecht et al., 2006; Bowles et al., 2009). This congenital form of prosopagnosia can severely affect children in many ways.
There are safety issues notably created by a parent’s inability to recognise their own children or family. The condition can also create anxiety for children, especially in busy and crowded areas. Social issues are also created. A rich and adequate social life is a key part of childhood development, and prosopagnosia makes this much more difficult to maintain. Children are able to meet new people and make friends but, subsequently, fail to recognise them. This alienates prosopagnosics; they are falsely perceived as being superficial or antisocial. These children also then have difficulty in team-based activities or sports.
A case study describes these problems in more detail (Diaz, 2008). The paper describes the lives of a 13-year old boy and his mother, both living with hereditary prosopagnosia. Their condition limits their social interactions and circles. This was exacerbated when the boy entered middle school, where he struggled to adapt to the increased number of people. Alone and alienated, with a reputation for ‘weirdness’, he suffered from depression, and became suicidal. The school nurse then developed an individualised healthcare plan, raising awareness and understanding of prosopagnosia amongst staff, enabling them to provide educated assistance. The boy underwent psychological counselling and took medication. His mental wellbeing improved, but despite this assistance, he remained isolated, and concentrated on his studies. This case study indicates that developmental prosopagnosia can lead both to difficulty maintaining a patient’s safety, and deterioration of their mental wellbeing (Diaz, 2008). The difficulties prosopagnosia presents can affect every aspect of an individual’s life, resulting from the loss of a basic cognitive function that most people have. In an attempt to counter this, coping strategies are adopted by the patient: they rely on other characteristics for recognition, such as an individual’s gait, voice, or hairstyle. This coping strategy can achieve improvements socially, but ultimately, may be undermined by a simple haircut or even a sore throat.
Prosopagnosia is a relatively unknown condition that affects a surprising number of people. If, allowing for the 2% prevalence, then the United Kingdom alone has about 1.5million prosopagnosics. There are no known cures or standardised treatments as of yet. Patients are dependent on their own individual management and coping strategies. Even the diagnostic tests, the Cambridge Face Memory Test and Cambridge Face Perception Test – whilst being fairly reliable as clinically strong indicators – are limited by factors such as one’s age and ethnicity (there is discrimination against faces not of your own race) (Bowles et al., 2009). The inability to recognise faces reaches beyond social issues, and impacts one’s quality of life. This article aims to raise better awareness and understanding of the condition.
Bodamer, J. (1947) ‘Die Prosop-Agnosie’,Archiv für Psychiatrie und Nervenkrankheiten Vereinigt mit Zeitschrift für die Gesamte Neurologie und Psychiatrie. Springer, 179(1-2), pp. 6–53. doi: 10.1007/BF00352849.
Bowles, D., McKone, E., Dawel, A., Duchaine, B., Palermo, R., Schmalzl, L., Rivolta, D., Wilson, E. and Yovel, G. (2009) ‘Diagnosing prosopagnosia: Effects of ageing, sex, and participant–stimulus ethnic match on the Cambridge Face Memory Test and Cambridge Face Perception Test’,Cognitive Neuropsychology, 26(5), pp. 423–455. doi: 10.1080/02643290903343149.
Busigny, Joubert, Felician, Ceccaldi and Rossion (2010) ‘Holistic perception of the individual face is specific and necessary: evidence from an extensive case study of acquired prosopagnosia.’,Neuropsychologia. Busigny T , et al., 48(14), pp. 4057–92. doi: 10.1016/j.neuropsychologia.2010.09.017.
Diaz, A. (2008) ‘Do I Know You? A Case Study of Prosopagnosia (Face Blindness)’,The Journal of School Nursing, 24(5), pp. 284–289. doi: 10.1177/1059840508322381.
Duchaine, B. and Nakayama, K. (2006) ‘Developmental prosopagnosia: a window to content-specific face processing’,Current Opinion in Neurobiology, 16(2), pp. 166–173. doi: 10.1016/j.conb.2006.03.003.
Gainotti, G. and Marra, C. (2011) ‘Differential Contribution of Right and Left Temporo-Occipital and Anterior Temporal Lesions to Face Recognition Disorders’,Frontiers in Human Neuroscience. Frontiers Media SA, 5, p. 55. doi: 10.3389/fnhum.2011.00055.
Goldstein, B. (2009)Sensation and Perception [With Virtual Lab Manual] – 8th Edition. 8th edn. United States: Wadsworth, Cengage Learning.
Grand, R. L., Mondloch, C., Maurer, D. and Brent, H. (2001) ‘Neuroperception: Early visual experience and face processing’,Nature, 410(6831), pp. 890–890. doi: 10.1038/35073749.
Halgren, Raij, Marinkovic, Jousmäki and Hari (2000) ‘Cognitive response profile of the human fusiform face area as determined by MEG.’,Cerebral cortex (New York, N.Y. : 1991). Halgren E , et al., 10(1), pp. 69–81. Available at: http://www.ncbi.nlm.nih.gov/pubmed/10639397 (Accessed: 17 January 2015).
Jones, R. and Tranel (2001) ‘Severe developmental prosopagnosia in a child with superior intellect.’,Journal of clinical and experimental neuropsychology. Jones RD and Tranel D, 23(3), pp. 265–73. Available at: http://www.ncbi.nlm.nih.gov/pubmed/11404805 (Accessed: 17 January 2015).
Kanwisher and Yovel (2006) ‘The fusiform face area: a cortical region specialized for the perception of faces’,Philosophical Transactions of the Royal Society B: Biological Sciences, 361(1476), pp. 2109–2128. doi: 10.1098/rstb.2006.1934.
Kennerknecht, I., Grueter, T., Welling, B., Wentzek, S., Horst, J., Edwards, S. and Grueter, M. (2006) ‘First report of prevalence of non-syndromic hereditary prosopagnosia (HPA)’,American Journal of Medical Genetics Part A, 140A(15), pp. 1617–1622. doi: 10.1002/ajmg.a.31343.
Meadows, J. (1974) ‘The anatomical basis of prosopagnosia’,Journal of Neurology, Neurosurgery & Psychiatry, 37(5), pp. 489–501. doi: 10.1136/jnnp.37.5.489.
Meng, Cherian, Singal and Sinha (2010) ‘Functional lateralization of face processing’, Journal of Vision, 10(7), pp. 562–562. doi: 10.1167/10.7.562.
Schiltz (2005) ‘Impaired Face Discrimination in Acquired Prosopagnosia Is Associated with Abnormal Response to Individual Faces in the Right Middle Fusiform Gyrus’,Cerebral Cortex, 16(4), pp. 574–586. doi: 10.1093/cercor/bhj005.
Jessie Lee, second year, investigates the role of mindfulness in the field of medicine.
With the rosy tint of hindsight, doing A-levels was like swimming in the baby pool at the local branch of Total Fitness. Medical school, from its’ high expectations to its fast paced spiral curriculum is more akin to one of those highly exciting, racetrack shaped pools. You swim around and around, rather frantically at first, and are fine as long as you dodge the obstacles appropriately and stay with the flow. However, steer off course, attempt to fight the current, or resist the efforts of the helpful lifeguards and issues start to arise.
The Stresses of Medicine: So how to stay on track?
Medicine is renowned for its’ stresses, yet medical students and doctors frequently ignore their own health (Brooks et al 2011).
Why mindfulness might be of some use to you?
The Oxford Mindfulness Centre, an international centre of excellence that resides within the Department of Psychiatry, offers the following reassurance – people who have learned mindfulness discover positive changes in well-being, experience less stress, are less likely to get stuck in depression and exhaustion, and are better able to control addictive behaviour.
Great, but what is Mindfulness?
Put simply, it is an awareness of the present. Nice and vague, but in essence this just represents an enhanced perspective, and an acceptance, of situations. It comes with the realisation that happiness is not found externally (or else someone would have found it by now) but rather internally, and that ultimately it is only our own mind that can affect our own happiness. So, that late bus will only affect our happiness if we let it.
Mindfulness is an active determination to take control of your own emotions and to not allow external factors to affect your mental wellbeing. To give an example: someone who was subject to a stressor, but doesn’t practice mindfulness, might see their situation as “just their luck” and let it make them unhappy, affecting the rest of their day, etc. Someone who was mindful would just accept the situation, and not let it bother them.
Okay, but how can I become more mindful?
Typically mindfulness is achieved through meditation or contemplation. An example of this might be, as a horribly wise man once asked me, ‘If there is something you can do to change a situation, then why worry? And if there is nothing you can do to change the situation, then why worry?’ Pausing, and asking oneself this, when faced with those everyday stressors, can have a large effect on one’s happiness. And simply taking ten minutes out of your day to relax and spend some time on yourself doesn’t really sound too bad does it?
Where to look if this is something that interests you?
Brooks, S. K., Gerada, C., & Chalder, T. (2011). Review of literature on the mental health of doctors: are specialist services needed? Journal of Mental Health (Abingdon, England), 20(2), 146–156.
Carmody, J., & Baer, R. A. (2008). Relationships between mindfulness practice and levels of mindfulness, medical and psychological symptoms and well-being in a mindfulness-based stress reduction program. Journal of Behavioral Medicine, 31(1), 23–33.
De Vibe, M., Solhaug, I., Tyssen, R., Friborg, O., Rosenvinge, J. H., Sørlie, T., & Bjørndal, A. (2013). Mindfulness training for stress management: a randomised controlled study of medical and psychology students. BMC Medical Education, 13, 107. doi:10.1186/1472-6920-13-107
Phang, C. K., Mukhtar, F., Ibrahim, N., Keng, S.-L., & Mohd. Sidik, S. (2015). Effects of a brief mindfulness-based intervention program for stress management among medical students: the Mindful-Gym randomized controlled study. Advances in Health Sciences Education. doi:10.1007/s10459-015-9591-3
Stress Reduction- University of Massachusetts Medical School’s Center for Mindfulness. Retrieved 18/03/2015 from http://www.umassmed.edu/cfm/
Caroline Cristofaro investigates olfactory ensheathing transplantation in the treatment of paraplegic patients who have suffered from complete severing of the spinal nerves.
pFor many years now, patients who suffered a spinal cord injury, with complete severing of the spinal nerves leading to paraplegia, had very limited treatment options. These were typically limited to physiotherapy or the more novel treatment of nerve cell stimulation, which hopes to stimulate nerves responsible for muscle contraction (mostly used in patients with incomplete severing of the spinal nerves) (MedGadget, 2014). Both of these treatments led to patients having more independence and enabled them to better cope with their disability. In other words, these treatments helped with the management of the disability, but they did not help repair the injury (Féron et al, 2005). Thus patients had very little hope of ever being able to walk again. That is, until neurosurgeons and neurologists focused their attention on stem cell-like therapies. After about a decade, we now have published papers detailing human trials of olfactory ensheathing cell transplantation actually improving both motor and sensory functions (Tabakow et al, 2013) (Lima et al, 2005). With incredible results like these reproduced in several different countries, for patients with complete severing of the spinal nerves, walking again may eventually become a reality.
CFS (Cerebrospinal Fluid)
CNS (Central Nervous System)
OEC (Olfactory Ensheathing Cells)
ONF (Olfactory Nerve Fibroblasts)
Stem cell-like therapy itself seems like a logical treatment option in patients who are looking to regenerate nerve cells within the spinal cord, but why are specifically olfactory ensheathing cells (OECs) used? Firstly, these cells are the only part of the nervous system that are capable of lifelong regeneration and can develop into supporting cells or mature neurons (Lima et al, 2005).
Secondly, OECs won’t activate the body’s immune defences as the CNS is already constantly in contact with these cells (Lima et al, 2005) (Féron et al, 2005). CSF bathes the olfactory mucosa via the olfactory route of CSF drainage cells (Lima et al, 2005). These cells are classified as stem-like progenitor cells, not stem cells, due to their stem cell-like properties (Lima et al, 2005). In order to transplant these cells, there are two main methods that are used. The first is an olfactory mucosa autograft in the spinal cord (Lima et al, 2005), and the second is OEC and olfactory nerve fibroblast (ONF) transplantation (see footnotes) (Féron et al, 2005) (MacKay-Sim et al, 2008) (Tabakow et al, 2013).
During olfactory mucosa autograft transplantation, the scar tissue surrounding the damaged area is removed to uncover the viable nervous tissue (Lima et al, 2005). During the scar tissue removal, vasoconstrictors are administered into the olfactory mucosa before the mucosa is harvested transnasally with an endoscope (Lima et al, 2005). Lastly, the olfactory mucosa is cut into small pieces and transplanted to close the gap between the two ends of the severed spinal cord (Lima et al, 2005).
During OEC/ONF transplantation, instead of grafting the olfactory mucosa in the spinal cord gap, OECs and ONFs are isolated from the olfactory mucosa and then injected via over one hundred microinjections (Tabakow et al, 2013). These are done at various injection sites in the damaged portion of the spinal cord with extremely complex technology designed for impeccable precision with the utmost sterility (Tabakow et al, 2013).
Both the injection of OEC/ONF and autologous olfactory mucosa grafts have been found to be safe procedures, with risks similar to any major surgery (i.e. post-op infection) (Lima et al, 2005). It must be noted that the studies currently published have been more focused on assessing the safety of these procedures in humans as opposed to the benefit that could possibly be accrued from the transplants (Féron et al, 2005) (MacKay-Sim et al, 2008) (Tabakow et al, 2013) (Lima et al, 2005). With that being said, studies have, at the same time, published the improvements seen in sensory and motor functions of patients.
Darek Fidyka, a 38-year-old man who was stabbed in the back in 2010, received the OEC injections and after just 6 months, he is now able to walk with the help of a frame (cbc.ca, 2014). Fidyka has been the most successful transplant patient by far and his case has led to the recent publicity on this innovative treatment. Although Fidyka’s outcome was due to the OEC/ONF treatment, other patients receiving that same treatment did not show as significant an improvement (Tabakow et al, 2013). Most importantly, although Fidyka was classified as a patient with completely severed spinal nerves, there were a very small number of nerves still connected that had not been severed at the time of his trauma, thus greatly influencing his outcome after treatment(cbc.ca, 2014). As this is a very new treatment, the trials investigating the treatments have published conflicting results (Ekberg et al, 2014).
With the information published so far, OEC/ONF transplantations seem to be a safer treatment option compared with olfactory mucosa autografts. A paper by Ekberg et al. found that around 15% of patients undergoing an olfactory mucosa autograft developed Myelomalacia or Syringomyelia (see footnotes), but there are a minimal number of studies addressing this method compared with OEC transplantation (Ekberg et al, 2014). Millions of patients have received OEC transplantations and such large numbers are due to papers analysing the different types of injections available for patients; injections varying in the purity of OECs/ONFs and types of cells transplanted from the nasal mucosa (Ekberg et al, 2014). Results published by Tabakow et al. stated that patients who had received the OEC transplant showed regained continuity in some white matter tracts in the injured spinal cord segment and restored afferent and efferent long white matter tracts(Tabakow et al, 2013). Other papers published have also said that olfactory cell transplants may specifically be able to promote axonal regeneration (Ekberg et al., 2014). These benefits, alongside intensive neurorehabilitation, lead to “modest” functional improvement (Tabakow et al, 2013). However, compared with the very minimal improvement expected with just neurorehabilitation, olfactory cell transplantation is the first method of treatment tried with such significant and positive results, especially in patients with complete severing of the spinal nerves.
Although Fidyka’s case has perhaps led the public to believe that a cure to paraplegia in patients with completely severed spinal nerves has been found, this is still only the very beginning. Most studies investigating these treatments are, as was previously mentioned, focused on the aspect of safety and accordingly have strict inclusion and exclusion criterion leading to less than 10 patients in studies (Féron et al, 2005) (MacKay-Sim et al, 2008) (Tabakow et al, 2013) (Lima et al, 2005). Thus it will be several years before OEC/ONF injections are used as a mainstream treatment. In the future, neurologists and neurosurgeons will focus on developing this treatment to enable more patients to enter the study and leading to more concrete results being published. In addition to this, the role of neurorehabilitation must be closely examined to maximise the benefits and functional improvements in patients, as rigorous physiotherapy after the treatment is vital (Tabakow et al, 2013). Therefore, even though the practice of transplanting olfactory ensheathing cells has been found to be safe and produced some promising results so far (Féron et al, 2005) (MacKay-Sim et al, 2008) (Tabakow et al, 2013) (Lima et al, 2005), much research and treatment refinement must be done before paraplegic patients with complete severing of the spinal nerves are able to walk again.
ONFs and OECs are the two main cell types found in the olfactory basement membrane (Tabakow et al, 2013)
Myelomalacia: the softening of the spinal cord primarily caused by bleeding (Ekberg et al, 2014)
Syringomyelia: cyst, called a syrinx, forms in the spinal cord which elongates over time and leads to the destruction of the spinal cord (Ekberg et al, 2014)
Angela Hu, third year, looks at some of the issues surrounding the new technology of egg freezing.
“As soon as I woke up in the recovery room, I no longer felt as though I were watching my window to have a baby close by the month. My future seemed full of possibility again.” – Sarah Elizabeth Richards, on why she chose to freeze her eggs at 36 years old (Richards, 2014).
Oocyte cryopreservation, or “egg freezing”, as it’s colloquially known, has been at the forefront of news lately, with Apple and Facebook announcing they would pay £12,000 towards the cost of egg freezing for their female employees, allowing them to focus on their careers (Barnett, 2014). And while egg freezing technology has been available for more than a decade, in the fall of 2012 the American Society for Reproductive Medicine officially removed its experimental label (Schubert, 2014), paving the way for more clinics to offer this service.
On the surface of it, egg freezing technology seems like a dream come true. To be able to preserve healthy embryos before our natural fertility starts declining, and wait to use them until we have found a suitable partner and are ready to have children; it seems like the pinnacle of reproductive technology. Especially to women in medicine, a demanding field often requiring more than a decade of training to become a fully qualified consultant, there is a delicate line that is drawn between waiting to have children and waiting too long, at which point it is no longer biologically viable to become pregnant. While egg freezing certainly is a field full of potential, there are still many ethical and physical implications that we must consider.