Is Subjective Cognitive Impairment a Harbinger of Dementia Risk?

Michael Gordon, MD, MSc, FRCPC

ABSTRACT: There has been a great deal of flutter in the medical literature in the past few years about the possibility of early diagnosis of cognitive impairment. There is now a suggestion of stretching the diagnostic criteria of cognitive decline to include not just a preclinical state but a high-risk state, which may be based on a collection of positive biomarkers and, more recently, the results of a complex computer-generated cognitive test. In addition to the quest for earlier identification of those with objective evidence of mild cognitive impairment, a new category appears to have been suggested—so-called subjective cognitive impairment—which, if accepted into clinical practice, may have a profound impact on the diagnostic approach to dementia, from the earliest clinical stages to a more nebulous stage that may or may not prove to be of value.


Our medical understanding of dementia has changed dramatically over the past few decades. Modern medical technologies have raised the hope that a greater understanding of the mechanisms of various types of dementia could spawn new interventions to alleviate the overwhelming clinical, social, demographic, and financial challenges faced by Western societies with increasing aging populations.1-3 Despite the wealth of insight we have gained, there is still much to be learned. For the primary care physician who may see an increase in the number of older patients reporting nonspecific cognitive issues, the questions will be when to intervene and how

New Challenges in Dementia Diagnosis and Treatments

It was not that long ago in the history of medicine that the Alzheimer’s type of dementia was merely deemed to be a natural consequence of aging. Many physicians in practice today may recall “senility” as being the primary diagnostic label applied to older people with evidence of serious cognitive impairment. It was not until 1906 when German physician Alois Alzheimer identified in one of his older patients a collection of brain cell abnormalities as the disease that would later bear his name.4 The patient was woman in her 50s who exhibited severe cognitive disorders pertaining to memory, language, and social interaction. After the patient passed away, Dr Alzheimer performed a brain autopsy and discovered dense deposits surrounding the nerve cells, which were later called neuritic plaques, and inside the nerve cells were twisted bands of fibers, which were later called neurofibrillary tangles.5 Since this discovery, scientists have made great strides in understanding these 2 hallmark features of Alzheimer’s disease, and medications have been developed to help treat its cognitive symptoms. 

We know that there is a spectrum of stages in dementia, from a preclinical stage to advanced dementia. Preclinical, or pre-symptomatic, Alzheimer’s disease is a newly defined stage of the disease in which biomarkers or radiological markers may be able to indicate a high likelihood of dementia, but patients rarely present with any subjective or objective clinical symptoms. Mild cognitive impairment can be diagnosed based on subjective and objective evidence of cognitive lapses, but these lapses are not severe enough to interfere with the person’s normal functioning, as is the case in diagnosing dementia due to Alzheimer’s disease. It is well known that some people with mild cognitive impairment can live for years without progressing to a more serious state of cognitive decline.6

However, we have learned that just because it is possible to predict the likelihood of a person acquiring a certain disease, such as dementia, it does not necessarily mean that he or she wants to know about that possibility. For example, in the medical literature on Huntington’s chorea, not every family member wants to know whether his or her genetic makeup will lead to the disease.7,8 Their rationale is often “if there is nothing that can be done to alter the course of the disease, why is it worth knowing?” A physician’s answer is often, “to make proper plans or decide on what kind of family commitments to make.” Some patients who are told that they have a genetic susceptibility or other non-modifiable factors putting them at risk of dementia later in life may respond by planning for the worst and hoping for the best; for others, the ambiguity and uncertainty of living with risk may cause unnecessary levels of stress or social stigma.

This situation presents a conundrum: What should physicians and other healthcare providers do with individuals who are high-functioning but claim “something is not right” with their memories, and who have gone through in-depth evaluations of cognitive status but do not have any evidence of cognitive impairment?

A Case Study

About 30 years ago, an older patient was referred to me with a degree of frustration and puzzlement on his part and on the part of his primary care physician. The patient was a professor of mathematics with a special area of expertise in some esoteric domain of the subject. He had experienced a small embolic stroke from previously undetected paroxysmal atrial fibrillation. He was treated with anticoagulants and made what was considered a complete recovery. He had no obvious physical or sensory clinical findings. Magnetic resonance imaging (MRI) had only just begun to be introduced into Canadian healthcare facilities, so only a cranial computed tomography (CT) scan was performed. The CT scan showed a very small residual lesion from his stroke in the lacunar region, but no neurological findings were demonstrated by a number of careful clinical examinations by a number of competent neurologists. Moreover, he scored normally on what were then considered standard cognitive tests, including the Mini-Mental State Examination, as well as more in-depth neuropsychological testing, but none of these tests demonstrated a discernible deficit in cognition.

The physicians who had seen him as well as a psychiatrist attributed his perceived deficits as a depressive response to his stroke, and he was prescribed a number of antidepressant medications, none of which had any impact on his perceived deficits. A year later, the patient had returned to work and appeared to be functioning normally, but he described himself being unable to return to his “sharp and sophisticated abilities and calculations and mathematical concepts.”

At the end of a long assessment in stages, he was still able to “pass” the usual tests for cognition, including cranial CT and other neuropsychological tests. The best I could tell him was that he was probably correct in his self-awareness that he “lost something” during the neurological event, but contemporary methods of investigations were not sufficiently precise or subtle enough to define it. I assured him that he was not crazy, anxious, or clinically depressed, and that he was not becoming a “worried well” neurotic person. I used the term silent lacunar stroke, but I did not follow him over the years, so I do not know his outcome. Looking back, I believe subjective cognitive impairment may have been at play here.

Current Progression of Ideas About Dementia

Cases similar to this one continue to arise today whether or not these patients have had a defined cerebral event. A recent The New York Times article described a longitudinal study of a patient with reported memory impairment that could not be documented by any of the specialists he saw.3  “He insisted that things were changing, but he aced all of our tests,” said Rebecca Amariglio, PhD, a neuropsychologist at Brigham and Women’s Hospital in Boston. But about 7 years later, he began showing symptoms of dementia. Amariglio now believes the patient had recognized a cognitive change so subtle that “he was the only one who could identify it.” Patients like this have long been called “the worried well,” said Creighton Phelps, PhD, acting chief of the dementias of aging branch of the National Institute on Aging.3  “People would complain, and we didn’t really think it was very valid to take that into account.” 

Now, scientists are finding that some people with these reports may in fact be detecting early harbingers of Alzheimer’s disease, and the medical community is starting to give these “worried well” greater credence. “The whole field now is moving to this area, and saying ‘Hey, maybe there is something to this, and maybe we should pay attention to these people,’” said Ronald C. Petersen, MD, PhD, chairman of the advisory panel to the federal government’s new National Alzheimer’s Project.3 Five studies were presented at last year’s Alzheimer’s Association International Conference that focused on subjective cognitive decline. In one of these studies, Amariglio and colleagues9 found a significant relationship between self-reported cognitive concerns and evidence of beta-amyloid, a protein implicated in Alzheimer’s disease progression, in more than 100 people who were otherwise clinically normal with no history of neurological or psychiatric illness. Interestingly, Alzheimer’s pathology was stronger in people with higher levels of education and occupational attainment, which may be attributable to a higher degree of sensitivity in their own cognitive prioritizing and organizing abilities, Amariglio noted.

Experts emphasize as an important proviso to the hypothesis of subjective cognitive impairment that many people with such complaints will not develop dementia. Some memory decline, such as forgetting the names of unfamiliar people, is normal in aging, and some of these patients’ concerns may reflect heightened psychological angst associated with the “doom and gloom” metaphors of dementia spread by the media. Subjective cognitive impairment becomes a cause for concern in people who forget important details of recent events, get lost in familiar places, or cannot follow the plot of a television show.3 

To Screen or Not to Screen?

At the heart of the public health interest and research focus on dementia in all its phases, especially this new concept of subjective cognitive impairment is the question: What can one do to forestall the onset of the clinical dementia state? Experts do not recommend that clinicians regularly screen people for subjective cognitive decline because much more research is needed, and no effective treatment for it currently exists. As Richard Caselli, MD, a neurology professor at the Mayo Clinic in Arizona, said, “If we do a scan and say, ‘Hey, we found some amyloid in your brain,’ there’s really nothing you can do.”3 These findings may be useful for research, but the question is whether we should expose these individuals to such tests. Amariglio argues the worth of identifying these patients with subjective clinical impairment for future clinical trials in Alzheimer’s prevention, but there are no clear implications for these findings at this time.

Moreover, the ethical implications and eligibility for healthcare coverage need to be considered.2 In The New York Times letter to the editor, American geriatrician and ethicist Daniel Brauner, MD, said: “Left to its own devices, the dementia industry will have everyone either at risk or at an early stage or deep in the grips of dementia. The sad truth is that we all forget, and it becomes harder to lay down new memories as we get older. When or if this becomes pathological, especially when it is still mild, is a question that is highly dependent on cultural, social, economic and political factors. Except for feeding the burgeoning dementia research community and placating those who feel better having a diagnosis, there seems little advantage at this time to pursuing an early diagnosis of a disease that is still without decent therapy when the patient is only destined to get worse.”10

Born out of this so-called “dementia industry” is a trend with brain/mind games or “mind gyms.” In a recent article in The New Yorker, the author describes her journey—rather, an odyssey—through a wide range of mind-gym organizations and products that purport to enhance brain fitness and resilience and protect against age-related cognitive decline.11 The author followed one program according to its instructions and concluded at the end of her experience, “I’m not sure I noticed my newfound cognitive abilities in everyday life. It’s hard to be both scientist and lab rat. On the positive side, I am slightly less troubled about the size of my hippocampus. On the negative side, why did I sprinkle Nutra-Sweet® on my broiled salmon last night?” 

In the same vein, a recent article in the Scientific American reviewed the current evidence on the purported benefits of brain games on function and cognition and concluded the following, “One thing remains clear: there is no serious harm to brain training other than the effect on your wallet (and the risk of some egg on your face if your 7-year old can play them better). And evidence is accumulating that they not only improve the skills they are designed to help, but likely generalize to other cognitive abilities and have some long-lasting benefits.”12

Another confounding factor is the fact that the prevalence of cognitive impairment and dementia increases sharply with age.13 This means that if enough people experience subjective cognitive impairment, a percentage of them will develop cognitive impairment whether or not they sense something years before. Maybe all of those who develop the condition might have been aware of a prodromal period only in retrospect, but it would be very hard to try and determine those who are aware of some impairment of function with a definitive determination of the likelihood of future cognitive problems. 

A recent study suggests a new collection of neuro-biomarkers that might be more predictive, but the report was the first of its kind and stands to be duplicated and then determined what, if any, benefit it provides to those who sense a problem or more often are part of the “worried well.”14 Biological markers and imaging might end up being of value as part of the total assessment of those with subjective cognitive impairment, but until there is something meaningful that one can do about it, other than lifestyle modifications or brain games that one might do in any event, being labeled with the condition may be a bit like the Sword of Damocles,15 and the negative impact on work and insurability could be substantial and unpredictable.

In the future, the Subjective Cognitive Decline Initiative workgroup, which formed in November 2012 and coined the term subjective cognitive decline, will work toward developing standardized subjective cognitive tests so that when treatments become available, physicians can eventually use them as “cheap and noninvasive” ways to help identify people at greater risk, explains Petersen in The New York Times. “We can’t do MRIs and scanning on everyone,” he says. The goal will be to develop a test that can identify which subjective concerns are cause for concern, since not all are.3

Take-Home Message

Are some individuals more sensitive than others to loss of cognitive sharpness years before the possible development of clinical cognitive impairment? This question might be possible to answer over time with greater accuracy than at present. Other than for the purposes of research, it is not clear whether providing individuals with such knowledge offers any great benefit in their current and future healthcare status, preventive activities, or future life planning. All of these actions, like universal precautions to avoid infection, eat a healthy diet, and exercise should become part of the normal fabric of life. On one hand, perhaps there are individuals who without the push and fear of future impairment might otherwise put off important planning processes with their families or at work, or might neglect preventive activities. On the other hand, the cognitive Sword of Damocles might be a significant burden to carry for the rest of one’s life when there are so many issues about one’s later years that have to be considered and incorporated into life-long planning processes. ■


Michael Gordon, MD, MSc, FRCPC, is a geriatrician at Baycrest Health Science System, medical program director of the palliative care program, co-head of the clinical ethics program, and professor of medicine at the University of Toronto, Canada.  


1.Gordon M. Identification of potential or preclinical cognitive impairment and the implications of sophisticated screening with biomarkers and cognitive testing: does it really matter? Biomed Research International. 2013. doi:10.1155/2013/976130

2.Gauthier S, Patterson C, Gordon M, Soucy JP, Schbert F, Leuzy A. Commentary on ‘Recommendations from the National Institute on Aging Alzheimer’s Association workgroups on diagnostic guidelines for Alzheimer’s disease.’ A Canadian perspective. Alzheimers Dement. 2011;7(3):330-332.

3.Belluck P. Dementia’s signs may come early. New York Times. July 17, 2013:A1. Accessed February 27, 2014.

4.A history of Alzheimer's disease. The Bright Focus Foundation website. Updated August 21, 2013. Accessed February 27, 2014.

5.Reger B. Alzheimer's disease: a brief history and avenues for current research. Journal of Young Investigators. 2002;2. Accessed February 27, 2014. 

6.Eckerström C, Olsson E, Bjerke M, Malmgren H, Edman A, Wallin A, Nordlund A. A combination of neuropsychological, neuroimaging, and cerebrospinal fluid markers predicts conversion from mild cognitive impairment to dementia. J Alzheimers Dis. 2013;36(3):421-431.

7.Huntington's disease: hope through research. National Institute of Neurological Disorders and Stroke website. Updated December 5, 2013. Accessed February 27, 2014. 

8.Schols L, Marckmann G. Quality of life and prenatal decisions. Virtual Mentor. 2005;7(2). Accessed February 27, 2014.

9.Subjective cognitive decline may be the earliest clinical indicator of Alzheimer’s disease [news release]. Alzheimer’s Association. Published July 17, 2013. Accessed February 27, 2014.

10.Brauner DJ. A dementia omen, or normal aging? New York Times. July 18, 2013:A18. Accessed February 27, 2014. 

11.Marx P. Mentally fit: workouts at the brain gym. New Yorker. July 29, 2013:24. Accessed February 27, 2014.

12.Doraiswamy PM, Agronin ME. Brain games: do they really work? Scientific American. Published April 28, 2009. Accessed February 28, 2014. 

13.Unverzagt FW, Gao S, Baiyewu O, et al. Prevalence of cognitive impairment. Neurology. 2011;57(9):1655-1662.

14.Mapstone M, Cheema AK, Flandaca MS, et al. Nature Medicine. Published March 9, 2014. Accessed March 18, 2014. 

15.Sword of Damocles. Merriam-Webster website. Accessed February 27, 2014.