April 2018

Print Friendly, PDF & Email




Medicine is getting personal

In the future, more targeted and accurate prescriptions can be made with progress in the field of genetics



That’s right, your genes. Medication in 10 years’ time may not look or work differently from those you are taking today, but they will be tailored to your genes. Many doctors expect that genetics will soon pervade medical treatment. This study of how genes affect a person’s response to drugs is called pharmacogenomics. This new field combines pharmacology (the science of drugs) and genomics (the study of genes and their functions) to develop effective, safe medications and doses that will be tailored to a person’s genetic makeup. This is the essence of personalised medicine.

Many drugs that are currently available are “one size fits all”, but they may not work the same way for everyone. Your doctor figures out what drug and how much to give you based on your age, weight and general health factors. However, depending on your genetic makeup, some drugs may work more or less effectively for you than they do in other people. Similarly, some drugs may produce more or fewer negative side effects (called adverse drug reactions) in you than in someone else. Adverse drug reactions are a significant cause of hospitalisations and deaths.

Pharmacogenomics may also help save you time and money. By using information about your genetic makeup, doctors may be able to avoid the trial-and-error approach of prescribing medications. The “best-fit” drug for you can be chosen from the beginning.

magine this scenario…

Your doctor ordered a pharmacogenomic test to decide how best to treat your high cholesterol. He might find a variant indicating that a commonly used drug would have little effect on reducing your cholesterol levels, but it would increase your risk of a heart attack or stroke. Luckily, you would not be given that commonly used drug due to this genomic information. Instead, another drug would be provided to you that lowers your cholesterol without negative side effects.

This is not a far-fetched scenario. Pharmacogenomic information is already being used today for a few health conditions. Take, for instance, abacavir, a commonly prescribed drug therapy for HIV, the virus that causes Aids. When first used, some patients developed severe rashes, fatigue, and diarrhoea – symptoms of a possible immune system reaction. Scientists looked at genomic variants associated with the immune system and finally identified one – called HLA-B*5701 – that causes the overreaction. Now doctors routinely test for the variant to find out who should avoid the drug.

Trastuzumab is used to treat metastatic (spread) breast cancer. It is effective against tumours that overexpress the HER2 gene. If the patient’s breast cancer do not have high levels of HER2 protein then trastuzumab will not be effective. The drug is expensive and has known side effects on the heart. Therefore, selecting appropriate patients with pharmacogenomic testing to receive trastuzumab is important.

Genetic testing

The US Food and Drug Administration (FDA) now recommends genetic testing before giving the chemotherapy drug mercaptopurine to patients with acute lymphoblastic leukaemia. Some people have a genetic variant that interferes with their ability to process the drug. This processing problem can cause severe side effects and increase the risk of infections, unless the standard dose is adjusted according to the patient’s genetic makeup.

Given the field’s rapid growth, pharmacogenomics is soon expected to lead to better ways of using drugs to manage heart disease, cancer, asthma, depression and many other common diseases. A useful example in the field of heart disease to illustrate is the common blood-thinning drug warfarin. Most commonly prescribed for patients with atrial fibrillation, it is also used to prevent clotting events in patients with mechanical heart valves or deep vein thrombosis. It has excellent efficacy but also a narrow therapeutic window requiring fine titration of dosing. Underdosing puts the patient at risk of clotting complications while overdosing results in bleeding events. Variation in the CYP2C9 gene can reduce warfarin breakdown in the body putting patients at risk of bleeding during treatment.

In 2007, the FDA revised the label on warfarin to explain that a person’s genetic makeup might influence response to the drug.

The FDA also is considering genetic testing for another blood-thinner, clopidogrel, used to reduce the rate of secondary heart attacks and stroke after a first event. Approximately 27 per cent of the population have a genetic variant of the CYP2C19 gene that makes the drug ineffective, putting them at risk of suffering from a second heart attack or stroke while on medication. If the information is available to the doctor, an alternative blood-thinner could be considered from the onset, reducing the risk of a second heart attack or stroke while on treatment.

Pharmacogenomics may also help to quickly identify the best drugs to treat people with certain psychiatric disorders. For example, while some patients with depression respond to the first drug they are given, many do not, and doctors have to try another drug. Because each drug takes weeks to take its full effect, patients’ depression may grow worse during the time spent searching for a drug that helps.

Atomoxetine is a medication used to treat attention deficit disorder (ADHD). Those with genetic variation of CYP2D6 gene has been found to poorly metabolise the drug and are at risk of adverse drug reaction from overly high concentrations of the drug in the body with the resultant increased suicidal tendency.

The field of pain medication is another area where pharmacogenomic testing has potential usefulness. Codeine is a commonly prescribed medication for pain control. It is largely a prodrug, and its activity is primarily dependent on its conversion to morphine. Patients who have little CYP2D6 activity, the enzyme activity that converts codeine to morphine, are likely to have little response to codeine. The number of people with low CYP2D6 activity is substantial, making the drug ineffective for their pain control. The more dangerous situation, however, occurs when ultrarapid metabolisers take codeine. They may develop severe adverse effects from excessive morphine concentrations in the blood. In one tragic case, a healthy breast-feeding newborn infant developed fatal morphine toxicity; his mother was an ultrarapid metaboliser who was taking codeine, and her milk contained toxic amounts of morphine. Pharmacogenomic testing could potentially guide the dosing of codeine or a switch to a more appropriate alternative.

With the complete mapping of the human gene via the Human Genome Project (HGP) in 2003, anticipation was high that genetic information would radically improve medicine; that side effects would be more predictable, and that patients could be screened for likely drug responses. Thus far, progress has been much slower than what initial excitement anticipated.

Gradual progress

One barrier to widespread clinical implementation of pharmacogenomic testing is the lack of clear, curated, peer-reviewed guidelines that translate laboratory test results into actionable prescribing decisions for specific drugs. Several international consortia have come forth to develop guidelines, of which the Clinical Pharmacogenetics Implementation Consortium (CPIC) and the Dutch Pharmacogenetics Working Group (DPWG) are the most widely recognised. Currently, these consortia have evaluated over 100 gene-drug interactions.

The FDA also recognises that pharmacogenomics can play an important role in identifying medication responders and non-reponders. It mandates pharmacogenomic information appear be indicated on the labels of more than a hundred medications that are currently on the market. Before the Human Genome Project started, only four drugs carried such a label.

As DNA sequencing costs continue to decline and our knowledge increases, tailoring drugs to your genomic profile will become more common in medical practice. The eventual goal is to have pre-emptive broad-based pharmacogenomic testing for everyone early in life. The records will be made available through individual electronic medical records accessible by whichever doctor you chose to see, guiding therapy.

That will be when doctors can finally offer with certainty the right dose of the right drug to the right patient. I will be looking forward to that day.


Blood pressure drug in a gel heals wounds faster


A topical gel made from a common type of blood pressure pill may offer a way to speed up healing of chronic skin wounds.

The findings in a study with mice and pigs may lead to use of the gel on treatment-resistant skin wounds among diabetics and others, particularly older adults.

“The FDA has not issued any new drug approval for wound healing in the past 10 years,” says Peter Abadir, associate professor of medicine at Johns Hopkins University and first author of the paper in the Journal of Investigative Dermatology.

“Using medicines that have been available for more than two decades, we think we have shown that this class of medicines holds great promise in effectively healing chronic wounds that are prevalent in diabetic and aged patients.”

Chronic wounds, defined as skin injuries that fail to heal in a timely manner and increase the risk of infection and tissue breakdown, accounted for more than 100 million visits to US hospitals in 2008, Abadir says.

In recent years, scientists have been studying the skin’s renin-angiotensin system, which is involved in inflammatory response, collagen deposition, and signaling necessary for healing of skin wounds. Studies show that the RAS is abnormal in diabetic and older adults.

Closing the wounds

Researchers experimented with gel formulations of angiotensin II receptor antagonists, or blockers, a class of drugs prescribed to treat high blood pressure that includes losartan and valsartan. The drugs block the RAS and increase wound blood flow, and the goal was to apply the gels directly to wounds to promote faster healing.

Working with mice, Abadir and colleagues showed that a valsartan gel was more effective in accelerating wound healing than losartan. Overall, a formulation with 1 percent valsartan had the greatest impact on total wound closure. Half of all mice that received 1 percent valsartan achieved complete wound healing, compared with only 10 percent of the mice given a placebo.

The researchers then tested 1 percent valsartan gel’s effects on wounds in aged diabetic pigs. Compared to mouse skin, pig skin is more similar to human skin.

Compared with a placebo group, pig wounds that received 1 percent valsartan healed much more quickly. All wounds were closed by day 50; no placebo-treated wounds were, the researchers say.

Blood tests suggest that the drug in gel form and applied to the skin acts locally on the tissues where it’s absorbed, rather than spreading into the bloodstream and potentially affecting the entire body. That’s important to avoid side effects, researchers say.

The scientists also wanted to determine not just the speed but also the quality of 1 percent valsartan’s biological effects on wound repair. They examined collagen content and tensile strength in the pigs’ skin. Pigs treated with valsartan had a thicker epidermal layer (the outermost layer of the skin) and dermal collagen layer, as well as a more organized collagen fiber arrangement. All those findings indicate 1 percent valsartan application leads to stronger healing skin, Abadir says.

“Our strategy for specifically targeting the biology that underlies chronic wounds in diabetics and older adults differs greatly from other approaches to wound care thus far,” says senior author Jeremy Walston, a professor of medicine. “The topical gel likely enables a cascade of positive biological effects that facilitates and accelerates chronic wound healing.”

Wrinkles, too?

Scientists plan to begin testing in humans and hope the gel medication will become “available for public use in a few years, if further research bears out our results,” Watson says. The team believes the medication may one day treat scars, wrinkles, and other skin problems.

How cells start the process of healing wounds

Twenty-nine million Americans have diabetes and 1.7 million are diagnosed each year. About 900,000 will develop diabetic foot ulcers annually. With an aging population and incidence of diabetes increasing rapidly across the globe, Abadir estimates the total number of diabetic foot ulcers to be more than 20 million a year, with an estimated cost of $25 billion annually in the United States alone.

Other authors of the study are from Johns Hopkins and Vanderbilt University. The National Institutes of Health, a Nathan Shock in Aging Scholarship Award, the Wound Healing Society Foundation, and a Maryland Technology Development Grant funded the work.

Abadir, Walston, and a colleague have filed an international patent application involving wound-healing topical RAS blocker treatment. After this study, the intellectual property was licensed to Gemstone Biotherapeutics LLC, a Maryland start-up company.

Source: Johns Hopkins University

Microbiopsy device samples skin cells for cancer without scarring

Professor Tarl Prow and his team have developed a prototype of a microbiopsy device, which can...
Skin cancer is the most common form of the disease, and while it might make its presence known in the form of moles or strange spots, it usually takes a biopsy to confirm whether the mark is malignant or not. Now, an Australian team has developed a microbiopsy device that’s far less invasive, basically painless, and won’t leave a scar.
A biopsy is a fairly routine procedure, but collecting large enough samples of skin cells can be unpleasantly invasive. The process can take several forms: Doctors might shave a thin layer off the top of a suspicious section of skin, use a hole punch-like device to remove a circular section or cut a larger segment out using a scalpel. Either way, the resulting wound can be a few millimeters wide and up to 5 mm deep, which may require stitches and usually leaves a small scar. That’s not ideal for highly-visible parts of the body, like the face.
The microbiopsy needle is half a millimeter wide, and the piece of skin that it takes...

With that in mind, Tarl Prow, currently a Research Professor at the University of South Australia’s Future Industries Institute, set out to design a device that could take skin samples less invasively. The microbiopsy needle is half a millimeter wide, and the piece of skin that it takes is about 0.15 mm wide and 0.4 mm deep. By comparison, Prow says the finger prick needles used by diabetics go 3 to 4 mm deep.

In tests so far, that tiny puncture mark has been found to completely heal within a week, and leave no scar whatsoever. The procedure is less painful than a regular biopsy too, so local anesthesia could be reduced or done away with entirely and it can be performed on kids. The microbiopsy device also makes the procedure faster, allowing more of the tests to be run and the health of a patient monitored better over time.

“Many of us, when we reach a certain age, have a lot of these pink spots on sun-exposed areas, and you just can’t go in and biopsy all of those with conventional techniques,” Prow tells New Atlas. “So the idea with the microbiopsy is we can go in on the face, on the head, where you don’t want to have a surgical procedure, take a small sample and see whether or not it’s malignant.”

With each prick of the microbiopsy, the device collects about 200 skin cells, which is enough...

With each prick of the microbiopsy, the device collects about 200 skin cells, which is enough to identify certain biomarkers for common types of skin cancers. While it can pick up the presence of basal cell and squamous cell carcinomas, the team says that melanomas are a different story, since their biomarkers are harder to pin down.

Although it will primarily be used to detect skin cancers, the microbiopsy device could – and has – been used for other diseases as well. Prow says that colleagues at the Technical University of Munich are conducting a trial of the devices to identify rashes in infants, a team from Hebrew University have tested for diseases in developing African countries, and researchers in Brazil are currently using it to look for parasite infections in dogs.

A researcher uses the microbiopsy device on a field test in Africa

Manufacturing of the microbiopsy devices will be handled by Trajan Scientific and Medical, and Prow and his team are currently preparing for a large clinical trial that may start as soon as next year. After that, the team hopes to have an approved diagnostic test available around 2023.

“For us, the challenge is really to scale up the manufacture, and develop the kind of pathology kits that we need to support different diseases, starting with skin cancer,” Prow tells us.

Sources:  University of South AustraliaUS Patent Office

Chronic Fatigue Syndrome Might Have a Crucial Hormonal Link

Finally, some answers.

main article image

21 MAR 2018

For years, a line has divided millions of patients from their doctors, separating those who experience the debilitating effects of chronic fatigue syndrome from a medical establishment that has traditionally refused to acknowledge or agree upon the condition.

Now, finally, that barrier is beginning to crumble.

In recent times, a series of studies has identified evidence of biological mechanisms that could contribute to the disorder – and now new research from the Netherlands is being hailed as an important advance in our understanding of the illness.

Researchers at the University Medical Centre Groningen have discovered a link between chronic fatigue syndrome (CFS) – aka myalgic encephalomyelitis (ME) – and lower thyroid hormone levels.

If the findings can be confirmed by additional research, it could be a first step toward finding a treatment for this maddening, mysterious disease.

Part of the problem with exploring what’s behind CFS is recognising it in the first place.

Often, it’s diagnosed by ruling out any other underlying medical conditions, using a process of deduction to eliminate viral, bacterial, and other medical explanations that we have established tests for.

The condition, which has no definitively known cause, is marked by long-term fatigue, post-exertional malaise, sleep problems, difficulty in thinking clearly, and a host of other varied physical symptoms, characterised by overall discomfort, aches, and pains (sometimes extreme).

The severity and prevalence of these symptoms – estimated to affect over 1 million Americans and 2.6 percent of the global population – has made CFS one of the world’s most controversial medical disorders, with patients and researchers lamenting the inadequacy of our existing understanding and ‘treatments’ of the disease.

The new study, led by biochemist Begoña Ruiz-Núñez, compared thyroid function and markers of inflammation between 98 CFS patients with 99 healthy control participants.

What they found was that CFS patients had lower serum levels of two key thyroid hormones – called triiodothyronine (T3) and thyroxine (T4) – but normal levels of a thyroid-stimulating hormone that’s usually present at higher levels in hypothyroidism – the better understood condition that also displays low thyroid hormone production.

To the extent they can characterise it so far, the researchers hypothesise that CFS is caused by low activity of thyroid hormones in the absence of thyroidal disease, since the patients in the study had regular amounts of the thyroid-stimulating hormone, called thyrotropin.

In addition, the CFS patients demonstrated low-grade inflammation generally, plus higher levels of another thyroid hormone called “reverse T3” (rT3), which is thought to contribute to the overall reduction in T3 hormones.

“One of the key elements of our study is that our observations persisted in the face of two sensitivity analyses to check the strength of the association between CFS and thyroid parameters and low-grade inflammation,” says Ruiz-Núñez.

“This strengthens our test results considerably.”

While we don’t yet understand how these altered hormone levels are related to the myriad symptoms of CFS, isolating this imbalance in the thyroid could be a major step forward in learning more about what’s triggering this strange illness – bringing us hopefully closer to more targeted trials, and one day, treatment.

“This new research into thyroid gland hormones in ME/CFS represents an important advance in our understanding of hormonal abnormalities in this illness,” explains physician Charles Shepherd, a medical adviser to the UK’s ME Association, who wasn’t involved in the study.

“If these findings can be replicated by other independent research groups, it suggests that the cautious use of thyroid hormone treatment needs to be assessed in a clinical trial – as it could be an effective form of treatment for at least a subgroup of people with ME/CFS.”

The findings are reported in Frontiers in Endocrinology.


Doctors Have Restored The Sight of Two People in a Monumental World First

This could end one of the most common forms of blindness.

main article image
21 MAR 2018

British physicians have successfully used stem cells to repair the degenerating tissue at the back of two patients’ eyes in a world first, effectively reversing their diminishing eyesight.

It’s now hoped that an affordable form of the therapy could be made available in the UK within the next five years, opening the way for more than half a million people in the UK and millions more around the globe to have their impaired vision restored.

The two elderly patients reported on in a recently published case study suffered from a condition called macular degeneration – an age-related condition of the retina that is responsible for roughly half of all cases blindness.

In simple terms, the disease involves a breakdown of the layer of cells behind the light-sensitive rods and cones forming the eye’s retina.

This layer of tissue, called the retinal pigment epithelium, helps transport nutrients into the retina’s outer layer and remove waste; its loss leads to a build-up of materials that slowly kill the surrounding cells.

As time passes, this steady degeneration can gradually widen into a blind spot that interferes with a person’s vision.

The fundamental causes of the cell layer’s failure aren’t all that clear, yet risk of getting the condition increases significantly in those over the age of 50.

Although small, the blind spots’ position fall on a tiny zone called the macula – an area of tissue that captures most of the detail of whatever it is we’re focussing on.

That rules out reading, watching television, or even recognising faces.

For 86 year old Douglas Waters, one of the therapy’s recipients, the condition meant losing half of his field of sight.

“In the months before the operation my sight was really poor and I couldn’t see anything out of my right eye,” Waters told BBC health and science correspondent, James Gallagher.

Treatments for the most severe forms of macular degeneration exist, but can involve frequent injections into the eye, which we can all agree isn’t exactly a pleasant concept.

Waters was one of two patients to undergo surgery a year ago, where a patch of specially designed embryonic engineered stem cells just 40 microns thick, and 4 by 6 millimetres wide was inserted into their retina.

These cells were not only grown to replicate the diverse cells in the retinal pigment epithelium, they were coated with a synthetic compound that helped them stick in place.

A 12 month follow-up case study on the patients’ progress reported both of the patients’ showed significant improvements.

While the transplanted cells weren’t a perfect replacement, with some small signs of rejection causing an uneven spread of cells, they appeared to remain relatively healthy.

Both patients also reported improvements in their vision, which is really where it matters.

“It’s brilliant what the team have done and I feel so lucky to have been given my sight back,” Waters said to the BBC, claiming he could now read the newspaper.

Further monitoring against rejection and cancerous changes in the cells will also ensure the procedure is as safe and effective as possible.

The research team has permission in this stage of clinical trials to test the procedure on a further eight recipients.

If all continues to go well, the procedure could soon be made more widely available.

“We hope this will lead to an affordable ‘off-the-shelf’ therapy that could be made available to NHS patients within the next five years,” ophthalmologist Pete Coffey from the University College London’s Institute of Ophthalmology told the BBC.

Time will tell what this means for the some 100 million people worldwide facing a future with age-related macular degeneration.

Another promising therapy tested last year used an engineered virus injected into the eye to slow and even reverse the effects of the condition, but appeared to be impeded in some by the patient’s immunity.

While the procedure is more invasive, stem cells could be the way to go.

We can only hope that Doug Waters is the first of many to get a new lease on vision.

This research was published in Nature Biotechnology.


The enemy within: Gut bacteria drive autoimmune disease

March 8, 2018
Yale University
Bacteria found in the small intestines of mice and humans can travel to other organs and trigger an autoimmune response, according to a new study. The researchers also found that the autoimmune reaction can be suppressed with an antibiotic or vaccine designed to target the bacteria, they said.
Orange dots represent the gut bacterium E. gallinarum in liver tissue.
Credit: Image courtesy of Yale University

Bacteria found in the small intestines of mice and humans can travel to other organs and trigger an autoimmune response, according to a new Yale study. The researchers also found that the autoimmune reaction can be suppressed with an antibiotic or vaccine designed to target the bacteria, they said.

The findings, published in Science, suggest promising new approaches for treating chronic autoimmune conditions, including systemic lupus and autoimmune liver disease, the researchers said.

Gut bacteria have been linked to a range of diseases, including autoimmune conditions characterized by immune system attack of healthy tissue. To shed light on this link, a Yale research team focused on Enterococcus gallinarum, a bacterium they discovered is able to spontaneously “translocate” outside of the gut to lymph nodes, the liver, and spleen.

In models of genetically susceptible mice, the researchers observed that in tissues outside the gut, E. gallinarum initiated the production of auto-antibodies and inflammation — hallmarks of the autoimmune response. They confirmed the same mechanism of inflammation in cultured liver cells of healthy people, and the presence of this bacterium in livers of patients with autoimmune disease.

Through further experiments, the research team found that they could suppress autoimmunity in mice with an antibiotic or a vaccine aimed at E. gallinarum. With either approach, the researchers were able to suppress growth of the bacterium in the tissues and blunt its effects on the immune system.

“When we blocked the pathway leading to inflammation, we could reverse the effect of this bug on autoimmunity,” said senior author Martin Kriegel, M.D.

“The vaccine against E. gallinarum was a specific approach, as vaccinations against other bacteria we investigated did not prevent mortality and autoimmunity,” he noted. The vaccine was delivered through injection in muscle to avoid targeting other bacteria that reside in the gut.

While Kriegel and his colleagues plan further research on E. gallinarum and its mechanisms, the findings have relevance for systemic lupus and autoimmune liver disease, they said.

“Treatment with an antibiotic and other approaches such as vaccination are promising ways to improve the lives of patients with autoimmune disease,” he said.

Story Source:

Materials provided by Yale University. Original written by Ziba Kashef. Note: Content may be edited for style and length.

COMPASS: Prognosis ‘Dire’ in PAD With Major Limb Events

March 16, 2018

A new analysis from the peripheral arterial disease (PAD) subgroup of the large-scale COMPASS trial show that development of a major adverse limb event — the primary endpoint of the study — is associated with a dire longer-term prognosis, including a 7-fold increased risk for hospitalization, a 200-fold increased risk for an amputation, and a 3-fold increased risk for death over the following year.

“Our results highlight the utmost importance of preventing major adverse limb events in patients with PAD,” COMPASS investigator, Sonia Anand, MD, Population Health Research Institute, Hamilton, Ontario, Canada, told theheart.org | Medscape Cardiology.

The main results of the PAD subgroup in the COMPASS trial, reported last year, showed a 46% reduction in major adverse limb events (defined as severe limb ischemia necessitating hospitalization for an intervention such as bypass, angioplasty, amputation, or thrombolysis) in patients receiving low-dose rivaroxaban (2.5 mg twice daily) plus aspirin compared with those receiving aspirin alone.

In the current analysis, the COMPASS researchers looked at the longer-term prognosis of the 128 patients who developed a major adverse limb event in the trial, the baseline factors that predicted the development of a major adverse limb event, and the impact of the low-dose rivaroxaban and aspirin combination on long-term outcomes.

This new analysis was presented by Anand earlier this week at the American College of Cardiology (ACC) 2018 Annual Scientific Session. It was also simultaneously published online in Journal of the American College of Cardiology.

Anand reported that the independent predictors of a major adverse limb event included severe ischemia symptoms at baseline (Fontaine classification 3 or 4), prior limb or foot amputation at baseline, and history of peripheral revascularization surgery or angioplasty.

Over the next year, the 128 (2%) patients who experienced a major adverse limb event in the trial had a 95.4% risk for a subsequent hospitalization, a 22.9% risk for a vascular amputation, and an 8.7% risk for death.

Compared with patients with PAD who had not experienced a major adverse limb event, those who did had hazard ratios for subsequent hospitalization of 7.21, for vascular amputations of 197.5, and for death of 3.23.

In terms of treatment, compared with those receiving aspirin alone, patients who had been randomly assigned to low-dose rivaroxaban plus aspirin had a 58% reduction in total amputations, a 67% reduction of major amputations and a 24% reduction in vascular interventions.

“The combination of rivaroxaban 2.5 mg twice daily and aspirin significantly lowers the incidence of major adverse limb events and their related complications and should be considered as an important therapy for patients with PAD,” Anand concluded.

She added: “We did not know much about the prognosis for these patients suffering a major adverse limb event before.  These results show that having such an event is a very important marker of adverse prognosis and anything that reduces these events is likely to have a major effect on longer-term outcomes.” Experts in the field asked to comment for theheart.org | Medscape Cardiologywere encouraged by these latest results.

Leslie Cooper, MD, Mayo Clinic, Jacksonville, Florida, pointed out that although a small minority of patients with PAD and claudication progress to critical limb ischemia, those that do have a high rate of amputation at 1 year.

“The combination of rivaroxaban 2.5 mg twice daily and aspirin is a major new advance in the management of PAD because it demonstrates that inhibiting a new pathway, factor Xa inhibition, can impact clinically meaningful outcomes with acceptable risk in this population,” Cooper said. “As with any substudy, caution is required in interpretation, but the large sample size and rigorous methodology of the parent trial add credibility to these data.”

Lars Wallentin, MD, PhD, Uppsala Clinical Research Center, Sweden, added that these new data further emphasize the importance of PAD as a risk factor for both limb events and cardiac events in patients with vascular disease.

“The new results further strengthen the previously published findings that the combination of rivaroxaban at a very low dose of 2.5 mg per day in combination with low-dose aspirin provides a clinically very meaningful protection against both new limb events and coronary events,” Wallentin said. “As the treatment is associated with only a modest increase in major bleeding, and no difference in severe bleeding, this treatment strategy is an appealing alternative for many patients with PAD.”

Joseph Ladapo, MD, PhD, associate professor of medicine at David Geffen School of Medicine at the University of California Los Angeles and a health policy researcher, called this an “important study because it highlights a condition — PAD with limb complications — that can cause a tremendous amount of suffering for patients.”

“Physicians know that outcomes are worse for patients with limb ischemia, but these authors quantify their risk in a large cohort, which I have not seen before,” Lapado said. The hazard ratio for subsequent amputation is about 200 among patients with a prior major adverse limb event, he noted.

“We all knew it was a risk factor, but it’s beneficial to see how much of a profound risk factor it is,” he said. “The other major takeaway is that aspirin plus rivaroxaban is effective, but we have shown previously that patients with PAD are undertreated. There are evidence-based therapies that improve outcomes in these patients, but they’re just not being prescribed.”

Despite describing the reduction in major adverse limb events with rivaroxaban plus aspirin as “eye popping,” Lapado said that translating this into a reduction in outcomes in the real world would depend on addressing the inertia that physicians currently demonstrate.

“There is slow adoption of evidence-based therapy in this population,” he stressed.

The COMPASS trial was funded by Bayer.  A nand has received honoraria and consulting fees from Bayer and Novartis.

American College of Cardiology (ACC) 2018 Annual Scientific Session. Presentation 407-16. Presented March 11, 2018.

J Am Coll Cardiol. Published online March 11, 2018.  Abstract

For more from theheart.org | Medscape Cardiology, follow us on Twitter and Facebook

High Cholesterol Tied to Lower Cognitive Decline Risk in Oldest Old

Batya Swift Yasgur, MA, LSW

March 14, 2018

Elevated cholesterol levels in individuals older than 85 years has been linked to a reduced risk for marked cognitive decline, compared with persons 10 years younger whose cholesterol levels were similarly elevated, new research shows.

Investigators found that cognitively intact people between the ages of 85 and 94 whose total cholesterol had increased from midlife had a 32% reduced risk for marked cognitive decline during the next decade, compared with individuals aged 75 to 84, who had a 50% increased risk.

“These findings do not imply that the cholesterol itself had a protective effect or that increasing cholesterol consumption will confer a benefit on people of this age,” lead author Jeremy Silverman, PhD, professor of psychiatry, Icahn School of Medicine at Mount Sinai, New York City, told Medscape Medical News.

Instead, “cholesterol may be a marker for some other protective factor present in these people who are making it to age 85 and maintaining their good cognition at that age,” he said.

The study was published online March 5 in Alzheimer’s and Dementia.

Successful Cognitive Aging

Elevated cholesterol levels in midlife have been associated with cognitive decline, dementia, and Alzheimer’s disease (AD), but in most studies, the mean outcome age at the time of cognitive assessment in follow-up is the mid-70s, the authors note.

Longitudinal studies of adults with older outcome ages have yielded “inconsistent” findings, they write.

Additionally, studies of associations between cholesterol and negative cognitive outcomes have focused primarily on differences with respect to midlife vs late-life cholesterol measurement, rather than outcome age.

To investigate the relationship between cholesterol levels and cognitive function in people of very advanced age, the researchers used data from the Framingham Heart Study, which “provides extensive cholesterol measure and cognition information, enabling survival analyses that include changes in association by outcome age,” the authors note.

Although an earlier study of the original Framingham cohort found no significant association between cholesterol and AD, the aim of the current study “was to determine whether specific cholesterol measures had different associations with marked cognitive decline at different outcome ages.”

The current study also differed from the earlier study in participant eligibility, cognitive outcome, cholesterol predictors, and the survival analysis model, which defines “survival” as “successful cognitive aging — having intact cognition, while living to oldest-old age, 85 years, and above.”

To investigate the question, the researchers used datasets of biannual longitudinal examinations from 1948-1953 and 2012-2014, drawn from the original Framingham cohort.

“Intact cognition” was defined as a Mini–Mental State Examination (MMSE) score of ≥25.

The “threshold age” was defined as the age at last intact cognition, and “marked cognitive decline” was defined as “deterioration from intact cognition at the threshold age to the first dementia diagnosis, or having MMSE ≤20.”

Inclusion requirements were having intact cognition at some examination, reported years of education, and at least three cholesterol measurements through the threshold age.

The researchers used two subsets of cholesterol predictors.

The first subset consisted of first cholesterol observation (obtained at midlife) and the late-life last observation through the threshold age (called “last cholesterol”).

Each subset was dichotomized between “normal” (<200 mg/dL) and at least “borderline high cholesterol” (≥200 mg/dL, which investigators referred to as “high”).

The second subset consisted of three predictors using all cholesterol measurements through the threshold age — mean, linear slope (ie, the angle of the fitted line for cholesterol measurements), and the quadratic components of the cholesterol trajectory.

Covariates included outcome age, sex, and education, with survival analyses using outcome age as the “time” variable, rather than a separate covariate.

Additional covariates were first cholesterol measurement age (called “entry age”) on or near the age of Framingham study entry, and whether the individuals had ever used statin drugs.

The five cholesterol predictors (ie, high first cholesterol, high last cholesterol, mean cholesterol, rising linear slope, and quadratic slope) were also used as time-dependent coefficients.

Framingham Cohort

Of the 5079 participants for whom any data from the original Framingham cohort were available, only 1897 met all the inclusion criteria. These patients composed the full sample for the primary survival analysis (mean age ± SD = 40.2 ± 6.8 years; 747 men, 1150 women).

Of these, 316 eventually experienced marked cognitive decline (114 with diagnosed dementia, 202 with MMSE ≤20).

In the sample, there were 1041 participants aged 75 to 84 years (denoted as “{75,85}”), and 391 participants aged 85 to 94 years (denoted as “{85,94}”).

The researchers found that in all analyses, “every significant time-dependent coefficient reduced the association of a significant predictor for a marked cognitive decline as the outcome age increased.”

For the {75,84} age interval, the 36.7% of the sample with a rising slope had significantly increased risk for marked cognitive decline (χ2 = 4.196, df = 1, P = 0.041; hazard ratio = 1.497, 95% confidence interval [CI], 1.016 – 2.206), compared with individuals who had a falling slope.

In contrast, for the {85,94} age interval, the 23.3% of the sample with a rising slope had a significantly lower risk (χ2 = 4.228 df = 1, P = 0.040; HR = 0.676, 95% CI, 0.457 – 0.999) of cognitive decline, compared with those who had a falling slope.

A significant association was found in the full sample between a rising linear slope in cholesterol measures from midlife, while cognitively intact, and a subsequent marked cognitive decline.

However, the time-dependent coefficient was significant and demonstrated diminishing strength of this association of risk for cognitive decline with rising linear slope as the outcome age increased.

Protected Survival Model

In contrast to previous studies with earlier outcome ages, high first cholesterol was significantly associated with reduced risk in the {85,94} sample, with average outcome age of 90.9 years.

Moreover, fewer years of education, not using statins, and earlier entry age (all of which are significantly associated with marked cognitive decline) were significantly weakened as outcome age increased. The crossover ages — all in the 90s — indicate when this weakening would reverse the direction of the association.

The crossover age for cholesterol linear slope in the full sample was 98 years. However, the {85,94} age interval sample showed an earlier reversed relationship.

These findings illustrate a concept called the “protected survival model” that “posits a minority subpopulation with protection against mortality and cognitive decline associated with cognitive risk factors,” the authors state.

“The take-home message for me as an investigator is that the study suggests that there is a phenotype out there that we can use and take advantage of, people who you think should be demented or dead because of risk factors and yet are still doing well,” said Silverman.

“Those are the people most likely to be carrying some type of protective factor, so if we can identify that group, we might be in a better position to identify genetic factors protective against cognitive decline as well as mortality,” he added.

“Ironic” Findings

Commenting on the findings for Medscape Medical News, Bernard G. Schreurs, PhD, professor, Department of Physiology, Pharmacology, and Neuroscience, West Virginia University, Morgantown, who was not involved in the study, called the findings “consistent with what is known in the field — that total cholesterol declines in old age and low cholesterol in old age is associated with poor health, multimorbidity, and deteriorating cognition.”

He noted that this is “supported by evidence that there is a lower level of synthesis and absorption of cholesterol as a person ages.”

He called the finding “ironic” because, “during middle age, the opposite is true — elevated cholesterol levels are associated with poor cognition that becomes more evident in old age.”

Silverman added that he is interested in further exploring “both genetic and environmental factors that may be protective in this population.”

The authors received salary support from the National Institutes of Health (NIH)–Fogarty International Center and the US Department of Veterans Affairs. Dr Silverman was supported by the NIH-NIA. Dr Schreurs has disclosed no relevant financial relationships.

Alzheimers Dement. Published online March 5, 2018. Abstract


Top 6 Robotic Applications in Medicine

According to a recent report by Credence Research, the global medical robotics market was valued at $7.24 billion in 2015 and is expected to grow to $20 billion by 2023. A key driver for this growth is demand for using robots in minimally invasive surgeries, especially for neurologic, orthopedic, and laparoscopic procedures.

As a result, a wide range of robots are being developed to serve in a variety of roles within the medical environment. Robots specializing in human treatment include surgical robots and rehabilitation robots. The field of assistive and therapeutic robotic devices is also expanding rapidly. These include robots that help patients rehabilitate from serious conditions like strokes, empathic robots that assist in the care of older or physically/mentally challenged individuals, and industrial robots that take on a variety of routine tasks, such as sterilizing rooms and delivering medical supplies and equipment, including medications.

Below are six top uses for robots in the field of medicine today.

1. Telepresence
Physicians use robots to help them examine and treat patients in rural or remote locations, giving them a “telepresence” in the room. “Specialists can be on call, via the robot, to answer questions and guide therapy from remote locations,” writes Dr. Bernadette Keefe, a Chapel Hill, NC-based healthcare and medicine consultant.  “The key features of these robotic devices include navigation capability within the ER, and sophisticated cameras for the physical examination.”

2. Surgical Assistants
These remote-controlled robots assist surgeons with performing operations, typically minimally invasive procedures. “The ability to manipulate a highly sophisticated robotic arm by operating controls, seated at a workstation out of the operating room, is the hallmark of surgical robots,” says Keefe. Additional applications for these surgical-assistant robots are continually being developed, as more advanced 3DHD technology gives surgeons the spatial references needed for highly complex surgery, including more enhanced natural stereo visualization, combined with augmented reality.

By Mark Crawford | ASME

Image Credit: Intuitive Surgical


Carri’s Corner: Will be back next month  


www.Vital-Profits.com     https://www.facebook.com/VitalProfits