Friday, March 16, 2018

Antimicrobial Stewardship

The first antimicrobial stewardship programs were introduced in hospitals more than 30 years ago to address inappropriate antibiotic prescribing and increasing antibiotic resistance. Since then a large body of evidence on the effectiveness and safety of this approach has accumulated.

The purpose of antimicrobial stewardship is to promote the prudent use of antibiotics in order to optimize patient outcomes, while at the same time minimizing the probability of adverse effects, including toxicity and the selection of pathogenic organisms, and the emergence and spread of antibiotic resistance.

The previous Cochrane Review demonstrated that interventions to reduce excessive antibiotic prescribing were successful, with persuasive and restrictive interventions being equally effective in reducing prescribing after six months. The recent update demonstrates that enabling and restrictive interventions are associated with a 15% increase in compliance with desired practice, a 1.95-day decrease in duration of antibiotic treatment, and a 1.12-day decrease in inpatient length of stay, without compromising patient safety.

Initiatives for implementing or strengthening antimicrobial stewardship were primarily developed as a response to increasing antibiotic resistance. Increasing antibiotic use results in increasing antibiotic resistance rates. But does improving antibiotic prescribing reverse antibiotic resistance rates? The updated Cochrane Review does not provide an answer; only 9% of the randomized controlled trials and 19% of the interrupted time series studies reported microbial outcome data. However, a reduction in the rate of Clostridium difficile infections was consistently demonstrated in the studies interventions.

Despite the extensive evidence base, antimicrobial stewardship programs are not a requirement in all hospitals. Antimicrobial resistance requires global action. This requires political commitment and resources, suggesting a role for continued advocacy by public health and specialist professionals and organizations. One significant characteristic of the evidence base is that 183 or the 221 studies in the updated Cochrane Review were performed in Europe or North American.

Wednesday, March 14, 2018

Wisdom Wednesday: U.S. Cancer Treatment Guidelines ‘Often Based on Weak Evidence’

Cancer treatment guidelines produced by the US National Comprehensive Cancer Network (NCCN) are often based on low quality evidence or no evidence at all, finds a study published by The BMJ today.

The researchers, led by Dr. Vinay Prasad at Oregon Health & Science University, say their findings “raise concern that the NCCN justifies the coverage of costly, toxic cancer drugs based on weak evidence.”

These recommendations are used by US private health insurers and social insurance schemes to make coverage decisions, and guide global cancer practice, but it is not clear how the evidence is gathered or reviewed.

In the US, the Food and Drug Administration (FDA) approves all new drugs and grants new indications for drugs already on the market. The NCCN makes recommendations both within and outside of FDA approvals, but patterns of NCCN recommendations beyond FDA approvals have not been analyzed.

So Dr. Prasad and his team compared FDA approvals of cancer drugs with NCCN recommendation in March 2016 for a contemporary sample of drugs. When the NCCN made recommendations beyond the FDA’s approvals, the evidence used to support those recommendations was evaluated.

A total of 47 new cancer drugs were approved by the FDA for 69 indications over the study period, whereas the NCCN recommended these drugs for 113 indications, of which 69 (62%) overlapped with the 69 FDA approved indications and 44 (39%) were additional recommendations.
Only 10 (23%) of these additional recommendations were based on evidence from randomized controlled trials, and seven (16%) were based on evidence from phase III studies. Most relied on small, uncontrolled studies or case reports, or no offered evidence.

And almost two years after their analysis, the researchers found than only six (14%) of the additional recommendations by the NCCN had received FDA approval.

Monday, March 12, 2018

Type 2 Diabetes: New Guidelines Lower Blood Sugar Control Levels

The American College of Physicians have now published their new guidelines regarding the desired blood sugar control levels for people with type 2 diabetes. The recommendations aim to change current therapeutic practices, and doctors should aim for a moderate level of blood sugar when treating their patients.

According to the most recent estimates, almost 30 million people in the United States have type 2 diabetes, which amounts to over 9% of the entire U.S. population.

Once diagnosed with type 2 diabetes, patients are often advised to take what is known as a glycated hemoglobin (HbA1c) test in order to keep blood sugar levels under control. The test averages a person’s blood sugar levels over the past 2 or 3 months, with an HbA1c score of 6.5% indicating diabetes.

But some studies have pointed out that the HbA1c test may currently be overused in the U.S., and they have suggested that such over-testing may lead to over-treating patients with hypoglycemic drugs. These drugs often have a range of side effects, such as gastrointestinal problems, excessively low blood sugar, weight gain, and even congestive heart failure. Additionally, as some researchers have pointed out, “Excessive testing contributes to the growing problem of waste in healthcare and increased patient burden in diabetes management.”

The existing recommendations of a score of 6.5% - or below 7% - decrease the risk of microvascular complications over time. However, the American College of Physicians (ACP) found that the evidence for such a reduction is “inconsistent.”

Friday, March 9, 2018

Could Vitamin D Lower Cardiovascular Death Risk?

People who have cardiovascular disease can reduce their risk of death by almost a third simply by maintaining normal vitamin D levels. This is the finding of a new study recently published in The Journal of Clinical Endocrinology & Metabolism.

Cardiovascular disease (CVD) is the number 1 killer in the United States. Heart disease alone is responsible for around 610,000 deaths in the country every year. Previous research suggests that vitamin D status may play an important role in cardiovascular health. A study in 2016, for example, associated low vitamin D levels with greater risk of stroke, heart failure, heart attack, and cardiovascular death.

The new study – led by Prof. Jutta Dierkes, of the Department of Clinical Medicine at the University of Bergen in Norway – further investigated the role that vitamin D levels play in the risk of death from CVD.

Prof. Dierkes and colleagues analyzed the blood samples of 4,114 adults who had suspected angina pectoris, which is chest pain as a result of coronary heart disease. Subjects were an average age of 62 at study baseline, and they were followed-up for an average of 12 years.

The team assessed the subjects’ blood samples for levels of 25-hydroxyvitamin D, or 25(OH)D, which is the primary circulating form of vitamin D. During follow-up, there were a total of 895 deaths. Of these, 407 were related to CVD.

According the National Institutes of Health (NIH), as 25(OH)D level of 50-125 nanomoles per liter (nmol/l) is “generally considered adequate for bone and overall health in healthy individuals.”

Wednesday, March 7, 2018

Wisdom Wednesday: A Short History of Quinine

As a follow-up to Monday’s blog, I thought you might enjoy some history about quinine. Legend has it that the bark of the fever tree was first used by the Spanish in the early 1630s when it was given to the Countess of Chinchon, who had contracted malaria (know colloquially as the ‘fever’) while living in Peru. The Countess recovered and the healing properties of the fever tree were passed to Europe.
It was imported to Europe under the name ‘Jesuits Powder’ which proved a very poor selling strategy in Protestant England. Even when Charles II in 1679 was cured of the ‘fever’ its popularity was not assured as its use remained the secret of his physician (Robert Talbor).

However, the healing power of this remarkable tree only became world renowned in the 1820’s when officers of the British Army in India, in an attempt to ward off malaria, mixed quinine (the extract from the bark of the fever tree) with sugar and water, creating the first Indian Tonic Water.

It was made more palatable when they added a little expedient of gin to the mixture. The original gin and tonic was thus born, and soon became the archetypal drink of the British Empire, the origins of which were firmly planted in the fever tree.
Medical historians claim that the British lost India primarily because the British forces were too drunk from the daily ritual of drinking gin and tonic to effectively fight.

Colonialism produced a huge demand for the bark of the fever tree. In the 1850s the East India Company alone spend 100,000 pounds annually on the bark, but it still brought in nowhere nearly enough to keep the colonists healthy. The answer was to try and cultivate fever trees in the colonies. This initiative inspired intrepid plant hunters across Europe to risk all and travel to South America to harvest these most valuable seeds. The Englishman, Richard Spruce, brought back seeds from Ecuador, which were subsequently grown in India and Ceylon, but they turned out to be of a species that was relatively poor in quinine. Using the wrong species of herb or wrong part of the plant remains a significant issue in herbology today with up to 80% of commercial products being ineffective as a result of this practice.

Monday, March 5, 2018

Quinine for Muscle Cramps

Muscle cramps can occur anywhere and for many reasons. Quinine has been used to treat cramps of all causes. However, controversy continues about its efficacy and safety. This review was first published in 2010 and searches were updated in 2014.

Three review authors independently selected trials for inclusion, assessed risk of bias and extracted data. We contacted study authors for additional information. We identified 23 trials with a total of 1586 participants. Fifty-eight percent of these participants were from five unpublished studies. Quinine was compared to placebo (20 trials), vitamin E (4 trials), a quinine-vitamin E combination (3 trials). The most commonly used quinine dosage was 300 mg/day (range 200 to 500 mg).

The risk of bias in the trials varied considerably. All 23 trials claimed to be randomized, but only a minority described randomization and allocation concealment adequately.

Compared to placebo, quinine significantly reduced cramp number over two weeks by 28%, cramp intensity by 10%, and cramp days by 20%. Cramp duration was not significantly affected.

A significantly greater number of people suffered minor adverse events on quinine than placebo, mainly gastrointestinal symptoms. Overdoses of quinine have been reported elsewhere to cause potentially fatal adverse effects, but in the included trials there was no significant difference in major adverse events compared with placebo.

There is low quality evidence that quinine (200 mg to 400 mg daily) significantly reduces cramp number and cramp days and moderate quality evidence that quinine reduces cramp intensity. There is moderate quality evidence that with use up to 60 days, the incidence of serious adverse events is not significantly greater than for placebo in the identified trials, but because serious adverse events can be rarely fatal, in some countries prescription of quinine is severely restricted.

Friday, March 2, 2018

How Fasting Boots Exercise’s Effects on Endurance

Intermittent fasting, such as eating only on alternate days, might enhance the ability of aerobic exercise to increase endurance because the body switches to using fats and ketones as a source of fuel for muscles instead of carbohydrates.

This was the conclusion that researchers came to after studying the effect in mice with such a regimen for a limited period of time. Their study is to be published in the FASEB Journal.

The findings suggest that three meals per day and snacking may not be the only eating habit for people who engage in endurance sports to reach peak performance and maintain good health.

“Emerging evidence,” explains senior study author Dr. Mark Mattson, from the Laboratory of Neurosciences in the National Institute on Aging in Baltimore, MD, “suggests that [intermittent dietary energy restriction] might improve overall health and reduce risk factors for diabetes and cardiovascular disease in humans.”

He and his team say that their findings propose that a similar pattern of eating and fasting may boost the beneficial effect of moderate aerobic exercise on endurance, and that it should be studied further.

For the study, the team put mice into four groups and observed them for 2 months as they went through the following exercise and eating patterns:

The control (CTRL) mice did not exercise at all and could eat as much food as they wanted every day.

The exercise (EX) mice could eat as much daily food as they wanted, but they also ran on a treadmill for 45 minutes each day.

The “alternate day food deprivation: (ADF) mice were only fed a fixed amount on every other day and did not exercise at all.

The EXADF mice were restricted to the ADF eating pattern but also exercised every day on a treadmill for 45 minutes.