Friday, March 30, 2018

Folate Deficiency Associated with Gene Modulation

Folate deficiency has been associated with the onset of varied metabolic abnormalities, “including insulin resistance, by altering epigenetic processes on key regulatory genes,” such as the calcium/calmodulin-dependent protein kinase kinase 2 (CAMKK2). CAMKK2 is part of the calcium-triggered signaling cascade, and influences obesity and glucose metabolism.

This study looked at subjects with a total folate intake lower than 300 ug/d, more “fat mass (especially trunk fat), as well as statistically higher levels of glucose, insulin, homeostatic model assessment-insulin resistance (HOMA-IR) index, cortisol, and plasminogen activator inhibitor-1, and compared [these levels] to those consuming greater than 300 ug/d [of folate].” They determined that “folate deficiency was related to lower CAMKK2 methylation.

In conclusion, this summary proposed “associations between low folate intakes, lower CAMKK2 gene methylation, and insulin resistance in obese individuals.”

My Take:
I know this reads as pretty technical, but it is the “dummied down” version. So, let me define some terms for you.

Epigenetics is the study of gene expression rather than gene composition. The term literally translates as “above genetics.”

Folate is the food form of folic acid, one of the B vitamins involved in hundreds of metabolic pathways in the human body. However, folate cannot be used by the body in its’ food form. It must have a ‘methyl’ group attached to it through a process called methylation.

I have written frequently about variations in genetic snippets that impair methylation of folate. However, this study shows impaired gene expression from a dietary deficiency of folate that contributes to metabolic syndrome.

Folate, translated as ‘foliage’ is found in all dark green leafy vegetables. Most Americans are deficient in their daily intake because they just don’t eat any vegetables. I recommend five servings of veggies daily and count a big salad as two servings.

The Bottom Line:
This study illuminates some of the biochemistry, including genetic expression, that links poor dietary habits to diabetes and heart disease. While supplementation of bioavailable folic acid is a common occurrence in my practice, it does not replace a healthy diet.

Source: Nutrition Research, February 2018

Wednesday, March 28, 2018

Wisdom Wednesday: Essential Oils May Disrupt Normal Hormonal Activity


New research suggests that the chemicals contained in essential oils such as lavender oil and tea tree oil may disrupt the normal functioning of hormones, leading to a condition called male gynecomastia in prepubescent boys.

Male gynecomastia is a condition in which boys develop noticeable breasts as a result of having abnormally high levels of estrogen, the female sex hormone. Research has previously linked the condition to essential oils such as lavender and tea tree oil. Such oils are regularly used in personal hygiene and cosmetic products, as well as in laundry detergents and aromatherapy candles and devices.

A study from 2007 found that the gynecomastia coincided with the use of essential oil-based products, and that the symptoms disappeared with the products were discontinued. The same study also found that lavender and tea tree oil had estrogen-boosting and anti-androgenic effects on human cells.

The new study was presented at the annual meeting of the Endocrine Society tested the impact of eight components that are commonly found in tea tree and lavender oil on human cancer cells to study their effect on hormonal activity.

Essential oils contain hundreds of chemicals. Researchers picked out 4-terpineol, dipentene/limonene, alpha-terpineol, linalyl acetate, linalool, alpha-terpinene, and gamma-terpinene for study. The first four components are common to both tea tree oil and lavender oil. The research revealed that all of the chemicals tested had an endocrine-disrupting activity to a certain extent.

Monday, March 26, 2018

Low-Calorie Sweeteners May Promote Metabolic Syndrome

New data presented at the annual meeting of the Endocrine Society, held in Chicago, suggests that consuming low-calorie sweeteners could put people at risk of metabolic syndrome.

Around 34% of adults in the United States have metabolic syndrome, the umbrella term for: high blood pressure; high blood sugar; high cholesterol levels; and abdominal fat.
\
We know that metabolic syndrome doubles the risk of heart disease and disease of the blood vessels, putting individuals at risk of heart disease and stroke. People with metabolic syndrome are also 3 to 5 times more likely to develop type 2 diabetes.

In this new study, researchers from George Washington University examined the effects of a low-calorie sweetener called sucralose on human stem cells from fat tissue. These were experimented on in petri dishes that simulated an obesity-promoting environment.

The scientists mimicked the typical concentration of sucralose in the blood of people who consume high quantities of low-calorie sweeteners. When this was administered to the stem cells, the team noticed increased expression of genes linked with fat production and inflammation.
The authors followed this up with a separate experiment involving biopsy samples of abdominal fat from people who were regular consumers of low-calorie sweeteners. In fat samples from people that were a healthy weight, they did not find a significant increase in gene expression, but in the fat samples from overweight or obese participants, there was significant overexpression of fat-producing and inflammation-inducing genes.

Study co-author Sabyasachi Sen, who is an associate professor of medicine at George Washington University, describes the results. “Our stem cell-based studies indicate that low-calorie sweeteners promote additional fat accumulation within the cells compared with cells not exposed to these substances, in a dose-dependent fashion – meaning that as the dose of sucralose in increased more cells showed increased fat droplet accumulation.”

Friday, March 23, 2018

Who is Shaping the Food Choices of the Future?

A conference was held by the British Nutrition Foundation (BNF) in October 2017 to explore future trends in agriculture, manufacturing and retailing, and the relationship between these trends and food choice. I picked a few of the presented topics for your review.
Professor Judith Buttriss, Director General of the BNF, described the grand challenge of securing a sustainable food supply for the world’s growing and more prosperous population in the face of climate change, which will increasingly affect what can be grown and where; the likelihood of the need for trade-offs in relation to the food supply and ecosystem; and public health trend and the associated need to encourage consumer behavior change.

Professor Robert Edwards, Head of the School of Agriculture, Food and Rural Development at Newcastle University, presented a case study on innovation in crop protection. He described how for the last 60 years farming has been heavily dependent on the use of pesticides and herbicides to protect crops. Currently 40% of the global food supply is dependent on agrochemical intervention. Professor Edwards described promising solutions such as new crop breeding technologies and biological treatments, which promote plant vigor and innate defense and precision agriculture technologies.

James Walton, Chief Economist at Institute of Grocery Distribution, discussed results from recent market research that explored the current state of shopper thinking about healthy eating. Currently, 89% of consumers report taking personal responsibility for their diet, with most reporting that they are doing something to make an improvement, such as reducing alcohol or meat intake. The most popular dietary goal was to eat more fruit and vegetables. Mr. Walton ended by describing how online grocery shopping is becoming more popular and can aid healthier choices.

Judith Batchelar OBE, highlighted that legislation, food scares and influential chefs have all shaped consumer food choice over the years and explained how supermarkets have a huge opportunity, to educate and influence consumers about healthy eating.

Wednesday, March 21, 2018

Wisdom Wednesday: Holistic Properties of Foods: A Changing Paradigm in Human Nutrition


Traditionally, the study of nutrition has been based on a reductionist approach, reducing a food down to constituent nutrients and then investigating the effects of these nutrients, either singly or together, on metabolism and metabolic outcomes. However, nutrients per se are not consumed by a person, but rather are consumed in the form of foods. Because of this the complex food matrix itself influences nutritional outcomes, which can often not be fully explained on the basis of the effects of “the sum of the nutrients” alone. Nutrient additivity effects, nutrient interactions, effects of food components other than the classical nutrients, effects of the food matrix for both single foods and combinations of foods consumed as meals on the kinetics of nutrient digestion and subsequent metabolism and metabolic outcomes are discussed. It is concluded that a paradigm shift in human nutrition is needed, with more consideration being given to the holistic effects of specific foods and mixtures of foods constitutes meals and diets.

My Take:
Unfortunately, only the abstract is currently available as it is a current publication. However, that gives me a little more space to play.

This holistic concept is not new to nutrition, looking at the whole food was all we understood for thousands of years. It wasn’t until the invention of the microscope in the 1600s and subsequent development of biochemistry as a scientific discipline two centuries later that led to this reductionist view of life.

By the 1920’s, a dentist named Dr. Royal Lee was decrying the decline of our food quality. He purchased land in Wisconsin and began growing crops to produce a food supplement. In 1929, he produced and marketed ‘Catalyn’, a whole food supplement comprised of local grown crops. His ingredients were alfalfa, barley grass, beets, Brussels sprouts, buckwheat, kale, kidney beans oats, pea vine and Spanish black radish.

Monday, March 19, 2018

Have Scientists Found an Answer to Chronic Pain?

Using computer modeling, researchers have designed a new compound that may help to treat neuropathic pain. In animal trials, it produced immediate, long-lasting therapeutic effects.

Neuropathic pain is a chronic condition wherein people have a heightened sensibility to pain, or hyperalgesia, and feel pain following stimuli that would not usually cause pain, or allodynia. For some individuals, the pain can come and go, seemingly at random. For others, however, it can be continuous. The condition affects up to 10% of the population of the United States, and there are currently no specific treatments that significantly relieve the discomfort and pain.

As it stands, anti-depressants and anti-epileptics are most commonly used to treat neuropathic pain, but less than 50% of people report a significant reduction to their pain. There is a range of conditions and situations that could lead to neuropathic pain. These include diabetes, spinal cord injury, herpes zoster infection, toxins, trauma, and chemotherapy. But although certain risk factors are known, there are still many gaps in our knowledge.

It is thought that peripheral neuropathic pain is caused by lesions in nerves. These lesions disrupt the blood-nerve barrier, allowing blood and the immune cells that it carries to contact the nerves. However, exactly how and why this produces neuropathic pain is not understood. The molecular interactions and chemical pathways involved are still being investigated.

Researchers from the Institute for Neurosciences of Montpellier found that immune cells, which flood the damaged nerves at the site of the lesion, produce a cytokine called FL which binds to activate FLT3 receptors. FLT3 receptors are produced by hematopoietic stem cells, the cell type that gives rise to blood cells.

Once the two molecules lock together, a chain reaction is activated that impacts the sensory system, producing pain and allowing it to persist. This is known as chronification. The findings are published this week in the journal Nature Communications.

Once the team understood the role of FLT3 in generating neuropathic pain, they analyzed 3 million potential molecular configurations eventually finding an anti-FTL3 molecule – which they dubbed BDT001.

Friday, March 16, 2018

Antimicrobial Stewardship

The first antimicrobial stewardship programs were introduced in hospitals more than 30 years ago to address inappropriate antibiotic prescribing and increasing antibiotic resistance. Since then a large body of evidence on the effectiveness and safety of this approach has accumulated.

The purpose of antimicrobial stewardship is to promote the prudent use of antibiotics in order to optimize patient outcomes, while at the same time minimizing the probability of adverse effects, including toxicity and the selection of pathogenic organisms, and the emergence and spread of antibiotic resistance.

The previous Cochrane Review demonstrated that interventions to reduce excessive antibiotic prescribing were successful, with persuasive and restrictive interventions being equally effective in reducing prescribing after six months. The recent update demonstrates that enabling and restrictive interventions are associated with a 15% increase in compliance with desired practice, a 1.95-day decrease in duration of antibiotic treatment, and a 1.12-day decrease in inpatient length of stay, without compromising patient safety.

Initiatives for implementing or strengthening antimicrobial stewardship were primarily developed as a response to increasing antibiotic resistance. Increasing antibiotic use results in increasing antibiotic resistance rates. But does improving antibiotic prescribing reverse antibiotic resistance rates? The updated Cochrane Review does not provide an answer; only 9% of the randomized controlled trials and 19% of the interrupted time series studies reported microbial outcome data. However, a reduction in the rate of Clostridium difficile infections was consistently demonstrated in the studies interventions.

Despite the extensive evidence base, antimicrobial stewardship programs are not a requirement in all hospitals. Antimicrobial resistance requires global action. This requires political commitment and resources, suggesting a role for continued advocacy by public health and specialist professionals and organizations. One significant characteristic of the evidence base is that 183 or the 221 studies in the updated Cochrane Review were performed in Europe or North American.

Wednesday, March 14, 2018

Wisdom Wednesday: U.S. Cancer Treatment Guidelines ‘Often Based on Weak Evidence’


Cancer treatment guidelines produced by the US National Comprehensive Cancer Network (NCCN) are often based on low quality evidence or no evidence at all, finds a study published by The BMJ today.

The researchers, led by Dr. Vinay Prasad at Oregon Health & Science University, say their findings “raise concern that the NCCN justifies the coverage of costly, toxic cancer drugs based on weak evidence.”

These recommendations are used by US private health insurers and social insurance schemes to make coverage decisions, and guide global cancer practice, but it is not clear how the evidence is gathered or reviewed.

In the US, the Food and Drug Administration (FDA) approves all new drugs and grants new indications for drugs already on the market. The NCCN makes recommendations both within and outside of FDA approvals, but patterns of NCCN recommendations beyond FDA approvals have not been analyzed.

So Dr. Prasad and his team compared FDA approvals of cancer drugs with NCCN recommendation in March 2016 for a contemporary sample of drugs. When the NCCN made recommendations beyond the FDA’s approvals, the evidence used to support those recommendations was evaluated.

A total of 47 new cancer drugs were approved by the FDA for 69 indications over the study period, whereas the NCCN recommended these drugs for 113 indications, of which 69 (62%) overlapped with the 69 FDA approved indications and 44 (39%) were additional recommendations.
Only 10 (23%) of these additional recommendations were based on evidence from randomized controlled trials, and seven (16%) were based on evidence from phase III studies. Most relied on small, uncontrolled studies or case reports, or no offered evidence.

And almost two years after their analysis, the researchers found than only six (14%) of the additional recommendations by the NCCN had received FDA approval.

Monday, March 12, 2018

Type 2 Diabetes: New Guidelines Lower Blood Sugar Control Levels

The American College of Physicians have now published their new guidelines regarding the desired blood sugar control levels for people with type 2 diabetes. The recommendations aim to change current therapeutic practices, and doctors should aim for a moderate level of blood sugar when treating their patients.

According to the most recent estimates, almost 30 million people in the United States have type 2 diabetes, which amounts to over 9% of the entire U.S. population.

Once diagnosed with type 2 diabetes, patients are often advised to take what is known as a glycated hemoglobin (HbA1c) test in order to keep blood sugar levels under control. The test averages a person’s blood sugar levels over the past 2 or 3 months, with an HbA1c score of 6.5% indicating diabetes.

But some studies have pointed out that the HbA1c test may currently be overused in the U.S., and they have suggested that such over-testing may lead to over-treating patients with hypoglycemic drugs. These drugs often have a range of side effects, such as gastrointestinal problems, excessively low blood sugar, weight gain, and even congestive heart failure. Additionally, as some researchers have pointed out, “Excessive testing contributes to the growing problem of waste in healthcare and increased patient burden in diabetes management.”

The existing recommendations of a score of 6.5% - or below 7% - decrease the risk of microvascular complications over time. However, the American College of Physicians (ACP) found that the evidence for such a reduction is “inconsistent.”

Friday, March 9, 2018

Could Vitamin D Lower Cardiovascular Death Risk?

People who have cardiovascular disease can reduce their risk of death by almost a third simply by maintaining normal vitamin D levels. This is the finding of a new study recently published in The Journal of Clinical Endocrinology & Metabolism.

Cardiovascular disease (CVD) is the number 1 killer in the United States. Heart disease alone is responsible for around 610,000 deaths in the country every year. Previous research suggests that vitamin D status may play an important role in cardiovascular health. A study in 2016, for example, associated low vitamin D levels with greater risk of stroke, heart failure, heart attack, and cardiovascular death.

The new study – led by Prof. Jutta Dierkes, of the Department of Clinical Medicine at the University of Bergen in Norway – further investigated the role that vitamin D levels play in the risk of death from CVD.

Prof. Dierkes and colleagues analyzed the blood samples of 4,114 adults who had suspected angina pectoris, which is chest pain as a result of coronary heart disease. Subjects were an average age of 62 at study baseline, and they were followed-up for an average of 12 years.

The team assessed the subjects’ blood samples for levels of 25-hydroxyvitamin D, or 25(OH)D, which is the primary circulating form of vitamin D. During follow-up, there were a total of 895 deaths. Of these, 407 were related to CVD.

According the National Institutes of Health (NIH), as 25(OH)D level of 50-125 nanomoles per liter (nmol/l) is “generally considered adequate for bone and overall health in healthy individuals.”

Wednesday, March 7, 2018

Wisdom Wednesday: A Short History of Quinine


As a follow-up to Monday’s blog, I thought you might enjoy some history about quinine. Legend has it that the bark of the fever tree was first used by the Spanish in the early 1630s when it was given to the Countess of Chinchon, who had contracted malaria (know colloquially as the ‘fever’) while living in Peru. The Countess recovered and the healing properties of the fever tree were passed to Europe.
It was imported to Europe under the name ‘Jesuits Powder’ which proved a very poor selling strategy in Protestant England. Even when Charles II in 1679 was cured of the ‘fever’ its popularity was not assured as its use remained the secret of his physician (Robert Talbor).

However, the healing power of this remarkable tree only became world renowned in the 1820’s when officers of the British Army in India, in an attempt to ward off malaria, mixed quinine (the extract from the bark of the fever tree) with sugar and water, creating the first Indian Tonic Water.

It was made more palatable when they added a little expedient of gin to the mixture. The original gin and tonic was thus born, and soon became the archetypal drink of the British Empire, the origins of which were firmly planted in the fever tree.
Medical historians claim that the British lost India primarily because the British forces were too drunk from the daily ritual of drinking gin and tonic to effectively fight.

Colonialism produced a huge demand for the bark of the fever tree. In the 1850s the East India Company alone spend 100,000 pounds annually on the bark, but it still brought in nowhere nearly enough to keep the colonists healthy. The answer was to try and cultivate fever trees in the colonies. This initiative inspired intrepid plant hunters across Europe to risk all and travel to South America to harvest these most valuable seeds. The Englishman, Richard Spruce, brought back seeds from Ecuador, which were subsequently grown in India and Ceylon, but they turned out to be of a species that was relatively poor in quinine. Using the wrong species of herb or wrong part of the plant remains a significant issue in herbology today with up to 80% of commercial products being ineffective as a result of this practice.

Monday, March 5, 2018

Quinine for Muscle Cramps

Muscle cramps can occur anywhere and for many reasons. Quinine has been used to treat cramps of all causes. However, controversy continues about its efficacy and safety. This review was first published in 2010 and searches were updated in 2014.

Three review authors independently selected trials for inclusion, assessed risk of bias and extracted data. We contacted study authors for additional information. We identified 23 trials with a total of 1586 participants. Fifty-eight percent of these participants were from five unpublished studies. Quinine was compared to placebo (20 trials), vitamin E (4 trials), a quinine-vitamin E combination (3 trials). The most commonly used quinine dosage was 300 mg/day (range 200 to 500 mg).

The risk of bias in the trials varied considerably. All 23 trials claimed to be randomized, but only a minority described randomization and allocation concealment adequately.

Compared to placebo, quinine significantly reduced cramp number over two weeks by 28%, cramp intensity by 10%, and cramp days by 20%. Cramp duration was not significantly affected.

A significantly greater number of people suffered minor adverse events on quinine than placebo, mainly gastrointestinal symptoms. Overdoses of quinine have been reported elsewhere to cause potentially fatal adverse effects, but in the included trials there was no significant difference in major adverse events compared with placebo.

There is low quality evidence that quinine (200 mg to 400 mg daily) significantly reduces cramp number and cramp days and moderate quality evidence that quinine reduces cramp intensity. There is moderate quality evidence that with use up to 60 days, the incidence of serious adverse events is not significantly greater than for placebo in the identified trials, but because serious adverse events can be rarely fatal, in some countries prescription of quinine is severely restricted.

Friday, March 2, 2018

How Fasting Boots Exercise’s Effects on Endurance

Intermittent fasting, such as eating only on alternate days, might enhance the ability of aerobic exercise to increase endurance because the body switches to using fats and ketones as a source of fuel for muscles instead of carbohydrates.

This was the conclusion that researchers came to after studying the effect in mice with such a regimen for a limited period of time. Their study is to be published in the FASEB Journal.

The findings suggest that three meals per day and snacking may not be the only eating habit for people who engage in endurance sports to reach peak performance and maintain good health.

“Emerging evidence,” explains senior study author Dr. Mark Mattson, from the Laboratory of Neurosciences in the National Institute on Aging in Baltimore, MD, “suggests that [intermittent dietary energy restriction] might improve overall health and reduce risk factors for diabetes and cardiovascular disease in humans.”

He and his team say that their findings propose that a similar pattern of eating and fasting may boost the beneficial effect of moderate aerobic exercise on endurance, and that it should be studied further.

For the study, the team put mice into four groups and observed them for 2 months as they went through the following exercise and eating patterns:

The control (CTRL) mice did not exercise at all and could eat as much food as they wanted every day.

The exercise (EX) mice could eat as much daily food as they wanted, but they also ran on a treadmill for 45 minutes each day.

The “alternate day food deprivation: (ADF) mice were only fed a fixed amount on every other day and did not exercise at all.

The EXADF mice were restricted to the ADF eating pattern but also exercised every day on a treadmill for 45 minutes.