USE ASTROLOGY TO TREAT PATIENTS

GPS COULD USE ASTROLOGY TO TREAT PATIENTS MORE EFFICIENTLY, SAYS MP

From the News Desk of Jeanne Hambleton

Posted 28 July 2014 | By Sofia Lind Pulse Daily

GPs may benefit from using astrology to make health services more efficient and reduce public spend, a member of the House of Commons Health Committee has told Pulse.

David Tredinnick, Conservative MP for Bosworth, said he believes that the ancient practice – used in western, eastern and native American cultures for thousands of years – could ‘certainly’ be useful to GPs and also help reduce the cost of NHS as a whole.

The debate was important, he added, to stop the overuse of antibiotics and reduce pressure on doctors, by guiding patients in understanding what type of ailments they may be prone to, based on the position of the planets at the time they were born.

Asked whether it could also help GPs in treating patients, Mr Tredinnick said: ‘Certainly. Particularly a lot of GPs from the Indian subcontinent would be aware of the Indian astrology and probably apply it. I mean there are doctors here who do astrology. It has been around for so long that I think it is time to stop saying “it has no evidence”, it has been used for 3,000 years in all these cultures and we need to be a bit more broadminded.’

‘[Astrology] does have a part to play and I’ve studied it over 20 years, but it doesn’t work on the basis of double-blind placebo controlled trials. It works on the basis of observation and to a degree intuition, and this is something that we have lost in the health service. We rely too much on evidence and we should listen more to patients’ experience which is what we always used to do.’

For example, astrology may help GPs and their patients understand which pending health issues they should be on the lookout for, he added.

Mr Tredinnick said: ‘The signs of the Zodiac have been associated with different ailments. For example Capricorns are associated with the knees, I am a Capricorn and I’ve always had to watch my knees, Aries is the head, Pisces is the feet and you have the others sort of going in order from top to bottom. That’s a fairly simple way of looking at it but some people because of their astrological make up would be more susceptible to some ailments than others.’

He added that the ‘bigger point’ though was how turning to complementary medicine could help achieve the ‘Nicholson challenge’ of cutting spending on the NHS by increasing efficiency and also the help avert the ‘drugs crisis’ linked to antimicrobial resistance.

He said: ‘We have to look at ways of reducing that demand. Traditional disciplines, such as herbal medicine, acupuncture, which is very widely used in China, homeopathic medicine – 90% of pregnant women in France use homeopathic medicines – we need to try and have a better understanding of these options to reduce demand, so that it is not just about increasing supply. We have to produce other alternatives to reduce the pressure on existing services in the system and take the pressure off doctors.’

The MP clarified however that he was not suggesting the NHS pays for patients to have their astrological charts done.

‘I have not said that this should be on the health service. I have been quite clear about this. This is something that can be looked at by people, but I am not advocating that the health service pays for this service. To have a chart done, or a map done, an astrological breakdown of someone’s personality and likely behaviour costs about £30. You can go online, there are lots of people doing it, and you can buy it as a computer programme I use and I’ve done it for MPs in the past. It is very, very helpful and based on where the emphasis in your chart is you get some idea of where you are likely to be affected and where you are not.’

Pulse spoke to Mr Tredinnick in light of comments he made in the House of Commons last week that he hoped Government would stop ‘looking just at increasing the supply of drugs and consider the way that complementary and alternative medicine can reduce the demand for drugs, reduce pressures on the health service, increase patient satisfaction, and make everyone in this country happier’.

Readers’ Comments

One reader wrote “Is it April 1st?”

An anonymous reader wrote” OMG, a lunatic is really in charge of the asylum. 

How does someone with views like that get to be on committees dealing with medicine and with science?
 What next witch doctors on the NHS? Shall we kill a chicken and read its entrails?

Just how much sheer stupidity can the NHS take before blowing up?

An other nameless reader wrote.” If he were a doctor, letters would right now be winging their way to GMC. But since he’s a politician he is allowed to spout drivel and draw a salary for it!

FIBROMYALGIA – LOW THYROID AND FALLING THROUGH THE CRACKS

“What Your Doctor Doesn’t Know About Low Thyroid Is Keeping You Sick and Tired

From the News Desk of Jeanne Hambleton Posted JULY 28 2014 Dr Rodger Murphree’s Blog THE FIBRO DOCTOR

When we look at low thyroid function, it really mirrors all those symptoms we see in fibromyalgia-chronic fatigue syndrome because when you have a low functioning thyroid, you start to have fatigue and the mental fibro fog, the headaches and the weight gain, cold hands and feet, poor memory, hair loss, hoarseness of your voice, start to get anxiety issues, depression. It increases your joint and muscle pain.

Along with that, you might get tingling in your hands and feet. There can also be some numbing sensations, certainly poor circulation, cold hands, cold feet also is part of that. Typically you have constipation, may get high blood pressure, elevated cholesterol, have problems with menstrual irregularities, PMS, infertility issues, fibrocystic breast disease, polycystic ovary syndrome, reactive low blood sugar reactions; hypoglycemia … psoriasis or the instance of psoriasis certainly raises its ugly head, and low immune function.

All these things show up with low thyroid. The reason that is, is because your thyroid and thyroid hormones control every function, every cell in the body. When the thyroid hormones are depleted or your thyroid function is compromised, you start having a slowdown in all of your cellular energy processes, your metabolism; what gives you energy to actually have a cell make something happen biochemically. It’s no wonder since the thyroid hormones control every cell, once they become compromised, you start getting all these different symptoms that can go along with that. That makes thyroid a very, very important thing to address if it’s an issue. As I said, I find about 60-70% of my patients of fibromyalgia and chronic fatigue syndrome have a low thyroid. Either it’s been misdiagnosed or it’s been improperly treated.

The doctor will say, “Let’s check your thyroid. Sounds like your energy is down, let’s check your thyroid.” They do the thyroid test and it comes back normal, they tell you everything’s okay. The problem with many doctors is they just start treating your symptoms. If you’re tired, they may put you on an amphetamine which is an upper that’s illegal if you sold it out on the street, called Ritalin or Concerta, or one of these other speed medications; amphetamines which really do havoc to your biochemistry and really very big source of problems that I see in patients who take that because it makes your adrenal glands, your stress coping glands and their function even more compromised, especially long term.

They may put you on something for your high cholesterol, even though your cholesterol is being elevated because you got low thyroid. They may put you on something for your low moods like Paxil, so you gain 30-40 pounds on this antidepressant, even though low moods are due to the low thyroid.

The other thing I’ve seen happen is patients get on prescription medication for their thyroid and they get checked about twice a year. The doctor says everything looks fine, even though they feel terrible because the thyroid medication’s either not the right dose, it’s too weak a dose, or it’s the wrong thyroid medication. That often happens as well.

The other thing I’ve seen happen is patients get on prescription medication for their thyroid and they get checked about twice a year. The doctor says everything looks fine, even though they feel terrible because the thyroid medication’s either not the right dose, it’s too weak a dose, or it’s the wrong thyroid medication. That often happens as well.

HEALING AFFECTS OF BATH SALTS

From the News Desk of Jeanne Hambleton

By Dr Rodger Murphree The Fibro Doctor

For over 2,000 years, the Dead Sea has been renowned for its therapeutic effects on bathers. Recently, studies have been done on the health benefits bathing in Dead Sea salts:

One such study was conducted by Dr. I. Machtey on 103 patients suffering from osteoarthritis and tendinitis…. Improvement was found after as little as one week of treatment for those treated with 7.5% or 2% salt baths. By the study’s end, 80% of the patients reported less pain; 70% experienced improved mobility and 60% were able to decrease their use of analgesics.

All bath salts are sea salts (with the exception of Epsom salt and Himalayan salt), and are obtained naturally from evaporating seawater. Research shows that adding a little sea salt to your bath can increase circulation, ease muscle cramps, relieve arthritis or back pain, and soothe achy, overworked legs and feet. Plus sea salt helps cleanse and detoxify your skin, the largest organ in the body.

I often recommend my patients use epsom salt baths to reduce their achy, tight muscles.

Did you know-Epsom salt isn’t actually salt. Epsom salt is a pure mineral compound of magnesium sulfate in crystal form, so it looks an awful lot like salt. But it has no sodium chloride. Sodium chloride is salt. Salt is sodium chloride.

Epsom salt is made up of naturally occurring minerals magnesium and sulfate, which can help improve health in numerous ways. A lack of magnesium—which helps regulate the activity of more than 300 enzymes in the body—can contribute to high blood pressure, hyperactivity, heart problems, constipation, depression, fatigue, restless leg syndrome, and other health issues. Magnesium is one of the most important stress coping chemicals and is quickly depleted in times of stress. The more stress we are under the more magnesium is needed. In today’s fast paced culture, most Americans are suffering from deficient magnesium levels.

More than 80 percent of Americans are deficient in magnesium, which helps the body regulate heart muscles and control high blood pressure.

Along with taking additional magnesium on a daily basis, epsom salt baths offer a way to optimize our magnesium levels.

Sulfate is essential for many biological processes, helping to flush toxins and helping form proteins in joints, brain tissue and mucin proteins.

Next time you’re feeling some aches and pains, try taking a warm 20-minute bath with one to two cups of epsom salts. Trust me you’ll become addicted.

I like to mix together, one cup epsom and one cup regular sea water bath salts with several drops of eucalyptus for a soothing, health promoting a bath that cleans the skin, relaxes the muscles and opens up the sinuses.

See you tomorrow Jeanne

 

 

 

 

 

Posted in 13606, 16816416, 16816701, 18473400, 18473734, 18475482, 18477189, 18477433, 18542595, 18637166, 18638001, 18961528, 19138205, 19555105, 19771093, Blogging, Britain feeds, Europe feeds, European Health Commissioner Ms Androulla Vassiliou, feed://feedproxy.google.com/bydls, Feeds, feeds://feedburnercomwordpressmytb-2, FM, FMA UK, FMS, FMS Diagnosis, FMS Global News, FMS-SAS, Global News, Health, http:// feedburner.hambletonjeanne@gmail.com., http://feedburner.google.com/fb, http://feedburner.google.com/fbX, http://feeds.feedburner.com/wordpress/HCNq, http://feeds.feedburner.com/wordpress/tbOk, http://feeds.feedburner.com/~u/16675050216497428360, http://globalhealthvision.wordpress.com, http://jeannehambleton77.wordpress.com, http://www.medpagetoday.com/PainManagement/PainManagement/tb/12867, http://www.myspace.com/jeannehambleton, Hypersensitivity, Jeanne Hambleton, Journalist UK, News US feeds, News USA, News USA feeds, RSS, RSS Feeds, Support Groups, Twitter, UK feeds, Uncategorized, USA feeds, World feeds, World News, Worldwide | Tagged , , , , , | Leave a comment

NASA’S HUBBLE TO BEGIN SEARCH BEYOND PLUTO FOR A NEW HORIZONS MISSION TARGET
From the News Desk of Jeanne Hambleton
Embargo expired: 16-Jun-2014 11:00 AM EDT
Source Newsroom: Space Telescope Science Institute (STScI)

Newswise — After careful consideration and analysis, the Hubble Space Telescope Time Allocation Committee has recommended using Hubble to search for an object the Pluto-bound NASA New Horizons mission could visit after its flyby of Pluto in July 2015.

The planned search will involve targeting a small area of sky in search of a Kuiper Belt object (KBO) for the outbound spacecraft to visit. The Kuiper Belt is a vast debris field of icy bodies left over from the solar system’s formation 4.6 billion years ago.

Kuiper Belt object — a city-sized icy relic left over from the birth of our solar system. The Sun, more than 4.1 billion miles (6.7 billion kilometers) away, shines as a bright star embedded in the glow of the zodiacal dust cloud. Jupiter and Neptune are visible as orange and blue “stars” to the right of the Sun. A KBO has never been seen up close because the belt is so far from the Sun, stretching out to a distance of 5 billion miles into a never before-visited frontier of the solar system.

“I am pleased that our science peer-review process arrived at a consensus as to how to effectively use Hubble’s unique capabilities to support the science goals of the New Horizons mission,” said Matt Mountain, director of the Space Telescope Science Institute (STScI) in Baltimore, Maryland.

The full execution of the KBO search is contingent upon the results from a pilot observation using Hubble observations provided by Mountain’s director’s discretionary time.

The space telescope will scan an area of sky in the direction of the constellation Sagittarius to try and identify any objects orbiting within the Kuiper Belt. To discriminate between a foreground KBO and the clutter of background stars in Sagittarius, the telescope will turn at the predicted rate that KBOs are moving against the background stars. In the resulting images, the stars will be streaked, but any KBOs should appear as pinpoint objects.

If the test observation identifies at least two KBOs of a specified brightness, it will demonstrate statistically that Hubble has a chance of finding an appropriate KBO for New Horizons to visit. At that point, an additional allotment of observing time will continue the search across a field of view roughly the angular size of the full Moon.

Astronomers around the world apply for observing time on the Hubble Space Telescope. Competition for time on the telescope is extremely intense and the requested observing time significantly exceeds the observing time available in a given year.

Proposals must address significant astronomical questions that can only be addressed with Hubble’s unique capabilities and are beyond the capabilities of ground-based telescopes. The proposals are peer reviewed annually by an expert committee, which looks for the best possible science that can be conducted by Hubble and recommends to the STScI director a balanced program of small, medium, and large investigations.

Though Hubble is powerful enough to see galaxies near the horizon of the universe, finding a KBO is a challenging needle-in-haystack search. A typical KBO along the New Horizons’ trajectory may be no larger than Manhattan Island and as black as charcoal.

Even before the launch of New Horizons in 2006, Hubble has provided consistent support for this edge-of-the-solar system mission. Hubble was used to discover four small moons orbiting Pluto and its binary companion object Charon, providing new targets to enhance the mission’s scientific return.

And Hubble has provided the most sensitive search yet for potentially hazardous dust rings around Pluto. Hubble also has made a detailed map of the dwarf planet’s surface, which astronomers are using to plan New Horizons’ close-up reconnaissance photos.

In addition to Pluto exploration, recent Hubble solar system observations have discovered a new satellite around Neptune, probed the magnetospheres of the gas-giant planets, found circumstantial evidence for oceans on Europa, and uncovered several bizarre cases of asteroids disintegrating before our eyes.

Hubble has supported numerous NASA Mars missions by monitoring the Red Planet’s seasonal atmospheric changes. Hubble has made complementary observations in support of the Dawn asteroid mission, and comet flybys. Nearly 20 years ago, in July 1994, Hubble documented the never-before-seen string of comet collisions with Jupiter that resulted from the tidal breakup of comet Shoemaker-Levy 9.

“The planned search for a suitable target for New Horizons further demonstrates how Hubble is effectively being used to support humankind’s initial reconnaissance of the solar system,” said Mountain.

“Likewise, it is also a preview of how the powerful capabilities of the upcoming James Webb Space Telescope will further bolster planetary science. We are excited by the potential of both observatories for ongoing solar system exploration and discovery.”

For images and more information about Hubble, visit: http://hubblesite.org/news/2014/29 and http://www.nasa.gov/hubble

The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Maryland, conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington, D.C.

RESEARCHER SHOWS HOW STRESS HORMONES PROMOTE BRAIN’S BUILDING OF NEGATIVE MEMORIES

Important clinical implications for understanding PTSD and memory in women

From the News Desk of Jeanne Hambleton
Released: 23-Jul-2014 12:00 PM EDT
Source Newsroom: Arizona State University College of Liberal Arts and Sciences – Citations Neuroscience

Newswise — When a person experiences a devastating loss or tragic event, why does every detail seem burned into memory; whereas, a host of positive experiences simply fade away?

It’s a bit more complicated than scientists originally thought, according to a study recently published in the journal Neuroscience by Arizona State University researcher Sabrina Segal.

When people experience a traumatic event, the body releases two major stress hormones: norepinephrine and cortisol. Norepinephrine boosts heart rate and controls the fight-or-flight response, commonly rising when individuals feel threatened or experience highly emotional reactions. It is chemically similar to the hormone epinephrine – better known as adrenaline.

In the brain, norepinephrine in turn functions as a powerful neurotransmitter or chemical messenger that can enhance memory.

Research on cortisol has demonstrated that this hormone can also have a powerful effect on strengthening memories. However, studies in humans up until now have been inconclusive – with cortisol sometimes enhancing memory while at other times having no effect.

A key factor in whether cortisol has an effect on strengthening certain memories may rely on activation of norepinephrine during learning, a finding previously reported in studies with rats.

In her study, Segal, an assistant research professor at the Institute for Interdisciplinary Salivary Bioscience Research (IISBR) at ASU, and her colleagues at the University of California- Irvine showed that human memory enhancement functions in a similar way.

Conducted in the laboratory of Larry Cahill at U.C. Irvine, Segal’s study included 39 women who viewed 144 images from the International Affective Picture Set. This set is a standardized picture set used by researchers to elicit a range of responses, from neutral to strong emotional reactions, upon view.

Segal and her colleagues gave each of the study’s subjects either a dose of hydrocortisone – to simulate stress – or a placebo just prior to viewing the picture set. Each woman then rated her feelings at the time she was viewing the image, in addition to giving saliva samples before and after. One week later, a surprise recall test was administered.

What Segal’s team found was that “negative experiences are more readily remembered when an event is traumatic enough to release cortisol after the event, and only if norepinephrine is released during or shortly after the event.”

“This study provides a key component to better understanding how traumatic memories may be strengthened in women,” Segal added. “because it suggests that if we can lower norepinephrine levels immediately following a traumatic event, we may be able to prevent this memory enhancing mechanism from occurring, regardless of how much cortisol is released following a traumatic event.”

Further studies are needed to explore to what extent the relationship between these two stress hormones differ depending on whether you are male or female, particularly because women are twice as likely to develop disorders from stress and trauma that affect memory, such as in Posttraumatic Stress Disorder (PTSD).

In the meantime, the team’s findings are a first step toward a better understanding of neurobiological mechanisms that underlie traumatic disorders, such as PTSD.

Link | Posted on by | Tagged , , , , , , , , , , , , , | Leave a comment

HUBBLE DETECTS GAS STREAMER ECLIPSES SUPERMASSIVE BLACK HOLE

From the FMS Global News Desk of Jeanne Hambleton

Embargo expired: 19-Jun-2014 2:00 PM EDT
Source Newsroom: Space Telescope Science Institute (STScI)

Citations Science Express, Jun-2014

Newswise — An international team of astronomers, using data from several NASA and European Space Agency (ESA) space observatories, has discovered unexpected behavior from the supermassive black hole at the heart of the galaxy NGC 5548, located 244.6 million light-years from Earth. This behavior may provide new insights into how supermassive black holes interact with their host galaxies.

Immediately after NASA’s Hubble Space Telescope observed NGC 5548 in June 2013, this international research team discovered unexpected features in the data. They detected a stream of gas flowing rapidly outward from the galaxy’s supermassive black hole, blocking 90 percent of its emitted X-rays.

“The data represented dramatic changes since the last observation with Hubble in 2011,” said Gerard Kriss of the Space Telescope Science Institute (STScI) in Baltimore, Maryland. “I saw signatures of much colder gas than was present before, indicating that the wind had cooled down due to a significant decrease in X-ray radiation from the galaxy’s nucleus.”

The discovery was made during an intensive observing campaign that also included data from NASA’s Swift spacecraft, Nuclear Spectroscopic Telescope Array (NuSTAR), and Chandra X-ray Observatory, as well as ESA’s X-ray Multi-Mirror Mission (XMM-Newton) and Integral gamma-ray observatory (INTEGRAL).

After combining and analyzing data from all six sources, the team was able to put together the pieces of the puzzle. Supermassive black holes in the nuclei of active galaxies, such as NGC 5548, expel large amounts of matter through powerful winds of ionized gas. For instance, the persistent wind of NGC 5548 reaches velocities exceeding 621 miles (approximately 1,000 kilometers) a second. But now a new wind has arisen, much stronger and faster than the persistent wind.

“These new winds reach speeds of up to 3,107 miles (5,000 kilometers) per second, but is much closer to the nucleus than the persistent wind,” said lead scientist Jelle Kaastra of the SRON Netherlands Institute for Space Research.

“The new gas outflow blocks 90 percent of the low-energy X-rays that come from very close to the black hole, and it obscures up to a third of the region that emits the ultraviolet radiation at a few light-days distance from the black hole.”

The newly discovered gas stream in NGC 5548 — one of the best-studied of the type of galaxy know as Type I Seyfert — provides the first direct evidence of a shielding process that accelerates the powerful gas streams, or winds, to high speeds. These winds only occur if their starting point is shielded from X-rays.

It appears the shielding in NGC 5548 has been going on for at least three years, but just recently began crossing their line of sight.

“There are other galaxies with similar streams of gas flowing outward from the direction of its central black hole, but we’ve never before found evidence that the stream of gas changed its position as dramatically as this one has,” said Kriss. “This is the first time we’ve seen a stream like this move into our line of sight. We got lucky.”

Researchers also deduced that in more luminous quasars, the winds may be strong enough to blow off gas that otherwise would have become “food” for the black hole, thereby regulating both the growth of the black hole and that of its host galaxy.

These results are being published online in the Thursday issue of Science Express.

For images and more information about Hubble, visit:

http://hubblesite.org/news/2014/30 — http://www.nasa.gov/hubble — http://www.spacetelescope.org/news/heic1413

The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. STScI conducts Hubble science operations and is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington, D.C.

NEW BRAIN PATHWAYS FOR TYPE 2 DIABETES AND OBESITY

UT SOUTHWESTERN RESEARCHERS UNCOVER NEURAL PATHWAYS

From the FMS Global News Desk of Jeanne Hambleton

Released: 25-Jul-2014 2:00 PM EDT
Source Newsroom: UT Southwestern Medical Center

Citations Nature Neuroscience 17, 911–913 (2014)

Newswise — DALLAS – July 25, 2014 – Researchers at UT Southwestern Medical Center have identified neural pathways that increase understanding of how the brain regulates body weight, energy expenditure, and blood glucose levels – a discovery that can lead to new therapies for treating Type 2 diabetes and obesity.

The study, published in Nature Neuroscience, found that melanocortin 4 receptors (MC4Rs) expressed by neurons that control the autonomic nervous system are key in regulating glucose metabolism and energy expenditure, said senior author Dr. Joel Elmquist, Director of the Division of Hypothalamic Research, and Professor of Internal Medicine, Pharmacology, and Psychiatry.

“A number of previous studies have demonstrated that MC4Rs are key regulators of energy expenditure and glucose homeostasis, but the key neurons required to regulate these responses were unclear,” said Dr. Elmquist, who holds the Carl H. Westcott Distinguished Chair in Medical Research, and the Maclin Family Distinguished Professorship in Medical Science, in Honor of Dr. Roy A. Brinkley.

“In the current study, we found that expression of these receptors by neurons that control the sympathetic nervous system, seem to be key regulators of metabolism. In particular, these cells regulate blood glucose levels and the ability of white fat to become ‘brown or beige’ fat.”

Using mouse models, the team of researchers, including co-first authors Dr. Eric Berglund, Assistant Professor in the Advanced Imaging Research Center and Pharmacology, and Dr. Tiemin Liu, a postdoctoral research fellow in Internal Medicine, deleted MC4Rs in neurons controlling the sympathetic nervous system.

This manipulation lowered energy expenditure and subsequently caused obesity and diabetes in the mice. The finding demonstrates that MC4Rs are required to regulate glucose metabolism, energy expenditure, and body weight, including thermogenic responses to diet and exposure to cold. Understanding this pathway in greater detail may be a key to identifying the exact processes in which type 2 diabetes and obesity are developed independently of each other.
In 2006, Dr. Elmquist collaborated with Dr. Brad Lowell and his team at Harvard Medical School to discover that MC4Rs in other brain regions control food intake but not energy expenditure.

The American Diabetes Association lists Type 2 diabetes as the most common form of diabetes. The disease is characterized by high blood glucose levels caused by the body’s lack of insulin or inability to use insulin efficiently, and obesity is one of the most common causes.

Future studies by Dr. Elmquist’s team will examine how melanocortin receptors may lead to the “beiging” of white adipose tissue, a process that converts white adipose to energy-burning brown adipose tissue.

Other UT Southwestern researchers involved in the study include Dr. Philipp Scherer, Director of the Touchstone Center for Diabetes Research, Professor of Internal Medicine and Cell Biology, and holder of the Gifford O. Touchstone, Jr. and Randolph G. Touchstone Distinguished Chair in Diabetes Research; Dr. Kevin Williams, Assistant Professor of Internal Medicine; Dr. Syann Lee, Instructor of Internal Medicine; Dr. Jong-Woo Sohn, postdoctoral research fellow; and Charlotte Lee, senior research scientist.

The study was supported by the National Institutes of Health, the American Diabetes Association, and the American Heart Association.

About UT Southwestern Medical Center
UT Southwestern, one of the premier academic medical centers in the nation, integrates pioneering biomedical research with exceptional clinical care and education. The institution’s faculty includes many distinguished members, including six who have been awarded Nobel Prizes since 1985. Numbering more than 2,700, the faculty is responsible for groundbreaking medical advances and is committed to translating science-driven research quickly to new clinical treatments. UT Southwestern physicians provide medical care in 40 specialties to nearly 91,000 hospitalized patients and oversee more than 2 million outpatient visits a year.

NEW CLUE HELPS EXPLAIN HOW BROWN FAT BURNS ENERGY

Investigators identify a major transciption fact that drives brown fat’s thermogenic process

From the FMS Global News Desk of Jeanne Hambleton

Embargo expired: 3-Jul-2014 12:00 PM EDT
Source Newsroom: Beth Israel Deaconess Medical Center

Newswise — BOSTON – The body contains two types of fat cells, easily distinguished by color: White and brown. While white fat serves to store excess calories until they are needed by the body, brown adipocytes actually burn fat by turning it into heat.

Ever since it was discovered that adult humans harbor appreciable amounts of brown fat, investigators have been working to better understand its thermogenic fat-burning properties with the ultimate goal of developing novel therapies to combat obesity and diabetes.

Now, research led by investigators at Beth Israel Deaconess Medical Center (BIDMC) adds another piece to the puzzle, demonstrating that the transcription factor IRF4 (interferon regulatory factor 4) plays a key role in brown fat’s thermogenic process, regulating energy expenditure and cold tolerance. The findings appear in the July 3 issue of the journal Cell.

“The discovery several years ago that brown fat plays an active role in metabolism suggested that if we could manipulate the number or activity of these fat cells, we could force our bodies to burn extra calories,” explains the study’s senior author Evan Rosen, MD, PhD, an investigator in the Division of Endocrinology, Diabetes and Metabolism at BIDMC and Associate Professor of Medicine at Harvard Medical School.

“Now that we have identified a major factor driving this process, we can look for new approaches to exploit this for therapeutic benefit.”

Turned on by cold temperatures and by certain hormones and drugs, including epinephrine, brown fat generates heat through the actions of a group of genes collectively termed the thermogenic gene expression program, the best known of which encodes uncoupling protein 1 (UCP1). UCP1 dissipates, or wastes, energy in the mitochondria of brown fat cells, causing heat generation as a byproduct.

“There has been intense interest in how the UCP1 gene is regulated, with most attention focused on a molecule called PGC1-alpha,” explains Rosen.”PGC1-alpha was discovered 15 years ago in the lab of coauthor Bruce Spiegelman, and is a transcriptional co-factor, which means that it indirectly drives the transcription of genes like UCP1 because it lacks the ability to bind to DNA itself. This suggested that there must be a bona fide transcription factor, or DNA binding protein, that was mediating the effects of PGC-1alpha, but despite years of work and several promising candidates, no clear partner for PGC-1alpha had been discovered to increase thermogenesis. It turns out that IRF4 is that partner.”

Interferon regulatory factors (IRFs) play important roles in the regulation of the immune system. Rosen’s group had previously identified IRF4 as a key element in adipocyte development and lipid handling, having discovered that IRF4 expression is induced by fasting in fat and that animals that lack IRF4 in adipose tissue are obese, insulin resistant and cold intolerant.

In this new work, led by first author Xingxing Kong, PhD, a postdoctoral fellow in the Rosen lab, the scientists hypothesized that in addition to serving as a key regulator of lipolysis, IRF4 might also play a direct thermogenic role in brown fat.

Experiments in mouse models confirmed their hypothesis, demonstrating that IRF4 is induced by cold and cAMP in adipocytes and is sufficient to promote increased thermogenic gene expression, energy expenditure and cold tolerance. Conversely, loss of IRF4 in brown fat resulted in reduced thermogenic gene expression and energy expenditure, obesity and cold intolerance. Finally, the researchers showed that IRF4 physically interacts with PGC-1 alpha to promote UCP1 expression and thermogenesis.

“We’ve known a lot about how these genes are turned on by cold or when stimulated by catelcholamine drugs such as epinephrine,” explains Rosen.

“But we did not know what was turning on this gene program at the molecular level. With this new discovery of IRF4’s key transcriptional role, perhaps we can identify new drug targets that directly affect this pathway, which might be more specific than simply giving epinephrine-like drugs, which drive up heart rate and blood pressure.”

In addition to Rosen and Kong, coauthors include BIDMC investigators Tiemin Liu (now at the University of Texas Southwestern Medical Center), Songtao Yu (now at Northwestern University Feinberg School of Medicine), Xun Wang and Sona Kang; Alexander Banks, Lawrence Kazak, Rajesh R. Rao, Paul Cohen, James C. Lo, Sandra Kleiner and Bruce M. Spiegelman of Dana-Farber Cancer Institute; and Yu-Hua Tseng, Aaron M. Cypess and Ruidan Xue of Joslin Diabetes Center.

This study was funded, in part by National Institutes of Health grants R01 DK31405 and R01 DK085171 and an American Heart Association postdoctoral fellowship to Xingxing Kong.

Beth Israel Deaconess Medical Center is a patient care, teaching and research affiliate of Harvard Medical School, and currently ranks third in National Institutes of Health funding among independent hospitals nationwide.

The BIDMC health care team includes Beth Israel Deaconess Hospital-Milton, Beth Israel Deaconess Hospital-Needham, Beth Israel Deaconess Hospital-Plymouth, Anna Jaques Hospital, Cambridge Health Alliance, Lawrence General Hospital, Signature Health Care, Commonwealth Hematology-Oncology, Beth Israel Deaconess HealthCare, Community Care Alliance, and Atrius Health. BIDMC is also clinically affiliated with the Joslin Diabetes Center and Hebrew Senior Life and is a research partner of Dana-Farber/Harvard Cancer Center. BIDMC is the official hospital of the Boston Red Sox. For more information, visit http://www.bidmc.org.

Back tomorrow with more news. Jeanne

Link | Posted on by | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

EATING MEAT CONTRIBUTES TO CLIMATE CHANGE

From the FMS Global News Desk of Jeanne Hambleton
Posted on July 21, 2014 by Stone Hearth News

Stanford, CA

Eating meat contributes to climate change, due to greenhouse gasses emitted by livestock. New research finds that livestock emissions are on the rise and that beef cattle are responsible for far more greenhouse gas emissions than other types of animals. It is published by Climactic Change.

Carbon dioxide is the most-prevalent gas when it comes to climate change. It is released by vehicles, industry, and forest removal and comprises the greatest portion of greenhouse gas totals. But methane and nitrous oxide are also greenhouse gasses and account for approximately 28 percent of global warming activity.

Methane and nitrous oxide are released, in part, by livestock. Animals release methane as a result of microorganisms that are involved in their digestive processes and nitrous oxide from decomposing manure. These two gasses are responsible for a quarter of these non-carbon dioxide gas emissions and 9 percent of total greenhouse gas emissions overall.

The research team, including Dario Caro, formerly of Carnegie and now at the University of Siena in Italy, and Carnegie’s Ken Caldeira, estimated the greenhouse gas emissions related to livestock in 237 countries over a nearly half a century and found that livestock emissions increased by 51 percent over this period.

They found a stark difference between livestock-related emissions in the developing world, which accounts for most of this increase, and that released by developed countries. This is expected to increase further going forward, as demand for meat, dairy products, and eggs is predicted by some scientists to double by 2050. By contrast, developed countries reached maximum livestock emissions in the 1970s and have been in decline since that time.

“The developing world is getting better at reducing greenhouse emissions caused by each animal, but this improvement is not keeping up with the increasing demand for meat,” said Caro. “As a result, greenhouse gas emissions from livestock keep going up and up in much of the developing world.”

Breaking it down by animal, beef and dairy cattle comprised 74 percent of livestock-related greenhouse gas emissions, 54 percent coming from beef cattle and 17 percent from dairy cattle. Part of this is due to the abundance of cows, but it is also because cattle emit greater quantities of methane and nitrous oxide than other animals. Sheep comprised 9 percent, buffalo 7 percent, pigs 5 percent, and goats 4 percent.

“That tasty hamburger is the real culprit,” Caldeira said. “It might be better for the environment if we all became vegetarians, but a lot of improvement could come from eating pork or chicken instead of beef.”

The Carnegie Institution for Science is a private, nonprofit organization headquartered in Washington, D.C., with six research departments throughout the U.S. Since its founding in 1902, the Carnegie Institution has been a pioneering force in basic scientific research. Carnegie scientists are leaders in plant biology, developmental biology, astronomy, materials science, global ecology, and Earth and planetary science.

THE REAL PRICE OF STEAK: NEW RESEARCH REVEALS THE COMPARATIVE ENVIRONMENTAL COSTS OF ANIMAL-BASED FOODS

From the FMS Global News Desk of Jeanne Hambleton
Released: 23-Jul-2014 3:00 PM EDT
Source Newsroom: Weizmann Institute of Science
Citations Proceedings of the National Academy of Sciences, June 2014

Newswise — We have heard that eating beef is bad for the environment, but do we know its real cost? Are other animal or animal-derived foods better or worse? New research at the Weizmann Institute of Science, conducted in collaboration with scientists in the U.S., compared the environmental costs of various foods and came up with some surprisingly clear results.

The findings, which appear in the Proceedings of the National Academy of Sciences (PNAS), will hopefully not only inform individual dietary choices, but those of governmental agencies that set agricultural and marketing policies.

Dr. Ron Milo of the Institute’s Department of Plant Sciences, together with his research student Alon Shepon and in collaboration with Tamar Makov of Yale University and Dr. Gidon Eshel in New York, asked which types of animal-based food one should consume, environmentally speaking. Though many studies have addressed parts of this issue, none have done such a thorough, comparative study that gives a multi-perspective picture of the environmental costs of food derived from animals.

The team looked at the five main sources of protein in the American diet: dairy, beef, poultry, pork, and eggs. Their idea was to calculate the environmental inputs – the costs – per nutritional unit: a calorie or gram of protein. The main challenge the team faced was to devise accurate, faithful input values. For example, cattle grazing on arid land in the western half of the U.S. use enormous amounts of land, but relatively little irrigation water. Cattle in feedlots, on the other hand, eat mostly corn, which requires less land, but much more irrigation and nitrogen fertilizer.

The researchers needed to account for these differences, but determine aggregate figures that reflect current practices and thus approximate the true environmental cost for each food item.

The inputs the researchers employed came from the U.S. Department of Agriculture databases, among other resources. Using the U.S. for this study is ideal, says Dr. Milo, because much of the data quality is high, enabling them to include, for example, figures on import-export imbalances that add to the cost.

The environmental inputs the team considered included land use, irrigation water, greenhouse gas emissions, and nitrogen fertilizer use. Each of these costs is a complex environmental system. For example, land use, in addition to tying up this valuable resource in agriculture, is the main cause of biodiversity loss. And nitrogen fertilizer creates pollution in natural waters.

When the numbers were in, including those for the environmental costs of different kinds of feed (pasture, roughage such as hay, and concentrates such as corn), the team developed equations that yielded values for the environmental cost – per calorie, and then per unit of protein – for each food.

The calculations showed that the biggest culprit is beef. That was no surprise, say Dr. Milo and Mr. Shepon. The surprise was in the size of the gap: In total, eating beef is more costly by an order of magnitude – about 10 times, on average – to the environment than other animal-derived foods, including pork and poultry. Cattle require on average 28 times more land and 11 times more irrigation water, are responsible for releasing 5 times more greenhouse gases, and consume 6 times as much nitrogen as egg production or poultry.

Poultry, pork, eggs, and dairy all came out fairly similarly, which was also surprising, because dairy production is often thought to be relatively environmentally benign. But the research shows that the price of irrigating and fertilizing the crops fed to milk cows – as well as the relative inefficiency of cows in comparison to other livestock – jacks up the cost significantly.

Dr. Milo believes that this study could have a number of implications. In addition to helping individuals make better choices about their diet, it should also hopefully help inform agricultural policy. And the tool the team has created for analyzing the environmental costs of agriculture can be expanded and refined for application to other areas such as, for example, understanding the relative cost of plant-based diets, or those of other nations. In addition to calculating comparisons, it can point to areas that might be improved.

Models based on this study can help policy makers decide how to better ensure food security through sustainable practices.

Dr. Ron Milo’s research is supported by the Mary and Tom Beck – Canadian Center for Alternative Energy Research; the Lerner Family Plant Science Research Endowment Fund; the European Research Council; the Leona M. and Harry B. Helmsley Charitable Trust; Dana and Yossie Hollander, Israel; the Jacob and Charlotte Lehrman Foundation; the Larson Charitable Foundation; the Wolfson Family Charitable Trust; Charles Rothschild, Brazil; Selmo Nissenbaum, Brazil; and the estate of David Arthur Barton. Dr. Milo is the incumbent of the Anna and Maurice Boukstein Career Development Chair in Perpetuity.

The Weizmann Institute of Science in Rehovot, Israel, is one of the world’s top-ranking multidisciplinary research institutions. The Institute’s 2,700-strong scientific community engages in research addressing crucial problems in medicine and health, energy, technology, agriculture, and the environment. Outstanding young scientists from around the world pursue advanced degrees at the Weizmann Institute’s Feinberg Graduate School. The discoveries and theories of Weizmann Institute scientists have had a major impact on the wider scientific community, as well as on the quality of life of millions of people worldwide.

Link | Posted on by | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

HUBBLE FINDS THREE SURPRISINGLY DRY EXOPLANETS

From the News Desk of Jeanne Hambleton
Released: 24-Jul-2014 8:00 AM EDT
Source Newsroom: Space Telescope Science Institute (STScI)
Citations The Astrophysical Journal Letters, July-2014

Newswise — Astronomers using NASA’s Hubble Space Telescope have gone looking for water vapor in the atmospheres of three planets orbiting stars similar to the Sun — and have come up nearly dry.

The three planets, known as HD 189733b, HD 209458b, and WASP-12b, are between 60 and 900 light-years away from Earth and were thought to be ideal candidates for detecting water vapor in their atmospheres because of their high temperatures where water turns into a measurable vapor.
These so-called “hot Jupiters” are so close to their star they have temperatures between 1,500 and 4,000 degrees Fahrenheit, however, the planets were found to have only one-tenth to one one-thousandth the amount of water predicted by standard planet-formation theories.

“Our water measurement in one of the planets, HD 209458b, is the highest-precision measurement of any chemical compound in a planet outside our solar system, and we can now say with much greater certainty than ever before that we have found water in an exoplanet,” said Nikku Madhusudhan of the Institute of Astronomy at the University of Cambridge, England.

“However, the low water abundance we have found so far is quite astonishing.”

Madhusudhan, who led the research, said that this finding presents a major challenge to exoplanet theory.

“It basically opens a whole can of worms in planet formation. We expected all these planets to have lots of water in them. We have to revisit planet formation and migration models of giant planets, especially “hot Jupiters,” and investigate how they are formed.”

He emphasizes that these results may have major implications in the search for water in potentially habitable Earth-sized exoplanets.

Instruments on future space telescopes may need to be designed with a higher sensitivity if target planets are drier than predicted.

“We should be prepared for much lower water abundances than predicted when looking at super-Earths (rocky planets that are several times the mass of Earth),” Madhusudhan said.

Using near-infrared spectra of the planets observed with Hubble, Madhusudhan and his collaborators estimated the amount of water vapor in each of the planetary atmospheres that explains the data.

The planets were selected because they orbit relatively bright stars that provide enough radiation for an infrared-light spectrum to be taken.

Absorption features from the water vapor in the planet’s atmosphere are detected because they are superimposed on the small amount of starlight that glances through the planet’s atmosphere.

Detecting water is almost impossible for transiting planets from the ground because Earth’s atmosphere has a lot of water in it, which contaminates the observation.

“We really need the Hubble Space Telescope to make such observations,” said Nicolas Crouzet of the Dunlap Institute at the University of Toronto and co-author of the study.

The currently accepted theory on how giant planets in our solar system formed, known as core accretion, states a planet is formed around the young star in a protoplanetary disk made primarily of hydrogen, helium, and particles of ices and dust composed of other chemical elements. The dust particles stick to each other, eventually forming larger and larger grains.

The gravitational forces of the disk draw in these grains and larger particles until a solid core forms. This then leads to runaway accretion of both solids and gas to eventually form a giant planet.

This theory predicts that the proportions of the different elements in the planet are enhanced relative to those in its star, especially oxygen, which is supposed to be the most enhanced.

Once the giant planet forms, its atmospheric oxygen is expected to be largely encompassed within water molecules. The very low levels of water vapor found by this research raise a number of questions about the chemical ingredients that lead to planet formation.

“There are so many things we still do not know about exoplanets, so this opens up a new chapter in understanding how planets and solar systems form,” said Drake Deming of the University of Maryland, College Park, who led one of the precursor studies.

“The problem is that we are assuming the water to be as abundant as in our own solar system. What our study has shown is that water features could be a lot weaker than our expectations.”

The findings are published July 24 in The Astrophysical Journal Letters.

For images and more information about Hubble, visit: http://hubblesite.org/news/2014/36 and http://www.nasa.gov/hubble
The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Maryland, conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington.

FLY-INSPIRED SOUND DETECTOR

New Device Based on a Fly’s Freakishly Acute Hearing May Find Applications in Futuristic Hearing Aids and Military Technology

From the News Desk of Jeanne Hambleton
Embargo expired: 22-Jul-2014 11:00 AM EDT
Source Newsroom: American Institute of Physics (AIP)
Citations Applied Physics Letters

Newswise — WASHINGTON D.C., June 22, 2014 – Even within a phylum so full of mean little creatures, the yellow-colored Ormia ochracea fly is distinguished among other arthropods for its cruelty — at least to crickets.

Native to the southeastern U.S. states and Central America, the fly is a most predatory sort of parasite. It swoops onto the back of a singing male cricket, deposits a smear of larvae, and leaves its wicked brood to invade, kill and consume the cricket from inside out.

None of this would be possible without the fly’s ability to find a cricket – the cornerstone of its parasitic lifestyle.

The fly can pinpoint the location of a chirping cricket with remarkable accuracy because of its freakishly acute hearing, which relies upon a sophisticated sound processing mechanism that really sets it apart from all other known insects.

Now a team of researchers at the University of Texas at Austin has developed a tiny prototype device that mimics the parasitic fly’s hearing mechanism, which may be useful for a new generation of hypersensitive hearing aids.

Described in the journal Applied Physics Letters, from AIP Publishing, the 2-millimeter-wide device uses piezoelectric materials, which turn mechanical strain into electric signals. The use of these materials means that the device requires very little power.

“Synthesizing the special mechanism with piezoelectric readout is a big step forward towards commercialization of the technology,” said Neal Hall, an assistant professor in the Cockrell School of Engineering at UT Austin. “Minimizing power consumption is always an important consideration in hearing-aid device technology.

There are military and defense applications as well, and Hall’s work was funded by the Defense Advanced Research Projects Agency (DARPA). In dark environments, for instance, where visual cues are not available, localizing events using sound may be critical.

Super Evolved Hearing

Humans and other mammals have the ability to pinpoint sound sources because of the finite speed of sound combined with the separation between our ears.

The spacing of several centimeters or more creates a slight difference in the time it takes sound waves to hit our ears, which the brain processes perceptually so that we can always experience our settings in surround sound.

Insects generally lack this ability because their bodies are so small that sound waves essentially hit both sides simultaneously. Many insects do detect sound vibrations, but they may rely instead on visual or chemical sensing to find their way through the fights, flights and forages of daily life.

O. ochracea is a notable exception. It can locate the direction of a cricket’s chirp even though its ears are less than 2 mm apart — a separation so slight that the time of arrival difference between its ears is only about four millionths of a second (0.000004 sec).

But the fly has evolved an unusual physiological mechanism to make the most of that tiny difference in time. What happens is in the four millionths of a second between when the sound goes in one ear and when it goes in the other, the sound phase shifts slightly.

The fly’s ear has a structure that resembles a tiny teeter-totter seesaw about 1.5 mm long.

Teeter-totters, by their very nature, vibrate such that opposing ends have 180-degree phase difference, so even very small phase differences in incident pressure waves force a mechanical motion that is 180 degrees out of phase with the other end. This effectively amplifies the four-millionths of a second time delay and allows the fly to locate its cricket prey with remarkable accuracy.

Such an ability is almost the equivalent of a human feeling an earthquake and being able to discern the direction of the epicenter by virtue of the difference in time between when the right and left foot first felt the tremor — except the fly’s hearing is even more sensitive than that, said Hall.

Mimicking the Mechanism

The pioneering work in discovering the fly’s unusual hearing mechanism was done by Ronald Miles at Binghamton University and colleagues Ronald Hoy and Daniel Robert, who first described the phase amplification mechanism the fly uses to achieve its directional hearing some 20 years ago.

In 2013, Miles, and his colleagues presented a microphone inspired by the fly’s ears. (See related release: http://newswise.com/articles/researchers-design-sensitive-new-microphone-modeled-on-fly-ear).

Inspired by Miles’s prior work, Hall and his graduate students Michael Kuntzman and Donghwan Kim built a miniature pressure-sensitive teeter-totter in silicon that has a flexible beam and integrated piezoelectric materials.

The use of piezoelectric materials was their original innovation and it allowed them to simultaneously measure the flexing and the rotation of the teeter-totter beam.

Simultaneously measuring these two vibration modes allowed them to replicate the fly’s special ability to detect sound direction in a device essentially the same size as the fly’s physiology.

This technology may be a boon for many people in the future, since 2 percent of Americans wear hearing aids, but perhaps 10 percent of the population could benefit from wearing one, Hall said.

“Many believe that the major reason for this gap is patient dissatisfaction, he added.

“Turning up the gain to hear someone across from you also amplifies all of the surrounding background noise – resembling the sound of a cocktail party.”

The new technology could enable a generation of hearing aids that have intelligent microphones that adaptively focus only on those conversations or sounds that are of interest to the wearer. But before the devices become part of the next generation of hearing aids or smartphones, more design and testing is needed.

“The delicate mechanism must be protected from consumer handling with surrounding packaging,” Hall said, “something the fly need not worry too much about.”

The article, “Sound source localization inspired by the ears of the Ormia ochracea,” is authored by Michael L. Kuntzman and Neal Hall. It will be published in the journal Applied Physics Letters on July 22, 2014.

ABOUT THE JOURNAL

Applied Physics Letters features concise, rapid reports on significant new findings in applied physics. The journal covers new experimental and theoretical research on applications of physics phenomena related to all branches of science, engineering, and modern technology.

Now I know why a fly disappears as I approach him with a fly swot. He has heard me coming.

Link | Posted on by | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

GENETIC DISORDER MAY HELP FIND A WAY TO REDUCE HEART DISEASE

From the News Desk of Jeanne Hambleton
Posted July 2014
Source : Clinical Center National Institute of Health

New insights into a rare genetic disorder affecting the immune system may lead to treatments for atherosclerosis, the arterial plaques or blockages that are one cause of heart disease. Dr. John I. Gallin, director of the Clinical Center, presented on the advancements at a recent Contemporary Clinical Medicine: Great Teachers Grand Rounds Lecture.Dr. David Bluemke, director of Radiology and Imag ing Sciences at the hospital, also presented on the use of new medical imaging techniques to detect atherosclerosis.

Gallin gave a brief history of the genetic immune disorder, chronic granulomatous disease, and explained the important role it may play in protecting a person from the consequences of atherosclerosis. Bluemke highlighted the advances medical imaging has provided for early diagnosis of atherosclerosis.

Chronic granulomatous disease affects roughly 1 in 200,000 people, said Gallin. The disorder prevents white blood cells known as phagocytes from producing chemicals that eliminate harmful pathogens in the body.

The disease is characterized by recurrent infections and granulomas, or small areas of inflammation, on the skin and inside the body that can lead to severe lung, skin, bone and other organ damage.

Gallin said the cause of the disease can be traced to five mutations in an enzyme complex called NAPDH oxidase. Conditions range in terms of severity depending on which type of mutation a person has.

Gallin and his colleagues have used what they learned about each mutation to develop individualized therapeutic approaches for different types of chronic granulomatous disease.

Over the years, Gallin noticed that people with the disease developed atherosclerosis at a slower rate than those without it. Soon, other researchers began publishing articles with similar conclusions.

A protocol was developed at the Clinical Center to assess the prevalence of atherosclerosis in patients with immune system disorders compared to those without. The results of the protocol confirmed Gallin’s hypothesis.

“What we’ve shown is that chronic granulomatous disease patients are protected from carotid artery thickening,” Gallin said.

Now, his team is working with the NIH’s National Center for Advancing Translational Sciences to identify potential drug targets. To continue advancements, Bluemke highlighted that researchers are using new medical imaging techniques such as CT, MRI and PET to detect atherosclerosis, as early as possible.

The research on these disorders, which is still ongoing, could have profound impacts on cardiovascular disease, which is the leading cause of death in the U.S. and costing health care more than $400 billion annually according to the American Heart Association.

Bluemke noted that doctors need noninvasive imaging methods to detect early atherosclerosis. Traditionally, doctors look at atherosclerosis risk factors, such as age, gender, blood pressure, family history and cholesterol levels to determine risk. These risk factors work well for large populations, but are less helpful for each individual patient.

At the Clinical Center, new medical imaging techniques are allowing researchers to identify early signs of atherosclerosis before a patient shows any symptoms using CT and MRI of the blood vessels in the heart and neck.

The next step is figuring out how to treat it. Clinical trials are underway at the hospital to compare image-guided approaches to lowering atherosclerosis risk factors with standard care guided by traditional risk factors.

Preliminary evidence suggests that atherosclerosis is a reversible condition, according to Bluemke. In one trial, patients were treated with statins, which are a cholesterol-lowering medication, for a year and a half. The statins decreased the amount of plaque in the arteries.

“A disease which we didn’t really envision as being reversible has a possibility of being regressed in this manner,” said Bluemke.

An additional benefit is that the image-guided approaches to treating atherosclerosis are non-invasive and the risk of radiation exposure is low.


STOP ASPIRIN TRY ANTICOAGULANTS, SAYS NICE

From the News Desk of Jeanne Hambleton
Source Pulse magazine
Posted 22 July 2014

NICE’s guidance to stop using aspirin in AF patients (atrial fibrillation) is clinically sound, but comes with little sign of the necessary additional resources, finds Caroline Price

Practices will face a major drive to review all patients with atrial fibrillation who are on aspirin and switch most of them to an oral anticoagulant, after NICE published its long-awaited updated guidance on management of the arrhythmia.

All parties are agreed on the evidence behind the main recommendation of the guideline – that anticoagulation therapy should be the only option for stroke prevention – but it has sparked concerns for practices on how they will be able to implement the recommendations systematically without additional resources being made available.

It could even leave GPs with little option other than to refer patients on to secondary care, the GPC warns.

The numbers of patients affected will differ by practice, but various studies suggest 20-40% of those with atrial fibrillation across the UK – anywhere between 200,000 and 350,000 patients – are still on aspirin, 90% of whom are eligible for anticoagulation.1-3
The guidance also adds some additional elements of complexity for GPs, advising them to adopt the CHA2DS2-VASc risk score instead of tholder CHADS2 score. They should also use the unfamiliar HAS-BLED score to weigh up and then modify patients’ risk of bleeding.

Up to date

The updated guidance has been warmly received by cardiology experts, mainly because it brings the NICE recommendations up to date and in line with European Society of Cardiology guidelines, published in 2010.

In particular, it draws on overwhelming evidence that anticoagulation is much more effective than antiplatelet treatment in terms of reducing stroke and all-cause mortality, while aspirin treatment is no longer considered beneficial because any effect on the risk of ischaemic stroke is offset by the harms of bleeding.

For practices, though, the workload issues are significant and will require extra resources. CCG leads estimate practices will have an average of 140 patients with atrial fibrillation on their list, of whom around 45 will be on aspirin and therefore need review and then several follow-up consultations to manage the transition to anticoagulation.

Dr Peter Scott, a GP in Solihull and GPC representative for the West Midlands, says: ‘It’s not going to happen unless it’s resourced and incentivised as part of a DES or LES, or through the QOF. Until then, I don’t think a systematic approach to this will happen.’

The GPC, while welcoming the guidance, in particular the clarification of the role of aspirin, has also warned the updated recommendations will be too complex for GPs to manage within normal routine care and has called on CCGs to develop enhanced services to support the process.

Dr Andrew Green, chair of the GPC clinical and prescribing subcommittee, says unless GPs are given more resources and support, they may feel they need to refer patients to secondary care.

Dr Green says: ‘I would expect GPs as part of their normal work to consider whether [atrial fibrillation] patients not on anticoagulation should be, in the light of the new guidance.

‘If they should be, then the choice is between anticoagulation with warfarin or one of the newer agents, and if GPs do not feel they have the expertise or resources to do this properly, they have a duty to refer to someone who can.’

He adds: ‘Commissioners need to predict this activity and may want to commission a service specifically for this, which is more cost-effective than a traditional outpatient referral.’

Dr Matthew Fay, a GP in Shipley, Yorkshire, and a member of the NICE AF guidelines development group, concedes practices may need extra help.

He says: ‘If GPs feel uncomfortable with [managing anticoagulation] then they should be approaching the CCG executive to say, “we need a service to provide expert support for this”. The CCG may choose to come up with an enhanced service.’

What the new AF guidance says

• GPs should use the CHA2DS2-VASc score to assess stroke risk in patients with atrial fibrillation and offer anticoagulation to men with a score above 0, and women with a score above 1.

• Offer anticoagulation therapy with warfarin, another vitamin K antagonist or one of the new oral anticoagulants – dabigatran, rivaroxaban or apixaban.

• Aspirin should no longer be offered solely for stroke prevention. Only continue aspirin if a patient is on it for another reason, such as occlusive vascular disease and is at low risk of stroke, does not want to start on anticoagulation, or is advised to continue on aspirin as well as anticoagulation by a cardiologist –for example as part of dual or triple antithrombotic therapy following coronary stenting.

• GPs should also use the HAS-BLED score to assess patients’ bleeding risk and then monitor and correct risk factors for bleeding, including high blood pressure, poor INR control and harmful alcohol consumption.

• Patients on warfarin should now have their time in therapeutic range calculated at each visit, with dosing adjusted accordingly. If poor anticoagulation control cannot be improved, discuss an alternative anticoagulant with the patient.

Source: NICE 2014. The management of atrial fibrillation. CG180. guidance.nice.org.uk/CG180

Opportunistic review

CCG leads acknowledge they may see more referrals into secondary care, but have said GPs should be able to manage the transition with the support of networks in primary care.

Dr Chris Arden, NHS West Hampshire CCG’s lead on cardiology and a GP in Southampton, says: ‘I do think GPs may refer on to other colleagues in the community or secondary care, which is going to be something we will need to try to manage.

‘It is a real concern if everyone reacts quickly on it and you could see patients pushed into secondary care – but I don’t think that’s necessarily appropriate. I think there are colleagues within the CCG and within practices who can advise.’

Dr Fay adds that GPs can look at reviewing patients opportunistically.

He says: ‘I think that’s perfectly acceptable. A lot of these patients who are at risk in this situation we will be reviewing because of their hypertension and other comorbidities, and those patients on aspirin should have that discussed at the next presentation.’

Others insist CCGs need to take a more proactive approach if the work is to be prioritised. Dr John Robson, a GP in Tower Hamlets and University College London Partners primary care lead for cardiovascular disease, recently led a programme in Tower Hamlets, east London, that saw practices raise the proportion of atrial fibrillation patients on anticoagulation by 10% over two years.

Education

The programme provided practices with software to identify patients on aspirin who would be suitable for anticoagulation, arranged educational sessions with local cardiologists and haematologists and also involved publishing individual practices’ performance to encourage bench-marking relative to peers.

Although the improvements in Tower Hamlets were made without financial incentives, the CCG has since introduced an enhanced service and Dr Robson says CCGs do need to set aside resources to identify patients proactively and educate GPs about the new recommendations, including the use of the newer oral anticoagulants.

Dr Robson says: ‘It does require some resource, because somebody has to organise the educational meetings and put the software tools onto practices’ computer systems. We need to make it easy for practices to do it.’

In the meantime, GPs without such support will need to take responsibility for reviewing and updating patients.

According to medicolegal advisors, practices should take steps to ensure patients are identified and reviewed, whether opportunistically or more proactively, or risk being in breach of their duty of care.

Dr Pallivi Bradshaw, a medicolegal advisor at the Medical Protection Society, says: ‘If it was found such steps had not been taken, and a patient slipped through the system, the practice could be criticised if the person went on to have a stroke. You might be able to establish there was a breach in the duty of care. That could also be the case if an individual doctor had been informed the guidance had changed and perhaps forgot to change it.’

Dr Bradshaw adds that GPs should document any decisions that depart from the NICE guidance, particularly as discussions with patients who have been on aspirin for a long time may be tricky.

She says: ‘Patients may not understand or even be scared if they are suddenly going from aspirin to an anticoagulant. GPs might need to get a second opinion from a specialist for the patient, or have a discussion with the consultant.’

How we moved most of our patients with atrial fibrillation onto anticoagulation:

1. Stroke risk assessment
We took a pragmatic approach and continued to use the CHADS2 tool for routine assessment, as a score of 1 or greater is equivalent to a CHA2DS2-VASc score of 2 or greater. Only in cases where the patient had a CHADS2 score of 0 did we consider using the CHA2DS2-VASc score, to determine if the patient was trulylow risk and did not require anticoagulation.

2. Bleeding risk

To assess bleeding risk, we used the SPARC tool (www.sparctool.com), or the HAS-BLED score, the use of which is now recommended in the updated NICE guidance. Bleeding risk was a great concern for GPs, and to a lesser extent patients, but even if the risk on HAS-BLED is high (a score greater than 3), if the CHA2DS2-VASc score is above 1 then the net clinical benefit is always in favour of an anticoagulant.

3. Choice of intervention
We used material produced by the Atrial Fibrillation Association extensively to assess the risks and benefits of intervention with an oral anticoagulant or no intervention, and referred patients to the association for further information and support.

4. Offering NOACs

Since NICE approved the use of non-vitamin K antagonist oral anticoagulants (NOACs) in 2011, we have widened the discussion with patients who are being initiated on anticoagulation so that, as well as offering either clinic-monitored warfarin or self-testing and self-managed warfarin. We also discuss the option of taking a NOAC.

Some patients who have struggled to achieve a stable therapeutic dose of warfarin, resulting in either very frequent visits to the practice-based clinic or poor time in therapeutic range (<65%), have been offered the option of changing to a NOAC.

Clinicians were supported in NOAC dosing by a locally developed prescribing guide, and currently 18% of our anticoagulated patients are on a NOAC.

Dr Matthew Fay is a GP in Shipley and NICE advisor on the atrial fibrillation guidelines. He is also a medical advisor to the Arrhythmia Alliance, the Atrial Fibrillation Association and Anticoagulation Europe.

References:

Holt TA et al. Risk of stroke and oral anticoagulant use in atrial fibrillation: a cross-sectional survey.
Br J Gen Pr 2012; 62: e710-17
Murphy NF et al. A national survey of the prevalence, incidence, primary care burden and treatment of atrial fibrillation in Scotland. Heart 2007; 93: 606–612
NICE personal communication, based on IMS disease Analyzer 2012/12 and GRASP-AF database. April 2014
Estimates by CCG leads
Aguilar MI et al. Oral anticoagulants versus antiplatelet therapy for preventing stroke in patients with non-valvular atrial fibrillation and no history of stroke or transient ischemic attacks. Cochrane Database Syst Rev 2007; 3: CD006186
Mant J et al. Warfarin versus aspirin for stroke prevention in an elderly community population with atrial fibrillation (the Birmingham Atrial Fibrillation Treatment of the Aged Study, BAFTA): a randomised controlled trial. Lancet 2007; 370: 493– 503
Petersen P et al. Placebo-controlled, randomised trial of warfarin and aspirin for prevention of thromboembolic complications in chronic atrial fibrillation. The Copenhagen AFASAK study. Lancet 1989; 1 :175-179
Sato H et al. Low-dose aspirin for prevention of stroke in low-risk patients with atrial fibrillation: Japan Atrial Fibrillation Stroke Trial. Stroke 2006; 37: 447-451
Robson J et al. Improving anticoagulation in atrial fibrillation: observational study in three primary care trusts. Br J Gen Pr 2014; 64: e275-e281

Link | Posted on by | Tagged , , , , , , , , , , | Leave a comment

UEA RESEARCH SHOWS OCEANS VITAL FOR POSSIBILITY FOR ALIEN LIFE

From the News Desk of Jeanne Hambleton
Posted July 20 2014
Source University of East Anglia Stone Hearth News

Researchers at the University of East Anglia have made an important step in the race to discover whether other planets could develop and sustain life.

New research published today in the journal Astrobiology shows the vital role of oceans in moderating climate on Earth-like planets.

Until now, computer simulations of habitable climates on Earth-like planets have focused on their atmospheres. But the presence of oceans is vital for optimal climate stability and habitability.

The research team from UEA’s schools of Maths and Environmental Sciences created a computer simulated pattern of ocean circulation on a hypothetical ocean-covered Earth-like planet. They looked at how different planetary rotation rates would impact heat transport with the presence of oceans taken into account.

Prof David Stevens from UEA’s school of Maths said: “The number of planets being discovered outside our solar system is rapidly increasing. This research will help answer whether or not these planets could sustain alien life.

“We know that many planets are completely uninhabitable because they are either too close or too far from their sun. A planet’s habitable zone is based on its distance from the sun and temperatures at which it is possible for the planet to have liquid water.

“But until now, most habitability models have neglected the impact of oceans on climate.

“Oceans have an immense capacity to control climate. They are beneficial because they cause the surface temperature to respond very slowly to seasonal changes in solar heating. And they help ensure that temperature swings across a planet are kept to tolerable levels.

“We found that heat transported by oceans would have a major impact on the temperature distribution across a planet, and would potentially allow a greater area of a planet to be habitable.

“Mars for example is in the sun’s habitable zone, but it has no oceans – causing air temperatures to swing over a range of 100OC. Oceans help to make a planet’s climate more stable so factoring them into climate models is vital for knowing whether the planet could develop and sustain life.

“This new model will help us to understand what the climates of other planets might be like with more accurate detail than ever before.”

‘The Importance of Planetary Rotation Period for Ocean Heat Transport’ is published in the journal Astrobiology on Monday, July 21, 2014. The research was funded by the Engineering and Physical Sciences Research Council (EPSRC).

LOWER SURVIVAL RATES FOR CONSUMERS OF PROCESSED MEAT

From the News Desk of Jeanne Hambleton
Source Stone Hearth News – American Society for Nutrition
Posted on July 20, 2014

Differences in survival associated with processed and with non processed red meat consumption.

First published July 16, 2014, doi:10.3945/ajcn.114.086249 Am J Clin Nutr September 2014 ajcn.086249.
Andrea Bellavia, Susanna C Larsson, Matteo Bottai, Alicja Wolk, and Nicola Orsini.

Author Affiliations
From the Unit of Nutritional Epidemiology (AB, SCL, AW, and NO) and the Unit of Biostatistics (AB, MB, and NO), Institute of Environmental Medicine, Karolinska Institutet, Stockholm, Sweden.

Author Notes
Supported in part by a Young Scholar Award from the Karolinska Institutet’s Strategic Program in Epidemiology and the Swedish Medical Society (SLS-250271) and by the Swedish Research Council.

Correspondence to A Bellavia, Institute of Environmental Medicine, Karolinska Institutet, PO Box, SE-171 77, Stockholm, Sweden. E-mail: andrea.bellavia@ki.se.

Abstract

Background:
High red meat consumption is associated with an increased mortality risk. This association is partly explained by the negative effect of processed meat consumption, which is widely established. The role of non processed meat is unclear.

Objective:
The objective was to examine the combined association of processed and non processed meat consumption with survival in a Swedish large prospective cohort.

Design:
In a population-based cohort of 74,645 Swedish men (40,089) and women (34,556), red meat consumption was assessed through a self-administered questionnaire.

We estimated differences in survival [15th percentile differences (PDs), differences in the time by which the first 15% of the cohort died] according to levels of total red meat and combined levels of processed and non processed red meat consumption.

Results:
During 15 y of follow-up (January 1998 to December 2012), we documented 16,683 deaths (6948 women; 9735 men). Compared with no consumption, consumption of red meat >100 g/d was progressively associated with shorter survival—up to 2 y for participants consuming an average of 300 g/d (15th PD: –21 mo; 95% CI: –31, –10). Compared with no consumption, high consumption of processed red meat (100 g/d) was associated with shorter survival (15th PD: –9 mo; 95% CI: –16, –2). High and moderate intakes of non processed red meat were associated with shorter survival only when accompanied by a high intake of processed red meat.

Conclusions:
We found that high total red meat consumption was associated with progressively shorter survival, largely because of the consumption of processed red meat. Consumption of non processed red meat alone was not associated with shorter survival.

PROCESSED RED MEAT LINKED TO HIGHER RISK OF HEART FAILURE, DEATH IN MEN –
American Heart Association Rapid Access Journal Report


From the News Desk of Jeanne Hambleton
Posted on June 12, 2014
Source Stone Hearth News – American Heart Association

Men who eat moderate amounts of processed red meat may have an increased risk of incidence and death from heart failure, according to a study in Circulation: Heart Failure, an American Heart Association journal.

Processed meats are preserved by smoking, curing, salting or adding preservatives. Examples include cold cuts (ham, salami), sausage, bacon and hot dogs.

“Processed red meat commonly contains sodium, nitrates, phosphates and other food additives, and smoked and grilled meats also contain polycyclic aromatic hydrocarbons, all of which may contribute to the increased heart failure risk,” said Alicja Wolk, D.M.Sc., senior author of the study and professor in the Division of Nutritional Epidemiology at the Institute of Environmental Medicine, Karolinska Institutet in Stockholm, Sweden.

“Unprocessed meat is free from food additives and usually has a lower amount of sodium.”

The Cohort of Swedish Men study — the first to examine the effects of processed red meat separately from unprocessed red meat — included 37,035 men 45-79 years old with no history of heart failure, ischemic heart disease or cancer.

Participants completed a questionnaire on food intake and other lifestyle factors and researchers followed them from 1998 to the date of heart failure diagnosis, death or the end of the study in 2010.

After almost 12 years of follow-up, researchers found:

• Heart failure was diagnosed in 2,891 men and 266 died from heart failure.

• Men who ate the most processed red meat (75 grams per day or more) had a 28 percent higher risk of heart failure compared to men who ate the least (25 grams per day or less) after adjusting for multiple lifestyle variables.

• Men who ate the most processed red meat had more than a 2-fold increased risk of death from heart failure compared to men in the lowest category.

• For each 50 gram (e.g. 1-2 slices of ham) increase in daily consumption of processed meat, the risk of heart failure incidence increased by 8 percent and the risk of death from heart failure by 38 percent.

• The risk of heart failure or death among those who ate unprocessed red meat didn’t increase.

At the beginning of the study, participants completed a 96-item questionnaire about their diet. Processed meat questions focused on consumption of sausages, cold cuts (ham/salami), blood pudding/sausages and liver pate over the last year. Unprocessed meat questions covered pork and beef/veal, including hamburger or ground-minced meat.

Results of the study for total red meat consumption are consistent with findings from the Physicians’ Health Study, in which men who ate the most total red meat had a 24 percent higher risk of heart failure incidence compared to those who ate the least.

“To reduce your risk of heart failure and other cardiovascular diseases, we suggest avoiding processed red meat in your diet, and limiting the amount of unprocessed red meat to one to two servings per week or less,” said Joanna Kaluza, Ph.D., study lead author and assistant professor in the Department of Human Nutrition at Warsaw University of Life Sciences in Poland.

“Instead, eat a diet rich in fruit, vegetables, whole grain products, nuts and increase your servings of fish.”

Researchers said they expect to find similar associations in a current study conducted with women.

Almost 6 million Americans have heart failure and about 50 percent die within five years of diagnosis. The healthcare costs and loss of productivity due to heart failure are an estimated $34 billion each year, researchers said.

The American Heart Association recommends that people eat a dietary pattern that emphasizes fruits, vegetables, whole grains, low-fat dairy products, poultry, fish, and nuts while limiting red meat and sugary foods and beverages.

For people who eat meat, choose lean meats and poultry without skin and eat fish at least twice a week – preferably fish high in omega-3 fatty acids such as salmon, trout, and herring.

The other co-author is Agneta Akesson, Ph.D. Author disclosures are on the manuscript.

The Swedish Research Council/Medicine and the Swedish Research Council/Infrastructure funded the study.
Additional Resources:

•The American Heart Association’s Diet and Lifestyle Recommendations
•Eat More Chicken, Fish and Beans than Red Meat
•Follow AHA/ASA news on Twitter @HeartNews.

Statements and conclusions of study authors published in American Heart Association scientific journals are solely those of the study authors and do not necessarily reflect the association’s policy or position.

The association makes no representation or guarantee as to their accuracy or reliability. The association receives funding primarily from individuals; foundations and corporations (including pharmaceutical, device manufacturers and other companies) also make donations and fund specific association programs and events.

Link | Posted on by | Tagged , , , , , , , , , , | Leave a comment