Monday, January 31, 2011

A day on Alzheimer's disease

On Friday I attended a series of seminars on Alzheimer's disease at the University of British Columbia's Brain Research Centre. I think the idea was to showcase Canadian research in the field of dementia to woo politicians (also in attendance) and ask them for more funding. We heard about all aspects of Alzheimer's disease, from its history to its treatment, and in this post I will fill you in on the latest developments.

Alzheimer's disease is the number one public health problem in the developed world, with approximately 35 million cases worldwide. In Canada it represents a very expensive problem, estimated to cost 50 million dollars a day. In the time it takes you to read this post, there will be two more people diagnosed with Alzheimer's in Canada. As there are currently no approved treatments that affect the disease itself, there is an urgent need to keep our heads down and power through (bonus points for whoever can identify this reference in the comments) to find a cure.

The first "official" patient with Alzheimer's disease was a 51-year old woman named Auguste Deter. She was examined by Alois Alzheimer in 1901. She suffered from impaired memory, aphasia (a language disorder) and disorientation. Alzheimer kept meticulous records: we have a very detailed description of Auguste's condition, and even a sample of her handwriting (see picture). Even though the condition was described in great detail, Alois Alzheimer did not know what had caused Auguste's disease. Today, as one of the researchers at the seminar pointed out, we still don't know what causes Alzheimer's disease, but on a much higher level.

We do know that one of the main culprits in Alzheimer's disease is amyloid beta (Abeta), a protein that everybody's brain makes. In the brain of an Alzheimer's patient, though, too much of this protein is being made, and it aggregates in toxic chunks called plaques. The researchers present at the seminar predicted that vaccines against these plaques will fail. However, there are several candidate drugs that could prevent or treat these plaques in clinical trials right now.

Interestingly, researchers are also studying naturally occurring compounds: one of the speakers talked about his research looking at whether natural extracts can block the formation of plaques in a "petri dish" model of Alzheimer's (brain cells grown in a dish). He finds that ginger, cinnamon, turmeric, cranberry, rhubarb, blueberry, pomegranate and blackberry all help prevent the aggregation of Abeta. However, he warns that at this point, it is not practical to focus on eating these foods because the concentrations used in the lab are just not possible to recreate in a diet.

Beyond the molecular and biological underpinnings of Alzheimer's disease, researchers are also addressing the inevitable changes the world will need to undergo to accommodate a growing prevalence of dementia. For example, one speaker pointed out that many public places such as airports and even hospitals are very difficult to navigate for cognitively healthy people: this represents a true disservice to people with Alzheimer's disease. Efforts are also being made to engage the public (as to avoid more bad news like
this one), and to provide resources for caregivers (such as the fantastic First Link initiative).

Overall, I'm disappointed to report that I didn't learn of any magical intervention that will rid us of Alzheimer's disease, but it's comforting to know that there is a big research community out there who is taking this problem very seriously and who is tackling it from many different angles.

Monday, January 24, 2011

Not-so-subliminal messages

For this week’s post, I had originally intended to kick-off the (Vancouver) cycling season with a post about helmets. So I reviewed the recent evidence to see if I could find an interesting paper. Unfortunately, I ran into a problem of the “boring” kind: the evidence out there is pretty much what you think it is: helmets are good, they prevent injuries. While that’s relevant, it doesn’t make for a great post, because, well, you already know this.

Luckily, I stumbled upon a related story that looks at helmet usage amongst… Fictional characters. The study, published in the journal Pediatrics, looks at safety practices depicted in movies over time. This may seem like a silly waste of time (or the project of your dreams, if you're a grad student), but we know that children tend to imitate what they see in movies (and that is why my eventual kids will never see the “Jackass” movies). Given that by age 18 the average child has spent two years in front of a screen, we might want to know a little more about the kinds of influences they may be getting from mass media.

The researchers started by identifying the 25 top-grossing G-rated (general audience) and PG-rated (parental guidance suggested) US movies for each year between 2003 and 2007. Of those 125 movies, they excluded movies that were animated, not set in present day, fantasy, documentary or not in English. That left them with 67 movies. The researchers then analyzed the safety practices in all the scenes that included characters with speaking roles either walking, driving or riding in a car, driving or riding in a boat, or riding a bike (for a grand total of 958 scenes).

The results show that in movies, just over half (56%) of motor-vehicle passengers wear seat belts, just over a third of pedestrians (35%) use crosswalks, three quarters of boaters (75%) wear personal flotation devices (or lifejackets), and a quarter (25%) of cyclists wear helmets.

Compared with similar studies carried out in the 1990’s and early 2000’s, there is a significant overall improvement in the depictions of safety practices. However, about half of the scenes still show unsafe practices. What’s more, movie characters rarely suffer the consequences of unsafe behavior. How many times did you see someone get up after falling off a cliff and think “Come on!”. The depictions of unsafe behavior combined with the absence of consequences for these behaviors may lead children to minimize dangers in real life, so parents, make sure you point it out when you see characters acting unsafe!

Now the study excluded quite a few movies for simplicity’s sake, and ended up with a fairly small sample, so it would be premature to generalize these results to all movies out there. I would be especially interested in finding out how animated movies fare, since they definitely cater to a younger crowd (Simba sure learned the consequences of *his* unsafe behavior). A later post, perhaps, if such a study exists!
Definitely not the crosswalk!

Reference: Injury-prevention practices as depicted in G- and PG-rated movies, 2003-2007. (2010) Tongren JE et al. Pediatrics 125(2):290-4.

Sunday, January 16, 2011

Dogs have owners, cats have staff, and children have rashes

The word "eczema", a skin inflammation that affects 15 to 30 percent of children and two to 10 percent of adults worldwide, is derived from an ancient Greek word that literally means "to boil out". I know firsthand why this word was chosen, as I suffer from contact dermatitis, a form of eczema that is caused by an allergic reaction (to nickel). The itch is not unlike that of bug bites, and this is by far the best depiction I've ever seen of what it feels like to be itchy:

(the ant hill is a nice touch)

As I'm sure you can imagine, having a child with this condition is a bucket load of fun. The ointments, the whining, the scratching, the scabs... (and, in my case, the sowing of little patches of fabric behind *every* jeans buttons... Thanks, mom!). So while eczema is not really a life-threatening condition, researchers are looking into it, because it's very closely tied to parental sanity.

We already know that eczema is not purely genetic: the environment you grow up in can influence your chances of developing the itchy rash. However, what we don't know is what components of the environment play an important role. A recent study attempts to add a piece to this puzzle by researching whether family pets can have an impact on the development of eczema.

Researchers followed over 600 children starting at one year of age. At the start of the study, the parents of each children were asked to fill a survey of their environment, and researchers took a dust sample from each house to test for allergens and such. Three years later, the researchers evaluated which child had developed eczema and which child hadn't, and analyzed what contributing factors might have played a role.

It turns out that owning a dog is not only good for your blood pressure: children who lived in a house with a dog had a significantly lower risk of developing eczema by four years of age. What about cats? The situation was a bit trickier for cats: living with a cat increased a child's risk of developing eczema, but only if the child tested positive on a cat allergen sensitivity test (the skin-prick kind).

So get rid of Mittens and adopt Fido? Not so fast. First, these findings don't hold true for all allergy-related conditions. For example, dogs are thought to contribute to asthma. Second, what this study really does is highlight how complicated these conditions are: several different types of environmental exposures may impact allergies in different ways, so it's very hard to draw clean, straightforward conclusions and guidelines.

That said, I'm still going to blame my eczema on growing up in a dogless home. I wish I would have known this tidbit of information way back when I was a kid: it might have helped me in my campaign to get a pet (admittedly, my heart was set on a horse).

A healthy start!

Reference: Opposing effects of cat and dog ownership and allergic sensitization on eczema in an atopic birth cohort (2010) Epstein TG et al. Journal of Pediatrics [Epub ahead of print].

Wednesday, January 12, 2011

New Year's resolutions through delayed gratification

It's that time of the year. That I-will-eat-better-and-exercise-lots time. It's that time when we kick start New Year's resolutions with the best intentions, the best plans, the most motivation. Unfortunately, and I can tell you this from experience, some of us will fail. A recent article published in the journal Obesity sheds light on one important aspect in keeping some resolutions: delayed gratification.

Delayed gratification, as the name implies, refers to the ability to forgo an immediate reward (for example, delicious Cheesy Poofs) for a benefit that will come later (for example, rocking that little black dress). A lot of research shows that the better you are at delaying gratification, the better you do in life in general (you may have heard of the famous marshmallow study). To see if delayed gratification is linked to obesity, a team of researchers set out to test whether children who have a high body mass index (BMI) are less likely to delay gratification.

The researchers looked at data from an educational obesity intervention program. In this program, attended by obese or overweight children along with their siblings (the healthy weight control group), children earn a point if they complete their weekly goals. They then have two choices: either spend that point immediately on a small toy prize (like a pencil) or save the point to use later on a larger prize worth more than one point (like a basketball). The measure of points saved and points spent is thought to be a valid model of delayed gratification. So the researchers looked at the relationship between how many points were saved by a child and that same child's BMI. The results show that a higher BMI is associated with less points saved, meaning the children who were overweight or obese had a harder time delaying gratification.

The strong aspect of this study is that the rewards were not food-related. This allowed the researchers to study delayed gratification as a behavior trait in general, and not specifically as it relates to obesity. However, their sample was fairly small (59 children) and the duration of the study was fairly short (12 weeks). Therefore, it's difficult to say whether delayed gratification plays a role in weight loss.

Overall, the research is relevant in that it suggests that working on delayed gratification (it's possible to "train" to get better at it) may help in obesity interventions. And for all you out there with eat-less-exercise-more resolutions, all I can say is "eyes on the prize"...

Reference: Ability to delay gratification and BMI in preadolescence. (2010) Bruce, AS and al. Obesity [Epub ahead of print].

© 2009 Scientific Chick. All Rights Reserved | Powered by Blogger
Design by psdvibe | Bloggerized By LawnyDesignz