Happy Thanksgiving. Let’s talk turkey. Let’s talk tryptophan.
Even though you know all this. You know that for a long time people believed it was the tryptophan in the turkey that made everyone sleepy on Thanksgiving. And then folks pointed out that no, it doesn’t work like that, it’s eating all those carbs, being in a warm house, around loved ones, relaxed, after days or even weeks of the usual sleep-deprived lifestyles we lead.
None of that says what, exactly, tryptophan is, and what it does in the brain. Basically, tryptophan is an amino acid, which are the building blocks of protein. Your body does not make tryptophan itself, and therefore has to get it from foods. You use tryptophan to make Niacin (vitamin B3) which, among other things, is good for your skin.
Tryptophan also makes serotonin, and most of the serotonin in your body is in your intestines, where it regulates movement. The remainder that is in your brain regulates mood, appetite, and sleep. Thus the source of the turkey = drowsiness myth. Serotonin makes you hungry, eating more makes you happy, happiness relaxes you, eating more makes you drowsy, and together the happy/drowsy state puts you to sleep in front of the football game.
From tryptophan to serotonin, and then from serotonin to melatonin. This is the hormone that regulates the circadian rhythm of various parts of your biology. In your brain, this is the cycle that puts you to sleep at the end of the day and wakes you up in the morning—and it’s light dependent, meaning your pineal gland doesn’t make much when it’s light out.
In order to show that turkey makes you actually sleepy, then, it would have to be able to do so in the absence of other things that make you sleepy, and in the presence of other things that keep you awake. We sit down to the table, and eat a ton of food, sending blood to the stomach, which makes us sleepy. We drink wine, which makes us sleepy. We’re sleep deprived already. We’re in a safe, warm, comfortable environment. We’re a little bit bored by the football game between two teams we don’t much care about. It’s cloudy outside, the fall weather, a dark gray sky.
Reverse all that. Eat just turkey, and water, after a good month of 8-hours-per-night of sleep. Keep the thermostat at 65, bicker with your in-laws about why your college team is better than the one’s their children went to, and place a wager on the game: loser does the dishes. Make sure all the lights are on. You will be wide awake, guaranteed.
Nevertheless, your take away from all of this should be satisfaction in the knowing. When one of your in-laws tucks in and hauls out the “turkey makes ya sleep” trope: let them. No need to bicker, no need to argue or educate. Keep your cortisol levels low. Enjoy your non-tryptophan-induced nap. You’ve probably earned it.
Thursday, November 28, 2013
Wednesday, November 27, 2013
If Your Brain Had Legs It Would Dance
Can you just imagine it? Your brain on the dance floor, doing the cabbage patch or the running man. Working up a sweat! Getting down with its bad self. Cutting a rug, as it where. Oh, if only. Exercise is supposed to be good for the brain. If only if it had legs of its own, to get out there and shake its groove thang.
But wait, it does. You are your brain, and you have legs. When you dance, your brain dances. When you exercise, your brain reaps the benefits. Aerobic exercise makes your heart stronger, your muscles stronger, reduces fat, and all the same time is improving cognition, strengthening memory, and boosting mood.
So if you can imagine it, you can do it. If your brain had legs, and it does, it would dance, so it does. So to speak. After all, you have free will, so you can choose to not dance. And if you're so inclined, you might choose to not exercise at all.
You can still improve cognition, strengthen memory, and boost mood. You can eat the right nutrients, engage in the right kinds of mental stimulation, seek out and enjoy social interaction. But what happens when you need to run faster, run further, run longer?
We live in a world where we have the luxury of rarely, if ever, needing to run fast, far, or for long. Your brain can survive without legs. It doesn't have to dance. But it does have to stay young. If it doesn't, as it grows "older," it will eventually not want to do anything.
Thankfully, the flip side is that if you do dance, or run, or swim, or do any kind of aerobic exercise, no matter how old you are, your brain is going to get better. Even if you don't want it to. But who wouldn't it to?
The irony is that, since you can choose to dance, and are not forced to, you can enjoy it for its own sake. You have to eat, have to use your brain on a daily basis, and unless you're a recluse, you have to deal with people. So your brain is going to maintain itself in those activities.
But if you dance, its just going to get better.
But wait, it does. You are your brain, and you have legs. When you dance, your brain dances. When you exercise, your brain reaps the benefits. Aerobic exercise makes your heart stronger, your muscles stronger, reduces fat, and all the same time is improving cognition, strengthening memory, and boosting mood.
So if you can imagine it, you can do it. If your brain had legs, and it does, it would dance, so it does. So to speak. After all, you have free will, so you can choose to not dance. And if you're so inclined, you might choose to not exercise at all.
You can still improve cognition, strengthen memory, and boost mood. You can eat the right nutrients, engage in the right kinds of mental stimulation, seek out and enjoy social interaction. But what happens when you need to run faster, run further, run longer?
We live in a world where we have the luxury of rarely, if ever, needing to run fast, far, or for long. Your brain can survive without legs. It doesn't have to dance. But it does have to stay young. If it doesn't, as it grows "older," it will eventually not want to do anything.
Thankfully, the flip side is that if you do dance, or run, or swim, or do any kind of aerobic exercise, no matter how old you are, your brain is going to get better. Even if you don't want it to. But who wouldn't it to?
The irony is that, since you can choose to dance, and are not forced to, you can enjoy it for its own sake. You have to eat, have to use your brain on a daily basis, and unless you're a recluse, you have to deal with people. So your brain is going to maintain itself in those activities.
But if you dance, its just going to get better.
Tuesday, November 26, 2013
This is Why Addiction is a Brain problem, Not a Person Problem
This will largely be a semantic argument. If you do not like semantic arguments, you can translate this into a philosophical discussion. If you think philosophy distracts from science, then you can examine on your own experien
Moving on, a recent post on the Psychology Today blogs discusses the mess that is the DSM V’s treatment of “addiction,” and specifically, the semantic nature of this treatment. What is addiction? Is it nothing more than something that one can’t help but do? Or do we need to add in understandings of consequence, perceived reward, frequency of activity, person and public usefulness, and so on?
For the most part, the word “addiction” has a negative connotation. This can used ironically to bolster something that is “so good,” such as an “addiction to chocolate.” The idea is that chocolate is so good, one would rather have it then not have the negative consequences of eating to too much. This is the parallel drawn with, for example, drug addiction: heroin feels so good, one would rather inject it than be healthy, keep a job, maintain personal relationships, and so on.
But there’s, of course, a huge difference between eating too much chocolate and taking drugs. Ostensibly, a person who eats a lot of chocolate, when told he had become a diabetic, could wean himself off of candy bars in the hopes of prolonging his life. The drug addict can’t do this, according to the definition of the word. Therefore, the word addiction requires an understanding of a lack of free will.
Take the word addiction out of the discussion, and what do we have left. We have a person who is being forced to do things that he doesn’t want to do. Or, to be precise, does not want to want to do. You could say the person is a slave. How do we emancipate people from being forced to take drugs?
The simple answer is, to free them. Give them back their free will. Its society that responds to heavy drug use by removing junkies from jobs, from families, puts them in jail. The irony is, if the drug user does not have to choose between drugs or society’s approval, then taking drugs is not a choice.
However, heavy drug use is physically damaging, and that’s not something society can simply choose to deny the truth of. But, if addiction is viewed as a medical problem, not a social problem, then access to medical care can treat said physical damage. And in the process, help the addicted choose to be free of drugs.
This blog post is not advocating the legalization of drugs. It is advocating for the abandonment of judgmental attitudes.
ces reflect on this discussion. If you think anecdotal phenomena are irrelevant, then you can go crawl into a Skinner box and enjoy your life of pre-programmed cause and effect. We guarantee you will enjoy it, as that is how your brain, for you, literally works.
Moving on, a recent post on the Psychology Today blogs discusses the mess that is the DSM V’s treatment of “addiction,” and specifically, the semantic nature of this treatment. What is addiction? Is it nothing more than something that one can’t help but do? Or do we need to add in understandings of consequence, perceived reward, frequency of activity, person and public usefulness, and so on?
For the most part, the word “addiction” has a negative connotation. This can used ironically to bolster something that is “so good,” such as an “addiction to chocolate.” The idea is that chocolate is so good, one would rather have it then not have the negative consequences of eating to too much. This is the parallel drawn with, for example, drug addiction: heroin feels so good, one would rather inject it than be healthy, keep a job, maintain personal relationships, and so on.
But there’s, of course, a huge difference between eating too much chocolate and taking drugs. Ostensibly, a person who eats a lot of chocolate, when told he had become a diabetic, could wean himself off of candy bars in the hopes of prolonging his life. The drug addict can’t do this, according to the definition of the word. Therefore, the word addiction requires an understanding of a lack of free will.
Take the word addiction out of the discussion, and what do we have left. We have a person who is being forced to do things that he doesn’t want to do. Or, to be precise, does not want to want to do. You could say the person is a slave. How do we emancipate people from being forced to take drugs?
The simple answer is, to free them. Give them back their free will. Its society that responds to heavy drug use by removing junkies from jobs, from families, puts them in jail. The irony is, if the drug user does not have to choose between drugs or society’s approval, then taking drugs is not a choice.
However, heavy drug use is physically damaging, and that’s not something society can simply choose to deny the truth of. But, if addiction is viewed as a medical problem, not a social problem, then access to medical care can treat said physical damage. And in the process, help the addicted choose to be free of drugs.
This blog post is not advocating the legalization of drugs. It is advocating for the abandonment of judgmental attitudes.
Friday, November 22, 2013
The Brain’s Five Kinds of Boredom
One of our writers writes for another blog called “Zombie For Life.”(We think that’s supposed to be an ironic title.) Today he wrote an entry called “5 Reasons People Who Listen to Surf Guitar Will Survive the Zombie Apocalypse.” And since today’s post about the five kinds of boredom, we thought it would be fun to somehow link the two.
Research performed by a handful of scientists and reported on by Science Daily suggests that boredom can be classified according to levels of arousal and valence. They describe:
- indifferent boredom (relaxed, withdrawn, indifferent)
- calibrating boredom (uncertain, receptive to change/distraction)
- searching boredom (restless, active pursuit of change/distraction)
- reactant boredom (high reactant, motivated to leave a situation for specific alternatives)
- apathetic boredom, an especially unpleasant form that resembles learned helplessness or depression. It is associated with low arousal levels and high levels of aversion.
(The above is directly quoted from Science Daily but reformatted for this blog post.)
In an effort to provide some entertainment today, which is to say, alleviate some boredom, we’d like to suggest that each boredom type, above, can be a means by which to survive a zombie apocalypse:
- Indifferent: this person will maintain a cool head when everyone else is losing control.
- Calibrating: this person will not take for granted the doomed-to-fail survival instructions provided by civic leaders, and will adapt to ever-changing circumstances.
- Searching: this person will not stay idle as things fall apart, becoming a sitting duck for that random zombie who wanders by.
- Reactant: this person will respond immediately to a zombie threat, and will not become complacent or accept a seemingly unavoidable doom.
- Apathetic: this person will be as like the undead themselves, blend in, and in doing so will avoid being eaten as the zombies chase after livelier prey.
Yes, we’ve taken a few liberties here, and made some assumptions. But then again, zombies are fictional, so taking liberties and making assumptions is de rigueur for this kind of discussion.
On a serious note, the fifth boredom type, “apathetic” is one that has only been recently recognized, and its inclusion in the above list reveals some troubling statistics: “Because of the assumed link between boredom and depression, the research group found it alarming that apathetic boredom was reported relatively frequently by 36 percent of the high school students sampled.”
Add to that their determination: “people do not just randomly experience the different boredom types over time… they tend to experience one type” and the 36 percent number is indeed alarming.
We’re hopeful that such studies can help catch signs of depression early so that people can seek proper treatment.
We’re also hopeful that for everyone else, our blog post today alleviates some of their boredom.
Thursday, November 21, 2013
Playing with Brains
Today one of us at TGBR played a game that required memorizing strings of numbers. The game started off with just two digits, but went all the way up to 10 digits. Sometimes the number has to be memorized “backwards” which made it more challenging.
What was interesting was how even the slightest bit of familiarity aided the process. ‘206’ popped up as part of a longer string, and was easily memorized as a single entity because it’s the area code here in Seattle. Any time 7x7 occurred (737, 747, 727, etc) that, too, was held in the mind as a single digit. And strings of numbers that started with ‘19’ were easily coded as dates, So, when 2061984737 popped up, instead of it feeling like a 10 digit number, it felt like a three digit number.
Our brains, it seems, are desperate to attach meaning to things. We prefer character and reason to stark, abstract existence. This is how we, here at TGBR, choose to interpret new findings that show people with so-called “perfect” recall can be tricked into having false memories.
An article at Discover describes how scientists were able to achieve this, and for the sake of time, we won’t rehash the explanation here. Suffice it to say that no one’s memory is invulnerable to manipulation.
We’re insisting it all goes back to a terrible need to consider the question ”why?” why did that happen? What does it mean? How does that fit into the rest of the universe?
And if the universe is in our own heads, than our memories are their own worst enemies.
What was interesting was how even the slightest bit of familiarity aided the process. ‘206’ popped up as part of a longer string, and was easily memorized as a single entity because it’s the area code here in Seattle. Any time 7x7 occurred (737, 747, 727, etc) that, too, was held in the mind as a single digit. And strings of numbers that started with ‘19’ were easily coded as dates, So, when 2061984737 popped up, instead of it feeling like a 10 digit number, it felt like a three digit number.
Our brains, it seems, are desperate to attach meaning to things. We prefer character and reason to stark, abstract existence. This is how we, here at TGBR, choose to interpret new findings that show people with so-called “perfect” recall can be tricked into having false memories.
An article at Discover describes how scientists were able to achieve this, and for the sake of time, we won’t rehash the explanation here. Suffice it to say that no one’s memory is invulnerable to manipulation.
We’re insisting it all goes back to a terrible need to consider the question ”why?” why did that happen? What does it mean? How does that fit into the rest of the universe?
And if the universe is in our own heads, than our memories are their own worst enemies.
Wednesday, November 20, 2013
Giving the Brain the Gift of Giving
Today we celebrate a birthday here at the Great Brain Robbery, as one of our staff is turning 42. Or, as he likes to call it, he’s now “21s.” Last year he was “thirty-eleven.” We think he might have an issue with aging.
And apropos to that we thought we’d look at aging and the brain, but that seems like a bit of a downer. Not that there’s anything wrong with aging per-se, and if we look hard enough, we could probably find an article that shows how older brains are better at some things. But instead of that, or all the research on Alzheimer’s and dementia and the like, perhaps that other part of birthdays, gifts, is worthy of examination.
An article in The Washington Post in 2007 discusses a 2006 study which found that altruism lit up parts of the brain associated with experiencing pleasure. Basically, the theory goes that altruism is an evolved tendency, one that exists a biological component of the brain’s make-up. The specific example cited was that volunteers were asked to imagine giving money away. This was described as “volunteers placed the interests of others above their own.”
Let’s split hairs. Imagining something is one thing; doing it is quite another. Fantasizing about feeling good might be why the volunteers felt good. But, we’re only splitting these hairs to address something else the article brought up again and again: the biology of morality.
The article seemed to lump altruism in with morality, as if giving away money is somehow a moral thing to do. Examples were made of people with certain kind of brain damage who are better able to make “cold” decisions about selfish (but logical) acts of self preservation. Other examples included sociopaths who don’t have any sense of morality at all.
The problem with have with this is that it begs the question: is it moral to “do good” or is it merely moral to “not do bad.” If you were to decide to not hit our birthday boy with your car, even though he was jaywalking, is that moral? Would it make your brain feel good?
Or do you need to send him a huge gift (he’s got an eye on this jacket, for example) before your brain starts smiling?
The article does point out that theologians and philosophers are looking at this research with interest; there’s the question of whther aligning morality with brain chemistry takes away the role of free will in make responsible decisions. We’d like to suggest that what the research actually does is split the hair that needs to be split in order to define what morality actually is.
And apropos to that we thought we’d look at aging and the brain, but that seems like a bit of a downer. Not that there’s anything wrong with aging per-se, and if we look hard enough, we could probably find an article that shows how older brains are better at some things. But instead of that, or all the research on Alzheimer’s and dementia and the like, perhaps that other part of birthdays, gifts, is worthy of examination.
An article in The Washington Post in 2007 discusses a 2006 study which found that altruism lit up parts of the brain associated with experiencing pleasure. Basically, the theory goes that altruism is an evolved tendency, one that exists a biological component of the brain’s make-up. The specific example cited was that volunteers were asked to imagine giving money away. This was described as “volunteers placed the interests of others above their own.”
Let’s split hairs. Imagining something is one thing; doing it is quite another. Fantasizing about feeling good might be why the volunteers felt good. But, we’re only splitting these hairs to address something else the article brought up again and again: the biology of morality.
The article seemed to lump altruism in with morality, as if giving away money is somehow a moral thing to do. Examples were made of people with certain kind of brain damage who are better able to make “cold” decisions about selfish (but logical) acts of self preservation. Other examples included sociopaths who don’t have any sense of morality at all.
The problem with have with this is that it begs the question: is it moral to “do good” or is it merely moral to “not do bad.” If you were to decide to not hit our birthday boy with your car, even though he was jaywalking, is that moral? Would it make your brain feel good?
Or do you need to send him a huge gift (he’s got an eye on this jacket, for example) before your brain starts smiling?
The article does point out that theologians and philosophers are looking at this research with interest; there’s the question of whther aligning morality with brain chemistry takes away the role of free will in make responsible decisions. We’d like to suggest that what the research actually does is split the hair that needs to be split in order to define what morality actually is.
If altruism merely feels good (our birthday boy’s a size extra-large and likes the jacket in black, by the way) then perhaps morality can be defined by how we feel about it, not what it’s larger effects are. The reduces moral question to a more ego-centric approach, but our birthday boy has pointed out that, at least one day a year, being ego-centric is okay.
Friday, November 15, 2013
Trust Your Brain and Your Brain Will Trust You
The Little Engine thought he could, and he made it up the mountain. By force of sheer will, he overcame the challenges before him, and succeeded where before he had failed.
TLE’s now retired, sitting on a sofa, watching TV. Next to him is a bowl of potato chips. He knows he shouldn’t eat them. A few are okay, but not the whole bowl. But there goes his hand, dipping and lifting the crispy treat to his mouth. “I can stop,” he says to himself. “I think I can!” And then he shoves the fistful in, and sighs as he flips channels on the TV.
We’ve all got bad habits. And we’ve all got routines and motions we go through, without even thinking about it. How do we break out of these ruts, and become the people we know we want to be? The key is confidence. It really is all about believing in and trusting yourself.
Those new-agers were right, but science can back-up some of the claims, by pointing to key areas in the brain. Consider self-control, which is occurs in the frontal part of your brain. This is an area with high energy demands, and you can literally run out of fuel necessary to control yourself (scientists call this “ego depletion”). Now consider the deeper, emotional part of your brain, the ones that want what they want NOW. You’re fighting temptation all day to the point you’re literally tired of fighting, and since the reward for staying motivated is less insistent than the reward for giving in now, you give in.
You run out of that very will that kept TLE chugging up the mountain.
Trusting yourself can replace that emotional want with a different emotion all together. This is reminiscent of the way the brain chooses to give control to the heuristics that you use for routines, and a more conscious executive decision making process. Your brain will choose the method that has the most confidence—it’s easier to drive a car to work, letting your brain handle the lefts and rights while you mental prepare the big speech for your boss. On the other hand, you wouldn’t expect to automatically drive through a city you’ve never been in before, or you’ll end up hopelessly lost.
So, if you’re confident in your decision, then making that decision will be its own reward. Instead of your frontal brain fighting against the pull of your emotions, you harness those emotions to actually push yourself up that hill.
The Little Engine that could should picture himself as a happier engine, sitting in a place where there are no chips. By choosing to concentrate on a positive outcome to his decisions, he can trust himself to execute those proper decisions. Then he can pick up the bowl, take it into the kitchen, and go back to this TV, with the bowl now out of sight.
References:
http://www.psychologytoday.com/blog/addicted-brains/201311/the-key-quitting-self-trust-part-1
http://blogs.scientificamerican.com/mind-guest-blog/2013/11/12/should-habits-or-goals-direct-your-life-it-depends-2/
http://www.psychologicalscience.org/index.php/publications/observer/2013/april-13/why-wait-the-science-behind-procrastination.html
TLE’s now retired, sitting on a sofa, watching TV. Next to him is a bowl of potato chips. He knows he shouldn’t eat them. A few are okay, but not the whole bowl. But there goes his hand, dipping and lifting the crispy treat to his mouth. “I can stop,” he says to himself. “I think I can!” And then he shoves the fistful in, and sighs as he flips channels on the TV.
We’ve all got bad habits. And we’ve all got routines and motions we go through, without even thinking about it. How do we break out of these ruts, and become the people we know we want to be? The key is confidence. It really is all about believing in and trusting yourself.
Those new-agers were right, but science can back-up some of the claims, by pointing to key areas in the brain. Consider self-control, which is occurs in the frontal part of your brain. This is an area with high energy demands, and you can literally run out of fuel necessary to control yourself (scientists call this “ego depletion”). Now consider the deeper, emotional part of your brain, the ones that want what they want NOW. You’re fighting temptation all day to the point you’re literally tired of fighting, and since the reward for staying motivated is less insistent than the reward for giving in now, you give in.
You run out of that very will that kept TLE chugging up the mountain.
Trusting yourself can replace that emotional want with a different emotion all together. This is reminiscent of the way the brain chooses to give control to the heuristics that you use for routines, and a more conscious executive decision making process. Your brain will choose the method that has the most confidence—it’s easier to drive a car to work, letting your brain handle the lefts and rights while you mental prepare the big speech for your boss. On the other hand, you wouldn’t expect to automatically drive through a city you’ve never been in before, or you’ll end up hopelessly lost.
So, if you’re confident in your decision, then making that decision will be its own reward. Instead of your frontal brain fighting against the pull of your emotions, you harness those emotions to actually push yourself up that hill.
The Little Engine that could should picture himself as a happier engine, sitting in a place where there are no chips. By choosing to concentrate on a positive outcome to his decisions, he can trust himself to execute those proper decisions. Then he can pick up the bowl, take it into the kitchen, and go back to this TV, with the bowl now out of sight.
References:
http://www.psychologytoday.com/blog/addicted-brains/201311/the-key-quitting-self-trust-part-1
http://blogs.scientificamerican.com/mind-guest-blog/2013/11/12/should-habits-or-goals-direct-your-life-it-depends-2/
http://www.psychologicalscience.org/index.php/publications/observer/2013/april-13/why-wait-the-science-behind-procrastination.html
Thursday, November 14, 2013
This is Your Drugs on Brain
The writers at The Great Brain Robbery contribute to other blogs, including one that has, shall we say, a more robust editing process. The topic this week for that blog is nutrition and the brain; what foods are good for the mind, and how does one enjoy those benefits. You know, things like antioxidants from chocolate, B vitamins from fish, folic acid from legumes, vitamin E from buts and seeds. All of these nutrients make for a healthier brain.
One of the editors for that blog suggested perhaps saying a few things about nootropics, aka “smart drugs.” We’ve heard of those, of course, but we don’t know much about them. Are they for real, do they work, are they safe, is every blog that discusses them required to mention that Bradley Cooper movie?
So we grabbed our grains of salt and went looking, Turns out there’s a healthy, vibrant nootropic culture out there, ready to bring you in if you want to start popping a few pills to make yourself smarter. We are happy to report that our initial searches took us to informative websites, (as opposed to websites merely shucking tablets for profit).
The purpose of this blog post is not to be an exhaustive overview of nootropics, nor a basic nootropics 101. Rather, we’d like to point out a few things about what we’ve read so far.
1. Buying prescription drugs from other countries is illegal. We’re not trying to police you, but we do want to point out that the reason why drugs are prescribed is so that dosages can be tailored to the individual, and so that a system of liability can be in place to mitigate unfortunate consequences. Simply put, buying controlled substances online is a gamble, and not worth taking.
2. Over the counter supplements are unregulated. This means that a pill in one bottle may not be the same as a pill in another bottle. But this is case with anything that does not qualify as a “drug,” and so if you trust a certain brand, stick with that brand. Bargain shopping when it comes to supplements can be very tricky.
3. Nootropics are not cure-alls, nor are they meant to be If you suffer from poor nutrition, bad sleeping habits, undue stress, or significant psychological disorders, popping some piracetam is not going to help you. That nootropics are “supplements” should be your guide—they work best if everything else is working well. They should supplement, not replace, good habits.
The gist of all this is: when in doubt, don’t do it. Our research suggests that the people who can take nootropics with the least risk are those who probably don’t have an overwhelming need for them.
Nevertheless, we are not suggesting that you just say no to nootropics. Rather, we encourage you to do what we did: research. Take your grain of salt, look for counter-arguments to every claim you find, and decide for yourself. And in all things, moderation.
One of the editors for that blog suggested perhaps saying a few things about nootropics, aka “smart drugs.” We’ve heard of those, of course, but we don’t know much about them. Are they for real, do they work, are they safe, is every blog that discusses them required to mention that Bradley Cooper movie?
So we grabbed our grains of salt and went looking, Turns out there’s a healthy, vibrant nootropic culture out there, ready to bring you in if you want to start popping a few pills to make yourself smarter. We are happy to report that our initial searches took us to informative websites, (as opposed to websites merely shucking tablets for profit).
The purpose of this blog post is not to be an exhaustive overview of nootropics, nor a basic nootropics 101. Rather, we’d like to point out a few things about what we’ve read so far.
1. Buying prescription drugs from other countries is illegal. We’re not trying to police you, but we do want to point out that the reason why drugs are prescribed is so that dosages can be tailored to the individual, and so that a system of liability can be in place to mitigate unfortunate consequences. Simply put, buying controlled substances online is a gamble, and not worth taking.
2. Over the counter supplements are unregulated. This means that a pill in one bottle may not be the same as a pill in another bottle. But this is case with anything that does not qualify as a “drug,” and so if you trust a certain brand, stick with that brand. Bargain shopping when it comes to supplements can be very tricky.
3. Nootropics are not cure-alls, nor are they meant to be If you suffer from poor nutrition, bad sleeping habits, undue stress, or significant psychological disorders, popping some piracetam is not going to help you. That nootropics are “supplements” should be your guide—they work best if everything else is working well. They should supplement, not replace, good habits.
The gist of all this is: when in doubt, don’t do it. Our research suggests that the people who can take nootropics with the least risk are those who probably don’t have an overwhelming need for them.
Nevertheless, we are not suggesting that you just say no to nootropics. Rather, we encourage you to do what we did: research. Take your grain of salt, look for counter-arguments to every claim you find, and decide for yourself. And in all things, moderation.
Wednesday, November 13, 2013
When Two Brains Lie
Over at the SciLogs blog, Michael Blume contends: HAD + TOM = Social & Religious Cognitions. Hyper Agency Detection (HAD) is what we call it when we attribute some sort of agency (or “will” if you like) behind a phenomena. The trees shift in the wind, and we say God is blowing on them. Theory of Mind (TOM) is what we call it when we start to guess why agents do the things they do. God blows on the trees because He wants to remind us He’s there.
That’s a very glib way of putting it, but it should be enough for you to get the point. Blume points out what Darwin pointed out, that assuming there’s some sort of agent behind that noise and then doing something about it is better than ignoring that growling tiger as ‘just the wind” and getting eaten. In other words, our brains were naturally selected to believe in religious things.
The Great Brain Robbery does not whole-heartedly agree. We don’t think that suggesting our brains are wired for religion dismisses worship as an artifact of survival, nor do we think that worship is promulgated by relics of survival instinct.
We contend that while TOM and HAD can create the fertile ground from which religions are grown, one more essential element is required: the ability to lie.
A study in 2002 found that increasing dopamine in skeptics resulted in their being more inclined to see faces in jumbled images. This study led to another study, which essential found a gene for believing in God. Again, a glib way to put it, but the point is this: seeing things that aren’t there is just a mild form of schizophrenia. But what if a second person is the source for HAD and TOM? What if this person tells you that the agent is called God and that He wants you to do certain things?
This is how religions are created. Not by seeing things and believing them, but by having those beliefs codified into rules of conduct. Passing false pattern recognition on to others for the purpose of control. And once it takes root, it is very easy to perpetuate: all agency is attributed to God.
But what’s not easy to perpetuate is attributing TOM. Lies require more energy from the brain than telling the truth. This effort requires a reward of some kind to compensate for the extra energy spent. But what is that reward? Is it simply survival? Being in power makes it easier to be less empathetic—are religious leaders merely making it easier for themselves to sacrifice others, this ensuring their own longevity?
That’s a very glib way of putting it, but it should be enough for you to get the point. Blume points out what Darwin pointed out, that assuming there’s some sort of agent behind that noise and then doing something about it is better than ignoring that growling tiger as ‘just the wind” and getting eaten. In other words, our brains were naturally selected to believe in religious things.
The Great Brain Robbery does not whole-heartedly agree. We don’t think that suggesting our brains are wired for religion dismisses worship as an artifact of survival, nor do we think that worship is promulgated by relics of survival instinct.
We contend that while TOM and HAD can create the fertile ground from which religions are grown, one more essential element is required: the ability to lie.
A study in 2002 found that increasing dopamine in skeptics resulted in their being more inclined to see faces in jumbled images. This study led to another study, which essential found a gene for believing in God. Again, a glib way to put it, but the point is this: seeing things that aren’t there is just a mild form of schizophrenia. But what if a second person is the source for HAD and TOM? What if this person tells you that the agent is called God and that He wants you to do certain things?
This is how religions are created. Not by seeing things and believing them, but by having those beliefs codified into rules of conduct. Passing false pattern recognition on to others for the purpose of control. And once it takes root, it is very easy to perpetuate: all agency is attributed to God.
But what’s not easy to perpetuate is attributing TOM. Lies require more energy from the brain than telling the truth. This effort requires a reward of some kind to compensate for the extra energy spent. But what is that reward? Is it simply survival? Being in power makes it easier to be less empathetic—are religious leaders merely making it easier for themselves to sacrifice others, this ensuring their own longevity?
Tuesday, November 12, 2013
Caffeine and Sleep and the Brain, a Terrible Lover’s Triangle
You’re tossing and turning. You just can’t get to sleep, but you’re exhausted.… to borrow a phrase from the kids today, double-you tee eff?
Research reported on by Scientific American and My Healthy Daily News suggests that if you’re a night person, you can drink all the coffee you want. Well, not all the coffee you want. But if you’re inclined to have a double-tall in the middle of the afternoon, go ahead.
But if you’re a morning person, best to avoid caffeine later in the day. It all has to do with your so-called chronotype (the fancy word for whether you’re a “day” person or a “night” person). Which begs the question: why do night people need caffeine later in the day at all?
As a nation, we’re all more or less sleep-deprived. While science says about eight hours is best, the average adult only gets six and half on the weekdays. Caffeine helps us get through until the weekend, when we try to pay back some of this sleep debt.
But it’s no good if, even during your inadequate six and half, you’re not getting good sleep.
According to researchers, most people are neither “larks” nor “owls” and as such, should consider careful use of stimulants: a cup of coffee in the morning as part of a meal, not as a quick sleepiness eradicator. Remember: if you’re not awake 30 minutes after getting out of bed, you’re probably sleep deprived, and caffeine addresses the symptom, not the cause, if your ailment.
When in doubt, just eschew caffeine altogether and try to go to bed an hour early. Boring, yes, but you may find that, with enough sleep, you’re a morning person after all.
Research reported on by Scientific American and My Healthy Daily News suggests that if you’re a night person, you can drink all the coffee you want. Well, not all the coffee you want. But if you’re inclined to have a double-tall in the middle of the afternoon, go ahead.
But if you’re a morning person, best to avoid caffeine later in the day. It all has to do with your so-called chronotype (the fancy word for whether you’re a “day” person or a “night” person). Which begs the question: why do night people need caffeine later in the day at all?
As a nation, we’re all more or less sleep-deprived. While science says about eight hours is best, the average adult only gets six and half on the weekdays. Caffeine helps us get through until the weekend, when we try to pay back some of this sleep debt.
But it’s no good if, even during your inadequate six and half, you’re not getting good sleep.
According to researchers, most people are neither “larks” nor “owls” and as such, should consider careful use of stimulants: a cup of coffee in the morning as part of a meal, not as a quick sleepiness eradicator. Remember: if you’re not awake 30 minutes after getting out of bed, you’re probably sleep deprived, and caffeine addresses the symptom, not the cause, if your ailment.
When in doubt, just eschew caffeine altogether and try to go to bed an hour early. Boring, yes, but you may find that, with enough sleep, you’re a morning person after all.
Monday, November 11, 2013
This is Your Brain in Love
Remember those commercials from the 90s, the ones where they’d show you an egg, and say “this is your brain,” and then they’d crack it open and start it sizzling in a frying pan, and say “this is your brain on drugs. Any questions?” And the question would of course be “can I get a side of home fries with that that?”
The idea was that drugs fry your brain, and so the metaphor was over extended. So, in that vein, let’s look at more ways that your brain can get fried. And since Valentine’s day is a good three months away, let’s talk about love.
First up, this study, as reported by United Academics, which points out that “Romantic love and partnership is influenced by various substances, such as oxytocin, vasopressin, dopamine, serotonin, cortisol and testosterone.” If you’ve been reading The Great Brain Robbery, you may recognize some of those hormones. Like cortisol, the one that, amongst other things, makes you fat and rips your brain to shreds. Ain’t love great?
And there’s this article at BrainFacts.org, discussing the same and similar research, which says “In male prairie voles, the hormone arginine vasopressin, which is involved in aggression and territorial behavior, also appears to play an important role in pair-bonding.” Did you get that, guys? Love can make you aggressive. Or being aggressive can make you fall in love.
And finally, Discover Magazine’s Seriously Science column reports on a study that shows that, for straight men, just thinking about talking to women can make their brains take a dive.
Add it all up: love makes you fat, aggressive, and stupid.
Well, of course it does.
Of course it doesn’t. Love is too complex to be reduced to a few hormones, and the brain is too complex to be similarly reduced. We encourage you to actually read the above articles and abstracts, as the question of what love does to the brain (or, not to put to fine a point on it, for the brain) is still as mysterious as it is fascinating.
But nevermind that. Nevermind being rational and cautious about approaching the science of the brain. Nevermind reading these scientific articles in the spirit that they’re offered: with a healthy dose of discernment not to take the findings and draw unsubstantiated conclusions We’ve just read the Neuroskeptics guide on how to wave away doubt. And we’re approaching jazz-hands level here.
After all, if love makes s stupid, than being stupid means we deserve to be loved, right?
The idea was that drugs fry your brain, and so the metaphor was over extended. So, in that vein, let’s look at more ways that your brain can get fried. And since Valentine’s day is a good three months away, let’s talk about love.
First up, this study, as reported by United Academics, which points out that “Romantic love and partnership is influenced by various substances, such as oxytocin, vasopressin, dopamine, serotonin, cortisol and testosterone.” If you’ve been reading The Great Brain Robbery, you may recognize some of those hormones. Like cortisol, the one that, amongst other things, makes you fat and rips your brain to shreds. Ain’t love great?
And there’s this article at BrainFacts.org, discussing the same and similar research, which says “In male prairie voles, the hormone arginine vasopressin, which is involved in aggression and territorial behavior, also appears to play an important role in pair-bonding.” Did you get that, guys? Love can make you aggressive. Or being aggressive can make you fall in love.
And finally, Discover Magazine’s Seriously Science column reports on a study that shows that, for straight men, just thinking about talking to women can make their brains take a dive.
Add it all up: love makes you fat, aggressive, and stupid.
Well, of course it does.
Of course it doesn’t. Love is too complex to be reduced to a few hormones, and the brain is too complex to be similarly reduced. We encourage you to actually read the above articles and abstracts, as the question of what love does to the brain (or, not to put to fine a point on it, for the brain) is still as mysterious as it is fascinating.
But nevermind that. Nevermind being rational and cautious about approaching the science of the brain. Nevermind reading these scientific articles in the spirit that they’re offered: with a healthy dose of discernment not to take the findings and draw unsubstantiated conclusions We’ve just read the Neuroskeptics guide on how to wave away doubt. And we’re approaching jazz-hands level here.
After all, if love makes s stupid, than being stupid means we deserve to be loved, right?
Labels:
brainfacts,
cortisol,
discover,
drugs,
egg,
love,
metaphor,
united academics
Friday, November 8, 2013
The Irony of Brain Research
Taken as a whole, the very idea of brain research is pretty hilarious.
And this is true no matter what approach you take to studying the mind. Strictly biological, psychological, philosophical, spiritual: each discipline, when looking at the brain itself, is more or less clueless.
Which is not to say remarkable research hasn’t been done. Neuroscience is a very hot subject right now, precisely because of all the fascinating discoveries being made. Psychology has fascinated us for centuries, and since Freud it’s been a rich topic for discussion. And the only thing older than philosophy is religion.
But even Plato figured out that the all he could be certain of was that he didn’t know anything. A few thousand years later, Descartes updated that to knowing he exists... and that's about it. There’s an entire philosophy based on separating the brain itself from the mind. As for anatomy, some of those ancient Greeks did think the brain was the center of thought, until Aristotle came along and switched that job to the heart.
The point is, all of this noodling is being done with the very noodle being noodled. This is ironic, and therefore hilarious. But wait, there’s more: there’s a word for the brain study of laughter: gelotology. Go head, laugh at that. Laugh at Jello-tology.
And of course, what little we know about laughter and brain is… very little. We know that laughter is a distributed emotional responses, moving through several parts of the brain. We know it has social uses, as well as therapeutic uses. But next to sleep, laughter maybe one of the biggest brain mysteries yet.
The good news is: you don’t have to be a mechanic to drive a car. You don’t have to know how the brain works to have a brilliant mind, and you don’t have to know why a joke is funny to know that it’s funny.
Indeed, sometimes nothing kills a joke like explaining it.
And this is true no matter what approach you take to studying the mind. Strictly biological, psychological, philosophical, spiritual: each discipline, when looking at the brain itself, is more or less clueless.
Which is not to say remarkable research hasn’t been done. Neuroscience is a very hot subject right now, precisely because of all the fascinating discoveries being made. Psychology has fascinated us for centuries, and since Freud it’s been a rich topic for discussion. And the only thing older than philosophy is religion.
But even Plato figured out that the all he could be certain of was that he didn’t know anything. A few thousand years later, Descartes updated that to knowing he exists... and that's about it. There’s an entire philosophy based on separating the brain itself from the mind. As for anatomy, some of those ancient Greeks did think the brain was the center of thought, until Aristotle came along and switched that job to the heart.
The point is, all of this noodling is being done with the very noodle being noodled. This is ironic, and therefore hilarious. But wait, there’s more: there’s a word for the brain study of laughter: gelotology. Go head, laugh at that. Laugh at Jello-tology.
And of course, what little we know about laughter and brain is… very little. We know that laughter is a distributed emotional responses, moving through several parts of the brain. We know it has social uses, as well as therapeutic uses. But next to sleep, laughter maybe one of the biggest brain mysteries yet.
The good news is: you don’t have to be a mechanic to drive a car. You don’t have to know how the brain works to have a brilliant mind, and you don’t have to know why a joke is funny to know that it’s funny.
Indeed, sometimes nothing kills a joke like explaining it.
Thursday, November 7, 2013
Don’t Gamble on Brain Health
In one of Farmers’ “Fifteen Seconds of Smart,” ads, Professor Nathaniel Burke says “So you want to drive more safely…” And then he says, “Stop eating. Take deep breaths. Avoid bad weather. Get eight hours. Turn it down. And of course, talk to Farmer’s.”
Although we have no proof of this whatsoever, someone at the ad agency where they make these commercials could have handed the copy writer a list of keys to maintaining a healthy brain, and had the writer distill it down to something that could be used to sell insurance.
Although we have no proof of this whatsoever, someone at the ad agency where they make these commercials could have handed the copy writer a list of keys to maintaining a healthy brain, and had the writer distill it down to something that could be used to sell insurance.
The essential elements to maintaining a healthy brain are: Eat smart, exercise, pick your battles, sleep, manage stress, socialize.
There’s a surfeit of research that backs up each of these assertions, and they just make sense. And we’re not even talking about good psychological or emotional health. We’re talking about the very physiology of the brain. No matter what point of view you have on human consciousness, no matter what philosophy your embrace, religion or lack thereof you adhere to, there’s no arguing against keeping your physical brain as healthy as possible.
We’ve discussed Maslow’s Hierarchy of Needs before, and its interesting how these simple rules of brain health align with the bases of that Need pyramid. In our hypermodern world, we live more or less in the top two areas, esteem and self-actualization. And yet many of us, perhaps most of us, merely “check-in” with those needs at the bottom. And our brains suffer.
Insurance is, essential, gambling on disaster. You pay into a system just in case you’re unlucky enough to suffer an accident. Farmers wants you to hedge that bet, in their favor, by taking care of your mind so you don’t increase the odds of becoming unlucky. Take this either way you like—either insurance companies don’t want you injured, or they don’t want to pay you.
Either way, you are better off paying and not needing it, than not paying and needing. Same is true for brain health. Of course, if you’re lucky, you can get away with not paying and not needing it. But we can guarantee you, when it comes to brain health, you should “pay” into maintaining these basics, because you will definitely need your brain.
There’s a surfeit of research that backs up each of these assertions, and they just make sense. And we’re not even talking about good psychological or emotional health. We’re talking about the very physiology of the brain. No matter what point of view you have on human consciousness, no matter what philosophy your embrace, religion or lack thereof you adhere to, there’s no arguing against keeping your physical brain as healthy as possible.
We’ve discussed Maslow’s Hierarchy of Needs before, and its interesting how these simple rules of brain health align with the bases of that Need pyramid. In our hypermodern world, we live more or less in the top two areas, esteem and self-actualization. And yet many of us, perhaps most of us, merely “check-in” with those needs at the bottom. And our brains suffer.
Insurance is, essential, gambling on disaster. You pay into a system just in case you’re unlucky enough to suffer an accident. Farmers wants you to hedge that bet, in their favor, by taking care of your mind so you don’t increase the odds of becoming unlucky. Take this either way you like—either insurance companies don’t want you injured, or they don’t want to pay you.
Either way, you are better off paying and not needing it, than not paying and needing. Same is true for brain health. Of course, if you’re lucky, you can get away with not paying and not needing it. But we can guarantee you, when it comes to brain health, you should “pay” into maintaining these basics, because you will definitely need your brain.
Labels:
ads,
balance,
brain health,
commercials,
farmers,
gambling,
maslow,
needs
Wednesday, November 6, 2013
Defending Brain Training
We’re pro brain-training at The Great Brain Robbery, so when we see articles that suggest brain-training doesn’t “work,” we’re a bit biased. We offer this self-description as a form of full disclosure before proceeding to the following discussion.
An article at Science Daily (a website we have much respect for and read every day) reports on a study forthcoming in Psychological Science. This study assigned older adults to one of two groups: one group learned new skills such as photography, while the other were tasked with listening to music and doing puzzles at home. Both groups included subjects who were assigned social tasks to mitigate for such an effect. Researchers found that those who where assigned to learn a new skill had better memory improvement.
The article quotes the scientists as saying “It seems it is not enough just to get out and do something -- it is important to get out and do something that is unfamiliar and mentally challenging …. When you are inside your comfort zone you may be outside of the enhancement zone.”
This makes sense, and is very encouraging, because for anyone with a brain, there should be a world of challenges available to keep the mind sharp. And while playing video games is not the same as taking an eight-week course on Taiwanese Dessert Decorating, at least games can still be challenging, and importantly, readily available.
But, we want to point out that his study does not mention a control group of subjects who did neither the skill-seeking activities nor the home puzzle activities. So while, yes, while learning basket weaving on a three-week trip through Namibia would probably be very good for the brain, this doesn’t mean that challenging yourself to play brain training games wouldn’t.
Note also that the article calls listening to music and doing puzzles “non-demanding mental activities.” We especially disagree with this assertion, in as much as doing crossword puzzles can be merely an “exercise” (like, on a Monday) whereas anyone who’s noodled through a Friday New York Time crossword puzzles knows how challenging—and stimulating, not to mention rewarding—it can be to finish one.
So the take-away we’re going to agree-upon with this research is that: more research needs to be done. And in the meantime, there is no good reason to play games. And fun is its own reason.
An article at Science Daily (a website we have much respect for and read every day) reports on a study forthcoming in Psychological Science. This study assigned older adults to one of two groups: one group learned new skills such as photography, while the other were tasked with listening to music and doing puzzles at home. Both groups included subjects who were assigned social tasks to mitigate for such an effect. Researchers found that those who where assigned to learn a new skill had better memory improvement.
The article quotes the scientists as saying “It seems it is not enough just to get out and do something -- it is important to get out and do something that is unfamiliar and mentally challenging …. When you are inside your comfort zone you may be outside of the enhancement zone.”
This makes sense, and is very encouraging, because for anyone with a brain, there should be a world of challenges available to keep the mind sharp. And while playing video games is not the same as taking an eight-week course on Taiwanese Dessert Decorating, at least games can still be challenging, and importantly, readily available.
But, we want to point out that his study does not mention a control group of subjects who did neither the skill-seeking activities nor the home puzzle activities. So while, yes, while learning basket weaving on a three-week trip through Namibia would probably be very good for the brain, this doesn’t mean that challenging yourself to play brain training games wouldn’t.
Note also that the article calls listening to music and doing puzzles “non-demanding mental activities.” We especially disagree with this assertion, in as much as doing crossword puzzles can be merely an “exercise” (like, on a Monday) whereas anyone who’s noodled through a Friday New York Time crossword puzzles knows how challenging—and stimulating, not to mention rewarding—it can be to finish one.
So the take-away we’re going to agree-upon with this research is that: more research needs to be done. And in the meantime, there is no good reason to play games. And fun is its own reason.
Tuesday, November 5, 2013
No More Poor Brains
Poverty is a political issue, not a social issue. We have the wealth, we have the means of redistributing it, and we even have the will to do so; we just don’t have the justification. Our country is in a grip of an idealism that puts a primacy on so-called “hard-work” and tautologically insists if you don’t have anything, it must be because you didn’t work hard enough.
This is backwards logic, and literally harmful to those who have to suffer under the yoke of financial oppression. And new research shows that poverty is harmful to the brain. This is no longer a matter of haves versus have nots, or even those who deserve and those who don’t. This is a matter of ignoring the injuries suffered by those who were, against their will, born without privilege.
Define poverty however you like, but noise pollution, overcrowding, and meager access to nutrition can cause the impoverished to suffer from reduced brain health. These are the same poor that some politicians believe do not deserve access to good health care. Nevertheless, it’s these same poor who are the major contributors, through labor, of our gross national product. Its these same poor who, despite not having access to health care, pay for it.
The Great Brain Robbery is walking a thin line here, we realize, taking on a political issue and not discussing brain science. But science is, at the end of the day, merely a tool that we use to better our lives. The essential word there is “our.” This research which has found the extremely negative impact of poverty on the physiology of the brain should be a wake-up call. This is not an issue that can be mitigated by shifting our social paradigms. We’re not going to be able to justify a lopsided distribution of wealth and call it culture. We are allowing the dehumanization of our poor citizens, and it has to stop.
This is backwards logic, and literally harmful to those who have to suffer under the yoke of financial oppression. And new research shows that poverty is harmful to the brain. This is no longer a matter of haves versus have nots, or even those who deserve and those who don’t. This is a matter of ignoring the injuries suffered by those who were, against their will, born without privilege.
Define poverty however you like, but noise pollution, overcrowding, and meager access to nutrition can cause the impoverished to suffer from reduced brain health. These are the same poor that some politicians believe do not deserve access to good health care. Nevertheless, it’s these same poor who are the major contributors, through labor, of our gross national product. Its these same poor who, despite not having access to health care, pay for it.
The Great Brain Robbery is walking a thin line here, we realize, taking on a political issue and not discussing brain science. But science is, at the end of the day, merely a tool that we use to better our lives. The essential word there is “our.” This research which has found the extremely negative impact of poverty on the physiology of the brain should be a wake-up call. This is not an issue that can be mitigated by shifting our social paradigms. We’re not going to be able to justify a lopsided distribution of wealth and call it culture. We are allowing the dehumanization of our poor citizens, and it has to stop.
Monday, November 4, 2013
Get the Fat Out of Your Brain
The people behind the Great Brain Robbery follow a lot of blogs and folks on Twitter. Brain-talk is a rich field, with lots of research, many different point of views, and thousands of theories all coalescing in a miasma of opinion, explanation, interpretation. Sometimes we see things we disagree with. Sometimes we feel like this casual blogging stuff sends the wrong message.
For example, a blog post we found today that asserts: “The more a woman weighs, the worse her memory.” We’re not going to link to this blog, but we’re going to take it apart. First, the study that found this correlation between BMI increases and memory test decreases only studied women ages 65 to 79. That’s not “women” in general, so the above assertion is spurious at best.
Then there’s the use of the BMI, a terrible way to classify anything except unrelated metrics. Take a person’s weight in kilograms and divide by the square of their height in meters. The taller you are, the more you weigh, and if you are not tall enough to weigh as much as you do, you’re fat. Except that taller people will weigh more than the a simple two-metric function will predict, making them “fat,” and the BMI doesn’t, obviously, account for what gives a person mass in the first place. Body builders are fat, and people who suffer from anemia are skinny: who’s healthier?
Who’s dumber? Because when bloggers start throwing around sentences like “the worse her memory,” that’s a judgmental statement. And this blogger later says “Remember, everything the brain does affects memory (and everything affecting memory affects the brain).” Therefore, if you’re a woman who weighs too much, expect to have decreased brain function, according to this blogger.
The blogger doesn’t bother to show the causative relationship between weight gain and memory dysfunction, but does say that another study showed that people who had bariatric stomach by-pass surgery showed mental improvements. This same blogger, in the past, has talked about the very positive effects happiness has on memory. Who, in this overly-critical society, wouldn’t be happier to be, finally, skinny?
Does gaining weight effect memory? Or does not exercising effect memory? The blogger says “Though exercise doesn’t do much to cause weight loss, it has many other benefits… that can directly benefit memory.” Did this study of older women control for the ones who, despite having a higher BMI were exercising regularly?
How about this: let’s stop using inadequate tools to focus on vaguely correlative measurements and making painfully obvious conclusions.
As you age, good health is important for maintaining brain function. We’ve known that for centuries.
For example, a blog post we found today that asserts: “The more a woman weighs, the worse her memory.” We’re not going to link to this blog, but we’re going to take it apart. First, the study that found this correlation between BMI increases and memory test decreases only studied women ages 65 to 79. That’s not “women” in general, so the above assertion is spurious at best.
Then there’s the use of the BMI, a terrible way to classify anything except unrelated metrics. Take a person’s weight in kilograms and divide by the square of their height in meters. The taller you are, the more you weigh, and if you are not tall enough to weigh as much as you do, you’re fat. Except that taller people will weigh more than the a simple two-metric function will predict, making them “fat,” and the BMI doesn’t, obviously, account for what gives a person mass in the first place. Body builders are fat, and people who suffer from anemia are skinny: who’s healthier?
Who’s dumber? Because when bloggers start throwing around sentences like “the worse her memory,” that’s a judgmental statement. And this blogger later says “Remember, everything the brain does affects memory (and everything affecting memory affects the brain).” Therefore, if you’re a woman who weighs too much, expect to have decreased brain function, according to this blogger.
The blogger doesn’t bother to show the causative relationship between weight gain and memory dysfunction, but does say that another study showed that people who had bariatric stomach by-pass surgery showed mental improvements. This same blogger, in the past, has talked about the very positive effects happiness has on memory. Who, in this overly-critical society, wouldn’t be happier to be, finally, skinny?
Does gaining weight effect memory? Or does not exercising effect memory? The blogger says “Though exercise doesn’t do much to cause weight loss, it has many other benefits… that can directly benefit memory.” Did this study of older women control for the ones who, despite having a higher BMI were exercising regularly?
How about this: let’s stop using inadequate tools to focus on vaguely correlative measurements and making painfully obvious conclusions.
As you age, good health is important for maintaining brain function. We’ve known that for centuries.
Friday, November 1, 2013
Delegating the Brain’s Delegation
You’re busy. You’ve got things to do. You don’t have time for a lot of disorganization. So when you walk into the reception area of the hotel where you’re presenting a paper on neuroplasticity, you expect to see signs, pointing you to the various areas you need to go. Places like the executive center, so you can download a few slides and print them. And the coffee shop, so you grab a quick cup to fuel your talk. And of course, the restroom.
But what if the sign was garbled? What would you do? You’d ask someone-- but what if there’s no one behind the reception desk? You’d ask someone else. Another hotel patron, maybe, who, while not officially designated to give directions, might be able to point out the places you need.
A new paper in the Journal of Neuroscience suggest that the brain may be able to do the same kind of thing.
Your brain has different areas for recognizing faces and landscapes, and another part of your brain assigns attention to which ever area is required for a given task at hand. If you want to remember a face, for example, this latter area tells the landscape area to relax while telling the face area to pay attention.
Scientists showed test subjects pictures of faces overlaid on top of pictures of landscape. They had them concentrate on only one aspect of the picture to memorize it, all while the scientists used magnets to temporarily knock-out the “delegator” portion of the brain. With the delegator knocked out, subjects were still able to remember faces or landscapes as required.
They looked at brain scans while they did this, and found that a different part of the brain was acting as a temporary delegator. Note that this was not a learned response, something that had developed over time. We already know about the brain’s remarkable ability to rewire itself and compensate for damaged parts. Instead, this was an immediate taking-over of delegation activity.
That’s one take away. And it’s intriguing. The Neuroskeptic at Discover is a little more, well, skeptical. He points out that the study didn’t show for sure this reassignment, only that another part of the brain spoke up when the magnets didn’t keep delegation from happening. It might have been the case that the scientists were wrong about how much actually delegation the delegator does.
(In our hotel analogy above, that might be like the hotel patron drifting over to the sign you’re looking at, watching you read it.)
Nevertheless, it’s enough to warrant further studies which will give us even more data in a quest to understand out decision making processes.
You know, free will.
But what if the sign was garbled? What would you do? You’d ask someone-- but what if there’s no one behind the reception desk? You’d ask someone else. Another hotel patron, maybe, who, while not officially designated to give directions, might be able to point out the places you need.
A new paper in the Journal of Neuroscience suggest that the brain may be able to do the same kind of thing.
Your brain has different areas for recognizing faces and landscapes, and another part of your brain assigns attention to which ever area is required for a given task at hand. If you want to remember a face, for example, this latter area tells the landscape area to relax while telling the face area to pay attention.
Scientists showed test subjects pictures of faces overlaid on top of pictures of landscape. They had them concentrate on only one aspect of the picture to memorize it, all while the scientists used magnets to temporarily knock-out the “delegator” portion of the brain. With the delegator knocked out, subjects were still able to remember faces or landscapes as required.
They looked at brain scans while they did this, and found that a different part of the brain was acting as a temporary delegator. Note that this was not a learned response, something that had developed over time. We already know about the brain’s remarkable ability to rewire itself and compensate for damaged parts. Instead, this was an immediate taking-over of delegation activity.
That’s one take away. And it’s intriguing. The Neuroskeptic at Discover is a little more, well, skeptical. He points out that the study didn’t show for sure this reassignment, only that another part of the brain spoke up when the magnets didn’t keep delegation from happening. It might have been the case that the scientists were wrong about how much actually delegation the delegator does.
(In our hotel analogy above, that might be like the hotel patron drifting over to the sign you’re looking at, watching you read it.)
Nevertheless, it’s enough to warrant further studies which will give us even more data in a quest to understand out decision making processes.
You know, free will.
Subscribe to:
Posts (Atom)