It's about eight in the morning, and I hear the front door of our office open. I hear the click-clack sound of the derailleur of a co-worker's bicycle, and then the bean grinder in the office kitchen, and then sound of the cofee mahcine itself warming up. And then a face appears in the doorway of my office. we've got about five minutes before the coffee is ready.
Idle small talk (how was your evening, good, how was yours, good, did you see the game, etc) for a few seconds until a key word is hit upon. Today, it's happens to be "cat." We begin to discuss how cats are, more or less, pointless. And yet, so essential to the human experience. That thing about strapping a piece of buttered toast to the back of a cat comes up- the idea that, since cat's always land on their feet, and buttered toast always lands butter-side down, you can break the laws of physics by creating a floating, spinning, cat-toast machine.
From there we consider Schroedinger's box, the thought experiment where a cat is placed in a box along with a poison that is triggered by a randomly decaying isotope. So long as the box stays closed and unobserved, that cat is both alive and dead. We, my coworker and I, decided this is not a silly metaphor, and that if we trap out floating cat-bread machine in such a box, the cat will be both alive and dead and right side up and upside down at the same time.
Then we start talking about USB plugs, and how there should be a fifty-fifty chance of getting them plugged in correctly on the first try, but it always takes three tries. We realize that if we were to attach such a cat to a USB plug, since it would be at all times both right-side up and upside down, it would balance out the three-times plug phenomena, and the order of the universe would be restored.
Because things are always in the last place you look for them, simply because once you find them, you stop looking. If you start making a practice of still looking for something after you find it, the law of averages would start to swing towards your eventually finding things in the first place you look. You'd have to keep looking, but you wouldn't have so much anxiety.
By hooking up a Schroedinger's cat-bread-box to a usb plug, we'd be essentially tipping the law of averages back to getting the plug right the first time— the cat would do the spinning we'd have to do to ensure the average was always on our side.
Then the coffee machine went ding, and we were ready for work. Had an incredibly productive day. Socializing, Creativity, Caffeine: all are excellent ingredients to a super-productive brain.
Wednesday, December 18, 2013
Tuesday, December 10, 2013
The Once a Week Brain
We freely admit we may not be getting enough sleep. We certainly don’t feel as creative as we used to. But then, back when this blog started a few months ago, we were very excited. The excitement has died down. And here’s a bit f rationalization: we’re glad it’s died down. Now we can let dedication to the subject at hand be our prime motivator.
Rationalization, but it’s out story and we’re sticking to it. While The Great Brain Robbery was once a twice-a-day blog, and then once a day, and then four times per week or so, we’re switching to once per week. We have not decided what that day is (yet—maybe it should be Tuesdays) but we’re hopeful that it will result in better-written posts. Or at least more thorough. More discerning?
In the meantime, here’ what’s we’ve been reading about lately:
- Wake Up: “6 Ways a Poor Night's Sleep Messes with You”
- Meditate: “Why You Should Meditate: Become a Multitasking Master”
- Eat Breakfast: “Eat More of These Four Things For A Stronger, Healthier Brain”
- Get Some Exercise: “35 Year Study Finds Exercise Reduces Risk of Dementia”
- Don’t Think Too Hard: “Mental fatigue impairs physical performance in humans”
We know you’ll forgive for this small change, since forgiveness is good for your brain anyway. Thanks for reading and keep looking forward to more!
Friday, December 6, 2013
Alcohol and the Brain
Alcohol is good for your heart, they say. It’s also good for your brain. Sort of.
All things in moderation. A study performed by a team of Swedish researchers found that small amounts of alcohol fed to mice led to the growth of extra brain cells. They pointed out, however, that these cells may have taught the mice to be dependent on alcohol. After they were given alcohol, the mice preferred it to water. The researchers also suggested that while neurons that grew were normal, alcohol may simply have facilitated their growth through a tranquilizing effect that reduced stress.
All things in moderation. A study performed by a team of Swedish researchers found that small amounts of alcohol fed to mice led to the growth of extra brain cells. They pointed out, however, that these cells may have taught the mice to be dependent on alcohol. After they were given alcohol, the mice preferred it to water. The researchers also suggested that while neurons that grew were normal, alcohol may simply have facilitated their growth through a tranquilizing effect that reduced stress.
A calm mind grows neurons more healthfully than a stressed mind, for the most part, although research has shown how certain high-stress experience can enhance particular memories. And while alcohol’s effect on growing neurons is being studied, ample evidence shows how excessive alcohol intake can be damaging to existing neurons.
As social creatures, humans have integrated alcohol consumption into a variety of activities. Getting black-out drunk is certainly bad, but even the placebo effect of feeling relaxed with a pint glass in your hand can be beneficial to the brain. This is not to say you should insist everyone must imbibe for their own good. Rather, by drinking in moderation, damage, if any, is minimized, and its possible actual physical growth can occur. In the meantime, pleasurable interaction with family and friends is good for both your brain and your heart.
And therefor it is important that drinking, even in moderation, be done in safe environments with considerable awareness for the effects it can have. If alcohol can be good for the brain, the damage it does in other ways can be mitigated by
- Making sure you’re well fed before having a drink
- Making sure you stay well hydrated before, during, and after any drinking
Special cases aside, there’s nothing wrong with having a drink now and again. Consider the analogy of driving car: there’s always the potential of getting into a crash, but driving the speed limit and wearing your seat belt will decrease the risk to a point low enough to tolerable, allowing for all the conveniences that driving affords. Go careening around corners at high speed and wreck is inevitable, and doesn’t even afford any real convenience.
Thursday, December 5, 2013
Your Brain Wants You to Eat Dirt
What’s fun about the title to this blog post is the very idea that your brain “wants” you to do anything. As if your brain had a separate will from “your” will. It smack of Karl Pilkingtonism, from which we get the utterly wonderful quote “is my brain in charge of me or am I in charge of my brain?”
Actually, if you Google the phrase: “Who’s in charge, me or my brain?” You’ll get millions of hits, and links to familiar news and science sites. So it’s not a stupid question at all. But the very idea of you brain “wanting” something belies a belief that your brain “knows” what’s good for it—even if “you” don’t.
Ever have a craving for something odd? Could that be your body “knowing” that a key nutrient you lack is in that food you’re hungry for? Or is that your body remembers not lacking in some capacity the last time you at that food? Is this what we mean when we sat your brain wants you to eat dirt?
No, it’s not, this time. We’re not talking about pregnancy and geophagy, where women will sometimes crave eating dirt, either because their bodies are mineral deficient or they want to boost toxin immunity (scientists aren’t exactly sure). We’re not talking about pica, the psychology (and probably pathological) desire to eat non-food items. We’re just talking about mycobacterium vaccae.
It seems that ingesting this bacteria has been shown to have a positive effect on learning and memory. Mice who were given the bacterium showed increased levels of serotonin and were able to navigate mazes faster and with less anxiety. Further tests showed this to be a temporary effect.
However, before you go outside and start adding real ground to you coffee grounds, be aware that the tests cited were not performed on humans, and that there’s no guarantee the dirt in your front yard contain the bacterium at all. You’d be better of simply go for a vigorous hike in the a natural area, and simply breathe deep the loamy aromas of the wilderness.
And what’s great about using yourself in such an experiment is that even if you don’t inhale any mycobacterium vaccae, or not enough to do anything, you’ll still reap benefits from the hike itself.
While it’s “fun” to say your brains wants you to eat dirt, we can promise your brain really does want you to exercise and immerse yourself in tranquility. So go for the hike and let us know how it makes you fell.
Actually, if you Google the phrase: “Who’s in charge, me or my brain?” You’ll get millions of hits, and links to familiar news and science sites. So it’s not a stupid question at all. But the very idea of you brain “wanting” something belies a belief that your brain “knows” what’s good for it—even if “you” don’t.
Ever have a craving for something odd? Could that be your body “knowing” that a key nutrient you lack is in that food you’re hungry for? Or is that your body remembers not lacking in some capacity the last time you at that food? Is this what we mean when we sat your brain wants you to eat dirt?
No, it’s not, this time. We’re not talking about pregnancy and geophagy, where women will sometimes crave eating dirt, either because their bodies are mineral deficient or they want to boost toxin immunity (scientists aren’t exactly sure). We’re not talking about pica, the psychology (and probably pathological) desire to eat non-food items. We’re just talking about mycobacterium vaccae.
It seems that ingesting this bacteria has been shown to have a positive effect on learning and memory. Mice who were given the bacterium showed increased levels of serotonin and were able to navigate mazes faster and with less anxiety. Further tests showed this to be a temporary effect.
However, before you go outside and start adding real ground to you coffee grounds, be aware that the tests cited were not performed on humans, and that there’s no guarantee the dirt in your front yard contain the bacterium at all. You’d be better of simply go for a vigorous hike in the a natural area, and simply breathe deep the loamy aromas of the wilderness.
And what’s great about using yourself in such an experiment is that even if you don’t inhale any mycobacterium vaccae, or not enough to do anything, you’ll still reap benefits from the hike itself.
While it’s “fun” to say your brains wants you to eat dirt, we can promise your brain really does want you to exercise and immerse yourself in tranquility. So go for the hike and let us know how it makes you fell.
Wednesday, December 4, 2013
When in Doubt, Blame the Brain
On Sunday December 1st, a commuter train in New York derailed, resulting in 67 injured and 4 dead. The train was going 82 mph when it came upon a curve rated for only 30 mph. This is a curve trains have taken thousands of time without incident.
What happened this time? Why didn’t the train slow down? The engineer who was driving the train claims the brakes failed. If this is the case, the tragedy has to be chalked up to simple dumb luck. All machines fail eventually, which is why we have safety checks—but even those checks can’t catch everything every time. Misunderstood circumstances can lead to machine failure, and the only thing to do is study those circumstances after the fact and add findings to the checklist.
But there’s evidence that the brakes did not fail, that it was the engineer himself who was responsible. He claims he did not fall asleep, that he had a full night’s rest. No drugs or alcohol were detected, either. And safety devices on the train, such as a “dead man’s switch” would have engaged if the engineer had fallen asleep and his hands had fallen from the controls.
So, if it wasn’t the brakes, or sleep, what would keep a man at the controls but not aware of the coming curve? His brain, if course. Enter Automaticity.
You can walk without thinking about it, ride a bicycle, even tie your shoes or a hundred other seemingly complicated tasks. Ever drive a car from home to work, lost in thought, and arrive with no memory of the journey whatsoever?
Unfortunately, the engineer, having become so efficient at operating the train, was able to do so without requiring any vigilance. He simply “zoned” out, and his brain did not register the landmarks indicating the curve was approaching. He may have been lost in thought, or in a simple meditative state, both of which would allow him to be “awake” without being aware.
Can anything be done about this? Of course. Add to that checklist measures to keep the engineer engaged in his activities. Driving several hundred people with several hundred tons of steel should never be done automatically. Invest every trip with novelty and purpose, and automaticity can be thwarted.
What happened this time? Why didn’t the train slow down? The engineer who was driving the train claims the brakes failed. If this is the case, the tragedy has to be chalked up to simple dumb luck. All machines fail eventually, which is why we have safety checks—but even those checks can’t catch everything every time. Misunderstood circumstances can lead to machine failure, and the only thing to do is study those circumstances after the fact and add findings to the checklist.
But there’s evidence that the brakes did not fail, that it was the engineer himself who was responsible. He claims he did not fall asleep, that he had a full night’s rest. No drugs or alcohol were detected, either. And safety devices on the train, such as a “dead man’s switch” would have engaged if the engineer had fallen asleep and his hands had fallen from the controls.
So, if it wasn’t the brakes, or sleep, what would keep a man at the controls but not aware of the coming curve? His brain, if course. Enter Automaticity.
You can walk without thinking about it, ride a bicycle, even tie your shoes or a hundred other seemingly complicated tasks. Ever drive a car from home to work, lost in thought, and arrive with no memory of the journey whatsoever?
Unfortunately, the engineer, having become so efficient at operating the train, was able to do so without requiring any vigilance. He simply “zoned” out, and his brain did not register the landmarks indicating the curve was approaching. He may have been lost in thought, or in a simple meditative state, both of which would allow him to be “awake” without being aware.
Can anything be done about this? Of course. Add to that checklist measures to keep the engineer engaged in his activities. Driving several hundred people with several hundred tons of steel should never be done automatically. Invest every trip with novelty and purpose, and automaticity can be thwarted.
Tuesday, December 3, 2013
Pink Brains vs. Blue Brains
There’s nature and then there’s nurture. Nature builds you from the ground up and leaves you there. Nurture shapes you. Nature uses blueprints that were evolved over millennia, nurture further molds you to fit your environment. If your environment shapes you in such a way that your blueprints are especially fit for the environment, those blueprints further evolve. But nature can’t make your blueprints so adaptable that nurture can make you into any old shape, nor can nurture be so aggressive that it breaks the original blueprints into something unrecognizable. More or less, no matter what nurture has done to you, you can always look back at where nature started you off.
So, at any given moment, it would be all too easy to confound nature and nurture unless we’re looking at your very beginning. A series of articles popping up all over the world today are claiming that men and women have different brains. But they’re not studying men and women at their very beginning—so is what’s being seen nature or nurture?
On the one hand, it’s a moot question. Who cares. If men and women are different, then they’re different, and no need to ask why. But the problem is, the strictest definition of one’s sex is determind by nature, at the moment of conception. At this point it’s just a label, devoid of connotations, and useful only to predict how the developing zygote, and fetus will shape itself.
But “being a man” and “being a woman” for anyone over the age of a few seconds is full of connotations, and those are connotations everyone holds in their heads… and as social creatures, we tend to reinforce those connotations on developing people i.e. children. And so we, the nurturers, shape and mold those who have been given certain natural characteristics until they are fit for their environments.
Or, not to put too fine a point on it, we reinforce existing stereotypes by using those stereotpyes to identify people with particular characteristics. We dress little girls in pink. We might be able to prove that girls are born with a natural preference for pink, but the point is, when people see pink, they treat the person “like a girl.”
Surrounding all of these articles about the differences between male and female brains are even more articles about brain plasticity, that is, the way the brain literally changes shape over time. The brain is plastic because of nature; the brain’s plasticity allows it to change by way of nurture.
Therefore, all of these articles that are pointing out that “men and women have different brains” are essentially saying “men and women are different.” That’s as a determinant statement as ‘the word ‘man’ and the word ‘woman’ are different words.”
So, at any given moment, it would be all too easy to confound nature and nurture unless we’re looking at your very beginning. A series of articles popping up all over the world today are claiming that men and women have different brains. But they’re not studying men and women at their very beginning—so is what’s being seen nature or nurture?
On the one hand, it’s a moot question. Who cares. If men and women are different, then they’re different, and no need to ask why. But the problem is, the strictest definition of one’s sex is determind by nature, at the moment of conception. At this point it’s just a label, devoid of connotations, and useful only to predict how the developing zygote, and fetus will shape itself.
But “being a man” and “being a woman” for anyone over the age of a few seconds is full of connotations, and those are connotations everyone holds in their heads… and as social creatures, we tend to reinforce those connotations on developing people i.e. children. And so we, the nurturers, shape and mold those who have been given certain natural characteristics until they are fit for their environments.
Or, not to put too fine a point on it, we reinforce existing stereotypes by using those stereotpyes to identify people with particular characteristics. We dress little girls in pink. We might be able to prove that girls are born with a natural preference for pink, but the point is, when people see pink, they treat the person “like a girl.”
Surrounding all of these articles about the differences between male and female brains are even more articles about brain plasticity, that is, the way the brain literally changes shape over time. The brain is plastic because of nature; the brain’s plasticity allows it to change by way of nurture.
Therefore, all of these articles that are pointing out that “men and women have different brains” are essentially saying “men and women are different.” That’s as a determinant statement as ‘the word ‘man’ and the word ‘woman’ are different words.”
Thursday, November 28, 2013
Another Tryptophan and the Brain Post
Happy Thanksgiving. Let’s talk turkey. Let’s talk tryptophan.
Even though you know all this. You know that for a long time people believed it was the tryptophan in the turkey that made everyone sleepy on Thanksgiving. And then folks pointed out that no, it doesn’t work like that, it’s eating all those carbs, being in a warm house, around loved ones, relaxed, after days or even weeks of the usual sleep-deprived lifestyles we lead.
None of that says what, exactly, tryptophan is, and what it does in the brain. Basically, tryptophan is an amino acid, which are the building blocks of protein. Your body does not make tryptophan itself, and therefore has to get it from foods. You use tryptophan to make Niacin (vitamin B3) which, among other things, is good for your skin.
Tryptophan also makes serotonin, and most of the serotonin in your body is in your intestines, where it regulates movement. The remainder that is in your brain regulates mood, appetite, and sleep. Thus the source of the turkey = drowsiness myth. Serotonin makes you hungry, eating more makes you happy, happiness relaxes you, eating more makes you drowsy, and together the happy/drowsy state puts you to sleep in front of the football game.
From tryptophan to serotonin, and then from serotonin to melatonin. This is the hormone that regulates the circadian rhythm of various parts of your biology. In your brain, this is the cycle that puts you to sleep at the end of the day and wakes you up in the morning—and it’s light dependent, meaning your pineal gland doesn’t make much when it’s light out.
In order to show that turkey makes you actually sleepy, then, it would have to be able to do so in the absence of other things that make you sleepy, and in the presence of other things that keep you awake. We sit down to the table, and eat a ton of food, sending blood to the stomach, which makes us sleepy. We drink wine, which makes us sleepy. We’re sleep deprived already. We’re in a safe, warm, comfortable environment. We’re a little bit bored by the football game between two teams we don’t much care about. It’s cloudy outside, the fall weather, a dark gray sky.
Reverse all that. Eat just turkey, and water, after a good month of 8-hours-per-night of sleep. Keep the thermostat at 65, bicker with your in-laws about why your college team is better than the one’s their children went to, and place a wager on the game: loser does the dishes. Make sure all the lights are on. You will be wide awake, guaranteed.
Nevertheless, your take away from all of this should be satisfaction in the knowing. When one of your in-laws tucks in and hauls out the “turkey makes ya sleep” trope: let them. No need to bicker, no need to argue or educate. Keep your cortisol levels low. Enjoy your non-tryptophan-induced nap. You’ve probably earned it.
Even though you know all this. You know that for a long time people believed it was the tryptophan in the turkey that made everyone sleepy on Thanksgiving. And then folks pointed out that no, it doesn’t work like that, it’s eating all those carbs, being in a warm house, around loved ones, relaxed, after days or even weeks of the usual sleep-deprived lifestyles we lead.
None of that says what, exactly, tryptophan is, and what it does in the brain. Basically, tryptophan is an amino acid, which are the building blocks of protein. Your body does not make tryptophan itself, and therefore has to get it from foods. You use tryptophan to make Niacin (vitamin B3) which, among other things, is good for your skin.
Tryptophan also makes serotonin, and most of the serotonin in your body is in your intestines, where it regulates movement. The remainder that is in your brain regulates mood, appetite, and sleep. Thus the source of the turkey = drowsiness myth. Serotonin makes you hungry, eating more makes you happy, happiness relaxes you, eating more makes you drowsy, and together the happy/drowsy state puts you to sleep in front of the football game.
From tryptophan to serotonin, and then from serotonin to melatonin. This is the hormone that regulates the circadian rhythm of various parts of your biology. In your brain, this is the cycle that puts you to sleep at the end of the day and wakes you up in the morning—and it’s light dependent, meaning your pineal gland doesn’t make much when it’s light out.
In order to show that turkey makes you actually sleepy, then, it would have to be able to do so in the absence of other things that make you sleepy, and in the presence of other things that keep you awake. We sit down to the table, and eat a ton of food, sending blood to the stomach, which makes us sleepy. We drink wine, which makes us sleepy. We’re sleep deprived already. We’re in a safe, warm, comfortable environment. We’re a little bit bored by the football game between two teams we don’t much care about. It’s cloudy outside, the fall weather, a dark gray sky.
Reverse all that. Eat just turkey, and water, after a good month of 8-hours-per-night of sleep. Keep the thermostat at 65, bicker with your in-laws about why your college team is better than the one’s their children went to, and place a wager on the game: loser does the dishes. Make sure all the lights are on. You will be wide awake, guaranteed.
Nevertheless, your take away from all of this should be satisfaction in the knowing. When one of your in-laws tucks in and hauls out the “turkey makes ya sleep” trope: let them. No need to bicker, no need to argue or educate. Keep your cortisol levels low. Enjoy your non-tryptophan-induced nap. You’ve probably earned it.
Wednesday, November 27, 2013
If Your Brain Had Legs It Would Dance
Can you just imagine it? Your brain on the dance floor, doing the cabbage patch or the running man. Working up a sweat! Getting down with its bad self. Cutting a rug, as it where. Oh, if only. Exercise is supposed to be good for the brain. If only if it had legs of its own, to get out there and shake its groove thang.
But wait, it does. You are your brain, and you have legs. When you dance, your brain dances. When you exercise, your brain reaps the benefits. Aerobic exercise makes your heart stronger, your muscles stronger, reduces fat, and all the same time is improving cognition, strengthening memory, and boosting mood.
So if you can imagine it, you can do it. If your brain had legs, and it does, it would dance, so it does. So to speak. After all, you have free will, so you can choose to not dance. And if you're so inclined, you might choose to not exercise at all.
You can still improve cognition, strengthen memory, and boost mood. You can eat the right nutrients, engage in the right kinds of mental stimulation, seek out and enjoy social interaction. But what happens when you need to run faster, run further, run longer?
We live in a world where we have the luxury of rarely, if ever, needing to run fast, far, or for long. Your brain can survive without legs. It doesn't have to dance. But it does have to stay young. If it doesn't, as it grows "older," it will eventually not want to do anything.
Thankfully, the flip side is that if you do dance, or run, or swim, or do any kind of aerobic exercise, no matter how old you are, your brain is going to get better. Even if you don't want it to. But who wouldn't it to?
The irony is that, since you can choose to dance, and are not forced to, you can enjoy it for its own sake. You have to eat, have to use your brain on a daily basis, and unless you're a recluse, you have to deal with people. So your brain is going to maintain itself in those activities.
But if you dance, its just going to get better.
But wait, it does. You are your brain, and you have legs. When you dance, your brain dances. When you exercise, your brain reaps the benefits. Aerobic exercise makes your heart stronger, your muscles stronger, reduces fat, and all the same time is improving cognition, strengthening memory, and boosting mood.
So if you can imagine it, you can do it. If your brain had legs, and it does, it would dance, so it does. So to speak. After all, you have free will, so you can choose to not dance. And if you're so inclined, you might choose to not exercise at all.
You can still improve cognition, strengthen memory, and boost mood. You can eat the right nutrients, engage in the right kinds of mental stimulation, seek out and enjoy social interaction. But what happens when you need to run faster, run further, run longer?
We live in a world where we have the luxury of rarely, if ever, needing to run fast, far, or for long. Your brain can survive without legs. It doesn't have to dance. But it does have to stay young. If it doesn't, as it grows "older," it will eventually not want to do anything.
Thankfully, the flip side is that if you do dance, or run, or swim, or do any kind of aerobic exercise, no matter how old you are, your brain is going to get better. Even if you don't want it to. But who wouldn't it to?
The irony is that, since you can choose to dance, and are not forced to, you can enjoy it for its own sake. You have to eat, have to use your brain on a daily basis, and unless you're a recluse, you have to deal with people. So your brain is going to maintain itself in those activities.
But if you dance, its just going to get better.
Tuesday, November 26, 2013
This is Why Addiction is a Brain problem, Not a Person Problem
This will largely be a semantic argument. If you do not like semantic arguments, you can translate this into a philosophical discussion. If you think philosophy distracts from science, then you can examine on your own experien
Moving on, a recent post on the Psychology Today blogs discusses the mess that is the DSM V’s treatment of “addiction,” and specifically, the semantic nature of this treatment. What is addiction? Is it nothing more than something that one can’t help but do? Or do we need to add in understandings of consequence, perceived reward, frequency of activity, person and public usefulness, and so on?
For the most part, the word “addiction” has a negative connotation. This can used ironically to bolster something that is “so good,” such as an “addiction to chocolate.” The idea is that chocolate is so good, one would rather have it then not have the negative consequences of eating to too much. This is the parallel drawn with, for example, drug addiction: heroin feels so good, one would rather inject it than be healthy, keep a job, maintain personal relationships, and so on.
But there’s, of course, a huge difference between eating too much chocolate and taking drugs. Ostensibly, a person who eats a lot of chocolate, when told he had become a diabetic, could wean himself off of candy bars in the hopes of prolonging his life. The drug addict can’t do this, according to the definition of the word. Therefore, the word addiction requires an understanding of a lack of free will.
Take the word addiction out of the discussion, and what do we have left. We have a person who is being forced to do things that he doesn’t want to do. Or, to be precise, does not want to want to do. You could say the person is a slave. How do we emancipate people from being forced to take drugs?
The simple answer is, to free them. Give them back their free will. Its society that responds to heavy drug use by removing junkies from jobs, from families, puts them in jail. The irony is, if the drug user does not have to choose between drugs or society’s approval, then taking drugs is not a choice.
However, heavy drug use is physically damaging, and that’s not something society can simply choose to deny the truth of. But, if addiction is viewed as a medical problem, not a social problem, then access to medical care can treat said physical damage. And in the process, help the addicted choose to be free of drugs.
This blog post is not advocating the legalization of drugs. It is advocating for the abandonment of judgmental attitudes.
ces reflect on this discussion. If you think anecdotal phenomena are irrelevant, then you can go crawl into a Skinner box and enjoy your life of pre-programmed cause and effect. We guarantee you will enjoy it, as that is how your brain, for you, literally works.
Moving on, a recent post on the Psychology Today blogs discusses the mess that is the DSM V’s treatment of “addiction,” and specifically, the semantic nature of this treatment. What is addiction? Is it nothing more than something that one can’t help but do? Or do we need to add in understandings of consequence, perceived reward, frequency of activity, person and public usefulness, and so on?
For the most part, the word “addiction” has a negative connotation. This can used ironically to bolster something that is “so good,” such as an “addiction to chocolate.” The idea is that chocolate is so good, one would rather have it then not have the negative consequences of eating to too much. This is the parallel drawn with, for example, drug addiction: heroin feels so good, one would rather inject it than be healthy, keep a job, maintain personal relationships, and so on.
But there’s, of course, a huge difference between eating too much chocolate and taking drugs. Ostensibly, a person who eats a lot of chocolate, when told he had become a diabetic, could wean himself off of candy bars in the hopes of prolonging his life. The drug addict can’t do this, according to the definition of the word. Therefore, the word addiction requires an understanding of a lack of free will.
Take the word addiction out of the discussion, and what do we have left. We have a person who is being forced to do things that he doesn’t want to do. Or, to be precise, does not want to want to do. You could say the person is a slave. How do we emancipate people from being forced to take drugs?
The simple answer is, to free them. Give them back their free will. Its society that responds to heavy drug use by removing junkies from jobs, from families, puts them in jail. The irony is, if the drug user does not have to choose between drugs or society’s approval, then taking drugs is not a choice.
However, heavy drug use is physically damaging, and that’s not something society can simply choose to deny the truth of. But, if addiction is viewed as a medical problem, not a social problem, then access to medical care can treat said physical damage. And in the process, help the addicted choose to be free of drugs.
This blog post is not advocating the legalization of drugs. It is advocating for the abandonment of judgmental attitudes.
Friday, November 22, 2013
The Brain’s Five Kinds of Boredom
One of our writers writes for another blog called “Zombie For Life.”(We think that’s supposed to be an ironic title.) Today he wrote an entry called “5 Reasons People Who Listen to Surf Guitar Will Survive the Zombie Apocalypse.” And since today’s post about the five kinds of boredom, we thought it would be fun to somehow link the two.
Research performed by a handful of scientists and reported on by Science Daily suggests that boredom can be classified according to levels of arousal and valence. They describe:
- indifferent boredom (relaxed, withdrawn, indifferent)
- calibrating boredom (uncertain, receptive to change/distraction)
- searching boredom (restless, active pursuit of change/distraction)
- reactant boredom (high reactant, motivated to leave a situation for specific alternatives)
- apathetic boredom, an especially unpleasant form that resembles learned helplessness or depression. It is associated with low arousal levels and high levels of aversion.
(The above is directly quoted from Science Daily but reformatted for this blog post.)
In an effort to provide some entertainment today, which is to say, alleviate some boredom, we’d like to suggest that each boredom type, above, can be a means by which to survive a zombie apocalypse:
- Indifferent: this person will maintain a cool head when everyone else is losing control.
- Calibrating: this person will not take for granted the doomed-to-fail survival instructions provided by civic leaders, and will adapt to ever-changing circumstances.
- Searching: this person will not stay idle as things fall apart, becoming a sitting duck for that random zombie who wanders by.
- Reactant: this person will respond immediately to a zombie threat, and will not become complacent or accept a seemingly unavoidable doom.
- Apathetic: this person will be as like the undead themselves, blend in, and in doing so will avoid being eaten as the zombies chase after livelier prey.
Yes, we’ve taken a few liberties here, and made some assumptions. But then again, zombies are fictional, so taking liberties and making assumptions is de rigueur for this kind of discussion.
On a serious note, the fifth boredom type, “apathetic” is one that has only been recently recognized, and its inclusion in the above list reveals some troubling statistics: “Because of the assumed link between boredom and depression, the research group found it alarming that apathetic boredom was reported relatively frequently by 36 percent of the high school students sampled.”
Add to that their determination: “people do not just randomly experience the different boredom types over time… they tend to experience one type” and the 36 percent number is indeed alarming.
We’re hopeful that such studies can help catch signs of depression early so that people can seek proper treatment.
We’re also hopeful that for everyone else, our blog post today alleviates some of their boredom.
Thursday, November 21, 2013
Playing with Brains
Today one of us at TGBR played a game that required memorizing strings of numbers. The game started off with just two digits, but went all the way up to 10 digits. Sometimes the number has to be memorized “backwards” which made it more challenging.
What was interesting was how even the slightest bit of familiarity aided the process. ‘206’ popped up as part of a longer string, and was easily memorized as a single entity because it’s the area code here in Seattle. Any time 7x7 occurred (737, 747, 727, etc) that, too, was held in the mind as a single digit. And strings of numbers that started with ‘19’ were easily coded as dates, So, when 2061984737 popped up, instead of it feeling like a 10 digit number, it felt like a three digit number.
Our brains, it seems, are desperate to attach meaning to things. We prefer character and reason to stark, abstract existence. This is how we, here at TGBR, choose to interpret new findings that show people with so-called “perfect” recall can be tricked into having false memories.
An article at Discover describes how scientists were able to achieve this, and for the sake of time, we won’t rehash the explanation here. Suffice it to say that no one’s memory is invulnerable to manipulation.
We’re insisting it all goes back to a terrible need to consider the question ”why?” why did that happen? What does it mean? How does that fit into the rest of the universe?
And if the universe is in our own heads, than our memories are their own worst enemies.
What was interesting was how even the slightest bit of familiarity aided the process. ‘206’ popped up as part of a longer string, and was easily memorized as a single entity because it’s the area code here in Seattle. Any time 7x7 occurred (737, 747, 727, etc) that, too, was held in the mind as a single digit. And strings of numbers that started with ‘19’ were easily coded as dates, So, when 2061984737 popped up, instead of it feeling like a 10 digit number, it felt like a three digit number.
Our brains, it seems, are desperate to attach meaning to things. We prefer character and reason to stark, abstract existence. This is how we, here at TGBR, choose to interpret new findings that show people with so-called “perfect” recall can be tricked into having false memories.
An article at Discover describes how scientists were able to achieve this, and for the sake of time, we won’t rehash the explanation here. Suffice it to say that no one’s memory is invulnerable to manipulation.
We’re insisting it all goes back to a terrible need to consider the question ”why?” why did that happen? What does it mean? How does that fit into the rest of the universe?
And if the universe is in our own heads, than our memories are their own worst enemies.
Wednesday, November 20, 2013
Giving the Brain the Gift of Giving
Today we celebrate a birthday here at the Great Brain Robbery, as one of our staff is turning 42. Or, as he likes to call it, he’s now “21s.” Last year he was “thirty-eleven.” We think he might have an issue with aging.
And apropos to that we thought we’d look at aging and the brain, but that seems like a bit of a downer. Not that there’s anything wrong with aging per-se, and if we look hard enough, we could probably find an article that shows how older brains are better at some things. But instead of that, or all the research on Alzheimer’s and dementia and the like, perhaps that other part of birthdays, gifts, is worthy of examination.
An article in The Washington Post in 2007 discusses a 2006 study which found that altruism lit up parts of the brain associated with experiencing pleasure. Basically, the theory goes that altruism is an evolved tendency, one that exists a biological component of the brain’s make-up. The specific example cited was that volunteers were asked to imagine giving money away. This was described as “volunteers placed the interests of others above their own.”
Let’s split hairs. Imagining something is one thing; doing it is quite another. Fantasizing about feeling good might be why the volunteers felt good. But, we’re only splitting these hairs to address something else the article brought up again and again: the biology of morality.
The article seemed to lump altruism in with morality, as if giving away money is somehow a moral thing to do. Examples were made of people with certain kind of brain damage who are better able to make “cold” decisions about selfish (but logical) acts of self preservation. Other examples included sociopaths who don’t have any sense of morality at all.
The problem with have with this is that it begs the question: is it moral to “do good” or is it merely moral to “not do bad.” If you were to decide to not hit our birthday boy with your car, even though he was jaywalking, is that moral? Would it make your brain feel good?
Or do you need to send him a huge gift (he’s got an eye on this jacket, for example) before your brain starts smiling?
The article does point out that theologians and philosophers are looking at this research with interest; there’s the question of whther aligning morality with brain chemistry takes away the role of free will in make responsible decisions. We’d like to suggest that what the research actually does is split the hair that needs to be split in order to define what morality actually is.
And apropos to that we thought we’d look at aging and the brain, but that seems like a bit of a downer. Not that there’s anything wrong with aging per-se, and if we look hard enough, we could probably find an article that shows how older brains are better at some things. But instead of that, or all the research on Alzheimer’s and dementia and the like, perhaps that other part of birthdays, gifts, is worthy of examination.
An article in The Washington Post in 2007 discusses a 2006 study which found that altruism lit up parts of the brain associated with experiencing pleasure. Basically, the theory goes that altruism is an evolved tendency, one that exists a biological component of the brain’s make-up. The specific example cited was that volunteers were asked to imagine giving money away. This was described as “volunteers placed the interests of others above their own.”
Let’s split hairs. Imagining something is one thing; doing it is quite another. Fantasizing about feeling good might be why the volunteers felt good. But, we’re only splitting these hairs to address something else the article brought up again and again: the biology of morality.
The article seemed to lump altruism in with morality, as if giving away money is somehow a moral thing to do. Examples were made of people with certain kind of brain damage who are better able to make “cold” decisions about selfish (but logical) acts of self preservation. Other examples included sociopaths who don’t have any sense of morality at all.
The problem with have with this is that it begs the question: is it moral to “do good” or is it merely moral to “not do bad.” If you were to decide to not hit our birthday boy with your car, even though he was jaywalking, is that moral? Would it make your brain feel good?
Or do you need to send him a huge gift (he’s got an eye on this jacket, for example) before your brain starts smiling?
The article does point out that theologians and philosophers are looking at this research with interest; there’s the question of whther aligning morality with brain chemistry takes away the role of free will in make responsible decisions. We’d like to suggest that what the research actually does is split the hair that needs to be split in order to define what morality actually is.
If altruism merely feels good (our birthday boy’s a size extra-large and likes the jacket in black, by the way) then perhaps morality can be defined by how we feel about it, not what it’s larger effects are. The reduces moral question to a more ego-centric approach, but our birthday boy has pointed out that, at least one day a year, being ego-centric is okay.
Friday, November 15, 2013
Trust Your Brain and Your Brain Will Trust You
The Little Engine thought he could, and he made it up the mountain. By force of sheer will, he overcame the challenges before him, and succeeded where before he had failed.
TLE’s now retired, sitting on a sofa, watching TV. Next to him is a bowl of potato chips. He knows he shouldn’t eat them. A few are okay, but not the whole bowl. But there goes his hand, dipping and lifting the crispy treat to his mouth. “I can stop,” he says to himself. “I think I can!” And then he shoves the fistful in, and sighs as he flips channels on the TV.
We’ve all got bad habits. And we’ve all got routines and motions we go through, without even thinking about it. How do we break out of these ruts, and become the people we know we want to be? The key is confidence. It really is all about believing in and trusting yourself.
Those new-agers were right, but science can back-up some of the claims, by pointing to key areas in the brain. Consider self-control, which is occurs in the frontal part of your brain. This is an area with high energy demands, and you can literally run out of fuel necessary to control yourself (scientists call this “ego depletion”). Now consider the deeper, emotional part of your brain, the ones that want what they want NOW. You’re fighting temptation all day to the point you’re literally tired of fighting, and since the reward for staying motivated is less insistent than the reward for giving in now, you give in.
You run out of that very will that kept TLE chugging up the mountain.
Trusting yourself can replace that emotional want with a different emotion all together. This is reminiscent of the way the brain chooses to give control to the heuristics that you use for routines, and a more conscious executive decision making process. Your brain will choose the method that has the most confidence—it’s easier to drive a car to work, letting your brain handle the lefts and rights while you mental prepare the big speech for your boss. On the other hand, you wouldn’t expect to automatically drive through a city you’ve never been in before, or you’ll end up hopelessly lost.
So, if you’re confident in your decision, then making that decision will be its own reward. Instead of your frontal brain fighting against the pull of your emotions, you harness those emotions to actually push yourself up that hill.
The Little Engine that could should picture himself as a happier engine, sitting in a place where there are no chips. By choosing to concentrate on a positive outcome to his decisions, he can trust himself to execute those proper decisions. Then he can pick up the bowl, take it into the kitchen, and go back to this TV, with the bowl now out of sight.
References:
http://www.psychologytoday.com/blog/addicted-brains/201311/the-key-quitting-self-trust-part-1
http://blogs.scientificamerican.com/mind-guest-blog/2013/11/12/should-habits-or-goals-direct-your-life-it-depends-2/
http://www.psychologicalscience.org/index.php/publications/observer/2013/april-13/why-wait-the-science-behind-procrastination.html
TLE’s now retired, sitting on a sofa, watching TV. Next to him is a bowl of potato chips. He knows he shouldn’t eat them. A few are okay, but not the whole bowl. But there goes his hand, dipping and lifting the crispy treat to his mouth. “I can stop,” he says to himself. “I think I can!” And then he shoves the fistful in, and sighs as he flips channels on the TV.
We’ve all got bad habits. And we’ve all got routines and motions we go through, without even thinking about it. How do we break out of these ruts, and become the people we know we want to be? The key is confidence. It really is all about believing in and trusting yourself.
Those new-agers were right, but science can back-up some of the claims, by pointing to key areas in the brain. Consider self-control, which is occurs in the frontal part of your brain. This is an area with high energy demands, and you can literally run out of fuel necessary to control yourself (scientists call this “ego depletion”). Now consider the deeper, emotional part of your brain, the ones that want what they want NOW. You’re fighting temptation all day to the point you’re literally tired of fighting, and since the reward for staying motivated is less insistent than the reward for giving in now, you give in.
You run out of that very will that kept TLE chugging up the mountain.
Trusting yourself can replace that emotional want with a different emotion all together. This is reminiscent of the way the brain chooses to give control to the heuristics that you use for routines, and a more conscious executive decision making process. Your brain will choose the method that has the most confidence—it’s easier to drive a car to work, letting your brain handle the lefts and rights while you mental prepare the big speech for your boss. On the other hand, you wouldn’t expect to automatically drive through a city you’ve never been in before, or you’ll end up hopelessly lost.
So, if you’re confident in your decision, then making that decision will be its own reward. Instead of your frontal brain fighting against the pull of your emotions, you harness those emotions to actually push yourself up that hill.
The Little Engine that could should picture himself as a happier engine, sitting in a place where there are no chips. By choosing to concentrate on a positive outcome to his decisions, he can trust himself to execute those proper decisions. Then he can pick up the bowl, take it into the kitchen, and go back to this TV, with the bowl now out of sight.
References:
http://www.psychologytoday.com/blog/addicted-brains/201311/the-key-quitting-self-trust-part-1
http://blogs.scientificamerican.com/mind-guest-blog/2013/11/12/should-habits-or-goals-direct-your-life-it-depends-2/
http://www.psychologicalscience.org/index.php/publications/observer/2013/april-13/why-wait-the-science-behind-procrastination.html
Thursday, November 14, 2013
This is Your Drugs on Brain
The writers at The Great Brain Robbery contribute to other blogs, including one that has, shall we say, a more robust editing process. The topic this week for that blog is nutrition and the brain; what foods are good for the mind, and how does one enjoy those benefits. You know, things like antioxidants from chocolate, B vitamins from fish, folic acid from legumes, vitamin E from buts and seeds. All of these nutrients make for a healthier brain.
One of the editors for that blog suggested perhaps saying a few things about nootropics, aka “smart drugs.” We’ve heard of those, of course, but we don’t know much about them. Are they for real, do they work, are they safe, is every blog that discusses them required to mention that Bradley Cooper movie?
So we grabbed our grains of salt and went looking, Turns out there’s a healthy, vibrant nootropic culture out there, ready to bring you in if you want to start popping a few pills to make yourself smarter. We are happy to report that our initial searches took us to informative websites, (as opposed to websites merely shucking tablets for profit).
The purpose of this blog post is not to be an exhaustive overview of nootropics, nor a basic nootropics 101. Rather, we’d like to point out a few things about what we’ve read so far.
1. Buying prescription drugs from other countries is illegal. We’re not trying to police you, but we do want to point out that the reason why drugs are prescribed is so that dosages can be tailored to the individual, and so that a system of liability can be in place to mitigate unfortunate consequences. Simply put, buying controlled substances online is a gamble, and not worth taking.
2. Over the counter supplements are unregulated. This means that a pill in one bottle may not be the same as a pill in another bottle. But this is case with anything that does not qualify as a “drug,” and so if you trust a certain brand, stick with that brand. Bargain shopping when it comes to supplements can be very tricky.
3. Nootropics are not cure-alls, nor are they meant to be If you suffer from poor nutrition, bad sleeping habits, undue stress, or significant psychological disorders, popping some piracetam is not going to help you. That nootropics are “supplements” should be your guide—they work best if everything else is working well. They should supplement, not replace, good habits.
The gist of all this is: when in doubt, don’t do it. Our research suggests that the people who can take nootropics with the least risk are those who probably don’t have an overwhelming need for them.
Nevertheless, we are not suggesting that you just say no to nootropics. Rather, we encourage you to do what we did: research. Take your grain of salt, look for counter-arguments to every claim you find, and decide for yourself. And in all things, moderation.
One of the editors for that blog suggested perhaps saying a few things about nootropics, aka “smart drugs.” We’ve heard of those, of course, but we don’t know much about them. Are they for real, do they work, are they safe, is every blog that discusses them required to mention that Bradley Cooper movie?
So we grabbed our grains of salt and went looking, Turns out there’s a healthy, vibrant nootropic culture out there, ready to bring you in if you want to start popping a few pills to make yourself smarter. We are happy to report that our initial searches took us to informative websites, (as opposed to websites merely shucking tablets for profit).
The purpose of this blog post is not to be an exhaustive overview of nootropics, nor a basic nootropics 101. Rather, we’d like to point out a few things about what we’ve read so far.
1. Buying prescription drugs from other countries is illegal. We’re not trying to police you, but we do want to point out that the reason why drugs are prescribed is so that dosages can be tailored to the individual, and so that a system of liability can be in place to mitigate unfortunate consequences. Simply put, buying controlled substances online is a gamble, and not worth taking.
2. Over the counter supplements are unregulated. This means that a pill in one bottle may not be the same as a pill in another bottle. But this is case with anything that does not qualify as a “drug,” and so if you trust a certain brand, stick with that brand. Bargain shopping when it comes to supplements can be very tricky.
3. Nootropics are not cure-alls, nor are they meant to be If you suffer from poor nutrition, bad sleeping habits, undue stress, or significant psychological disorders, popping some piracetam is not going to help you. That nootropics are “supplements” should be your guide—they work best if everything else is working well. They should supplement, not replace, good habits.
The gist of all this is: when in doubt, don’t do it. Our research suggests that the people who can take nootropics with the least risk are those who probably don’t have an overwhelming need for them.
Nevertheless, we are not suggesting that you just say no to nootropics. Rather, we encourage you to do what we did: research. Take your grain of salt, look for counter-arguments to every claim you find, and decide for yourself. And in all things, moderation.
Wednesday, November 13, 2013
When Two Brains Lie
Over at the SciLogs blog, Michael Blume contends: HAD + TOM = Social & Religious Cognitions. Hyper Agency Detection (HAD) is what we call it when we attribute some sort of agency (or “will” if you like) behind a phenomena. The trees shift in the wind, and we say God is blowing on them. Theory of Mind (TOM) is what we call it when we start to guess why agents do the things they do. God blows on the trees because He wants to remind us He’s there.
That’s a very glib way of putting it, but it should be enough for you to get the point. Blume points out what Darwin pointed out, that assuming there’s some sort of agent behind that noise and then doing something about it is better than ignoring that growling tiger as ‘just the wind” and getting eaten. In other words, our brains were naturally selected to believe in religious things.
The Great Brain Robbery does not whole-heartedly agree. We don’t think that suggesting our brains are wired for religion dismisses worship as an artifact of survival, nor do we think that worship is promulgated by relics of survival instinct.
We contend that while TOM and HAD can create the fertile ground from which religions are grown, one more essential element is required: the ability to lie.
A study in 2002 found that increasing dopamine in skeptics resulted in their being more inclined to see faces in jumbled images. This study led to another study, which essential found a gene for believing in God. Again, a glib way to put it, but the point is this: seeing things that aren’t there is just a mild form of schizophrenia. But what if a second person is the source for HAD and TOM? What if this person tells you that the agent is called God and that He wants you to do certain things?
This is how religions are created. Not by seeing things and believing them, but by having those beliefs codified into rules of conduct. Passing false pattern recognition on to others for the purpose of control. And once it takes root, it is very easy to perpetuate: all agency is attributed to God.
But what’s not easy to perpetuate is attributing TOM. Lies require more energy from the brain than telling the truth. This effort requires a reward of some kind to compensate for the extra energy spent. But what is that reward? Is it simply survival? Being in power makes it easier to be less empathetic—are religious leaders merely making it easier for themselves to sacrifice others, this ensuring their own longevity?
That’s a very glib way of putting it, but it should be enough for you to get the point. Blume points out what Darwin pointed out, that assuming there’s some sort of agent behind that noise and then doing something about it is better than ignoring that growling tiger as ‘just the wind” and getting eaten. In other words, our brains were naturally selected to believe in religious things.
The Great Brain Robbery does not whole-heartedly agree. We don’t think that suggesting our brains are wired for religion dismisses worship as an artifact of survival, nor do we think that worship is promulgated by relics of survival instinct.
We contend that while TOM and HAD can create the fertile ground from which religions are grown, one more essential element is required: the ability to lie.
A study in 2002 found that increasing dopamine in skeptics resulted in their being more inclined to see faces in jumbled images. This study led to another study, which essential found a gene for believing in God. Again, a glib way to put it, but the point is this: seeing things that aren’t there is just a mild form of schizophrenia. But what if a second person is the source for HAD and TOM? What if this person tells you that the agent is called God and that He wants you to do certain things?
This is how religions are created. Not by seeing things and believing them, but by having those beliefs codified into rules of conduct. Passing false pattern recognition on to others for the purpose of control. And once it takes root, it is very easy to perpetuate: all agency is attributed to God.
But what’s not easy to perpetuate is attributing TOM. Lies require more energy from the brain than telling the truth. This effort requires a reward of some kind to compensate for the extra energy spent. But what is that reward? Is it simply survival? Being in power makes it easier to be less empathetic—are religious leaders merely making it easier for themselves to sacrifice others, this ensuring their own longevity?
Tuesday, November 12, 2013
Caffeine and Sleep and the Brain, a Terrible Lover’s Triangle
You’re tossing and turning. You just can’t get to sleep, but you’re exhausted.… to borrow a phrase from the kids today, double-you tee eff?
Research reported on by Scientific American and My Healthy Daily News suggests that if you’re a night person, you can drink all the coffee you want. Well, not all the coffee you want. But if you’re inclined to have a double-tall in the middle of the afternoon, go ahead.
But if you’re a morning person, best to avoid caffeine later in the day. It all has to do with your so-called chronotype (the fancy word for whether you’re a “day” person or a “night” person). Which begs the question: why do night people need caffeine later in the day at all?
As a nation, we’re all more or less sleep-deprived. While science says about eight hours is best, the average adult only gets six and half on the weekdays. Caffeine helps us get through until the weekend, when we try to pay back some of this sleep debt.
But it’s no good if, even during your inadequate six and half, you’re not getting good sleep.
According to researchers, most people are neither “larks” nor “owls” and as such, should consider careful use of stimulants: a cup of coffee in the morning as part of a meal, not as a quick sleepiness eradicator. Remember: if you’re not awake 30 minutes after getting out of bed, you’re probably sleep deprived, and caffeine addresses the symptom, not the cause, if your ailment.
When in doubt, just eschew caffeine altogether and try to go to bed an hour early. Boring, yes, but you may find that, with enough sleep, you’re a morning person after all.
Research reported on by Scientific American and My Healthy Daily News suggests that if you’re a night person, you can drink all the coffee you want. Well, not all the coffee you want. But if you’re inclined to have a double-tall in the middle of the afternoon, go ahead.
But if you’re a morning person, best to avoid caffeine later in the day. It all has to do with your so-called chronotype (the fancy word for whether you’re a “day” person or a “night” person). Which begs the question: why do night people need caffeine later in the day at all?
As a nation, we’re all more or less sleep-deprived. While science says about eight hours is best, the average adult only gets six and half on the weekdays. Caffeine helps us get through until the weekend, when we try to pay back some of this sleep debt.
But it’s no good if, even during your inadequate six and half, you’re not getting good sleep.
According to researchers, most people are neither “larks” nor “owls” and as such, should consider careful use of stimulants: a cup of coffee in the morning as part of a meal, not as a quick sleepiness eradicator. Remember: if you’re not awake 30 minutes after getting out of bed, you’re probably sleep deprived, and caffeine addresses the symptom, not the cause, if your ailment.
When in doubt, just eschew caffeine altogether and try to go to bed an hour early. Boring, yes, but you may find that, with enough sleep, you’re a morning person after all.
Monday, November 11, 2013
This is Your Brain in Love
Remember those commercials from the 90s, the ones where they’d show you an egg, and say “this is your brain,” and then they’d crack it open and start it sizzling in a frying pan, and say “this is your brain on drugs. Any questions?” And the question would of course be “can I get a side of home fries with that that?”
The idea was that drugs fry your brain, and so the metaphor was over extended. So, in that vein, let’s look at more ways that your brain can get fried. And since Valentine’s day is a good three months away, let’s talk about love.
First up, this study, as reported by United Academics, which points out that “Romantic love and partnership is influenced by various substances, such as oxytocin, vasopressin, dopamine, serotonin, cortisol and testosterone.” If you’ve been reading The Great Brain Robbery, you may recognize some of those hormones. Like cortisol, the one that, amongst other things, makes you fat and rips your brain to shreds. Ain’t love great?
And there’s this article at BrainFacts.org, discussing the same and similar research, which says “In male prairie voles, the hormone arginine vasopressin, which is involved in aggression and territorial behavior, also appears to play an important role in pair-bonding.” Did you get that, guys? Love can make you aggressive. Or being aggressive can make you fall in love.
And finally, Discover Magazine’s Seriously Science column reports on a study that shows that, for straight men, just thinking about talking to women can make their brains take a dive.
Add it all up: love makes you fat, aggressive, and stupid.
Well, of course it does.
Of course it doesn’t. Love is too complex to be reduced to a few hormones, and the brain is too complex to be similarly reduced. We encourage you to actually read the above articles and abstracts, as the question of what love does to the brain (or, not to put to fine a point on it, for the brain) is still as mysterious as it is fascinating.
But nevermind that. Nevermind being rational and cautious about approaching the science of the brain. Nevermind reading these scientific articles in the spirit that they’re offered: with a healthy dose of discernment not to take the findings and draw unsubstantiated conclusions We’ve just read the Neuroskeptics guide on how to wave away doubt. And we’re approaching jazz-hands level here.
After all, if love makes s stupid, than being stupid means we deserve to be loved, right?
The idea was that drugs fry your brain, and so the metaphor was over extended. So, in that vein, let’s look at more ways that your brain can get fried. And since Valentine’s day is a good three months away, let’s talk about love.
First up, this study, as reported by United Academics, which points out that “Romantic love and partnership is influenced by various substances, such as oxytocin, vasopressin, dopamine, serotonin, cortisol and testosterone.” If you’ve been reading The Great Brain Robbery, you may recognize some of those hormones. Like cortisol, the one that, amongst other things, makes you fat and rips your brain to shreds. Ain’t love great?
And there’s this article at BrainFacts.org, discussing the same and similar research, which says “In male prairie voles, the hormone arginine vasopressin, which is involved in aggression and territorial behavior, also appears to play an important role in pair-bonding.” Did you get that, guys? Love can make you aggressive. Or being aggressive can make you fall in love.
And finally, Discover Magazine’s Seriously Science column reports on a study that shows that, for straight men, just thinking about talking to women can make their brains take a dive.
Add it all up: love makes you fat, aggressive, and stupid.
Well, of course it does.
Of course it doesn’t. Love is too complex to be reduced to a few hormones, and the brain is too complex to be similarly reduced. We encourage you to actually read the above articles and abstracts, as the question of what love does to the brain (or, not to put to fine a point on it, for the brain) is still as mysterious as it is fascinating.
But nevermind that. Nevermind being rational and cautious about approaching the science of the brain. Nevermind reading these scientific articles in the spirit that they’re offered: with a healthy dose of discernment not to take the findings and draw unsubstantiated conclusions We’ve just read the Neuroskeptics guide on how to wave away doubt. And we’re approaching jazz-hands level here.
After all, if love makes s stupid, than being stupid means we deserve to be loved, right?
Labels:
brainfacts,
cortisol,
discover,
drugs,
egg,
love,
metaphor,
united academics
Friday, November 8, 2013
The Irony of Brain Research
Taken as a whole, the very idea of brain research is pretty hilarious.
And this is true no matter what approach you take to studying the mind. Strictly biological, psychological, philosophical, spiritual: each discipline, when looking at the brain itself, is more or less clueless.
Which is not to say remarkable research hasn’t been done. Neuroscience is a very hot subject right now, precisely because of all the fascinating discoveries being made. Psychology has fascinated us for centuries, and since Freud it’s been a rich topic for discussion. And the only thing older than philosophy is religion.
But even Plato figured out that the all he could be certain of was that he didn’t know anything. A few thousand years later, Descartes updated that to knowing he exists... and that's about it. There’s an entire philosophy based on separating the brain itself from the mind. As for anatomy, some of those ancient Greeks did think the brain was the center of thought, until Aristotle came along and switched that job to the heart.
The point is, all of this noodling is being done with the very noodle being noodled. This is ironic, and therefore hilarious. But wait, there’s more: there’s a word for the brain study of laughter: gelotology. Go head, laugh at that. Laugh at Jello-tology.
And of course, what little we know about laughter and brain is… very little. We know that laughter is a distributed emotional responses, moving through several parts of the brain. We know it has social uses, as well as therapeutic uses. But next to sleep, laughter maybe one of the biggest brain mysteries yet.
The good news is: you don’t have to be a mechanic to drive a car. You don’t have to know how the brain works to have a brilliant mind, and you don’t have to know why a joke is funny to know that it’s funny.
Indeed, sometimes nothing kills a joke like explaining it.
And this is true no matter what approach you take to studying the mind. Strictly biological, psychological, philosophical, spiritual: each discipline, when looking at the brain itself, is more or less clueless.
Which is not to say remarkable research hasn’t been done. Neuroscience is a very hot subject right now, precisely because of all the fascinating discoveries being made. Psychology has fascinated us for centuries, and since Freud it’s been a rich topic for discussion. And the only thing older than philosophy is religion.
But even Plato figured out that the all he could be certain of was that he didn’t know anything. A few thousand years later, Descartes updated that to knowing he exists... and that's about it. There’s an entire philosophy based on separating the brain itself from the mind. As for anatomy, some of those ancient Greeks did think the brain was the center of thought, until Aristotle came along and switched that job to the heart.
The point is, all of this noodling is being done with the very noodle being noodled. This is ironic, and therefore hilarious. But wait, there’s more: there’s a word for the brain study of laughter: gelotology. Go head, laugh at that. Laugh at Jello-tology.
And of course, what little we know about laughter and brain is… very little. We know that laughter is a distributed emotional responses, moving through several parts of the brain. We know it has social uses, as well as therapeutic uses. But next to sleep, laughter maybe one of the biggest brain mysteries yet.
The good news is: you don’t have to be a mechanic to drive a car. You don’t have to know how the brain works to have a brilliant mind, and you don’t have to know why a joke is funny to know that it’s funny.
Indeed, sometimes nothing kills a joke like explaining it.
Thursday, November 7, 2013
Don’t Gamble on Brain Health
In one of Farmers’ “Fifteen Seconds of Smart,” ads, Professor Nathaniel Burke says “So you want to drive more safely…” And then he says, “Stop eating. Take deep breaths. Avoid bad weather. Get eight hours. Turn it down. And of course, talk to Farmer’s.”
Although we have no proof of this whatsoever, someone at the ad agency where they make these commercials could have handed the copy writer a list of keys to maintaining a healthy brain, and had the writer distill it down to something that could be used to sell insurance.
Although we have no proof of this whatsoever, someone at the ad agency where they make these commercials could have handed the copy writer a list of keys to maintaining a healthy brain, and had the writer distill it down to something that could be used to sell insurance.
The essential elements to maintaining a healthy brain are: Eat smart, exercise, pick your battles, sleep, manage stress, socialize.
There’s a surfeit of research that backs up each of these assertions, and they just make sense. And we’re not even talking about good psychological or emotional health. We’re talking about the very physiology of the brain. No matter what point of view you have on human consciousness, no matter what philosophy your embrace, religion or lack thereof you adhere to, there’s no arguing against keeping your physical brain as healthy as possible.
We’ve discussed Maslow’s Hierarchy of Needs before, and its interesting how these simple rules of brain health align with the bases of that Need pyramid. In our hypermodern world, we live more or less in the top two areas, esteem and self-actualization. And yet many of us, perhaps most of us, merely “check-in” with those needs at the bottom. And our brains suffer.
Insurance is, essential, gambling on disaster. You pay into a system just in case you’re unlucky enough to suffer an accident. Farmers wants you to hedge that bet, in their favor, by taking care of your mind so you don’t increase the odds of becoming unlucky. Take this either way you like—either insurance companies don’t want you injured, or they don’t want to pay you.
Either way, you are better off paying and not needing it, than not paying and needing. Same is true for brain health. Of course, if you’re lucky, you can get away with not paying and not needing it. But we can guarantee you, when it comes to brain health, you should “pay” into maintaining these basics, because you will definitely need your brain.
There’s a surfeit of research that backs up each of these assertions, and they just make sense. And we’re not even talking about good psychological or emotional health. We’re talking about the very physiology of the brain. No matter what point of view you have on human consciousness, no matter what philosophy your embrace, religion or lack thereof you adhere to, there’s no arguing against keeping your physical brain as healthy as possible.
We’ve discussed Maslow’s Hierarchy of Needs before, and its interesting how these simple rules of brain health align with the bases of that Need pyramid. In our hypermodern world, we live more or less in the top two areas, esteem and self-actualization. And yet many of us, perhaps most of us, merely “check-in” with those needs at the bottom. And our brains suffer.
Insurance is, essential, gambling on disaster. You pay into a system just in case you’re unlucky enough to suffer an accident. Farmers wants you to hedge that bet, in their favor, by taking care of your mind so you don’t increase the odds of becoming unlucky. Take this either way you like—either insurance companies don’t want you injured, or they don’t want to pay you.
Either way, you are better off paying and not needing it, than not paying and needing. Same is true for brain health. Of course, if you’re lucky, you can get away with not paying and not needing it. But we can guarantee you, when it comes to brain health, you should “pay” into maintaining these basics, because you will definitely need your brain.
Labels:
ads,
balance,
brain health,
commercials,
farmers,
gambling,
maslow,
needs
Wednesday, November 6, 2013
Defending Brain Training
We’re pro brain-training at The Great Brain Robbery, so when we see articles that suggest brain-training doesn’t “work,” we’re a bit biased. We offer this self-description as a form of full disclosure before proceeding to the following discussion.
An article at Science Daily (a website we have much respect for and read every day) reports on a study forthcoming in Psychological Science. This study assigned older adults to one of two groups: one group learned new skills such as photography, while the other were tasked with listening to music and doing puzzles at home. Both groups included subjects who were assigned social tasks to mitigate for such an effect. Researchers found that those who where assigned to learn a new skill had better memory improvement.
The article quotes the scientists as saying “It seems it is not enough just to get out and do something -- it is important to get out and do something that is unfamiliar and mentally challenging …. When you are inside your comfort zone you may be outside of the enhancement zone.”
This makes sense, and is very encouraging, because for anyone with a brain, there should be a world of challenges available to keep the mind sharp. And while playing video games is not the same as taking an eight-week course on Taiwanese Dessert Decorating, at least games can still be challenging, and importantly, readily available.
But, we want to point out that his study does not mention a control group of subjects who did neither the skill-seeking activities nor the home puzzle activities. So while, yes, while learning basket weaving on a three-week trip through Namibia would probably be very good for the brain, this doesn’t mean that challenging yourself to play brain training games wouldn’t.
Note also that the article calls listening to music and doing puzzles “non-demanding mental activities.” We especially disagree with this assertion, in as much as doing crossword puzzles can be merely an “exercise” (like, on a Monday) whereas anyone who’s noodled through a Friday New York Time crossword puzzles knows how challenging—and stimulating, not to mention rewarding—it can be to finish one.
So the take-away we’re going to agree-upon with this research is that: more research needs to be done. And in the meantime, there is no good reason to play games. And fun is its own reason.
An article at Science Daily (a website we have much respect for and read every day) reports on a study forthcoming in Psychological Science. This study assigned older adults to one of two groups: one group learned new skills such as photography, while the other were tasked with listening to music and doing puzzles at home. Both groups included subjects who were assigned social tasks to mitigate for such an effect. Researchers found that those who where assigned to learn a new skill had better memory improvement.
The article quotes the scientists as saying “It seems it is not enough just to get out and do something -- it is important to get out and do something that is unfamiliar and mentally challenging …. When you are inside your comfort zone you may be outside of the enhancement zone.”
This makes sense, and is very encouraging, because for anyone with a brain, there should be a world of challenges available to keep the mind sharp. And while playing video games is not the same as taking an eight-week course on Taiwanese Dessert Decorating, at least games can still be challenging, and importantly, readily available.
But, we want to point out that his study does not mention a control group of subjects who did neither the skill-seeking activities nor the home puzzle activities. So while, yes, while learning basket weaving on a three-week trip through Namibia would probably be very good for the brain, this doesn’t mean that challenging yourself to play brain training games wouldn’t.
Note also that the article calls listening to music and doing puzzles “non-demanding mental activities.” We especially disagree with this assertion, in as much as doing crossword puzzles can be merely an “exercise” (like, on a Monday) whereas anyone who’s noodled through a Friday New York Time crossword puzzles knows how challenging—and stimulating, not to mention rewarding—it can be to finish one.
So the take-away we’re going to agree-upon with this research is that: more research needs to be done. And in the meantime, there is no good reason to play games. And fun is its own reason.
Tuesday, November 5, 2013
No More Poor Brains
Poverty is a political issue, not a social issue. We have the wealth, we have the means of redistributing it, and we even have the will to do so; we just don’t have the justification. Our country is in a grip of an idealism that puts a primacy on so-called “hard-work” and tautologically insists if you don’t have anything, it must be because you didn’t work hard enough.
This is backwards logic, and literally harmful to those who have to suffer under the yoke of financial oppression. And new research shows that poverty is harmful to the brain. This is no longer a matter of haves versus have nots, or even those who deserve and those who don’t. This is a matter of ignoring the injuries suffered by those who were, against their will, born without privilege.
Define poverty however you like, but noise pollution, overcrowding, and meager access to nutrition can cause the impoverished to suffer from reduced brain health. These are the same poor that some politicians believe do not deserve access to good health care. Nevertheless, it’s these same poor who are the major contributors, through labor, of our gross national product. Its these same poor who, despite not having access to health care, pay for it.
The Great Brain Robbery is walking a thin line here, we realize, taking on a political issue and not discussing brain science. But science is, at the end of the day, merely a tool that we use to better our lives. The essential word there is “our.” This research which has found the extremely negative impact of poverty on the physiology of the brain should be a wake-up call. This is not an issue that can be mitigated by shifting our social paradigms. We’re not going to be able to justify a lopsided distribution of wealth and call it culture. We are allowing the dehumanization of our poor citizens, and it has to stop.
This is backwards logic, and literally harmful to those who have to suffer under the yoke of financial oppression. And new research shows that poverty is harmful to the brain. This is no longer a matter of haves versus have nots, or even those who deserve and those who don’t. This is a matter of ignoring the injuries suffered by those who were, against their will, born without privilege.
Define poverty however you like, but noise pollution, overcrowding, and meager access to nutrition can cause the impoverished to suffer from reduced brain health. These are the same poor that some politicians believe do not deserve access to good health care. Nevertheless, it’s these same poor who are the major contributors, through labor, of our gross national product. Its these same poor who, despite not having access to health care, pay for it.
The Great Brain Robbery is walking a thin line here, we realize, taking on a political issue and not discussing brain science. But science is, at the end of the day, merely a tool that we use to better our lives. The essential word there is “our.” This research which has found the extremely negative impact of poverty on the physiology of the brain should be a wake-up call. This is not an issue that can be mitigated by shifting our social paradigms. We’re not going to be able to justify a lopsided distribution of wealth and call it culture. We are allowing the dehumanization of our poor citizens, and it has to stop.
Monday, November 4, 2013
Get the Fat Out of Your Brain
The people behind the Great Brain Robbery follow a lot of blogs and folks on Twitter. Brain-talk is a rich field, with lots of research, many different point of views, and thousands of theories all coalescing in a miasma of opinion, explanation, interpretation. Sometimes we see things we disagree with. Sometimes we feel like this casual blogging stuff sends the wrong message.
For example, a blog post we found today that asserts: “The more a woman weighs, the worse her memory.” We’re not going to link to this blog, but we’re going to take it apart. First, the study that found this correlation between BMI increases and memory test decreases only studied women ages 65 to 79. That’s not “women” in general, so the above assertion is spurious at best.
Then there’s the use of the BMI, a terrible way to classify anything except unrelated metrics. Take a person’s weight in kilograms and divide by the square of their height in meters. The taller you are, the more you weigh, and if you are not tall enough to weigh as much as you do, you’re fat. Except that taller people will weigh more than the a simple two-metric function will predict, making them “fat,” and the BMI doesn’t, obviously, account for what gives a person mass in the first place. Body builders are fat, and people who suffer from anemia are skinny: who’s healthier?
Who’s dumber? Because when bloggers start throwing around sentences like “the worse her memory,” that’s a judgmental statement. And this blogger later says “Remember, everything the brain does affects memory (and everything affecting memory affects the brain).” Therefore, if you’re a woman who weighs too much, expect to have decreased brain function, according to this blogger.
The blogger doesn’t bother to show the causative relationship between weight gain and memory dysfunction, but does say that another study showed that people who had bariatric stomach by-pass surgery showed mental improvements. This same blogger, in the past, has talked about the very positive effects happiness has on memory. Who, in this overly-critical society, wouldn’t be happier to be, finally, skinny?
Does gaining weight effect memory? Or does not exercising effect memory? The blogger says “Though exercise doesn’t do much to cause weight loss, it has many other benefits… that can directly benefit memory.” Did this study of older women control for the ones who, despite having a higher BMI were exercising regularly?
How about this: let’s stop using inadequate tools to focus on vaguely correlative measurements and making painfully obvious conclusions.
As you age, good health is important for maintaining brain function. We’ve known that for centuries.
For example, a blog post we found today that asserts: “The more a woman weighs, the worse her memory.” We’re not going to link to this blog, but we’re going to take it apart. First, the study that found this correlation between BMI increases and memory test decreases only studied women ages 65 to 79. That’s not “women” in general, so the above assertion is spurious at best.
Then there’s the use of the BMI, a terrible way to classify anything except unrelated metrics. Take a person’s weight in kilograms and divide by the square of their height in meters. The taller you are, the more you weigh, and if you are not tall enough to weigh as much as you do, you’re fat. Except that taller people will weigh more than the a simple two-metric function will predict, making them “fat,” and the BMI doesn’t, obviously, account for what gives a person mass in the first place. Body builders are fat, and people who suffer from anemia are skinny: who’s healthier?
Who’s dumber? Because when bloggers start throwing around sentences like “the worse her memory,” that’s a judgmental statement. And this blogger later says “Remember, everything the brain does affects memory (and everything affecting memory affects the brain).” Therefore, if you’re a woman who weighs too much, expect to have decreased brain function, according to this blogger.
The blogger doesn’t bother to show the causative relationship between weight gain and memory dysfunction, but does say that another study showed that people who had bariatric stomach by-pass surgery showed mental improvements. This same blogger, in the past, has talked about the very positive effects happiness has on memory. Who, in this overly-critical society, wouldn’t be happier to be, finally, skinny?
Does gaining weight effect memory? Or does not exercising effect memory? The blogger says “Though exercise doesn’t do much to cause weight loss, it has many other benefits… that can directly benefit memory.” Did this study of older women control for the ones who, despite having a higher BMI were exercising regularly?
How about this: let’s stop using inadequate tools to focus on vaguely correlative measurements and making painfully obvious conclusions.
As you age, good health is important for maintaining brain function. We’ve known that for centuries.
Friday, November 1, 2013
Delegating the Brain’s Delegation
You’re busy. You’ve got things to do. You don’t have time for a lot of disorganization. So when you walk into the reception area of the hotel where you’re presenting a paper on neuroplasticity, you expect to see signs, pointing you to the various areas you need to go. Places like the executive center, so you can download a few slides and print them. And the coffee shop, so you grab a quick cup to fuel your talk. And of course, the restroom.
But what if the sign was garbled? What would you do? You’d ask someone-- but what if there’s no one behind the reception desk? You’d ask someone else. Another hotel patron, maybe, who, while not officially designated to give directions, might be able to point out the places you need.
A new paper in the Journal of Neuroscience suggest that the brain may be able to do the same kind of thing.
Your brain has different areas for recognizing faces and landscapes, and another part of your brain assigns attention to which ever area is required for a given task at hand. If you want to remember a face, for example, this latter area tells the landscape area to relax while telling the face area to pay attention.
Scientists showed test subjects pictures of faces overlaid on top of pictures of landscape. They had them concentrate on only one aspect of the picture to memorize it, all while the scientists used magnets to temporarily knock-out the “delegator” portion of the brain. With the delegator knocked out, subjects were still able to remember faces or landscapes as required.
They looked at brain scans while they did this, and found that a different part of the brain was acting as a temporary delegator. Note that this was not a learned response, something that had developed over time. We already know about the brain’s remarkable ability to rewire itself and compensate for damaged parts. Instead, this was an immediate taking-over of delegation activity.
That’s one take away. And it’s intriguing. The Neuroskeptic at Discover is a little more, well, skeptical. He points out that the study didn’t show for sure this reassignment, only that another part of the brain spoke up when the magnets didn’t keep delegation from happening. It might have been the case that the scientists were wrong about how much actually delegation the delegator does.
(In our hotel analogy above, that might be like the hotel patron drifting over to the sign you’re looking at, watching you read it.)
Nevertheless, it’s enough to warrant further studies which will give us even more data in a quest to understand out decision making processes.
You know, free will.
But what if the sign was garbled? What would you do? You’d ask someone-- but what if there’s no one behind the reception desk? You’d ask someone else. Another hotel patron, maybe, who, while not officially designated to give directions, might be able to point out the places you need.
A new paper in the Journal of Neuroscience suggest that the brain may be able to do the same kind of thing.
Your brain has different areas for recognizing faces and landscapes, and another part of your brain assigns attention to which ever area is required for a given task at hand. If you want to remember a face, for example, this latter area tells the landscape area to relax while telling the face area to pay attention.
Scientists showed test subjects pictures of faces overlaid on top of pictures of landscape. They had them concentrate on only one aspect of the picture to memorize it, all while the scientists used magnets to temporarily knock-out the “delegator” portion of the brain. With the delegator knocked out, subjects were still able to remember faces or landscapes as required.
They looked at brain scans while they did this, and found that a different part of the brain was acting as a temporary delegator. Note that this was not a learned response, something that had developed over time. We already know about the brain’s remarkable ability to rewire itself and compensate for damaged parts. Instead, this was an immediate taking-over of delegation activity.
That’s one take away. And it’s intriguing. The Neuroskeptic at Discover is a little more, well, skeptical. He points out that the study didn’t show for sure this reassignment, only that another part of the brain spoke up when the magnets didn’t keep delegation from happening. It might have been the case that the scientists were wrong about how much actually delegation the delegator does.
(In our hotel analogy above, that might be like the hotel patron drifting over to the sign you’re looking at, watching you read it.)
Nevertheless, it’s enough to warrant further studies which will give us even more data in a quest to understand out decision making processes.
You know, free will.
Thursday, October 31, 2013
This is Your Brain on Football
A friend of The Great Brain Robbery mentioned reading about ALS in athletes, specifically football players. So we looked it up, and indeed, a study published in Neurology in 2012 reveals some alarming results.
Professional football players in American have a higher risk of dying from ALS and Alzheimer’s disease, according to the study. (Researchers pointed out that they were examining death certificates specifically, and that without post-mortem examinations, it can be difficult to diagnose chronic traumatic encephalopathy as a contributor to an ex-football player’s death.)
And when we think of football players bashing heads, we think of the defensive line and the offensive line crashing their helmets together on every play. Linebackers and running backs, for example, don’t see helmet-to helmet contact on every down.
But the scientists looking at the data discovered that, in fact, it was the high-speed players, not the linemen, who were even more likely to die from diseases related to damaging brain cells.
Those big hits have effect that literally lasts for the rest of a player’s life.
What can be done about this? Other than rule changes, not much. In an interview with Freakonomics radio, Dr. Robert Cantu said that helmets built to prevent death may do a worse job of preventing concussions. So, while deaths during football games are down to zero (for NFL players), deaths from having played football are as high as ever.
Fans often refer to football players as gladiators, men who killed one another for the entertainment of bored citizens. The analogy, hyperbolic when it was coined, is becoming increasingly, depressingly apt.
Professional football players in American have a higher risk of dying from ALS and Alzheimer’s disease, according to the study. (Researchers pointed out that they were examining death certificates specifically, and that without post-mortem examinations, it can be difficult to diagnose chronic traumatic encephalopathy as a contributor to an ex-football player’s death.)
And when we think of football players bashing heads, we think of the defensive line and the offensive line crashing their helmets together on every play. Linebackers and running backs, for example, don’t see helmet-to helmet contact on every down.
But the scientists looking at the data discovered that, in fact, it was the high-speed players, not the linemen, who were even more likely to die from diseases related to damaging brain cells.
Those big hits have effect that literally lasts for the rest of a player’s life.
What can be done about this? Other than rule changes, not much. In an interview with Freakonomics radio, Dr. Robert Cantu said that helmets built to prevent death may do a worse job of preventing concussions. So, while deaths during football games are down to zero (for NFL players), deaths from having played football are as high as ever.
Fans often refer to football players as gladiators, men who killed one another for the entertainment of bored citizens. The analogy, hyperbolic when it was coined, is becoming increasingly, depressingly apt.
Labels:
als,
alzheimer's,
cte,
football,
freakonomics,
helmets,
neurology
Wednesday, October 30, 2013
Playing Games Changes Your Brain (for the Better)
In front of you, a green field and blue sky.
And mushrooms with clown shoes waddling in your direction. You run around them, picking up large gold coins. This is not a dream. This is Super Mario 64. And while you play, your brain is changing.
Brain researchers at the Max Planck Institute in Berlin studied adults playing the popular first-person video game and found remarkable changes in gamers’ cerebellum, right prefrontal cortex, and right hippocampus. These are areas typically associated with executive function, spatial orientation, and memory formation.
We have long known about our brain’s plasticity, the way it literal reshapes itself as we learn. But these findings are particularly intriguing, as they show a direct, causal relationship between playing and brain growth. And according to the research team: “The presented video game training could therefore be used to counteract known risk factors for mental disease such as smaller hippocampus and prefrontal cortex volume in, for example, post-traumatic stress disorder, schizophrenia and neurodegenerative disease.”
Even more intriguing is an additional data point: gamers who looked forward to playing showed even more change than those who merely played as requested, “evidence suggesting a predictive role of desire in volume change.”
We’ve discussed this sort of thin before, the idea that people have more control over their brain growth that science used to think: in a study (reported on by Scientific American) on permanent changes to working memory: “[Researchers] found that people who have a growth mindset about intelligence… showed greater improvement on the visuospatial reasoning tests than those who have a fixed mindset about intelligence.”
In other words—if you think you can get smarter, and you want to, and you want to have fun doing it, then playing games will definitely help.
Brain researchers at the Max Planck Institute in Berlin studied adults playing the popular first-person video game and found remarkable changes in gamers’ cerebellum, right prefrontal cortex, and right hippocampus. These are areas typically associated with executive function, spatial orientation, and memory formation.
We have long known about our brain’s plasticity, the way it literal reshapes itself as we learn. But these findings are particularly intriguing, as they show a direct, causal relationship between playing and brain growth. And according to the research team: “The presented video game training could therefore be used to counteract known risk factors for mental disease such as smaller hippocampus and prefrontal cortex volume in, for example, post-traumatic stress disorder, schizophrenia and neurodegenerative disease.”
Even more intriguing is an additional data point: gamers who looked forward to playing showed even more change than those who merely played as requested, “evidence suggesting a predictive role of desire in volume change.”
We’ve discussed this sort of thin before, the idea that people have more control over their brain growth that science used to think: in a study (reported on by Scientific American) on permanent changes to working memory: “[Researchers] found that people who have a growth mindset about intelligence… showed greater improvement on the visuospatial reasoning tests than those who have a fixed mindset about intelligence.”
In other words—if you think you can get smarter, and you want to, and you want to have fun doing it, then playing games will definitely help.
Tuesday, October 29, 2013
Talking Away Those Brain-Stress Blues
A man, middle-aged, reasonably healthy, walks into his living room. He’s a cleared an area in front of his gas-fireplace, and sits down, crossing his legs. He sits with god posture, eyes closed, arms loose in his lap. He takes a long deep breath, lets it out slowly. Another deep breath, another soothing exhalation. The man appears relaxed, at ease, content.
Blood flow in his cerebral prefontal cortices slowly decreases. He takes another breath, opens his mouth, and says “womba.”
He exhales. Breathes in. “womba choo dee latta.”
His posture straightens just slightly. He inhales. “womba choo dee matta fen, fenno chum de ganafed. Arrana sanipe. Arrana sanipe dee matta fen, womba choo dee matta fen ro sanipe, sanipe dreepa, sanipe dreepa saaaaaa….” He inhales, exhales, his lips murmuring more nonsense syllables.
None of the words come from any known language. He does this for an hour.
~~~
Glossolalia is the technical term for speaking tongues, a phenomena seemingly reserved for Pentecostal church meetings and Haitian voodoo rituals. Actually, glossolalia occurs all over the world in a diversity of cultures, including Japan, Egypt, and India. Adherents claim to be speaking a divine or lost language, either by channeling a holy spirituality or as the result of a gift from a god or ancestor.
But there’s more glossolalia than just nonsense syllables and church devotion. In 2006 a group of scientists studied the brains of those who could, at will, speak in tongues, and found that brain activity was different than for a control group. Another study, performed in 2010, found that people who spoke in tongues had lowers levels of the so-called “stress” hormone cortisol. This evidence suggests glossolalia is not merely a psychology phenomenon, but a biological one.
The scientific case for meditation as a means to reduce stress is well established. But glossolalia is not meditation. Andrew Newberg of the 2006 study, speaking to Dick Hanson and the DANA Foundation for an article on glossolalia, points out that as far as the biology of the brain is concerned, “In some sense, [glossolalia] is the opposite of the concentrative process of meditation.”
While it’s easy to dismiss the subject of speaking in tongues as nothing more than pseudo-scientific gibberish, the results of these studies suggest there’s more here than meets the ear. We live in a hectic, non-stop, information-overloading world, so any technique that one discovers to reduce stress and increase relaxation has to be worth taking the time to explore.
(But maybe try this one alone.)
Monday, October 28, 2013
What Can Your Brain Do While You Drive a Car?
Not very much. And nothing very well.
Your brain can’t really do anything else while you drive a car. Every time you do something else, the emphasis is on the “else.” While you’re checking your phone, you’re not driving anymore. The car is on autopilot.
We’re not including the things it does automatically, like regulate your breathing, heartbeat, and so on. But your brain can do those things when you’re asleep.
But you wouldn’t drive a car while asleep, would you?
Scientists have discovered that so-called “multi-tasking” requires a splitting of the brain. Different parts of your brain will be used to do a variety of things. If you eat popcorn while watching TV, part of your brain moves your hand into the bowl, while another part pays attention to what Daryl Dixon is doing to zombie number 62.
But driving is not just holding the wheel straight and watching the road. Your working memory temporarily memorizes where all the other cars are so you can keep a sense of them as they move around you. You’re long-term memory is recalling landmarks and routes for you to follow. You’re hippocampus is filtering out the sound of other cars but still paying attention for the sound of horns and sirens, making sure to send those impulses to your frontal cortex where you can process what to do with them.
So, while you’re checking your phone, you’re checking your phone, and nothing else. You’re wiping your working memory clean (where did the other cars go?), distracting your long-term memory (was that a right here or a left) and confusing your hippocampus (I hear a horn but there’s no horn icon in this text message).
And even nothing bad happens while you’re looking at the phone, when you look up again and disengage autopilot, you’re still not driving—you’re getting a whole new set of data before you’re fully immersed in the act of driving again.
So, go ahead, give yourself an estimate—how long did you glace at that phone? 2 seconds? Now double that, to account for how long it takes for your brain to switch.
A car traveling 55 mph covers 322 feet in 4 seconds. And since it takes a person about second to respond to a sudden change in environment, and since most cars take about 120 feet to come to a stop with a sudden application of the brakes from 55 mph, that all adds up to 600 feet.
Anything that happens less than two football fields away from you when you check your phone is going to be an accident.
Your brain can’t really do anything else while you drive a car. Every time you do something else, the emphasis is on the “else.” While you’re checking your phone, you’re not driving anymore. The car is on autopilot.
We’re not including the things it does automatically, like regulate your breathing, heartbeat, and so on. But your brain can do those things when you’re asleep.
But you wouldn’t drive a car while asleep, would you?
Scientists have discovered that so-called “multi-tasking” requires a splitting of the brain. Different parts of your brain will be used to do a variety of things. If you eat popcorn while watching TV, part of your brain moves your hand into the bowl, while another part pays attention to what Daryl Dixon is doing to zombie number 62.
But driving is not just holding the wheel straight and watching the road. Your working memory temporarily memorizes where all the other cars are so you can keep a sense of them as they move around you. You’re long-term memory is recalling landmarks and routes for you to follow. You’re hippocampus is filtering out the sound of other cars but still paying attention for the sound of horns and sirens, making sure to send those impulses to your frontal cortex where you can process what to do with them.
So, while you’re checking your phone, you’re checking your phone, and nothing else. You’re wiping your working memory clean (where did the other cars go?), distracting your long-term memory (was that a right here or a left) and confusing your hippocampus (I hear a horn but there’s no horn icon in this text message).
And even nothing bad happens while you’re looking at the phone, when you look up again and disengage autopilot, you’re still not driving—you’re getting a whole new set of data before you’re fully immersed in the act of driving again.
So, go ahead, give yourself an estimate—how long did you glace at that phone? 2 seconds? Now double that, to account for how long it takes for your brain to switch.
A car traveling 55 mph covers 322 feet in 4 seconds. And since it takes a person about second to respond to a sudden change in environment, and since most cars take about 120 feet to come to a stop with a sudden application of the brakes from 55 mph, that all adds up to 600 feet.
Anything that happens less than two football fields away from you when you check your phone is going to be an accident.
A Happy Brain’s a Healthy Brain
Listening to recordings form the summit requires a paid membership to the site, but the slide decks are free to view. Topics include “What are scalable best practices to spread smart health?” and “How can Big Data help upgrade brain care?” I’ve been looking at “The Future of Personal Brain Health.”
This is a slide deck that accompanied a talk, so it amounts to visual notes—I don’t have a transcript of what was said. But I found several points very intriguing. For example, Kaiser Permanente has an ad campaign that ties mental wellness to physical wellness.
Hugs = Healthy
Happy People are 50% healthier.
I’ve discussed happiness and memory before but what about overall health?
I decided to look up the science behind this, and apparently it’s related to cortisol and serotonin. Cortisol is a steroid your adrenal glands make in response to stress. Amongst other (necessary) things, cortisol suppresses immune function. And then there’s serotonin, which is “made” in the raphe nuclei of the brain, is used to regulate intestinal movement, and can effect mood. Although is not a causal relationship, cortisol is usually higher when serotonin is lower, and vice-versa.
So you if get stressed, your immune system is suppressed, you are not happy, and you get sick. If you’re not stressed, your healthy body produces serotonin, and you feel happy.
(Note that this is a one-way circle: this isn’t to say that if you make yourself happy, you will make yourself healthy.)
But back to the mind: most interesting to me is how cortisol and serotonin play a part in the formation of memory. Cortisol mixes with adrenaline to encode highly detailed short-term memories: a high-stress situation can etch a memory into a person mind. However, long-term stress can allow cortisol to impede memory function in the hippocampus. On the happier side of the coin, serotonin impacts learning and memory—until we get older, when serotonin starts to increase and alters associative memory.
So, in a nutshell, happiness is good for us, which seems obvious, but also helps us learn—but when we’re not happy, we can still learn short-term solutions to problems, to reduce stress, and get back to long-term behaviors to maintain happiness and health.
Friday, October 25, 2013
Your Brain Doesn’t Like Your Sweet Tooth
Ignorance is bliss. Bliss is sweet.
And we’ve got the science to transform those ideas into literal realities.
We’ve long know the importance of a healthy diet to keeping a healthy brain. We know that some foods in particular seem to boost brain function, and we know some foods can have a debilitating effect. And now scientists in Germany have found a direct correlation between elevated blood sugar and poor recall.
The experiment (with a self-reportedly small sample set) included memory tests and brains scans, and as a preliminary step, suggests a need for more study on how glucose directly impacts memory function.
We’ve all had listless days, feeling low, light-headed, and knowing it’s because our blood sugar is too low. But what about those days when we’re pepped up, maybe on overdrive? Getting that “sugar high” and careening around in a whirlwind of jittery limbs? Are we shaking memories right out of our heads?
No need to worry about that for now—more tests are needed. Just add this as yet another reason to practice moderation. Eat healthfully and stay active, and your brain will be thankful. The better your brain functions the more likely you are to remember to eat well.
Afterall, to misquote a certain supermodel: Nothing tastes as delicious as having a healthy mind feels.
And we’ve got the science to transform those ideas into literal realities.
We’ve long know the importance of a healthy diet to keeping a healthy brain. We know that some foods in particular seem to boost brain function, and we know some foods can have a debilitating effect. And now scientists in Germany have found a direct correlation between elevated blood sugar and poor recall.
The experiment (with a self-reportedly small sample set) included memory tests and brains scans, and as a preliminary step, suggests a need for more study on how glucose directly impacts memory function.
We’ve all had listless days, feeling low, light-headed, and knowing it’s because our blood sugar is too low. But what about those days when we’re pepped up, maybe on overdrive? Getting that “sugar high” and careening around in a whirlwind of jittery limbs? Are we shaking memories right out of our heads?
No need to worry about that for now—more tests are needed. Just add this as yet another reason to practice moderation. Eat healthfully and stay active, and your brain will be thankful. The better your brain functions the more likely you are to remember to eat well.
Afterall, to misquote a certain supermodel: Nothing tastes as delicious as having a healthy mind feels.
This is your Brain on Melatonin
“Your brain’s in a fog. You’re staring at the computer screen, trying to remember something—but you can’t even remember why you’re trying to remember it. You’re hunched, slack-jawed. What’s going on? Is this some kind of dementia? Alzheimer’s?”
Sound familiar?
That is, have you read the above paragraph before? It was posted here a week ago, on an entry about sleep. If you read it then and can remember it now, chances are you’ve slept a few nights since that first reading. And the same chemical that made you sleep probably helped you remember what you’d read.
Sleep and memory are inexorably linked, something scientists have theorized since the early nineteenth century, and tested as early as the 1920s. A study in 2006 found that the brain replays past events during sleep: rats where run through a maze, and brain scans during the run and while sleeping showed duplicate brain patterns. These patterns were occurring between the brain’s visual centers and the hippocampus; while the rats slept, visual memories were being moved from short-term to long-term memory (and, interestingly, the memories were played backwards).
Melatonin is essential to sleep, but the connection between melatonin and memory isn’t spurious. Your pineal gland produces melatonin, which causes drowsiness. While you sleep, melatonin acts as a powerful antioxidant, “washing away” free radicals, (including toxins associated with Alzheimer’s). This action is part of the process that allows for that “dialogue” between the visual cortex and the hippocampus.
Nutshell: As your eyes are exposed to less and less blue light at the end of the day, the pineal gland responds by producing melatonin, which makes you drowsy; and as you sleep, melatonin removes free radicals and facilitates the movement of experiences into long-term memory.
(And one theory holds that we experience this memory storage as dreams—and if the mechanism is not interrupted, you tend to wake up without any memory of dreaming, since the experience of dreaming is a separate conscious observation from the experiences that you were dreaming about—and there’s no “reason” to remember those temporary observations).
If you don’t remember the first paragraph, above, don’t worry: your brain encounters millions of pieces of data every day, and you can only process so much.
But if you want to make sure you do remember for later, why not sleep on it?
Thursday, October 24, 2013
All Brains are the Same
But, all brains are definitely not the same, right?
For the most part, scientists don’t really know much about the brain. While a lot of research has been done, a lot of hypothesis tested and a lot of theories proven to be true, the best science can determine is that there’s way more that we don’t know about the brain than what we do know.
For every brain rule we find, there tend to exist brains which break those rules. This part of the brain is for speech, this part for math, this part the sex drive, this part makes your arms and legs work… but what about a man who was discovered to have none of those areas in his brain? A civil servant and father of two children, he was shown to have a brain that mostly fluid—estimates say he had as much as 75% less brain tissue than someone who is “normal.”
And then there’s Einstein’s brain, cut out of his head (after he died, of course) before the rest of him was cremated. Sliced into thin sheets, imaged, and studied for decades. What did they find? Not much. A more or less normal brain, except for a missing Sylvian fissure, which may have resulted in a slightly wider parietal lobe, and may have meant some of his brain matter was more tightly packed together.
But they’ve also shown that, in general, women, who are physically smaller than men, nevertheless have the same amount of brain matter, just more tightly packed together. Theories suggest that this tightness of neurons results in faster neural communication.
More or less, any one healthy brain is the same as any other healthy brain. Even though we know this isn’t true, even though the very structure of our individuals brains cause and are caused by our own distinct individuality.
But until we know more about the brain, and how microscopic difference make us different, one brain is as good as another.
It’s what you do with it that counts.
For the most part, scientists don’t really know much about the brain. While a lot of research has been done, a lot of hypothesis tested and a lot of theories proven to be true, the best science can determine is that there’s way more that we don’t know about the brain than what we do know.
For every brain rule we find, there tend to exist brains which break those rules. This part of the brain is for speech, this part for math, this part the sex drive, this part makes your arms and legs work… but what about a man who was discovered to have none of those areas in his brain? A civil servant and father of two children, he was shown to have a brain that mostly fluid—estimates say he had as much as 75% less brain tissue than someone who is “normal.”
And then there’s Einstein’s brain, cut out of his head (after he died, of course) before the rest of him was cremated. Sliced into thin sheets, imaged, and studied for decades. What did they find? Not much. A more or less normal brain, except for a missing Sylvian fissure, which may have resulted in a slightly wider parietal lobe, and may have meant some of his brain matter was more tightly packed together.
But they’ve also shown that, in general, women, who are physically smaller than men, nevertheless have the same amount of brain matter, just more tightly packed together. Theories suggest that this tightness of neurons results in faster neural communication.
More or less, any one healthy brain is the same as any other healthy brain. Even though we know this isn’t true, even though the very structure of our individuals brains cause and are caused by our own distinct individuality.
But until we know more about the brain, and how microscopic difference make us different, one brain is as good as another.
It’s what you do with it that counts.
Education and our Malnourished Brains
Are we getting smarter? A bunch of smart people got together and found that, no, we’re not.
For a very specific and compelling definition of “we,” that is. The gap between the academic performances of haves and have nots is growing, at least in the US. And while certainly we, as a nation, like to point at the outliers, evoke the notion of rugged individualism, and then, ironically, say “look how great we are,” the truth is that the majority of people in this country are not performing at their full potential.
A report by Jonathan Plucker, Jacob Hardesty, and Nathan Burroughs, called “Talent on the Sidelines,” says that we are losing the minds of our school children. They studied a variety of education assessments, and found that while there has been an increase in the scores of white and well-off students, minority and poor students are, at best, performing at the same levels as 15 years ago, in in some states, their doing worse.
Plucker et. al say that this means we have a “permanent talent underclass.” And it is difficult, in the face of this, not to consider the political environment that has led to this condition. Underfunding of education. The downsizing and removal of school nutrition programs. Even something like the cost of healthcare can keep parents out of the home, working over time to pay for a meager lifestyle, instead of being able to spend time in the home, participating in their children’s education.
And it’s a viscous cycle—underperforming school children become underpaid adult workers, who have children that enter the same broken education environment.
What can we do? According to Plucker et al, one thing we can do is start making excellence a focus. Rather than spend billions on assessing and cultivating achievement minimums, forcing educators to teach to an arbitrary standardized test, we should start working on ways to move underachievers ahead of the curve.
BUT GREAT BRAIN ROBBERY, WHAT DOES THIS HAVE TO DO WITH BRAINS?
Uh… well, one of the things we champion here at TGBR is brain training. You can strive for excellence by making your own brain excellent. Contribute to the culture!
Wednesday, October 23, 2013
Bashing the Furniture in my Brain Attic
I was struggling to find something to say about brains today, so on a whim I googled “Sherlock’s Brain.” I got this: Mastermind: How to Think Like Sherlock Holmes by Maria Konnikova – digested read. I skimmed it, decided I liked the phrase “obliquity of the ecliptic,” then got distracted by something.
I went back to read the whole thing later, and found myself a bit put off by the writer’s tone. Who does she think she is, this Konnikova?
So I looked for a different link from my original Google search. I clicked on “Sherlock Holmes and the infamous brain attic - Boing Boing” and got an article written by Maria Konnikova.
Grrrr.
But I read it anyway. And the tone was very different. And informative! Sherlock, via Sir Arthur Conan Doyle, of course, writing in the late nineteenth century, thought of the human brain as an attic, to be filled with furniture as appropriate, to be cluttered up or only furnished with useful pieces. Konnikova points out that Doyle’s analogy is a good one, given the way our brains interpret, process, store, and retrieve experiences. Memory is indeed like an attic. So I felt better for having read this second piece.
But not a single use of the phrase “obliquity of the ecliptic.” So what I read before at the Guardian?
I went back and realized my error. The Guardian piece was written by John Crace. It’s a (satirical) digest of Maria Konnikova’s book. The article on Boing Boing IS by Konnikova, and is a promotion of her book.
So what I had initially read was a fake digest, or a faux facsimile, or a sham of a simulacrum. And what I read second was an introductory promotion, or an appetizing distillation, or a tantalizing opportunity for further investigation.
In other words, as I was reading these articles, storing furniture in my attack, I was trying to stack a baby grand onto a spindly ottoman. With enough balance, I might have achieved it—but when it came time to recall what I’d read, who knows what would have crashed through.
Probably something about fairies. Apparently, Doyle and or Kannikova believed in them at one point. I’m not sure what that has to do with anyone of this, but unlike Sherlock, my own attic is already pretty crammed as it is.
Subscribe to:
Posts (Atom)