Showing posts with label humans. Show all posts
Showing posts with label humans. Show all posts

Wednesday, February 5, 2014

When being mean is actually being nice ... and when it is just being mean

It's a harsh world in here, in academia. We all already know that academic science is not a carebear teaparty, and apparently now things are worse than ever as far as potential jobs for Ph.D.s and funding goes. 

A brief interruption, for an ad:
Use Grammarly's plagiarism checker online because it's better to have a computer criticize you than a person.
 And now back to your regularly scheduled blogramming.

Scicurious has a great post up at Neurotic Physiology about what it is like to be out in the 'real world' and out of academia. She has some fascinating points about how academia has skewed her perspective on things, but one in particular jumped out at me: That now she has to re-learn how to take criticism.

Scicurious says:
"I remember a time when I took criticism well. I did a lot of theater and music, it was something you HAD to take well. I took it, I improved, worked harder, fixed things, and did better. Sometime during grad school, however, criticism began to paralyze me. Every critique felt like a critique of me, as a scientist. Since a scientist was what I WAS, all criticism began to feel like criticism of me, as a person. Sometimes it was indeed phrased that way. You are careless. You are not smart enough, why don't you get this?! You are not focused."
This got me thinking because, honestly, I feel exactly the opposite. I think I learned how to take criticism in grad school partly by learning how to give it.

It's for your own good! (source)

When I am editing a paper or grant for someone, I am trying to help them. The more critical I am the better their paper/grant will be. The paper is headed to peer-review which will determine whether it gets published or not , and the grant is headed to a study section which decides whether it gets funded. Both are grueling and rigorous examinations of quality and scientific merit. These review processes are so important because published papers and funded grants are 'science currency' and will determine your future. In some cases the funding status of a grant can determine whether a lab stays open or a PI gets tenure.

If there is a paragraph that doesn't make sense, or (gasp) a typo, it is obviously better for me to catch it than for someone important to see it and get confused or frustrated.

Understanding this concept, that constructive criticism is the nicest thing a scientist can do for another scientist taught me to take criticism much better than I had previously. I was one of those students who was always 'better enough' than the other students that teachers rarely bothered to push me to true excellence. So I was really not used to criticism, and the initial slings and arrows in graduate school did sting. However at one point it really sunk in that these criticisms were making me better... better at everything: writing, presenting, scientific thinking. 

So that is when being mean is actually being nice.

That said, I never had anyone tell me I wasn't 'smart enough' as Scicurious says. Just because constructive and thorough critique can sound mean, but actually be nice, doesn't mean that there is no such thing as 'meanness' in academia.

Sometimes being mean is really just being mean. A criticism that does not help me improve in any way is just mean. 'you are not smart enough to be a scientist' does not help anyone be a better scientist. It is a completely different kind of criticism than 'you really need to read more about X because you don't understand how X works.' Both are directed at 'you' personally, but one says you can't do it and the other says you can do it and even suggests how you can do it.


© TheCellularScale













Thursday, September 12, 2013

Use Imposter Syndrome to become an excellent grad student

Let's talk about Aristotle for a minute.

School of Athens Aristotle is the one in blue.

Many people mis-attribute this quote to him:

"We are what we repeatedly do. Excellence therefore is not an act, but a habit." -Will Durant
But really this quote is from someone summarizing Aristotle. It's a great summary and it seems to say what Aristotle means, just more concisely.

Aristotle does say:
"For these reasons the virtues are not capacities either; for we are neither called good nor called bad, nor are we praised or blamed, insofar as we are simply capable of feelings. Further, while we have capacities by nature, we do not become good or bad by nature." Nicomachean Ethics Book II 5.5
Ok, so what does this have to do with grad school?

Well lots of people are starting grad school right now with lots of potential. Tons of potential probably, it's what got them into grad school in the first place.

But here's the thing, your potential doesn't mean anything unless you live up to it (or at least come close). Basically Aristotle says that your feelings and intentions and capabilities do not make you excellent, your actions do.

The real lesson here is that you ARE what you DO. if you want to be a good person think 'what would a good person do in this situation?' and then do that thing. Simple really. So in grad school this translates to:

Make Imposter Syndrome work in your favor.

Imposter Syndrome is when someone thinks 'I'm not good enough to be where I am, and I'm just minutes away from the moment my colleagues find out' and it is apparently a plague of many grad students and there are plenty of blog posts around on how to combat it.

But guess what? Playing dress up can make you smarter. People wearing a white coat called a lab coat did better on focus tasks that people wearing the same white coat called an painter's coat (Adam and Galinsky 2012). These are the same people who did the perspective taking experiments showing that when you pretend to be something you become more like it. (See item number 4 on this post.)

Pretending to be what you want to be is actually a completely valid and useful way to become what you want to be. This doesn't mean go into class and pretend you are the professor (that's not a good idea). It means go into class and pretend you are the BEST student in that class. 

So go put on those 'smart person clothes' and make believe that you are the best student that school has ever seen. If you run into a dilemma think to yourself 'what would an excellent grad student do in this situation?' or better yet think 'what would an excellent scientist do in this situation?' and then do that thing.

© TheCellularScale

ResearchBlogging.org
Adam and Galinsky (2012). Enclothed Cognition Journal of experimental social psychology DOI: 10.1016/j.jesp.2012.02.008

Tuesday, July 30, 2013

Treatise on the Diseases of Females: Pregnancy in the 1800s

While looking through some seriously old books, I came across a medical treatise from 1853. Now this would be fascinating on its own, but even better, it's a treatise specifically about the "diseases of females" written by William P. Dewees, M.D.

William Dewees (from Wikipedia)
Having recently been pregnant, I was particularly interested in the 1800s recommendations for pregnancy.

Dewees starts out his chapter on pregnancy by explaining why it is important to scientifically determine whether a woman is pregnant or not. The reasons are essentially as follows:

1. So if the woman needs to be treated for some other disease, she doesn't get prescribed something that would hurt her or the baby if pregnant.
2. Because if she is under trial or awaiting execution, pregnancy might forestall it.
3. If the predicted date of birth might influence the 'character or property' of someone else.

So yes, clearly it is important to know if a woman is pregnant.

So how do you tell in the 1800s when no pee-sticks with plus signs were available? Not surprisingly, the first way is 'she doesn't have her period.' However there is clearly some debate in the field at this time.

Other things can 'suppress the menses' and sometimes a woman can bleed while pregnant.

Dewees spends excessive words and semi-colons defending his position on the subject:

"In declaring that women may menstruate after impregnation, I have no favourite hypothesis to support; nor am I influenced by any affectation or vanity to differ from others; neither do I believe I am more than ordinarily prone to be captivated or misled by the marvellous; for I soberly and honestly believe what I say, and pledge myself for the fidelity of the relation of the cases I adduce in support of my position." *

So you need some other signs of pregnancy other than just not menstruating. Next up: Nausea and Vomiting. Though "far from certain" as a sign of pregnancy, in conjunction with other signs, it is 'added proof'

Another sign is the enlargement of the sebaceous glands (which are on the areolae around the nipple), and the formation of milk. But milk coming in is also not certain:

"I once new a considerable quantity of milk form in the breasts of a lady, who though she had been married a number of years had never been pregnant; but who at this time had been two years separated from her husband. She mentioned the fact of her having milk to a female friend, who from an impression that it augured pregnancy, told it to another friend, as a great secret; and thus, after having enlisted fifteen or twenty to help them keep the secret, it got to the ears of the lady's brother. Her surprise was only equaled by his rage; and, in a paroxysm, he accused his sister, in the most violent and indelicate terms, of incontinency, and menaced her with the most direful vengeance." *

It turns out the lady was not pregnant, but was sick with 'phthisis pulmonalis.'

So finally the surest signs of pregnancy are the enlargement of the uterus and abdomen, and feeling the baby move "quickening".

(also mentioned are the 'pouting of the navel' and the 'spitting of frothy saliva')

*All quotes from Treatise on the Diseases of Females by William P. Dewees

© TheCellularScale

For more on historical pregnancy medicine, see some great posts from Tea in a Teacup

Sunday, July 7, 2013

Male DNA in the Female Brain

When you are pregnant, people like to tell you all sorts of things about yourself.


probably the most complimentary thing I have been compared to.

"you are going to have a boy/girl"
"you are carrying high/low"
"you look like an olive on a toothpick/beached whale"
"you probably have some of your husband's DNA/baby's cells in your brain now."

huh?

That last one requires a little more explanation. How could new external foreign cells get into my brain? First of all there is the blood-brain barrier which prevents your own blood cells from getting mixed in with your neurons, and second of all there is the placental barrier that prevents your blood from mixing with the baby's blood.

Neither of these barriers are perfect. Certain drugs and chemicals can cross the blood-brain barrier, and drugs and chemicals that a pregnant woman ingests can cross the placental barrier to get to the baby. But are these barriers so leaky that whole cells can get through?

Apparently they are. Dawe et al., 2007 explains possible ways that this can happen.

The placenta, up close. (Dawe et al,. 2007 Figure 1)
The placenta develops with the fetus, and so it is a hotbed of new growing cells early in pregnancy. It is made up of a combination of cells that contain the mother's DNA and cells that contain the new baby's DNA. However it is not clear exactly how baby cells get transferred to the mom. In the author's words:

"The mechanism by which cells are exchanged across the placental barrier is unclear. Possible explanations include deportation of trophoblasts, microtraumatic rupture of the placental blood channels or that specific cell types are capable of adhesion to the trophoblasts of the walls of the fetal blood channels and migration through the placental barrier created by the trophoblasts." (Dawe et al., 2007)

It is also not clear how these baby cells, once in the mother, could cross the blood-brain barrier. In fact, it is not perfectly clear (as of this 2007 paper) that these cells do get into the mother's brain in humans, though studies have shown fetal DNA-containing cells in the brains of mice.

So in conclusion, if you have ever been pregnant, you probably still have some of that baby's DNA (and consequently some of the baby's father's DNA) in your body. If you were pregnant with a boy, then you probably have Y chromosomes in some of your cells! It even seems that mothers can transfer cells from previous babies into future babies. This means that if you have an older brother or sister, you might have some of their DNA in your body as well.


The next question is: Do these foreign DNA cells have a meaningful impact on your body?

© TheCellularScale



ResearchBlogging.orgDawe GS, Tan XW, & Xiao ZC (2007). Cell migration from baby to mother. Cell adhesion & migration, 1 (1), 19-27 PMID: 19262088


Monday, June 10, 2013

The Ultimate Simulation

You may have noticed that things have slowed down here at The Cellular Scale.

The reason is that I have been really busy making the ultimate simulation of a human brain. I've worked on this day and night for the past 8 months. It is exhausting work and is starting to take all my energy now that it is nearly complete.

Pretty soon I will have to push this simulation out my uterus, and then all my effort will be spent on simulation support and maintenance. I may write some posts, but they won't be appearing regularly for a while.

© TheCellularScale

Monday, May 27, 2013

What is an Experiment?

What is an experiment?
Experimenting with color (source)

People use the term 'experiment' to meant a lot of things. One may say "she experimented with drugs" or "she performs experimental music". Someone might 'experiment with ones hair' or 'perform a thought experiment'.

All of these uses of the word experiment have distinct connotations, but most of them essentially mean 'to try something and see what happens'. In the examples above, most of the phrases also imply that the experiment is something new. If she experiments with her hair, she's probably trying some new style and seeing if she likes it. If she performs experimental music, she's probably not following the conventional rules for the music she is playing.

Are these kinds of 'experiments' different from the experiments that scientists do? Well yes and no.  The basic definition of an experiment as 'trying something [often new] and seeing what happens' is pretty much what scientists do. So what's different? Why isn't someones 'hair experiment' publishable in a scientific journal?

Mythbusters would have you believe that the only difference between science and screwing around is writing it down:


And that is sort of true.

But what really makes a scientific experiment scientific is controls. In our hair example, you can experiment with your hair by dying it black, and seeing if you like it. But that's not the scientific experiment. To be scientific you would have to decide how to measure how much you like your new hair color. You could do this by filling out a survey each day asking you how many times you thought you were pretty or rating your confidence on a scale of 1-10. You could fill this survey out for a week and then dye your hair and fill the survey out for another week. You could then compare the scores and decide if the new black hair had a 'significant' impact on your self-image. 

Lets say it does impact your self image and you report higher self-confidence that second week. But what if you feel different just because you have a new hair color, not because you have black hair?

Well, you would want to do a control experiment, which controls for the newness of the hair color. You could control for novelty by dying your hair yet a different color, and taking the survey for another week. Or you could take the survey two months after you dyed your hair black to see if you still report higher confidence or if your confidence has dropped back down to normal.


This is not a perfect experiment by any means, it's not even a clever or well-designed one, but it is somewhat scientific. And illustrates what I think is the most important difference between experimenting as in trying something new, and experimenting as in trying to find something out:

The control group


In addition, here is a great example of how important the control group is in science. (See the epilogue)

© TheCellularScale



Sunday, May 12, 2013

The Inadvertent Psychological Experiment


Escape from Camp 14 is deeply disturbing, and I highly recommend it.

Escape from Camp 14 by Blaine Harden
Escape from Camp 14 is a chilling tale of Shin Dong-hyuk's escape from a North Korean prison camp. What is so interesting about Shin Dong-hyuk's story as written by Blaine Harden is that he was born inside this North Korean prison camp. Apparently they allow breeding between prisoners as a reward for 'good behavior.'

Escape from Camp 14 reveals the obscene violations of human rights that occur in North Korean prison camps, and was especially poignant because I am a similar age to Shin Dong-hyuk and could directly compare my memories during the specified years to his. For example he escapes on January 2nd, 2005 and I couldn't help but think of the New Years party I was at that year and how absurdly different my life has been from his.

This book struck me in a way that reading about the horrors of the Holocaust never could. Those atrocities happened long before I was born. But the atrocities in North Korea are happening right now. I mean right this minute in a prison camp, a child is likely being beaten, a woman is likely being raped by a guard (later to be killed if she happens to become pregnant), someone may be picking undigested corn kernels from cow dung to ease hir starving belly, and maybe two lucky prisoners are getting to have 'reward breeding' time. Right now. This minute. That is just nuts.

The other thing that struck me about this whole situation is that having children born into a hostile prison environment is an inadvertent psychological experiment. These children are raised without love and without trust. One of the sharpest points in the book is the reveal that Shin Dong-hyuk turned his own mother and brother in to the guards for planning an escape. He watched his mother's execution shortly thereafter and felt nothing but anger at her for planning an escape.

When he finally escaped, it was shocking to him to see people talking and laughing together without guards coming over to (violently) stop it. In Camp 14, gathering of more than 2 people was forbidden. These prison children are being raised on fear of the guards and suspicion of each other. One of the easiest ways to be rewarded is to tattle on another prisoner for something (stealing food, for example), and the children learn this quickly.

If something drastic happens and North Korea dissolves, these children raised in prison camps will have a near impossible time trying to adjust to a life of freedom and will have a difficult time forming attachments and trusting others (as seen in Shin Dong-hyuk and other refugees from North Korea). Their personalities and psychological profiles could be fundamentally different from any other group on earth. These atrocities should be stopped and these people should be studied and rehabilitated.

© TheCellularScale

ResearchBlogging.org
Lee YM, Shin OJ, & Lim MH (2012). The psychological problems of north korean adolescent refugees living in South Korea. Psychiatry investigation, 9 (3), 217-22 PMID: 22993519

Monday, May 6, 2013

Everyone should learn everything.

Today I am getting on a bit of a soapbox about things.  Specifically about things scientists should learn.
Scientists should learn everything (source)
In an ideal world everyone would be good at everything, but as you have probably noticed this is NOT the case. Some people are good at lots of things and some people are really good at specific things, but terrible at others, and some unfortunate people are generally bad at a lot of things and mediocre at a few.

Recently, I've been hearing increasing noise for scientists (or scientists-in-training) to learn X, Whatever X is. 'Scientists should learn art"; "Scientists should learn creative writing"; "Scientists should learn how to communicate to the public more clearly" ; "Scientists should learn managerial skills" and so forth.

This bothers me for a couple of reasons.

1. Why should the scientists learn all this stuff? Why aren't people clamoring for artists to learn microbiology, or for novelists to brush up on their molecular genetics?

and

2. What is wrong with some people being good at science and NOT being good at much else?

Yes, if waving a magic wand could suddenly make scientists good communicators, artists, and managers, I wouldn't object. But these things (like science itself) take training. And god knows, graduate students already get a lot of training.

And yes, running a lab takes managerial skills and grant writing requires clear communication and story-telling skills. But instead of requiring one person to be good at all these things, why not divide up the labor a little and have a 'lab manager' help run the lab, and a 'departmental grants guru' to help polish the grants.

It is really easy to say 'scientists should learn X' because...

1. there is a perception that scientists are smart and can learn things easily

and

2. it is always impossible to argue that things wouldn't be better if scientists were good at X. (Wouldn't it be great if all scientists were excellent public speakers? yes of course.)

The problem is implementing the extensive training in X that a scientist should have, and what current training to replace. Therefore I propose that the 'scientists should learn X' statements should all be adjusted to say 'scientists should get extensive training in X rather than Y'.

© TheCellularScale

Wednesday, April 17, 2013

Van Gogh was afraid of the moon and other lies

I remember the first time I realized just how easily false information gets spread about.

A terrifying starry night
I was in French class in high school. Our homework had been to find out 1 interesting fact about Van Gogh and tell it to the class. When it was my turn, I said some boring small fact that I no longer remember. My friend sitting behind me, however, had a fascinating fact: When Van Gogh was a young child, he was actually afraid of the moon.

The teacher and the class were all quite impressed and thought about how interesting that was and how that fact might be reflected in the way that he paints the Starry Night. Though this fact was new to everyone, including the teacher, no one even thought to question its truth.

In fact, the teacher was so enthralled by this idea that she passed the information on to all the other French classes that day.

When talking to my friend later that day, he admitted that he had not done the assignment, and just made the 'fact' up. I was completely surprised, not only that someone had not done their homework *gasp*, but that I hadn't even thought to question whether this was true or not. 
The best lies have an element of truth (source)
 Misinformation like this spreads like wildfire and is exceptionally difficult to undo. The more things you can link this piece of information to in your brain, the more true you might think it and even after your learn that it's not true, you still might inadvertently believe it or fit new ideas into the context it creates. Myths like the corpus callosum is bigger in women than in men is just one of those things that is easy to believe.

An interesting paper by Lewandowsky et al. (2012) explains how this kind of persistent misinformation is detrimental to individuals and to society with the example of vaccines causing autism. This particular piece of misinformation is widely believed to be true despite numerous attempts to publicize the correct information and the most recent scientific findings showing no evidence for a link between the two

The authors of this paper give some recommendations for making the truth more vivid and effectively replacing the misinformation with new, true information. For example:
"Providing an alternative causal explanation of the event can fill the gap left behind by retracting misinformation. Studies have shown that the continued influence of misinformation can be eliminated through the provision of an alternative account that explains why the information was incorrect." Lewandowsky et al. (2012)
Misinformation can be replaced with information, but it takes more work to replace a 'false fact' than to just have the truth out there in the first place. It is much better when misinformation is not spread around in the first place, than when it is retroactively corrected.

This paper is also covered over at The Jury Room.


© TheCellularScale


ResearchBlogging.org
Lewandowsky, S., Ecker, U., Seifert, C., Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing Psychological Science in the Public Interest, 13 (3), 106-131 DOI: 10.1177/1529100612451018

Sunday, April 7, 2013

LMAYQ: Scales

The word "scale" can mean many things, and The Internet can't yet use context to tell the difference. So for this issue of Let Me Answer Your Questions, here are questions about scales that The Internet thinks The Cellular Scale can answer. As always these are real true search terms, and all the posts in the LMAYQ series can be found here

A Question of Scale (source)


1. "Can you give a rat scales?"

 I have never thought to ask this question, but it is an interesting one. If you can grow weird things on mice, like ears, then why not scales? Well here's the thing, the 'ear mouse' is growing skin like it normally does, the skin is just growing over an ear-shaped mold. It would actually be harder to make a rat grow scales. If it is possible, it would take some mastery in genetic manipulation...

Bee-Rat, the ultimate achievement in genetic manipulation (source)

Some sniffing around on wikipedia taught me that scales have evolved several times (fish, reptiles, arthropods, etc). It might be possible to make a rat (or mouse) grow scales by isolating the scale gene from these other animals and inserting it into the rat genome. However, since rats already grow fur, teeth, and nails, which are related to scales, it might be possible to manipulate those features already in the rat to become more scale-like.

But to answer your question, no. I am pretty sure we can't give a rat scales yet.


2. "Does the giant squid have scales?"

Another interesting question. The quick answer is no, giant squid and colossal squid (like their normal squid counterparts) have smooth skin that does not contain scales. This isn't too surprising because squid aren't fish, they are cephalopods (like octopus and cuttlefish). Cephalopods sometimes have shells, but not scales. 

Zoomed in view of Squid Skin (source)
Instead of protective scales, cephalopods use pigment in their skin to camouflage themselves or confuse predators.

Blue Octopus, Eilat Israel (I took this picture)
This octopus turning blue sure confused me.


3. "How to turn your cell phone into a scale."

There are a couple of ways that you might think a cell phone could be used as a scale. One is by the touch screen sensor. However, most smart phones now have capacitive touch screens which respond to the electric change your finger induces on the screen. That means that the amount of pressure applied doesn't matter. So you couldn't use a smart phone as a scale in that way.

Another way is through the accelerometer. Smart phones also have accelerometers, which you could possibly use to measure the force of something moving. But this wouldn't tell you the mass of the object unless you already knew the acceleration. (force = mass * acceleration). 

But really the only way that seems to actually work (albeit slowly and with questionable accuracy) is using the 'tilt sensor' of the smart phone.

But really you just as well make your own if you are weighing out small amounts of something.

Most importantly it's helpful to know what some typical objects around the house weigh, so you can use them to calibrate a phone or homemade scale.  Here are some useful weights:

1. US penny 2.5g
2. US nickel 5 g
3. 1ml water 1g
4. Euro 7.5g
5. British pound 9.5 g



4. "What is the scale on the cellular level?" 

Finally a relevant question! Most cells are measured in microns, with a blood cell being about 6-8 microns in diameter.

blood (source)
Neurons on the other hand can have somas (cell bodies) ranging from tiny (5 micron diameter) to large (50 micron diameter). But even for neurons with small somas, the dendritic or axonal arbors can be gigantic. 

Some neurons in the aplysia (snail) can get up to 1mm (1,000 microns) in diameter. Which is ridiculously huge for a neuron. For perspective, C. Elegans, a nematode frequently used for neuroscience research, is about 1mm in length. The whole animal! Including its 302 neurons! 

© TheCellularScale



Tuesday, April 2, 2013

Reviewers and Citations: an update

A few weeks ago, I asked whether reviewers should be required to cite sources in their summary statements for submitted journal articles.
I heart citations (source)
I was frustrated about my paper reviews (not an uncommon sentiment) because one reviewer made some serious claims against my basic assumptions, but did not include any citations to back these claims up. A few people in the comments suggested that I write to the editor.

So I did. I wrote a nice polite letter saying that citations from the reviewer would be helpful. The editors wrote to the reviewer and then wrote back to me with the reviewers reply. I was actually surprised at this because I had a feeling that once the editor puts the paper on the 'will not consider for publication' list, that they might not even respond. But I got some citations from the reviewer, so I could find out where I got lost in my own literature search. And much to my delight, after scouring the papers the reviewer cited, I still feel that I am right and that my paper has a sound foundation. I have now re-submitted this paper (to another journal), and feel pretty confident about it. 

Though of course I always feel more confident about a submitted paper than I should.

Submitting a beautifully written paper for review (what should we call grad school)

© TheCellularScale


Tuesday, March 26, 2013

Advice vs Victim-blaming: a proposed study on #safetytipsforladies

So there has been a lot of noise about whether giving women 'safety tips' to avoid being raped is a form of 'victim blaming'.

Don't get Raped (source)
This culminated in a great hashtag (as many things do). Follow #safetytipsforladies to see some lovely tips for avoiding rape.

For example:

Others suggest simply not being a woman, not ever drinking anything, not ever wearing anything (but not being naked either), not ever leaving the house (or since many rapes happen inside the house, not ever being home). And so forth.

The main point is that it's absurd to tell women to not get raped. Rape by definition is NOT under the victim's control.

Yet people still tend to blame the victim in rape cases. An interesting study was published in 2011 showing that people were more likely to blame the victim in a rape case than in a robbery case. The authors gave people short vignettes describing either a rape or a robbery, and had these participants fill out a perpetrator blame scale and a victim blame scale.

Bienek and Krahe 2011 Figure 4
Interestingly, but maybe not surprisingly, rape always had more victim blame and less perpetrator blame than robbery and this difference increased with how close the victim and perpetrator were to each other (stranger, acquaintance, ex-partner). 

Now some people say 'hey, I'm just trying to keep women safe by telling them to avoid dark places, and not take drinks from strangers.' But here's the thing, maybe the mere suggestion that women can do something to avoid being raped is enough to subtly nudge one's opinion toward thinking that if a woman got raped, she should have done something to avoid it and is therefore somewhat to blame.

So I propose the following study:

Have one group of people read a short article on tips for women to avoid being raped (a serious and well meaning one), and one group of people read some unrelated article. Then have both groups read rape vignettes similar to the ones described in the Bienek and Krahe study and fill out the victim and perpetrator blame scales. They would also fill out a scale for how much punishment the perpetrator should get in a court of law.

I hypothesize that simply reading a list of well meant tips for how women can avoid being raped would increase victim blame and would make people more lenient in their prescribed punishment for the perpetrator.

Somebody please do this experiment!

© TheCellularScale


ResearchBlogging.orgBieneck S, & Krahé B (2011). Blaming the victim and exonerating the perpetrator in cases of rape and robbery: is there a double standard? Journal of interpersonal violence, 26 (9), 1785-97 PMID: 20587449




Tuesday, March 19, 2013

What is up with the "Dopamine Project"?

Someone is trying to make me eat my words.

yum. (source)
That someone is the Dopamine Project. I am on record as saying "It is better for the public to learn simplified bite-size science morsels than to learn nothing at all." And my specific example was that it's better for people to know that 'dopamine is a reward molecule' than to not even know the term dopamine.

But sometimes things just go too far. The "Dopamine Project" is a website run by Charles Lyell with a stated 'self-help' purpose:
"The Dopamine Project was founded to foster positive change by encouraging open-minded individuals to share readily available research into the connections between dopamine and a growing list of addictive behaviors." -About Tab
Doesn't sound too terrible, right? Share research about dopamine? sign me up! .... However, I don't see ANY research, or even references to research, on this website. In fact it's quite wootastic. Going through the posts, you get some gems like

"A Message from you Dopamine Angel"

and

"Keeping a Dopamine Diary: Wrestling with Dopamine-Induced Ignorance"

It's all about how 'good dopamine' makes you want things you should want (food) and 'bad dopamine' makes you want things that will hurt you later (addictive drugs for example). Basically the website's message is a self-help, self-control one with the word dopamine sprinkled all over it.

The worst part is that not only does the website not include a single citation to a research paper, it actively rails against science.

"The future depends on how long it takes scientists to discover what they haven’t been interested in discovering so far. Rather than wait for the mainstream scientists and media to get started, we’re reaching out to anyone interested in fostering positive change by raising dopamine awareness." -Welcome to the Dopamine Project
Trust me, scientists want to understand dopamine. At the IBAGS conference half the talks related to dopamine, and there is a conference completely devoted to dopamine coming up in May. The specific action of dopamine is really really complex, and scientists are working really really hard to unravel its intricacies. This Charles Lyell guy is pulling out a typical woo card, implying that he knows what scientists don't want you to know.
"If the thought of fostering positive change through dopamine awareness triggers a shot of dopamine that brings a smile to your face, this might be your chance to be among the top .001% who go on record as the first to understand and apply what we know about dopamine to make a difference."  -Welcome to the Dopamine Project
 He also seems to feel personally attacked by Steven Poole's New Statesman article on Neurobollocks.

Is the 'Dopamine Project' ridiculous and unscientific? Absolutely.

Is it harmful and dangerous to people? ... Honestly, I'm not sure. Reading it certainly makes me want to throw up, but there are worse things for pseudoscience to encourage than self-control. I'm not sure if I should devour my earlier words quite yet.

© TheCellularScale

To read more on the confusing line between science and pseudoscience, see Michael Shermer's Scientific American article:


ResearchBlogging.org
Shermer M (2011). What is pseudoscience? Scientific American, 305 (3) PMID: 21870452

Tuesday, March 12, 2013

Should reviewers be required to cite their sources?

When I got back from the IBAGS conference, I was greeted by an 'paper rejection email'.

Failure with a capital F (source)
I was disappointed of course, but I slept off my jetlag and then built my self-confidence back up by saving the universe. I will retool the paper and submit it somewhere else.

However, the reviews for this paper were particularly infuriating (aren't they always?). Here's a summary:

I say: "Thing X is true (citation, citation), so we did thing Y which uses thing X."

Reviewer says: "You act like thing X is true, but it's not (no citations)."

The reviewer did this for two specific aspects of the paper, saying that the basis for our model and our ideas just aren't true, but giving no citations. In both case, I have citations in the paper to back up my claim that these things ARE true.  

This particular form of irritating review has not happened to me before. I've always had well-cited responses to my claims. It's common courtesy to cite some papers when you say that someone is completely wrong about something, but I guess it's not required.

Anyone have any thoughts on this? Has it happened to you? Am I just having the normal 'grrr' response to a negative review?

© TheCellularScale
 

Monday, March 4, 2013

Honoring a Legend

The Cellular Scale is at the International Basal Ganglia Society meeting this week (#IBAGS2013), and finally has internet!

Sunrise over the Gulf of Aqaba (I took this picture)

It's already been two days of conferencing, and I plan to mainly write some follow up posts when I get back. But I will just briefly mention the "Lifetime Member" lecture that was given on the first evening of the conference.

Mahlon Delong (source)
This year's lifetime member is Mahlon DeLong.
I've written before about deep brain stimulation (DBS) as a treatment for Parkinson's Disease, and DeLong has done some fascinating work that has lead up to DBS in the  subthalamic nucleus (STN).

One particular treat was to see a video during the talk of the very first attempt at alleviating Parkinson's symptoms through a subthalamotomy, the lesion of the subthalamic nucleus.

A Parkinson's Disease monkey was given the subthalamotomy on only one side of the brain and the video shows Mahlon DeLong interacting with the monkey and noting that it's treated side is less stiff than the untreated side. A second video shows the monkey later able to move its arm with no problems.

It was exciting to see this sort of 'moment of discovery' from 1989. There were no cries of "Eureka!" or anything it was more of a 'hm, interesting' tone. You actually hear his post-doc on the video saying
(paraphrasing from memory) "the right side has better tone, at least Mahlon thinks so" and then start laughing.

(source)


One other cool thing about Dr. DeLong is that he is Muhammad Ali's physician.

© TheCellularScale

Sunday, February 24, 2013

Scientizing Art

I've always been fascinated with the way the eye moves around a piece of art.

Andrew Wyeth's "Christina's World" (or as I looked up "that painting of a girl in a field looking at a house")

This piece by Andrew Wyeth is an obvious example of an artist completely controlling your gaze. There are pretty much no options here. You look at the girl and then you follow her gaze to the house. You probably then take a quick glance at that other house/barn to the left, and then maybe follow the edge of the light circle around the houses. (It's my opinion that that is how the eye should go on this painting, but I have no eye tracking data to support it.)

A paper last year in PLoS One really tries to "scientize' this process by testing what factors determine the eye movements, and the 'clusters' where the eye tended to fall. Massaro et al., (2012) compare dynamic and static images and images that contain human subjects or nature subjects. Their cluster analysis overlaying classic paintings makes for quite interesting images:

The next installment at MoMA

This one is a dynamic human image. Each patch of color shows where the parts of the painting where the eye lingers (face, hands, ....crotch...). The authors do all sorts of interesting analysis on this and other paintings, having participants rate the painting for 'movement' or for 'aesthetic value' and since the paper is open access, it is free to people who may not have university access to journal publications. Anyone can read the whole thing here.

One interesting thing that the authors find is that pictures containing humans have fewer clusters than pictures of nature. I expect this is because certain aspects of humans (faces, hands ...crotches...) are so salient and the brain focuses directly on them, while all the branches of a tree for example have about equal 'meaning' for a person.

science creates modern art
 Another great image from this paper. The authors show how much gazing was done at different parts of a painting through a heat map. This one is a human static image. The end result is actually quite haunting because the place that you want to look is blanked out (sort of like a Magritte painting).

So here are my questions: If someone looks at a blank page, where does their eye naturally go? Is there some sort of common pattern that most people use just to scan an area? Do chimpanzees use a similar pattern to scan a blank page? Does everyone have their own unique scanning pattern? Or is it just pretty much random? 

And here's an idea for artists: Buy yourself an eye tracker and have customers come use it and stare at a blank page. Trace their eye movements and then create a dynamic painting (or T-shirt, or napkin drawing) that follows the person's natural scanning patterns. This would be the ultimate in commissioned custom art! (Then give me one for free, because I think this sounds like fun.)

© TheCellularScale

ResearchBlogging.org
Massaro D, Savazzi F, Di Dio C, Freedberg D, Gallese V, Gilli G, & Marchetti A (2012). When art moves the eyes: a behavioral and eye-tracking study. PloS one, 7 (5) PMID: 22624007


Thursday, February 14, 2013

It's not you, it's my birth control

So, Valentine's Day, what better time to question the foundations of your relationship?

It's my brain that loves you (source)
Well, part of your relationship may be based on your Major Histocompatibility Complex (MHC) compatibility. The MHC is a cluster of genes that define which antigens get expressed on white blood cells. It is thought to control the ability of the body to recognize pathogens as 'other.' It is also thought that the more varied the genes in your MHC are, the more resistant to pathogens or parasites you are.

So what does the MHC have to do with your love life?
Well the most popular theory goes as such: If you want to have a healthy baby, you want to give it a varied MHC, therefore you want to find a man who has an MHC that is very different from your own.

And... Maybe you can detect whether a man has a MHC that is the same or different from yours through smell (maybe vision too). In 2005, a paper came out explaining that the Major Histocompatibility Complex (MHC) can be detected through smell, and (importantly) that women prefer the smell of men who have an MHC that is different from their own. (However another paper in 2008, did not replicate this preference)

Possible new fragrance? 

Now here's the real kicker: Taking oral contraceptives (birth control pills) might mess this preference up. Roberts et al., 2008 show that in an armpit sweat test (like this one), women on birth control showed more of a preference for the MHC similar men than the women not on birth control. If true, this could have implications for women starting relationships when they are either on or not on birth control. To take this to the greatest sensationalist extreme, you might pick the WRONG GUY because you were on birth control. However, just like I don't believe in destined, fated true love, I don't believe you need to have opposite MHCs to have a good relationship or healthy children.


Roberts et al. 2008 Figure 2C: Odor desirability ratings.
And not only that, I have somewhat of a problem with this graph and their data. As far as I can tell (I found the description to be pretty confusing), the white bars represent 'session 1' in which NO ONE was on the pill and then the gray bars represent 'session 2' when the women labeled 'pill' were actually on the pill, but the women labeled 'non-pill'  were still not on the pill. (following this?)  AND, 0 means that they liked the similar MHC and the dissimilar MHC guys equally, negative means the like the similar guys more and positive means they like the dissimilar guys more... (I told you this was confusing).

So my question is, why are the non-pill and pill users so different to begin with? Unless I am completely misunderstanding this graph, I would think the white bars should be similar, as they represent 'women who are not on birth control.' The huge difference between groups before the 'experimental treatment' should be a red flag: Something is already different between these women.

However, the pill session 1 (white) and pill session 2 (gray) bars are indeed different, and that is their 'main result.' Basically, women on the pill had an overall slight odor preference for MHC similar men, and the same women not on the pill had an odor preference for MHC dissimilar men.

So should you worry this Valentine's Day? Should you break up with your boyfriend because you were on birth control when you met? Should you spend a lot of time smelling your boyfriend's worn shirts analyzing how 'desirable' a scent the give off?

Probably not (unless you really like smelling sweaty shirts). There is more to relationship compatibility than histocompatibility, and making life-changing decisions based on possible olfactory disruptions due to birth control is just not a good idea.

Though if you are worried, you can read more about it at:

Context and Variation "will the pill mess up my ability to detect my one true love?"

and

First Nerve "pill goggles"

© TheCellularScale

ResearchBlogging.org
Roberts SC, Gosling LM, Carter V, & Petrie M (2008). MHC-correlated odour preferences in humans and the use of oral contraceptives. Proceedings. Biological sciences / The Royal Society, 275 (1652), 2715-22 PMID: 18700206

Sunday, February 10, 2013

Why scientists should play games

I have just finished reading Jane McGonigal's book Reality is Broken: Why games make us better and how they can change the world. It is a fascinating book which presents a strong case for games (including video games) doing good in the world.

Reality is Broken by Jane McGonigal

I have to admit, part of me wanted to read this book to make me feel better about my own video game habit. It certainly helped solidify the vague ideas I had about what good they might be doing me.

Specifically, the book made me think that scientists of all people might benefit greatly from playing games. There is one major reason why:

Games make you more resistant to failure

If there is one thing that scientists need to persist in their research its resilience in the face of failure. If you didn't know this already, just start following some 'life in academia' bloggers on twitter. Failure is a staple of scientific life.

Just yesterday I awoke to a small grant rejection. I started thinking about just how many things I had applied for during my (still new) scientific career, and just what proportion of those applications had resulted in rejections. I tallied it up on a chart (similar to a failure C.V.), and discovered that for about every 3.5 things I have applied for, only one was successful. This includes grant applications, travel fellowships, paper submissions and re-submissions, and miscellaneous things like applying to be an SfN Neuroblogger. (I did not include abstract submissions or applications to graduate school.) I actually think this is a relatively good ratio, and I expect this ratio to get worse in the future, because the competition for the things I am applying for will be even tougher.

Part of the reason I wanted to calculate my success/attempt ratio was to see how many things I had actually applied for. I was glad that the list was long, and that I applied for lots of things, even if it means that my 'ratio' is the worse for it. I would posit that having a good success/attempt ratio is not really that great if you only ever apply for a few things that are easy to get.

In science, you will fail; there is absolutely no scientist EVER who hasn't been rejected from something.

So back to games. Reality is Broken explains that games teach you to persist in the face of failure, and that games increase your optimism.

"Learning to stay urgently optimistic in the face of failure is an important emotional strength that we can learn in games and apply to our real lives. When we're energized by failure, we develop emotional stamina. And emotional stamina makes it possible for us to hang on longer, to do much harder work, and to tackle more complex challenges. We need this kind of optimism in order to thrive as human beings." -Reality is Broken, chapter 4

When I think of my own resistance to failure (which is decent, but could be better), I think of my time spent learning from games that failure is not the end of the world. Ever since I repeatedly failed to jump Mario over the first Goomba, video games were teaching me to try again, and again, and again.

Mario and Goomba level 1. (source)
Jane McGonigal brings up Tetris, one of the most popular video games of all time. Tetris is a game with no possible outcome except failure. You keep playing until you lose, and yet the game is immensely fun and ultimately rewarding. Each time you fail you want to try again, and you feel that you will probably do better next time.

In summary, games reward persistence and desensitize you to failure. When you play video games you learn implicitly that trying again is worth it and that failing isn't the end of the world. These skills are great to have in life and are essential to have in an academic career.

Reality is Broken lists 13 other ways that games 'fix' reality. Some of these fixes are about personal betterment (like persistence in the face of failure), but some of these fixes are about how games can ultimately change the larger reality. Games that combat global warming, for example, or games like Fold-It that actually further scientific progress and human knowledge. Whether you already play games or not, you can get something out of this book.

A nice addition to this book is the appendix "Practical advice for gamers" in which Jane McGonigal lays out some guidelines for getting the most out of games. For example, one rule is to never play more that 21 hours in a week. While video games have benefits, there are problems that can result from compulsive video game play, and you shouldn't think think that you are doing something healthy if you play video games for 50 hours a week and completely ignore reality. The idea is that playing games can help you function in reality. If you never venture into reality, you won't make any use of the benefits that the game might have given you.


© TheCellularScale

Here are further reviews of Reality is Broken:

ResearchBlogging.org
Ferguson, C. (2011). Reality is broken, and the video game research field along with it. PsycCRITIQUES, 56 (48) DOI: 10.1037/a0026131
 
Farhangi, S. (2012). Reality is broken to be rebuilt: how a gamer’s mindset can show science educators new ways of contribution to science and world? Cultural Studies of Science Education, 7 (4), 1037-1044 DOI: 10.1007/s11422-012-9426-y