When my advisor told me that it takes a semester just to graduate, she seriously wasn't kidding.
Here's the math:
To graduate this December, I had to have everything signed and turned in on December 6th. Doesn't sound so bad, right?
But that basically means I have to defend in November.
And my college/department has a mandatory pre-defense which must be a MONTH before the real defense.
/*Note on the pre-defense: I am not sure how many universities require a pre-defense. It has some pros and cons.
Pros: you have everything in a state of readiness a month before you really need to, and if anything is glaringly horrible and you might not graduate because of it, you find that out before your defense and likely before you tell everyone you are going to graduate. The pre-defense is private, so your presentation is critiqued and likely better for the public real defense, which is to everyone's benefit.
Cons: you have to have everything ready a month earlier that you really need to. Your committee might use it as an excuse to tell you to do extra things because you have a month. Your committee has to sit through basically the same talk twice, and it is sort of a waste of their time.
end of note*/
Thus the pre-defense must occur in October.
And your committee needs to read the dissertation before you pre-defend it, so you really need to give it to them 2 weeks before the defense.
This means that to graduate in December, you basically need your dissertation in a state of readability and relative completeness by the end of September! And since the semester starts at the beginning of September, you have essentially 4 weeks of the fall semester to work on your dissertation.
And the same goes for a spring graduation.
If you want to graduate in May, you defend in April, pre-defend in March, and have have everthing turned in by the end of February.
Plan accordingly.
© DrCellularScale
Showing posts with label personal. Show all posts
Showing posts with label personal. Show all posts
Monday, December 9, 2013
Thursday, September 12, 2013
Use Imposter Syndrome to become an excellent grad student
Let's talk about Aristotle for a minute.
Many people mis-attribute this quote to him:
Aristotle does say:
Well lots of people are starting grad school right now with lots of potential. Tons of potential probably, it's what got them into grad school in the first place.
But here's the thing, your potential doesn't mean anything unless you live up to it (or at least come close). Basically Aristotle says that your feelings and intentions and capabilities do not make you excellent, your actions do.
The real lesson here is that you ARE what you DO. if you want to be a good person think 'what would a good person do in this situation?' and then do that thing. Simple really. So in grad school this translates to:
Make Imposter Syndrome work in your favor.
Imposter Syndrome is when someone thinks 'I'm not good enough to be where I am, and I'm just minutes away from the moment my colleagues find out' and it is apparently a plague of many grad students and there are plenty of blog posts around on how to combat it.
But guess what? Playing dress up can make you smarter. People wearing a white coat called a lab coat did better on focus tasks that people wearing the same white coat called an painter's coat (Adam and Galinsky 2012). These are the same people who did the perspective taking experiments showing that when you pretend to be something you become more like it. (See item number 4 on this post.)
Pretending to be what you want to be is actually a completely valid and useful way to become what you want to be. This doesn't mean go into class and pretend you are the professor (that's not a good idea). It means go into class and pretend you are the BEST student in that class.
So go put on those 'smart person clothes' and make believe that you are the best student that school has ever seen. If you run into a dilemma think to yourself 'what would an excellent grad student do in this situation?' or better yet think 'what would an excellent scientist do in this situation?' and then do that thing.
© TheCellularScale
Adam and Galinsky (2012). Enclothed Cognition Journal of experimental social psychology DOI: 10.1016/j.jesp.2012.02.008
![]() |
| School of Athens Aristotle is the one in blue. |
Many people mis-attribute this quote to him:
But really this quote is from someone summarizing Aristotle. It's a great summary and it seems to say what Aristotle means, just more concisely."We are what we repeatedly do. Excellence therefore is not an act, but a habit." -Will Durant
Aristotle does say:
"For these reasons the virtues are not capacities either; for we are neither called good nor called bad, nor are we praised or blamed, insofar as we are simply capable of feelings. Further, while we have capacities by nature, we do not become good or bad by nature." Nicomachean Ethics Book II 5.5Ok, so what does this have to do with grad school?
Well lots of people are starting grad school right now with lots of potential. Tons of potential probably, it's what got them into grad school in the first place.
But here's the thing, your potential doesn't mean anything unless you live up to it (or at least come close). Basically Aristotle says that your feelings and intentions and capabilities do not make you excellent, your actions do.
The real lesson here is that you ARE what you DO. if you want to be a good person think 'what would a good person do in this situation?' and then do that thing. Simple really. So in grad school this translates to:
Make Imposter Syndrome work in your favor.
Imposter Syndrome is when someone thinks 'I'm not good enough to be where I am, and I'm just minutes away from the moment my colleagues find out' and it is apparently a plague of many grad students and there are plenty of blog posts around on how to combat it.
But guess what? Playing dress up can make you smarter. People wearing a white coat called a lab coat did better on focus tasks that people wearing the same white coat called an painter's coat (Adam and Galinsky 2012). These are the same people who did the perspective taking experiments showing that when you pretend to be something you become more like it. (See item number 4 on this post.)
Pretending to be what you want to be is actually a completely valid and useful way to become what you want to be. This doesn't mean go into class and pretend you are the professor (that's not a good idea). It means go into class and pretend you are the BEST student in that class.
So go put on those 'smart person clothes' and make believe that you are the best student that school has ever seen. If you run into a dilemma think to yourself 'what would an excellent grad student do in this situation?' or better yet think 'what would an excellent scientist do in this situation?' and then do that thing.
© TheCellularScale
Adam and Galinsky (2012). Enclothed Cognition Journal of experimental social psychology DOI: 10.1016/j.jesp.2012.02.008
Sunday, July 7, 2013
Male DNA in the Female Brain
When you are pregnant, people like to tell you all sorts of things about yourself.
"you are going to have a boy/girl"
"you are carrying high/low"
"you look like an olive on a toothpick/beached whale"
"you probably have some of your husband's DNA/baby's cells in your brain now."
huh?
That last one requires a little more explanation. How could new external foreign cells get into my brain? First of all there is the blood-brain barrier which prevents your own blood cells from getting mixed in with your neurons, and second of all there is the placental barrier that prevents your blood from mixing with the baby's blood.
Neither of these barriers are perfect. Certain drugs and chemicals can cross the blood-brain barrier, and drugs and chemicals that a pregnant woman ingests can cross the placental barrier to get to the baby. But are these barriers so leaky that whole cells can get through?
Apparently they are. Dawe et al., 2007 explains possible ways that this can happen.
The placenta develops with the fetus, and so it is a hotbed of new growing cells early in pregnancy. It is made up of a combination of cells that contain the mother's DNA and cells that contain the new baby's DNA. However it is not clear exactly how baby cells get transferred to the mom. In the author's words:
It is also not clear how these baby cells, once in the mother, could cross the blood-brain barrier. In fact, it is not perfectly clear (as of this 2007 paper) that these cells do get into the mother's brain in humans, though studies have shown fetal DNA-containing cells in the brains of mice.
So in conclusion, if you have ever been pregnant, you probably still have some of that baby's DNA (and consequently some of the baby's father's DNA) in your body. If you were pregnant with a boy, then you probably have Y chromosomes in some of your cells! It even seems that mothers can transfer cells from previous babies into future babies. This means that if you have an older brother or sister, you might have some of their DNA in your body as well.
The next question is: Do these foreign DNA cells have a meaningful impact on your body?
© TheCellularScale
Dawe GS, Tan XW, & Xiao ZC (2007). Cell migration from baby to mother. Cell adhesion & migration, 1 (1), 19-27 PMID: 19262088
![]() |
| probably the most complimentary thing I have been compared to. |
"you are going to have a boy/girl"
"you are carrying high/low"
"you look like an olive on a toothpick/beached whale"
"you probably have some of your husband's DNA/baby's cells in your brain now."
huh?
That last one requires a little more explanation. How could new external foreign cells get into my brain? First of all there is the blood-brain barrier which prevents your own blood cells from getting mixed in with your neurons, and second of all there is the placental barrier that prevents your blood from mixing with the baby's blood.
Neither of these barriers are perfect. Certain drugs and chemicals can cross the blood-brain barrier, and drugs and chemicals that a pregnant woman ingests can cross the placental barrier to get to the baby. But are these barriers so leaky that whole cells can get through?
Apparently they are. Dawe et al., 2007 explains possible ways that this can happen.
![]() |
| The placenta, up close. (Dawe et al,. 2007 Figure 1) |
"The mechanism by which cells are exchanged across the placental barrier is unclear. Possible explanations include deportation of trophoblasts, microtraumatic rupture of the placental blood channels or that specific cell types are capable of adhesion to the trophoblasts of the walls of the fetal blood channels and migration through the placental barrier created by the trophoblasts." (Dawe et al., 2007)
It is also not clear how these baby cells, once in the mother, could cross the blood-brain barrier. In fact, it is not perfectly clear (as of this 2007 paper) that these cells do get into the mother's brain in humans, though studies have shown fetal DNA-containing cells in the brains of mice.
So in conclusion, if you have ever been pregnant, you probably still have some of that baby's DNA (and consequently some of the baby's father's DNA) in your body. If you were pregnant with a boy, then you probably have Y chromosomes in some of your cells! It even seems that mothers can transfer cells from previous babies into future babies. This means that if you have an older brother or sister, you might have some of their DNA in your body as well.
The next question is: Do these foreign DNA cells have a meaningful impact on your body?
© TheCellularScale
Monday, June 10, 2013
The Ultimate Simulation
You may have noticed that things have slowed down here at The Cellular Scale.
The reason is that I have been really busy making the ultimate simulation of a human brain. I've worked on this day and night for the past 8 months. It is exhausting work and is starting to take all my energy now that it is nearly complete.
Pretty soon I will have to push this simulation out my uterus, and then all my effort will be spent on simulation support and maintenance. I may write some posts, but they won't be appearing regularly for a while.
© TheCellularScale
The reason is that I have been really busy making the ultimate simulation of a human brain. I've worked on this day and night for the past 8 months. It is exhausting work and is starting to take all my energy now that it is nearly complete.
Pretty soon I will have to push this simulation out my uterus, and then all my effort will be spent on simulation support and maintenance. I may write some posts, but they won't be appearing regularly for a while.
© TheCellularScale
Monday, May 6, 2013
Everyone should learn everything.
Today I am getting on a bit of a soapbox about things. Specifically about things scientists should learn.
In an ideal world everyone would be good at everything, but as you have probably noticed this is NOT the case. Some people are good at lots of things and some people are really good at specific things, but terrible at others, and some unfortunate people are generally bad at a lot of things and mediocre at a few.
Recently, I've been hearing increasing noise for scientists (or scientists-in-training) to learn X, Whatever X is. 'Scientists should learn art"; "Scientists should learn creative writing"; "Scientists should learn how to communicate to the public more clearly" ; "Scientists should learn managerial skills" and so forth.
This bothers me for a couple of reasons.
1. Why should the scientists learn all this stuff? Why aren't people clamoring for artists to learn microbiology, or for novelists to brush up on their molecular genetics?
and
2. What is wrong with some people being good at science and NOT being good at much else?
Yes, if waving a magic wand could suddenly make scientists good communicators, artists, and managers, I wouldn't object. But these things (like science itself) take training. And god knows, graduate students already get a lot of training.
And yes, running a lab takes managerial skills and grant writing requires clear communication and story-telling skills. But instead of requiring one person to be good at all these things, why not divide up the labor a little and have a 'lab manager' help run the lab, and a 'departmental grants guru' to help polish the grants.
It is really easy to say 'scientists should learn X' because...
1. there is a perception that scientists are smart and can learn things easily
and
2. it is always impossible to argue that things wouldn't be better if scientists were good at X. (Wouldn't it be great if all scientists were excellent public speakers? yes of course.)
The problem is implementing the extensive training in X that a scientist should have, and what current training to replace. Therefore I propose that the 'scientists should learn X' statements should all be adjusted to say 'scientists should get extensive training in X rather than Y'.
© TheCellularScale
![]() |
| Scientists should learn everything (source) |
Recently, I've been hearing increasing noise for scientists (or scientists-in-training) to learn X, Whatever X is. 'Scientists should learn art"; "Scientists should learn creative writing"; "Scientists should learn how to communicate to the public more clearly" ; "Scientists should learn managerial skills" and so forth.
This bothers me for a couple of reasons.
1. Why should the scientists learn all this stuff? Why aren't people clamoring for artists to learn microbiology, or for novelists to brush up on their molecular genetics?
and
2. What is wrong with some people being good at science and NOT being good at much else?
Yes, if waving a magic wand could suddenly make scientists good communicators, artists, and managers, I wouldn't object. But these things (like science itself) take training. And god knows, graduate students already get a lot of training.
And yes, running a lab takes managerial skills and grant writing requires clear communication and story-telling skills. But instead of requiring one person to be good at all these things, why not divide up the labor a little and have a 'lab manager' help run the lab, and a 'departmental grants guru' to help polish the grants.
It is really easy to say 'scientists should learn X' because...
1. there is a perception that scientists are smart and can learn things easily
and
2. it is always impossible to argue that things wouldn't be better if scientists were good at X. (Wouldn't it be great if all scientists were excellent public speakers? yes of course.)
The problem is implementing the extensive training in X that a scientist should have, and what current training to replace. Therefore I propose that the 'scientists should learn X' statements should all be adjusted to say 'scientists should get extensive training in X rather than Y'.
© TheCellularScale
Wednesday, April 17, 2013
Van Gogh was afraid of the moon and other lies
I remember the first time I realized just how easily false information gets spread about.
I was in French class in high school. Our homework had been to find out 1 interesting fact about Van Gogh and tell it to the class. When it was my turn, I said some boring small fact that I no longer remember. My friend sitting behind me, however, had a fascinating fact: When Van Gogh was a young child, he was actually afraid of the moon.
The teacher and the class were all quite impressed and thought about how interesting that was and how that fact might be reflected in the way that he paints the Starry Night. Though this fact was new to everyone, including the teacher, no one even thought to question its truth.
In fact, the teacher was so enthralled by this idea that she passed the information on to all the other French classes that day.
When talking to my friend later that day, he admitted that he had not done the assignment, and just made the 'fact' up. I was completely surprised, not only that someone had not done their homework *gasp*, but that I hadn't even thought to question whether this was true or not.
Misinformation like this spreads like wildfire and is exceptionally difficult to undo. The more things you can link this piece of information to in your brain, the more true you might think it and even after your learn that it's not true, you still might inadvertently believe it or fit new ideas into the context it creates. Myths like the corpus callosum is bigger in women than in men is just one of those things that is easy to believe.
An interesting paper by Lewandowsky et al. (2012) explains how this kind of persistent misinformation is detrimental to individuals and to society with the example of vaccines causing autism. This particular piece of misinformation is widely believed to be true despite numerous attempts to publicize the correct information and the most recent scientific findings showing no evidence for a link between the two.
The authors of this paper give some recommendations for making the truth more vivid and effectively replacing the misinformation with new, true information. For example:
This paper is also covered over at The Jury Room.
© TheCellularScale
Lewandowsky, S., Ecker, U., Seifert, C., Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing Psychological Science in the Public Interest, 13 (3), 106-131 DOI: 10.1177/1529100612451018
![]() |
| A terrifying starry night |
The teacher and the class were all quite impressed and thought about how interesting that was and how that fact might be reflected in the way that he paints the Starry Night. Though this fact was new to everyone, including the teacher, no one even thought to question its truth.
In fact, the teacher was so enthralled by this idea that she passed the information on to all the other French classes that day.
When talking to my friend later that day, he admitted that he had not done the assignment, and just made the 'fact' up. I was completely surprised, not only that someone had not done their homework *gasp*, but that I hadn't even thought to question whether this was true or not.
![]() |
| The best lies have an element of truth (source) |
An interesting paper by Lewandowsky et al. (2012) explains how this kind of persistent misinformation is detrimental to individuals and to society with the example of vaccines causing autism. This particular piece of misinformation is widely believed to be true despite numerous attempts to publicize the correct information and the most recent scientific findings showing no evidence for a link between the two.
The authors of this paper give some recommendations for making the truth more vivid and effectively replacing the misinformation with new, true information. For example:
"Providing an alternative causal explanation of the event can fill the gap left behind by retracting misinformation. Studies have shown that the continued influence of misinformation can be eliminated through the provision of an alternative account that explains why the information was incorrect." Lewandowsky et al. (2012)Misinformation can be replaced with information, but it takes more work to replace a 'false fact' than to just have the truth out there in the first place. It is much better when misinformation is not spread around in the first place, than when it is retroactively corrected.
This paper is also covered over at The Jury Room.
© TheCellularScale
Lewandowsky, S., Ecker, U., Seifert, C., Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing Psychological Science in the Public Interest, 13 (3), 106-131 DOI: 10.1177/1529100612451018
Tuesday, April 2, 2013
Reviewers and Citations: an update
A few weeks ago, I asked whether reviewers should be required to cite sources in their summary statements for submitted journal articles.
I was frustrated about my paper reviews (not an uncommon sentiment) because one reviewer made some serious claims against my basic assumptions, but did not include any citations to back these claims up. A few people in the comments suggested that I write to the editor.
So I did. I wrote a nice polite letter saying that citations from the reviewer would be helpful. The editors wrote to the reviewer and then wrote back to me with the reviewers reply. I was actually surprised at this because I had a feeling that once the editor puts the paper on the 'will not consider for publication' list, that they might not even respond. But I got some citations from the reviewer, so I could find out where I got lost in my own literature search. And much to my delight, after scouring the papers the reviewer cited, I still feel that I am right and that my paper has a sound foundation. I have now re-submitted this paper (to another journal), and feel pretty confident about it.
Though of course I always feel more confident about a submitted paper than I should.
© TheCellularScale
![]() |
| I heart citations (source) |
So I did. I wrote a nice polite letter saying that citations from the reviewer would be helpful. The editors wrote to the reviewer and then wrote back to me with the reviewers reply. I was actually surprised at this because I had a feeling that once the editor puts the paper on the 'will not consider for publication' list, that they might not even respond. But I got some citations from the reviewer, so I could find out where I got lost in my own literature search. And much to my delight, after scouring the papers the reviewer cited, I still feel that I am right and that my paper has a sound foundation. I have now re-submitted this paper (to another journal), and feel pretty confident about it.
Though of course I always feel more confident about a submitted paper than I should.
![]() |
| Submitting a beautifully written paper for review (what should we call grad school) |
© TheCellularScale
Tuesday, March 12, 2013
Should reviewers be required to cite their sources?
When I got back from the IBAGS conference, I was greeted by an 'paper rejection email'.
I was disappointed of course, but I slept off my jetlag and then built my self-confidence back up by saving the universe. I will retool the paper and submit it somewhere else.
However, the reviews for this paper were particularly infuriating (aren't they always?). Here's a summary:
I say: "Thing X is true (citation, citation), so we did thing Y which uses thing X."
Reviewer says: "You act like thing X is true, but it's not (no citations)."
The reviewer did this for two specific aspects of the paper, saying that the basis for our model and our ideas just aren't true, but giving no citations. In both case, I have citations in the paper to back up my claim that these things ARE true.
This particular form of irritating review has not happened to me before. I've always had well-cited responses to my claims. It's common courtesy to cite some papers when you say that someone is completely wrong about something, but I guess it's not required.
Anyone have any thoughts on this? Has it happened to you? Am I just having the normal 'grrr' response to a negative review?
© TheCellularScale
![]() |
| Failure with a capital F (source) |
However, the reviews for this paper were particularly infuriating (aren't they always?). Here's a summary:
I say: "Thing X is true (citation, citation), so we did thing Y which uses thing X."
Reviewer says: "You act like thing X is true, but it's not (no citations)."
The reviewer did this for two specific aspects of the paper, saying that the basis for our model and our ideas just aren't true, but giving no citations. In both case, I have citations in the paper to back up my claim that these things ARE true.
This particular form of irritating review has not happened to me before. I've always had well-cited responses to my claims. It's common courtesy to cite some papers when you say that someone is completely wrong about something, but I guess it's not required.
Anyone have any thoughts on this? Has it happened to you? Am I just having the normal 'grrr' response to a negative review?
© TheCellularScale
Friday, January 11, 2013
On Selling and Over-Selling Science
![]() |
| Science!!! (source) |
All these questions rise up in science blogs and on twitter and then fade back into the background. Then something happens and a flurry of posts about communicating science float to the surface again.
I have decided to join this party, and have written a Guest Editorial at the Biological Bulletin.
It's called "On Selling and Over-Selling Science" and is about trying to find that perfect balance between communicating a scientific finding accurately and accessibly.
I'd love to hear new opinions on this. So feel free to follow the link and leave a comment about it here.
© TheCellularScale
I was not able to use my 'blogging name' like Neuroskeptic was, so here is the article and my identity along with it:
Evans RC (2012). Guest editorial on selling and over-selling science. The Biological bulletin, 223 (3), 257-8 PMID: 23264470
Thursday, December 27, 2012
Holiday Hiatus
If you have been wondering where new blog posts are, I am taking a holiday hiatus. I'll see you in January when biweekly posting will be resumed, and the cellular scale will celebrate its first birthday!
In the meantime, thank you for reading The Cellular Scale this year.
In the meantime, thank you for reading The Cellular Scale this year.
Thursday, December 20, 2012
Video game shooting vs Real shooting
Video game shooting is different from real shooting.
![]() |
| Battle Rifle, my Halo weapon of choice (source) |
![]() |
| As a female Halo player myself, I think these Lady Spartans are awesome! (source) |
Personally, I like shooter video games. I'm playing Halo 4 like the rest of the world right now and I played the heck out of Mass Effect earlier in the year. I have also shot real guns.
And guess what? shooting real guns is just not really my thing. I find it a little bit scary and not that fun or exciting. The idea of going to a shooting range and shooting guns at paper targets for an hour sounds really boring to me. Shooting skeet or something moving, like an animal, also sounds pretty boring.
I am skeptical about the idea that the dopamine released during shooting video games transfers to more enjoyment while shooting real guns. I am willing to change my mind upon seeing some data, but having seen nothing to support this direct transfer, I don't think it exists.
This post is written in response to "Addicted to the Bang: The neuroscience of the gun." by Steve Kotler and Jim Olds. (They don't actually claim that dopamine release during video game shooting directly causes addiction to real shooting, but I think that someone might get that idea from the article.)
© TheCellularScale
Saturday, October 13, 2012
SfN Neuroblogging 2012: The long hard road to the Big Easy
You want to get to SfN on time, so here's a pro tip about how airlines work.
I know it's a little late for this, but maybe you can take heed of this advice for next years SfN.
Airlines oversell their flights, ALWAYs. Especially if everyone in the universe is trying to get to the same place for a gigantic conference. So just because you bought a ticket, you don't actually have a guaranteed seat on that flight.
I witnessed the horrible tragedy of a woman not getting to her satellite event talk because she didn't have a seat on a plane for which she had bought a seat. They asked my flight for 8 volunteers to bump to another flight to New Orleans. 8 people! That means that 8 people who BOUGHT tickets did not have seats on the plane.
They didn't have enough volunteers so some people were involuntarily bumped! Including our noble heroine who bravely decided to drive from our connecting city all the way to New Orleans to at least get there the same day, but unfortunately not in time to give her talk. (7 hour drive)
Ok so how can you avoid being in this situation?
As far as I can tell, the best way to lock in your seat is to actually check into your flight as early as possible. That usually means going online and printing your boarding pass. But that alone is not good enough. The online checkin process usually opens 24 hours before the flight, so my advice is to set an alarm exactly 24 hours before your flight and checkin online right then.
One of the airlines I ended up taking doesn't even automatically give boarding passes to the last 15 people who check in. They give 'security passes' so you can get through security and to the gate but you do not have official permission to board your flight.
So check in early and if you are presenting on Saturday afternoon get a flight on Friday. DO NOT count on everything going smoothly at the airport.
What I have just described is pretty much a worst-case scenario, and obviously plenty of people made it to New Orleans without much trouble (check out the #SfN12 tag to read about all the fun things people are doing there).
So yeah, I am writing this at the airport...but at least I'm not driving. And thank goodness my poster is not until Wednesday.
© TheCellularScale
![]() |
| How do airlines work? |
Airlines oversell their flights, ALWAYs. Especially if everyone in the universe is trying to get to the same place for a gigantic conference. So just because you bought a ticket, you don't actually have a guaranteed seat on that flight.
I witnessed the horrible tragedy of a woman not getting to her satellite event talk because she didn't have a seat on a plane for which she had bought a seat. They asked my flight for 8 volunteers to bump to another flight to New Orleans. 8 people! That means that 8 people who BOUGHT tickets did not have seats on the plane.
They didn't have enough volunteers so some people were involuntarily bumped! Including our noble heroine who bravely decided to drive from our connecting city all the way to New Orleans to at least get there the same day, but unfortunately not in time to give her talk. (7 hour drive)
Ok so how can you avoid being in this situation?
As far as I can tell, the best way to lock in your seat is to actually check into your flight as early as possible. That usually means going online and printing your boarding pass. But that alone is not good enough. The online checkin process usually opens 24 hours before the flight, so my advice is to set an alarm exactly 24 hours before your flight and checkin online right then.
One of the airlines I ended up taking doesn't even automatically give boarding passes to the last 15 people who check in. They give 'security passes' so you can get through security and to the gate but you do not have official permission to board your flight.
So check in early and if you are presenting on Saturday afternoon get a flight on Friday. DO NOT count on everything going smoothly at the airport.
What I have just described is pretty much a worst-case scenario, and obviously plenty of people made it to New Orleans without much trouble (check out the #SfN12 tag to read about all the fun things people are doing there).
So yeah, I am writing this at the airport...but at least I'm not driving. And thank goodness my poster is not until Wednesday.
© TheCellularScale
Wednesday, September 26, 2012
you can't trust your brain: memory
"Flashbulb" memories are those vivid memories of specific salient events. The 'everyone remembers exactly where they were when...' sort of events. In the USA, and depending on how old you are, you might remember the assassination of JFK, or Martin Luther King Jr. in this way. In this century, most Americans remember exactly where they were when they heard about the 9/11 attacks on the world trade center and pentagon.
It's widely acknowledged these days that the brain is not really a safe place to store information. Memories of events change over time. But for a while the "flashbulb" memory was thought to be immune from the memory-altering properties of time. Think about your own memories of 9/11 or another highly meaningful event. I bet you are pretty certain about the details. I, for example, was in my second year of college and I know exactly who told me that the first tower was hit, exactly where I was standing on the quad, and exactly what class I was going to....
...or do I?
A study in 2003 tested the consistency of flashbulb memories over time and compared the details to 'control memories' of everyday events. They specifically recorded memories from people during the day after the 9/11 attacks, and then recorded memories of the same events from subsets of those same people 1 week, 6 weeks, and 32 weeks later. They found that the flashbulb memories did have different properties when compared to control memories, but that consistency was not one of them.
Talarico and Rubin show that the flashbulb memories and the everyday memories had the same time-dependent decay (that x axis is in days), demonstrating that the flashbulb memory did not have some special property that protected it from corruption.
However, they did find that the level of confidence in the memory was higher for flashbulb memories than for everyday memories. People thought (incorrectly) that their memories of the 9/11 attacks were more accurate than their other memories.
So again we learn the lesson that we cannot trust ourselves.
In the authors words:
But I think the most interesting finding in this paper was that the flashbulb memories of 9/11 were more likely to be recalled 'through ones own eyes' than the everyday memories. Everyday memories were seen 'through ones own eyes' at the beginning and a at 1 week, but at 6 and 32 weeks the everyday memories were more likely to be seen 'from an outside observer perspective.' The flashbulb memories, on the other hand, were seen 'through ones own eyes' at all time points. Indeed, when I think of my own 9/11 memory, I still see it through my own eyes.
The authors don't go into why that might be or what it might mean, so we are left to wonder.
© TheCellularScale
Talarico JM, & Rubin DC (2003). Confidence, not consistency, characterizes flashbulb memories. Psychological science, 14 (5), 455-61 PMID: 12930476
![]() |
| "Never Forget" (source) |
...or do I?
A study in 2003 tested the consistency of flashbulb memories over time and compared the details to 'control memories' of everyday events. They specifically recorded memories from people during the day after the 9/11 attacks, and then recorded memories of the same events from subsets of those same people 1 week, 6 weeks, and 32 weeks later. They found that the flashbulb memories did have different properties when compared to control memories, but that consistency was not one of them.
![]() |
| Talarico and Rubin 2003, Figure 1a |
However, they did find that the level of confidence in the memory was higher for flashbulb memories than for everyday memories. People thought (incorrectly) that their memories of the 9/11 attacks were more accurate than their other memories.
So again we learn the lesson that we cannot trust ourselves.
In the authors words:
"The true 'mystery,' then, is not why flashbulb memories are so accurate for so long,... but why people are so confident in the accuracy of their flashbulb memories." Talarico and Rubin (2003)
But I think the most interesting finding in this paper was that the flashbulb memories of 9/11 were more likely to be recalled 'through ones own eyes' than the everyday memories. Everyday memories were seen 'through ones own eyes' at the beginning and a at 1 week, but at 6 and 32 weeks the everyday memories were more likely to be seen 'from an outside observer perspective.' The flashbulb memories, on the other hand, were seen 'through ones own eyes' at all time points. Indeed, when I think of my own 9/11 memory, I still see it through my own eyes.
The authors don't go into why that might be or what it might mean, so we are left to wonder.
© TheCellularScale
Talarico JM, & Rubin DC (2003). Confidence, not consistency, characterizes flashbulb memories. Psychological science, 14 (5), 455-61 PMID: 12930476
Monday, August 6, 2012
Let me answer your questions: Part 1
One of the most entertaining things about writing this blog, is seeing the search terms that people have used to find it. Some of them leave me completely befuddled ("Giraffe eating man"), and some are amusing combinations of words that happened to be in a particular post. But my favorite ones take the form of questions.
I am pretty sure that most of the questions people have used to find my blog are not answered in my blog, and I hate the thought of disappointing people. So I thought I would take some time and answer a few of my favorites.
(All of these are actual 'search terms' that showed up on my blogger stats, I am not making these up.)
1. "Do neurons make you smart?"
This is a surprisingly interesting question. My answer is probably. All the evidence points to us needing neurons to think. When neurons get damaged in certain parts of the brain, things start going badly in the 'smart' department. If you had no neurons, you probably couldn't think.
There are two caveats that make this question interesting.
1. Sometimes things that do not have neurons act 'smart' (see my post about the almost-neurons of the Venus Fly trap)
2. If neurons make you smart, they also make you stupid. You need neurons to perform a 'stupid' action just as much as you need neurons to perform a 'smart' action. An example here is when I am trying to drive somewhere I don't often go, I might accidentally find myself on my way to work. I might think "how stupid of me, I wasn't even thinking" And it would be true. The stupid action of turning the wrong way is because my striatal neurons have encoded the drive to work really strongly.
2. "Does Shrek wear pants?"
This question directed someone to my post "How animals, Shrek, and Yoda stimulate your neurons."
And yes, in fact, Shrek does wear pants, though I had to google that term myself to find out. The issue here is that Shrek's pants are a dark olive color, close in hue to his skin tone, making it hard to tell and remember that he is wearing pants under that short tunic thing.
UPDATE: 8/6/12 I have been informed by an astute reader that Shrek does not in fact wear 'pants' but rather 'tights'. I suppose this is a more accurate description of his attire, so I formally apologize for spreading Shrek-related misinformation.
3. "Why is it better to play female Mass Effect?"
This is sort of answered by "4 reasons all women should play Mass Effect" But the question is specifically referring to playing the game as a woman rather than as a man.
A lot of people play Mass Effect as a woman because they think the voice-acting is better. (I agree with this)
I imagine most women play Mass Effect as a woman for the same reason most men play Mass Effect as a man. It's more fun to be a character when you can relate yourself to the character. I think it would be great if everyone played Mass Effect as a woman in a perspective taking experiment.
And I think Commander Shepard makes an excellent role model for young girls aspiring to one day save the galaxy.
(FemShep Barbie would be such great friends with my Computational Neuroscientist Barbie!)
Readers, I hope you have found this post informative. I plan to continue answering these important questions for you in the future.
© TheCellularScale
![]() |
| (by Nanecakes) |
(All of these are actual 'search terms' that showed up on my blogger stats, I am not making these up.)
1. "Do neurons make you smart?"
This is a surprisingly interesting question. My answer is probably. All the evidence points to us needing neurons to think. When neurons get damaged in certain parts of the brain, things start going badly in the 'smart' department. If you had no neurons, you probably couldn't think.
There are two caveats that make this question interesting.
1. Sometimes things that do not have neurons act 'smart' (see my post about the almost-neurons of the Venus Fly trap)
2. If neurons make you smart, they also make you stupid. You need neurons to perform a 'stupid' action just as much as you need neurons to perform a 'smart' action. An example here is when I am trying to drive somewhere I don't often go, I might accidentally find myself on my way to work. I might think "how stupid of me, I wasn't even thinking" And it would be true. The stupid action of turning the wrong way is because my striatal neurons have encoded the drive to work really strongly.
2. "Does Shrek wear pants?"
This question directed someone to my post "How animals, Shrek, and Yoda stimulate your neurons."
And yes, in fact, Shrek does wear pants, though I had to google that term myself to find out. The issue here is that Shrek's pants are a dark olive color, close in hue to his skin tone, making it hard to tell and remember that he is wearing pants under that short tunic thing.
UPDATE: 8/6/12 I have been informed by an astute reader that Shrek does not in fact wear 'pants' but rather 'tights'. I suppose this is a more accurate description of his attire, so I formally apologize for spreading Shrek-related misinformation.
3. "Why is it better to play female Mass Effect?"
This is sort of answered by "4 reasons all women should play Mass Effect" But the question is specifically referring to playing the game as a woman rather than as a man.
A lot of people play Mass Effect as a woman because they think the voice-acting is better. (I agree with this)
I imagine most women play Mass Effect as a woman for the same reason most men play Mass Effect as a man. It's more fun to be a character when you can relate yourself to the character. I think it would be great if everyone played Mass Effect as a woman in a perspective taking experiment.
And I think Commander Shepard makes an excellent role model for young girls aspiring to one day save the galaxy.
![]() |
| FemShep Barbie (pure genius from Introverted Wife) |
Readers, I hope you have found this post informative. I plan to continue answering these important questions for you in the future.
© TheCellularScale
Monday, July 30, 2012
I Know Why the Caged Rat Runs
![]() |
| (source) |
I know what the caged rat feels, I say,
when the moon shines bright upon the brush
when the sunset sends out its last ray
with the earth still pulsing the warmth of day.
When the spring comes quietly, in no rush
and the seeds emerge as delicious meals,
I know how the caged rat feels
I know why the caged rat builds its nest
when the bedding will just be changed,
its work its effort its strength invest
in a task that requires its all, its best
each times its world gets rearranged,
it goes back to work with little rest
I know why the caged rat builds its nest
when the bedding will just be changed,
its work its effort its strength invest
in a task that requires its all, its best
each times its world gets rearranged,
it goes back to work with little rest
I know why the caged rat builds its nest
I know why the cages rat runs, I know
paws beating, pounding on the wheel
thinking, knowing it has a place to go
a goal to reach, something to show
it stops for moments to let wounds heal
then resumes its race for its reasons
I know why the caged rat runs.
Wednesday, June 27, 2012
Science + Art at Artomatic
As much as I may complain about misrepresentations of literature in science or misrepresentation of science in entertainment, I love artwork inspired by science. Which is why I was delighted by the many science and art connections to be seen at Artomatic this year.
The work I was most excited to see was from Artologica. Michele Banks makes gorgeous water color paintings of neurons and microbes. I love how she brings out the natural beauty of bacteria. It reminds me how beautiful and sufficient the natural world is.
Another fantastic exhibit was by Sarah Noble, a research scientist at NASA and artist.
She has some amazing portraits of planets and abstract rockets, but I really love her 'earth from space' series. Especially "our earth" shown above. I love the stark whites, the hint of blue on the earth, and the feeling of loneliness it evokes. It reminds me of the scene in Ursula LeGuin's The Left Hand of Darkness, where two characters are traveling alone on a glacier which extends as far as the eye can see.
The 30 Computers Project uses discarded computer parts to make large 3D models of viruses and molecules and other exciting science things!
You can go HERE to see how they made this large sculpture.
Another favorite was Erika Rubel's Had Matter kitchen bugs:
These insects made with salvaged vintage kitchen utensils remind me of the little steampunk robots in Girl Genius. The only thing that could make them cooler is if they were controlled by rat neurons.
Another delightful exhibit was from Duncan Guthery, quite possibly the coolest 11 year old ever.
He makes lego mosaics of familiar characters, like this streetfighter:
Also on exhibit were the Beatle's Yellow Submarine and a big Totoro. Not exactly science related, but turning legos into pixels is pretty cool.
And finally, the great Peep Diorama Contest submissions were all on display. Although the "OccuPeepDC" diorama won the contest, my favorite was the peep CERN lab.
There was so much more at Artomatic than I can possibly cover here. I was there for 4 hours or so and still only managed to see 3 of the 11 floors full to bursting of art exhibits. I am sure I missed some amazing science-related art. If you were there or are a science-related artist, please comment or email to let me know about your work.
© TheCellularScale
(I took all of the pictures here except "our earth" which I got from Sarah Noble's website)
![]() |
| (A very cellular scale) "Portrait of a Human" by Artologica |
Another fantastic exhibit was by Sarah Noble, a research scientist at NASA and artist.
![]() |
| "Our Earth" Sarah Noble |
The 30 Computers Project uses discarded computer parts to make large 3D models of viruses and molecules and other exciting science things!
![]() |
| Adeno cd virus |
Another favorite was Erika Rubel's Had Matter kitchen bugs:
![]() |
| Kitchen Spatula Bug |
Another delightful exhibit was from Duncan Guthery, quite possibly the coolest 11 year old ever.
He makes lego mosaics of familiar characters, like this streetfighter:
![]() |
| pixels made with legos |
Also on exhibit were the Beatle's Yellow Submarine and a big Totoro. Not exactly science related, but turning legos into pixels is pretty cool.
And finally, the great Peep Diorama Contest submissions were all on display. Although the "OccuPeepDC" diorama won the contest, my favorite was the peep CERN lab.
![]() |
| Peep CERN lab |
![]() |
| close up of marshmallow Peep CERN lab |
There was so much more at Artomatic than I can possibly cover here. I was there for 4 hours or so and still only managed to see 3 of the 11 floors full to bursting of art exhibits. I am sure I missed some amazing science-related art. If you were there or are a science-related artist, please comment or email to let me know about your work.
© TheCellularScale
(I took all of the pictures here except "our earth" which I got from Sarah Noble's website)
Wednesday, June 13, 2012
A note about comments
I recently started moderating comments on this blog, but at the same time I changed something about my email notifications from blogger. So long story short: while I was sitting around wondering why no one was commenting on my blog, a small trove of interesting, funny, and insightful comments was piling up behind an hidden email curtain.
I have fixed this problem and published the comments. Sorry about the delay, and rest assured that future comments will be moderated and published in a much more timely fashion.
The Cellular Scale
I have fixed this problem and published the comments. Sorry about the delay, and rest assured that future comments will be moderated and published in a much more timely fashion.
The Cellular Scale
Tuesday, April 17, 2012
Why I type in Dvorak and you should too
The Dvorak keyboard is an alternative to the traditional Qwerty layout. Proponents (like me) claim that it is faster and easier to use. Dvorak himself claimed in a 1943 National Business Education Quarterly paper "There is a better typewriter keyboard" that experts could type 35% faster in the Dvorak layout than in the Qwerty layout. (value cited in this paper, I could not locate original)
I started using Dvorak during my freshman year of college because some guy told me it was cool. I converted my computer's keyboard format to Dvorak and re-arranged all the keys of my 1st generation iMac.
![]() |
| I feel old. |
I was not much of a 'typer' before attempting Dvorak. I was a step above 'hunt and peck' (I used multiple fingers), but I couldn't type without looking at the keyboard. It wasn't long before I became much faster typing in Dvorak than in Qwerty, and could touch-type for the first time in my life.
I now change all computers I use to Dvorak, but do not change the physical keys on the keyboard. This has resulted in some lovely events such as my work-study boss in college thinking her computer was 'haunted' because I forgot to change the format back before leaving the office. It has also resulted in some embarrassing moments for me when I am forced to return to a Qwerty layout. During a presentation on some new neuro-software, I volunteered to test it out. This was a bad idea, because of course the presenter's computer was set to Qwerty. I not only typed super-slowly, but I couldn't put in a familiar password at one point. I knew the password by touch, and without the letters showing up as feedback, I literally could not type it correctly.
Despite the occasional problem, I love typing in Dvorak. I find it much easier and more natural than typing in Qwerty. However, since I have been typing in Dvorak since iMacs were cool, my favoritism is probably due to familiarty more than some inherent 'betterness'. I can hardly be objective here.
For some real objective analysis we need some peer-reviewed studies. Luckily the Human Factors and Ergonomics Society cares about this sort of thing.
In a 2009 paper Anderson et al. investigated just how steep the learning curve was for a variety of alternative keyboards.
![]() |
| Anderson et al., 2009 Figure 1: chord, contoured split Qwerty, Dvorak, and angle split Qwerty |
![]() |
| Anderson et al., 2009 Figure 3 |
This study says nothing about how 'experts' type on any of these keyboards, so I decided to test myself.
Online, you can test your typing speed by typing in random words or passages for 1 minute.
I tried these tests 3 times each in Dvorak and Qwerty (alternating). Not surprisingly, I was much better in Dvorak.
![]() |
| open symbols= random words test, filled symbols=passages test |
The random words test is much easier than the passages test which includes punctuation, but in both tests I was faster in Dvorak.
But of course I don't type in Qwerty regularly, so this isn't exactly the right comparison. To rectify this, I got help from a Qwerty user who was so kind as to try the passages test 3 times for me. My Dvorak passages test were slightly better than the Qwerty-user's passages tests (filled red circles compared to blue squares). One person per group is hardly proof and couldn't even count for preliminary data, so don't quote this figure as proof that Dvorak is faster or anything. It could just as easily be proof that people with brown eyes (me) are better typers that people with blue eyes (Qwerty-user). This was just some good old fashioned dorky fun-with-data.
If you want to add data points to my table, go ahead and take the typing tests yourself:
Random words
Passages
Both sites are annoyingly stuffed with ads, but you can take the test without clicking on any of them.
Then let me know if you are Dvorak or Qwerty user, what test you took, and how many words per minute you typed.
© TheCellularScale
Monday, April 2, 2012
3 months of blogging
I love reading other blog posts about ridiculous scientific (and unscientific) claims. They are usually entertaining and always informative. (3 notable examples: Neuroskeptic, Respectful Insolence, Neurocritic)
Originally when I started this blog (waaaay back in January 2012), I thought I would do something similar, find outrageous claims in the press or the scientific literature and explain what was wrong with them. The "Cellular Scale" was supposed to imply the weighing of these claims and judging them on their scientific worth. This name would have been delightfully clever if I had actually stuck to this original plan.
I suppose there are 3 reasons why this didn't happen:
1. I didn't immediately find many outrageous claims specific to neurons (most of the claims are a little 'zoomed out' from the cellular scale and involve whole human brain areas), so I only managed to produce one (not very) skeptical post.
2. I got sidetracked by all the - totally - cool - things - that - cells - do.
3. Mass Effect 3 came out, so I had to play it and blog about it....twice.
I am pretty happy with how the blog is going so far because I have:
A. Stuck to cellular-level neuroscience for the most part. Even though my 3 most - popular - posts are not about cells at all.
B. Posted something about twice a week.
C. Not run out of ideas. I was worried about this at first, but now every time I hear something interesting I think 'I could blog about that' and actually have a list of ideas that is growing faster that I am posting.
Over the next 3 months I want to:
i. Get back to my original plan and clear up some misconceptions people might have about cells.
ii. Get more comfortable on Twitter. Right now it is like being at a party eavesdropping on a super-interesting conversation between people I don't know.
iii. Post more pictures of my dog.
Thanks for reading!
TheCellularScale
© TheCellularScale
Originally when I started this blog (waaaay back in January 2012), I thought I would do something similar, find outrageous claims in the press or the scientific literature and explain what was wrong with them. The "Cellular Scale" was supposed to imply the weighing of these claims and judging them on their scientific worth. This name would have been delightfully clever if I had actually stuck to this original plan.
I suppose there are 3 reasons why this didn't happen:
1. I didn't immediately find many outrageous claims specific to neurons (most of the claims are a little 'zoomed out' from the cellular scale and involve whole human brain areas), so I only managed to produce one (not very) skeptical post.
2. I got sidetracked by all the - totally - cool - things - that - cells - do.
3. Mass Effect 3 came out, so I had to play it and blog about it....twice.
I am pretty happy with how the blog is going so far because I have:
A. Stuck to cellular-level neuroscience for the most part. Even though my 3 most - popular - posts are not about cells at all.
B. Posted something about twice a week.
C. Not run out of ideas. I was worried about this at first, but now every time I hear something interesting I think 'I could blog about that' and actually have a list of ideas that is growing faster that I am posting.
Over the next 3 months I want to:
i. Get back to my original plan and clear up some misconceptions people might have about cells.
ii. Get more comfortable on Twitter. Right now it is like being at a party eavesdropping on a super-interesting conversation between people I don't know.
iii. Post more pictures of my dog.
![]() |
| CellularDog |
Thanks for reading!
TheCellularScale
© TheCellularScale
Friday, March 23, 2012
How to Tell a Story: Science Edition
Recently I watched a really great Ph.D. dissertation defense, and it got me thinking:
What was so great about it?
While there are many factors that go into a good presentation (and you can read all about them at Neurodojo: the Zen of Presentations parts 1-1bazillion), but I think there is a single golden rule after which all other rules are secondary.
This doesn't mean 'say as little as possible'.
It means say a lot, but make your points using the minimum number of words necessary.
Aside from the obvious cutting out 'ummms' and 'likes' that can be distracting, simply saying what you are going to say without caveats and without extra phrases is always the best way to go.
Here's a secret about me: In my pre-neuroscience life, I spent 2 years teaching special education at an elementary school. This job involved herding distracted children and trying to teach them things in the most interesting and engaging way possible. The idea being that if the students are engaged with the lesson they will be less likely to throw their pencil box across the room or knock their desk over (both unfortunately frequent occurrences).
One infinitely transferable lesson I learned was 'how to tell a story'.
The method was something like this:
1. Read the story to yourself.
2. Write the story down in 10 sentences.
3. Write the story down in 5 sentences.
4. Write the story down in 1 sentence.
5. Tell the story at the level of detail appropriate for the situation.
Simple, right? And obviously applicable to scientific communication. This is a step by step method for crafting a good elevator story. It is also something everyone should do before they make a poster, give a presentation, or even write a paper. In fact, you should stop what you are doing right now and try to write down your dissertation or current project in 10, 5 and 1 sentence.
Becoming an expert on something is not simply knowing all the details about it. It's also knowing which ones are critical to the main point and which ones are not.
© TheCellularScale
What was so great about it?
While there are many factors that go into a good presentation (and you can read all about them at Neurodojo: the Zen of Presentations parts 1-1bazillion), but I think there is a single golden rule after which all other rules are secondary.
Use as few words as possible.
![]() |
| (source) |
It means say a lot, but make your points using the minimum number of words necessary.
Aside from the obvious cutting out 'ummms' and 'likes' that can be distracting, simply saying what you are going to say without caveats and without extra phrases is always the best way to go.
Here's a secret about me: In my pre-neuroscience life, I spent 2 years teaching special education at an elementary school. This job involved herding distracted children and trying to teach them things in the most interesting and engaging way possible. The idea being that if the students are engaged with the lesson they will be less likely to throw their pencil box across the room or knock their desk over (both unfortunately frequent occurrences).
One infinitely transferable lesson I learned was 'how to tell a story'.
The method was something like this:
1. Read the story to yourself.
2. Write the story down in 10 sentences.
3. Write the story down in 5 sentences.
4. Write the story down in 1 sentence.
5. Tell the story at the level of detail appropriate for the situation.
Simple, right? And obviously applicable to scientific communication. This is a step by step method for crafting a good elevator story. It is also something everyone should do before they make a poster, give a presentation, or even write a paper. In fact, you should stop what you are doing right now and try to write down your dissertation or current project in 10, 5 and 1 sentence.
Becoming an expert on something is not simply knowing all the details about it. It's also knowing which ones are critical to the main point and which ones are not.
© TheCellularScale
Subscribe to:
Comments (Atom)


































