Friday, May 16, 2014

LMAYQ: Cellular Cells

Time to get back to Answering Some Questions. Where I attempt to answer the search-engine questions which have led you to The Cellular Scale. I mainly try to answer questions that I am sure are not answered anywhere on this blog. 

1. "What does a cellular cell look like?" 

Good question. First of all, what is a cellular cell? Is it different from a regular cell?
Cells look like all sorts of things. Some look like footballs, some look like sea coral. Some cells look like little rafts, drifting down a river.

Blood rafts (source)

Hitch a ride on a blood cell this summer (source)

2. "What song does Shrek sing in Shrek 1?"

 I have no idea. However, I remember the song "I like big butts" being in that movie. I am not certain that Shrek sings it. It might be the donkey.

As a side note, I am glad that this blog has become the go-to place for Shrek questions. After all, Shrek and Yoda do stimulate your neurons.

3. "Why does golgi not stain all cells?"

This is a real true very good question. The strength of the Golgi stain lies precisely in its sparse labeling. If it labeled all cells it would be useless because you wouldn't be able to see the elaborate morphology of a neuron's dendrites.

However, even though it has been around for almost 150 years, it is still unclear why it doesn't stain all cells. It is not clear how it 'decides' to stain one cell and not the one next to it. This is always a bit of a problem for people wanting to use the Golgi stain, because you always have that nagging feeling that maybe you are seeing only the sick neurons or only the neurons with random undiscovered quality X, and so forth. But it is a well respected technique, and journals regularly publish scientific articles which rely on the Golgi stain.

© TheCellularScale

Saturday, February 15, 2014

A Hop, Skip, and a pre-synaptic Patch

This new technique is just too cool not to blog about. 

Novak et al. 2013 Figure 1A pre-synaptic patch clamp

The synapse is the connection between two neurons. The pre-synaptic part is from the neuron sending a signal and the post-synaptic part is from the neuron receiving the signal.

If you want to learn about the connection between the two neurons, you want to know what is happening on both sides of the synapse. It's relatively easy to record signals from the post-synaptic side using patch clamp or sharp electrode recording, but it is much much harder (basically impossible until now) to record from the pre-synaptic side.

Wednesday, February 5, 2014

When being mean is actually being nice ... and when it is just being mean

It's a harsh world in here, in academia. We all already know that academic science is not a carebear teaparty, and apparently now things are worse than ever as far as potential jobs for Ph.D.s and funding goes. 

A brief interruption, for an ad:
Use Grammarly's plagiarism checker online because it's better to have a computer criticize you than a person.
 And now back to your regularly scheduled blogramming.

Scicurious has a great post up at Neurotic Physiology about what it is like to be out in the 'real world' and out of academia. She has some fascinating points about how academia has skewed her perspective on things, but one in particular jumped out at me: That now she has to re-learn how to take criticism.

Scicurious says:
"I remember a time when I took criticism well. I did a lot of theater and music, it was something you HAD to take well. I took it, I improved, worked harder, fixed things, and did better. Sometime during grad school, however, criticism began to paralyze me. Every critique felt like a critique of me, as a scientist. Since a scientist was what I WAS, all criticism began to feel like criticism of me, as a person. Sometimes it was indeed phrased that way. You are careless. You are not smart enough, why don't you get this?! You are not focused."
This got me thinking because, honestly, I feel exactly the opposite. I think I learned how to take criticism in grad school partly by learning how to give it.

It's for your own good! (source)

When I am editing a paper or grant for someone, I am trying to help them. The more critical I am the better their paper/grant will be. The paper is headed to peer-review which will determine whether it gets published or not , and the grant is headed to a study section which decides whether it gets funded. Both are grueling and rigorous examinations of quality and scientific merit. These review processes are so important because published papers and funded grants are 'science currency' and will determine your future. In some cases the funding status of a grant can determine whether a lab stays open or a PI gets tenure.

If there is a paragraph that doesn't make sense, or (gasp) a typo, it is obviously better for me to catch it than for someone important to see it and get confused or frustrated.

Understanding this concept, that constructive criticism is the nicest thing a scientist can do for another scientist taught me to take criticism much better than I had previously. I was one of those students who was always 'better enough' than the other students that teachers rarely bothered to push me to true excellence. So I was really not used to criticism, and the initial slings and arrows in graduate school did sting. However at one point it really sunk in that these criticisms were making me better... better at everything: writing, presenting, scientific thinking. 

So that is when being mean is actually being nice.

That said, I never had anyone tell me I wasn't 'smart enough' as Scicurious says. Just because constructive and thorough critique can sound mean, but actually be nice, doesn't mean that there is no such thing as 'meanness' in academia.

Sometimes being mean is really just being mean. A criticism that does not help me improve in any way is just mean. 'you are not smart enough to be a scientist' does not help anyone be a better scientist. It is a completely different kind of criticism than 'you really need to read more about X because you don't understand how X works.' Both are directed at 'you' personally, but one says you can't do it and the other says you can do it and even suggests how you can do it.

© TheCellularScale

Monday, December 9, 2013

It takes a semester just to graduate

When my advisor told me that it takes a semester just to graduate, she seriously wasn't kidding. 

Here's the math:

To graduate this December, I had to have everything signed and turned in on December 6th. Doesn't sound so bad, right?

But that basically means I have to defend in November.

And my college/department has a mandatory pre-defense which must be a MONTH before the real defense.

/*Note on the pre-defense: I am not sure how many universities require a pre-defense. It has some pros and cons.
Pros: you have everything in a state of readiness a month before you really need to, and if anything is glaringly horrible and you might not graduate because of it, you find that out before your defense and likely before you tell everyone you are going to graduate. The pre-defense is private, so your presentation is critiqued and likely better for the public real defense, which is to everyone's benefit.  
Cons: you have to have everything ready a month earlier that you really need to. Your committee might use it as an excuse to tell you to do extra things because you have a month. Your committee has to sit through basically the same talk twice, and it is sort of a waste of their time.
end of note*/

Thus the pre-defense must occur in October

And your committee needs to read the dissertation before you pre-defend it, so you really need to give it to them 2 weeks before the defense.

This means that to graduate in December, you basically need your dissertation in a state of readability and relative completeness by the end of September!  And since the semester starts at the beginning of September, you have essentially 4 weeks of the fall semester to work on your dissertation.

And the same goes for a spring graduation.
If you want to graduate in May, you defend in April, pre-defend in March, and have have everthing turned in by the end of February.

Plan accordingly.

© DrCellularScale


Thursday, November 7, 2013

Official SfN Neurobloggers 2013

Due to starting a major simulation this summer, I am not going to the annual society for neuroscience meeting in San Diego this year. And therefore I won't be neuroblogging it like I did last year.

I look forward to reading the posts and tweets from the official neurobloggers this year.

Here they are:

From Brains to Beyonce by @Spork15

House of Mind by  @houseofmind

Neuron Physics by @Eric_Melonakos

NeuroscienceDC by @NeuroscienceDC

Neurolore by @TheMrsZam

NeuroCultureBlog by @LaSaks87

Churchland lab by @anne_churchland

Dormivigilia by @beastlyvaulter

Neurorexia by @ShellyFan

On Psychology and Neuroscience by @astroglia

Más Ciencia por México by @mrenteria_

Imagining Science by @DrImmySmith

Corona Radiata by  @JohnKubie

Follow the #SfN13 hashtag on twitter to find all the unofficial coverage of the conference.

© TheCellularScale

Thursday, September 12, 2013

Use Imposter Syndrome to become an excellent grad student

Let's talk about Aristotle for a minute.

School of Athens Aristotle is the one in blue.

Many people mis-attribute this quote to him:

"We are what we repeatedly do. Excellence therefore is not an act, but a habit." -Will Durant
But really this quote is from someone summarizing Aristotle. It's a great summary and it seems to say what Aristotle means, just more concisely.

Aristotle does say:
"For these reasons the virtues are not capacities either; for we are neither called good nor called bad, nor are we praised or blamed, insofar as we are simply capable of feelings. Further, while we have capacities by nature, we do not become good or bad by nature." Nicomachean Ethics Book II 5.5
Ok, so what does this have to do with grad school?

Well lots of people are starting grad school right now with lots of potential. Tons of potential probably, it's what got them into grad school in the first place.

But here's the thing, your potential doesn't mean anything unless you live up to it (or at least come close). Basically Aristotle says that your feelings and intentions and capabilities do not make you excellent, your actions do.

The real lesson here is that you ARE what you DO. if you want to be a good person think 'what would a good person do in this situation?' and then do that thing. Simple really. So in grad school this translates to:

Make Imposter Syndrome work in your favor.

Imposter Syndrome is when someone thinks 'I'm not good enough to be where I am, and I'm just minutes away from the moment my colleagues find out' and it is apparently a plague of many grad students and there are plenty of blog posts around on how to combat it.

But guess what? Playing dress up can make you smarter. People wearing a white coat called a lab coat did better on focus tasks that people wearing the same white coat called an painter's coat (Adam and Galinsky 2012). These are the same people who did the perspective taking experiments showing that when you pretend to be something you become more like it. (See item number 4 on this post.)

Pretending to be what you want to be is actually a completely valid and useful way to become what you want to be. This doesn't mean go into class and pretend you are the professor (that's not a good idea). It means go into class and pretend you are the BEST student in that class. 

So go put on those 'smart person clothes' and make believe that you are the best student that school has ever seen. If you run into a dilemma think to yourself 'what would an excellent grad student do in this situation?' or better yet think 'what would an excellent scientist do in this situation?' and then do that thing.

© TheCellularScale
Adam and Galinsky (2012). Enclothed Cognition Journal of experimental social psychology DOI: 10.1016/j.jesp.2012.02.008

Tuesday, August 27, 2013

Philosophy of Computational Neuroscience

Just like experimental neuroscience, computational neuroscience can be done well or poorly.

computational models look beautiful (source)
This post was motivated by Janet Stemwedel's recent post in Adventures in Ethics and Science about the philosophy of computational neuroscience. There seem to be three views of the use of computational models in biology and neuroscience:

1. All models are bullshit.
2. Models rely on MATH, so of course they are right.
3. Some models are good and some are bad.

Obviously the first two are extremes and usually posited by people who don't know anything about computational neuroscience, and I am clearly advocating the third view. The only problem is that it is hard to tell if a model is good or bad unless you know a lot about it.

So here are some general principles that can help you divide the good and the bad in computational neuroscience.

1. The authors use the correct level of detail.

devil's in the details (source)
If you are trying to test how brain regions interact with each other, you don't need to model every single cell in each region, but you need to have enough detail to differentiate the brain regions from one another. Similarly, if you are trying to test how molecules diffuse within a dendrite, you don't need to model a whole cell, but you need to have enough detail to differentiate one molecule type from another. If you are trying to test how a cell processes information, you need to have a cell, as you may have learned in how to build a neuron.  Basically a model can be bad simply because it is applied to the wrong question.

2. The authors tune and validate their model using separate data.

When you are making a model you tune it to fit data. For example, in a computational model of a neuron you want to make sure your particular composition of channels produces the right spiking pattern. However, you also want to validate it against data. So how is tuning different from validating? Tuning is when you change the parameters of the model to make it match data. Validating is when you check the tuned model to see if it matches data. Good practice in computational neuroscience is to tune your model to one set of data, but to validate it against a different set of data.
For example, if a cell does X and Y, you can tune your model to effect X, but then check to see that the parameters that make it do X also make it do Y. Sometimes this is not possible. Maybe there is not enough experimental data out there. But if it is not possible, you should at least test the robustness of your model (see point 3).

3. The authors test the robustness of their model.

A robust computational model can be delicious (source)
One problem with computational models is that the specific set of parameters you've found by tuning the model might not be the 'right ones.' In fact they probably aren't the right ones. There are many different sets of parameters that can make a neuron spike slowly, for example.  And the chance that you hit on exactly the correct combination of things is very low. But that doesn't mean the model is not useful. You can still use the model to test effects that are not strongly altered by small changes in these parameters. So you need to test whether the specific effect you are testing is robust to parameter variation. If you are testing effect Q, you can increase the sodium channels by 10%, or the network size by 20% and see if you still get effect Q. In other words is 'effect Q' robust to changes in sodium channels or network size? If it is, then great! Your effect is not some weird fluke due to the exact combination of parameters that you have used.

These are the main things I try to pay attention to, but I am sure there are other important things to keep in mind when making models and reading about them. What are your thoughts?

© TheCellularScale