Donate

If you would like to support our clinical research you can make a tax-deductable donation to Ped IMID.

Navigation
Tuesday
Dec142010

The Autoimmune Genetics Laboratory in 2010

All the members of the Autoimmune Genetics Laboratory, at our end of year dinner.

Wednesday
Oct132010

The historic quandary of antibody production

The mechanism by which antibodies were formed was once one of the oldest and most perplexing mysteries of immunology. The properties of antibody generation, with the capacity of the immune system to generate specific antibodies against any foreign challenge – even artificial compounds which had never previously existed – defied the known laws of genetics.

Three major models of antibody production were proposed before the correct model was derived. The first was the “side-chain” hypothesis put forward by Ehrlich in 1900, in which antibodies were essentially a side-product of a normal cellular process (Ehrlich 1900). Rather than a specific class of proteins, antibodies were just normal cell-surface proteins that bound their antigen merely by chance, and the elevated production in the serum after immunisation was simply due to the bound proteins being released by the cell so that a functional, non-bound, protein could take its place. In this model antibodies “represent nothing more than the side-chains reproduced in excess during regeneration and are therefore pushed off from the protoplasm”.

 

Figure 1. The “side-chain” hypothesis of antibody formation. Under the side-chain hypothesis, antibodies were normal cell-surface molecules that by chance bound antigens (step 1). The binding of antigen disrupted the normal function of the protein so the antigen-antibody complex was shed (step 2), and the cell responded by replacing the absent protein (step 3). Notably, this model explained the large generation of specific antibodies after immunisation, as surface proteins without specificity would stay bound to the cell surface and not require additional production. The model also allowed a single cell to generate antibodies of multiple specificities.

 

The “side-chain” model was replaced by the “direct template” hypothesis by Haurowitz in 1930. Under this alternative scenario, antibodies were a distinct class of proteins but with no fixed structure. The antibody-forming cell would take in antigen and use it as a mould on which to cast the structure of the antibody (Breinl and Haurowitz 1930). The resulting fixed-structure protein would then be secreted as an antigen-specific antibody, and the antigen reused to create more antibody. In preference to the “side-chain” hypothesis, the “direct template” hypothesis explained the enormous potential range of antibody specificities and the biochemical similarities between them, but it lacked any mechanism to explain immunological tolerance.

 

Figure 2. The “direct-template” hypothesis of antibody formation. The direct-template hypothesis postulated that antibodies were a specific class of proteins with highly malleable structure. Antibody-forming cells would take in circulating antigen (step 1) and use this antigen as a mould to modify the structure of antibody (step 2). Upon antibody “setting”, the fixed structure antibody was released into circulation and the antigen cast was reused (step 3). In this model specificity is cast by the antigen, and a single antibody-producing cell can generate multiple different specificities of antibody. 

 

A third alternative model was put forward by Jerne in 1955 (Jerne 1955). The “natural selection” hypothesis is, in retrospect, quite similar to the “clonal selection” hypothesis, but uses the antibody, rather than the cell, as the unit of selection. In this model the healthy serum contains minute amounts of all possible antibodies. After the exposure to antigen, those antibodies which bind the antigen are taken up phagocytes, and the antibodies are then used as templates to produce more antibodies for production (the reverse of the “direct template” model). As with the “direct template” model, this hypothesis was useful in explaining many aspects of the immune response, but strikingly fails to explain immunological tolerance.

 

Figure 3. The “natural selection” hypothesis of antibody formation. The theoretical basis of the natural selection hypothesis is the presence in the serum, at undetectable levels, of all possible antibodies, each with a fixed specificity. When antigen is introduced it binds only those antibodies with the correct specificity (step 1), which are then internalised by phagocytes (step 2). These antibodies then act as a template for the production of identical antibodies (step 3), which are secreted (step 4). As with the clonal selection theory, this model postulated fixed specificity antibodies, however it allowed single cells to amplify antibodies of multiple specificities.

 

When Talmage proposed a revision with more capacity to explain allergy and autoimmunity in 1957 (Talmage 1957), Burnet immediately saw the potential to create an alternative cohesive model, the “clonal selection model” (Burnet 1957). The elegance of the 1957 Burnet model was that by maintaining the basic premise of the Jerne model (that antibody specificity exists prior to antigen exposure) and restricting the production of antibody to at most a few specificities per cell, the unit of selection becomes the cell. Critically, each cell will have “available on its surface representative reactive sites equivalent to those of the globulin they produce” (Burnet 1957). This would then allow only those cells selected by specific antigen exposure to become activated and produce secreted antibody. The advantage of moving from the antibody to the cell as the unit of selection was that concepts of natural selection could then be applied to cells, both allowing immunological tolerance (deletion of particular cells) and specific responsiveness (proliferation of particular cells). As Burnet wrote in his seminal paper, “This is simply a recognition that the expendable cells of the body can be regarded as belonging to clones which have arisen as a result of somatic mutation or conceivably other inheritable change. Each such clone will have some individual characteristic and in a special sense will be subject to an evolutionary process of selective survival within the internal environment of the cell.” (Burnet 1957)

 

Figure 4. The “clonal selection” hypothesis of antibody formation. Unlike the other models described, the clonal selection model limits each antibody-forming cell to a single antibody specificity, which presents the antibody on the cell surface. Under this scenario, antibody-forming cells that never encounter antigen are simply maintained in the circulation and do not produce secreted antibody (fate 1). By contrast, those cells (or “clones”) which encounter their specific antigen are expanded and start to secrete large amounts of antibody (fate 2). Critically, the clonal selection theory provides a mechanism for immunological tolerance, based on the principle that antibody-producing cells which encounter specific antigen during ontogeny would be eliminated (fate 3).

 

It is important to note that while the clonal selection theory rapidly gained support as explaining the key features of antibody production, for decades it remained a working model rather than a proven theory. Key support for the model had been generated in 1958 when Nossal and Lederberg demonstrated that each antibody producing cell has a single specificity (Nossal and Lederberg 1958), however a central premise of the model remained pure speculation – the manner by which sufficient diversity in specificity could be generated such that each precursor cell would be unique. “One aspect, however, should be mentioned. The theory requires at some stage in early embryonic development a genetic process for which there is no available precedent. In some way we have to picture a “randomization” of the coding responsible for part of the specification of gamma globulin molecules” (Burnet 1957). Describing the different theories of antibody formation in 1968, ten years after the original hypothesis was put forward, Nossal was careful to add a postscript after his support of the clonal selection hypothesis: “Knowledge in this general area, particularly insights gained from structural analysis, are advancing so rapidly that any statement of view is bound to be out-of-date by the time this book is printed. As this knowledge accumulates, it will favour some theories, but also show up their rough edges. No doubt our idea will seem as primitive to twenty-first century immunologists as Ehrlich’s and Landsteiner’s do today.” (Nossal, 1969).

It was not until the research of Tonegawa, Hood and Leder that the genetic principles of antibody gene rearrangement were discovered (Barstad et al. 1974; Hozumi and Tonegawa 1976; Seidman et al. 1979), rewriting the laws of genetics that one gene encoded one protein, and a mechanism was found for the most fragile of Burnet’s original axioms. The Burnet hypothesis, more than 50 years old and still the central tenant of the adaptive immune system, remains one of the best examples in immunology of the power of a good hypothesis to drive innovative experiments.

 

References

Barstad et al. (1974). "Mouse immunoglobulin heavy chains are coded by multiple germ line variable region genes." Proc Natl Acad Sci U S A 71(10): 4096-100.

Breinl and Haurowitz (1930). "Chemische Untersuchung des Prazipitates aus Hamoglobin and Anti-Hamoglobin-Serum and Bemerkungen ber die Natur der Antikorper." Z Phyisiol Chem 192: 45-55.

Burnet (1957). "A modification of Jerne's theory of antibody production using the concept of clonal selection." Australian Journal of Science 20: 67-69.

Ehrlich (1900). "On immunity with special reference to cell life." Proc R Soc Lond 66: 424-448.

Hozumi and Tonegawa (1976). "Evidence for somatic rearrangement of immunoglobulin genes coding for variable and constant regions." Proc Natl Acad Sci U S A 73(10): 3628-32.

Jerne (1955). "The Natural-Selection Theory of Antibody Formation." Proc Natl Acad Sci U S A 41(11): 849-57.

Nossal and Lederberg (1958). "Antibody production by single cells." Nature 181(4620): 1419-20.

Nossal (1969). Antibodies and immunity.

Seidman et al. (1979). "A kappa-immunoglobulin gene is formed by site-specific recombination without further somatic mutation." Nature 280(5721): 370-5.

Talmage. (1957). "Allergy and immunology." Annu Rev Med 8: 239-56.

Friday
Aug132010

2010's worst failure in peer review

Even though it is only August, I think I can safely call 2010's worst failure in the peer review process. Just as a sampler, here is the abstract:

Influenza or not influenza: Analysis of a case of high fever that happened 2000 years ago in Biblical time

Kam LE Hon, Pak C Ng and Ting F Leung

The Bible describes the case of a woman with high fever cured by our Lord Jesus Christ. Based on the information provided by the gospels of Mark, Matthew and Luke, the diagnosis and the possible etiology of the febrile illness is discussed. Infectious diseases continue to be a threat to humanity, and influenza has been with us since the dawn of human history. If the postulation is indeed correct, the woman with fever in the Bible is among one of the very early description of human influenza disease.

If you read the rest of the paper, it is riddled with flaws at every possible level. My main problems with this article are:

1. You can't build up a hypothesis on top of an unproven hypothesis. From the first sentence it is clear that the authors believe in the literal truth of the Bible and want to make conclusions out of the Bible, without drawing in any natural evidence. What they believe is their own business, but if they don't have any actual evidence to bring to the table they can't dine with scientists.

2. The discussion of the "case" is completely nonsensical. The authors rule out any symptom that wasn't specifically mentioned in the Bible ("it was probably not an autoimmune disease such as systemic lupus erythematousus with multiple organ system involvement, as the Bible does not mention any skin rash or other organ system involvement") because medical observation was so advanced 2000 years ago. They even felt the need to rule out demonic influence on the basis that exorcising a demon would be expected to cause "convulsion or residual symptomatology".

This really makes me so mad. The basis for getting published in science is really very simple - use the scientific method. The answer doesn't have to fit dogma or please anyone, but the question has to be asked in a scientific manner. How on earth did these authors manage to get a Bible pamphlet past what is meant to be rigorous peer review? Virology Journal is hardly Nature, but with an impact factor of 2.44 it is at least a credible journal (or was, until this catastrophe). At least the journal has apologised and promised to retract the paper:

As Editor-in-Chief of Virology Journal I wish to apologize for the publication of the article entitled ''Influenza or not influenza: Analysis of a case of high fever that happened 2000 years ago in Biblical time", which clearly does not provide the type of robust supporting data required for a case report and does not meet the high standards expected of a peer-reviewed scientific journal.

Okay, Nature has also made some colossally stupid mistakes in letting industry-funded pseudo-science into their pages, but in the 21st century you would hope that scientific journals would be able to tell the difference between evidence-based science, and faith-based pseudo-science.

Tuesday
Jul272010

Juvenile Diabetes Research Foundation

Good news in funding appears to come in pairs. The Juvenile Diabetes Research Foundation is supporting the Autoimmune Genetics Laboratory through a Career Development Award. This is a grant that I am particularly happy to receive, not just for the science that will come out of it, but because I have been a long-time admirer of the JDRF, who tirelessly raise money for research on type 1 diabetes. They are not only the leading sponsor of type 1 diabetes research (spending over $1.4 billion on research since 1970), but also take an active role in coordinating researchers and integrating patient into trials to ensure that the best results come from the money spent. As a PhD student with Chris Goodnow, I always joined in the Walk for the Cure fundraiser, and JDRF sponsored my conference travel to the International Immunology Congress in 2004.

Now the JDRF is supporting our research project on the contribution of non-hematopoietic defects to autoimmune diabetes:

The Non-obese diabetic (NOD) mouse is one of the best studied models of common autoimmune disease in humans, with the spontaneous development of autoimmune diabetes. Similar to the way multiple autoimmune diseases run in families of diabetic patients, the NOD mouse strain is also susceptible to multiple autoimmune diseases, with specific disease development depending on slight alterations in the environment and genetics. These results demonstrate the complexity of autoimmune genetics – in both human families and inbred mouse strains there appear to be a subset of genetic loci that skew the immune system towards dysfunction and an additional subset of genetic loci that result in this immune damage affecting a particular target organ. In the case of NOD mice and type 1 diabetic patients these additional genetic factors result in damage to the beta islets of the pancreas. While the previous emphasis on type 1 diabetes was strictly on the immune system, this model suggests the important role the pancreas may play in the disease process. If certain individuals harbour genetic loci that increase the vulnerability of pancreatic islets to immune-mediated damage, the combination of immune and pancreatic loci could provoke a pathology not caused by either set of genes alone.

Current approaches to genetic mapping in both mice and humans are confounded by the large number of small gene associations and are not able to discriminate between these functional subsets of genetic loci. However, we have developed an alternative strategy for functional genetic mapping. Instead of mapping diabetes as the sole end-point, with small genetic contributions by multiple genes, we map discrete functional processes of diabetes development. This has three key advantages. Firstly, as simpler sub-traits there are fewer genes contributing, each with larger effects, making mapping to particular genes more feasible. Secondly, by mapping a functional process within diabetes we start out with functional information for every gene association we find. Thirdly, by mapping a series of functional processes and then building up this genetic information into diabetes as an overall result we gain a more comprehensive view of diabetes, as a network of genetic and environmental influences that cause disease by influencing multiple systems and processes.

In this project we propose to use the functional genetic mapping approach to probe the role of the pancreatic beta islets in the development of diabetes in the NOD mice. We have developed a transgenic model of islet-specific cellular stress which demonstrates that NOD mice have a genetic predisposition of increased vulnerability of the pancreatic islets to dying and hence the development of diabetes. This is a unique model to analyse the genetic, cellular and biochemical pathways that can be altered in the pancreas of diabetes-susceptible individuals, shedding light on the role the beta islets play in the development of disease.

Saturday
Jul242010

A breakthrough for HIV prevention?

This week a breakthrough for HIV prevention was announced in Science. AIDS researchers in South Africa just completed a long-term study of Tenofovir Gel, and found that the gel, inserted into the vagina before sex, results in a 40% HIV protection rate for women. With 900 women being followed up for 30 months, the results look very solid, and potentially even better than the headline figure of 39% protection. As with all such studies, the protection rate given is with average usage, not ideal usage. The average study participant only actually used the gel for ~75% of sexual intercourse occasions. For the "high adherers", the group using the vaginal gel for >80% of sexual intercourse occasions, the protection rate was 54%. How important is this breakthrough? In a way, it is both bigger and smaller than the headlines would suggest.

A new tool to fight HIV spread

In the age of vaccines with efficacy rates of >99%, a ~40% protection rate sounds rather poor. Furthermore, this is currently a form of protection only against heterosexual transmission of HIV to women, with no data yet on any protection granted to males having sex with a HIV+ woman or as an anal gel for male homosexual transmission. HIV acquisition by non-sexual routes, such as intravenous drug use, will of course be unaffected by the gel. This is a very poor efficacy rate when compared to condom use. A Cochrane meta-analysis has determined that consistent use of condoms results in an 85% protection rate against HIV, which can go as high as 95% with correct usage. The protective effect is only on par with that of male circumcision, which multiple randomized trials have found protects males from heterosexual HIV transmission at a rate of around 60%.

Is the new gel then completely redundant? A downgrade from the condom? No, not for a key population group - the women of southern Africa. The ten countries of southern Africa together constitute 35% of global HIV cases, with HIV reaching a hyper-endemic situation with 10-30% of adults infected with HIV. In this region, heterosexual spread is the dominant form of HIV transmission, and indeed the risk factor of greatest magnitude at the population level goes to married women. Condom usage in Africa is generally very poor, with an average of only 4.6 condoms available per man per year, due to low demand. Only 7% of women in southern Africa reported using a condom the last time they had sexual intercourse with a regular partner. In particular, women who are food insecure are 70% less likely to use a condom when having sex, with less personal control over sexual relationships. Other women may not use a condom during sex for more personal reference - such as trying to conceive. A vaginal gel therefore provides (partial) HIV protection for the first time to any women who would not otherwise use a condom during sex, either because of personal choice, lack of sexual control, or through a desire to become pregnant.

The other important consideration is that any protection results in a greater number of cases being prevented than the effectiveness of the protection to the individual. This is because each case stopped also prevents the flow-on cases which would have spread from the infected individual. It has been estimated that a weakly protective vaccine, with only a 50% protection rate and only given to 30% of the population, would reduce new HIV infections by more than half, over 15 years. These figures are comparable to the results for Tenofovir Gel, so if the maximal potential is realized, this breakthrough has the ability to halve new African HIV cases.

A tool that will sit idle?

The problem, of course, is that the potential of this gel will not be realized. In many ways, the HIV epidemic is not a problem waiting for a medical solution, but rather a problem waiting for a social and political solution. Consider mother-to-child HIV prevention. Current medical treatment of HIV+ women during pregnancy and after birth reduces the transmission rate to the child by more than 99%. Even in developing countries, the treatment program has over 98% efficacy. And yet these cases, almost entirely preventable under current treatment, make up 15% of global HIV cases and 40% of HIV cases in southern Africa, since only 33% of pregnant HIV+ women in Africa get any form of anti-HIV treatment, let alone the recommended treatment program.

Other strategies, which are already proven to work, could make similar impacts if broadly implemented. Widespread male circumcision would reduce HIV rates by 60% in males and, by reducing prevalence, 30% in females. Comprehensive sexual education focused on preventing new infections can be highly successful. An aggressive campaign of university HIV testing and near universal antiretroviral treatment would be capable of reducing new HIV infections by 95% within 5 years. Just the simple treatment of individuals with genital herpes with current antiherpatic drugs could be expected to reduce transmission of HIV in southern Africa by 50%.

No, a new tool to fight HIV is not going to stop the virus. Realistically, the current tools available could cut new HIV cases by 99% within the decade, if only they were implemented. The true scourge of HIV is that it attacks the marginalised in society, hitting regions of great poverty, infecting those on the receiving side of racial and sexual discrimination. The people that, quite frankly, too many people feel deserve to be sick. Being interwoven with issues of sexuality, drugs, race and poverty, people in power have not only been slow to move - they have often moved in the wrong direction, such as the $15 billion pledged in aid by George W. Bush, with its focus on replacing effective condom use with ineffective "abstinence only" programs.

A major part of the problem is certainly lack of resources, both funding and public health infrastructure. The response to HIV has been delayed, fragmented, inconsistent and grossly under-resourced. Lesotho launched a national voluntary counselling and testing campaign aiming at universal testing, which fell through due to a lack of resources. In South Africa only 28% of HIV+ people have access to antiretrovirals. In Zimbabwe only 4.4% of HIV+ pregnant women are receiving antiretroviral treatment to prevent mother to child transmission. In Nigeria 10% of all HIV transmission events are due to lack of funds for hospitals to screen transfused blood, a situation which requires only funding to remedy. However, funding is not the only impediment to an efficient HIV prevention campaign. Policy makers have repeatedly failed to spend limiting resources on HIV prevention, concentrating on medical treatment without adequate care and support. This is despite the cost of most HIV prevention techniques being well under the $4770 per infection prevented that it would take to create a cost savings compared to simple treatment. What is needed to end the HIV crisis is, in fact, simple in health terms and is difficult only in political implementation – a coordinated and adequately funded approach to integrate evidence-based HIV prevention strategies, in concert with major social and economic development efforts to eliminate gender disparities, race- and sexuality-based discrimination and extreme poverty.

Thursday
Jul222010

European Research Council funding

A major investment of my time last year and this year was in putting together an application for a European Research Council Start grant. The process was quite an ordeal, with both a substantial written grant and a challenging oral defense, probably consuming over 100 hours of my time. Fortunately, with excellent independent researchers in the laboratory, great research continued to be done in the laboratory while I was locked away with the computer.

Being open to researchers across Europe, in any discipline, the competition is fierce, however there are some large advantages to the ERC Start grant process: 1) the committee looks favourably upon large ideas, rather than safe ideas; 2) the competition is segregated according to career stage, so that I was only competing with other researchers less than five years out from their PhD; 3) the funding is sufficient in scale and duration to really put forward a grand plan. Just recently I found out that the application was approved, and the VIB put out the following press release:

VIB receives high score from European Research Council (ERC)
Two young top researchers awarded €1.5 million research grants!

Leuven - VIB landed two research grants worth 1.5 million euros each. The prestigious grants are courtesy of the European Research Council (ERC) and are aimed at giving talented young scientists the opportunity to develop their own research team. The honor fell to Adrian Liston and Patrik Verstreken, both recently transferred to VIB-K.U.Leuven from abroad.

The European Research Council
ERC was created to encourage excellent research in Europe. ERC starting grants give young talented researchers the opportunity to develop a research group. At present, there are still too few opportunities in Europe for young scientists to initiate and lead their own research, which is extremely unfortunate as it results in top researchers leaving the region to develop their careers elsewhere.

Adrian Liston studies autoimmune diseases.
The immune system is our body's defense system and allows it to fight off foreign substances and micro-organisms. In people with an autoimmune disease, the immune system has gone awry: it can no longer distinguish between the body's own and foreign substances and ends up attacking vital tissues and organs. Adrian Liston studies immune system cells (T cells) that are responsible for this malfunction. With his ERC research grant, he plans to bridge the gap between his research on mice models and humans. This may be a first step in the development of new therapies for autoimmune diseases.

Patrik Verstreken explores the communication between brain cells.
Brain disorders take a major toll on society. Many brain diseases are caused by the disruption of communication between brain cells. Finding a solution depends on understanding this communication in the smallest detail. Patrik Verstreken uses the fruit fly as his model organism for studying genes involved in the communication between brain cells. The ERC research grant gives him the opportunity to expand his research to more complex neural communication networks that control behavior. This step is crucial if we are to understand neurological disorders such as Parkinson's disease.

Monday
Mar222010

One year as a junior faculty member

One year in numbers:

62: the number of grants I have reviewed for various foundations
19: the number of articles I have reviewed for different journals

25: the number of grants I have submitted
7: grants accepted
4: grants declined
14: grants pending
1,029,685: euros given in grants
830,493: euros spent in research

10: invited talks
3: conferences
3: lectures

13: article submissions
9: articles published or in press

5: PhD projects started
11: number of permanent staff in the lab
8: number of full-time permanent researchers in the lab

0: number of days I've spent doing experiments

Wednesday
Feb242010

Negotiating a start-up package

After my previous posts on science careers I was asked about negotiating a start-up package. Unfortunately here I have little input - for a new faculty member there is very little negotiation that can take place. The faculty will have a budget set aside for recruitment and this is not going to change in any substantial way. There are a few minor points to consider:

1. The edges can be flexible.
The net value of the start-up package is unlikely to change, but a one-size-fits-all package may be adopted to your circumstance. Will it be possible to have no teaching commitments in the first year? A discount on departmental services? Perhaps make your start-up fund open-ended rather than time-limited. Look carefully at the package being offered and find any conditions that could be an issue to you - and only ask about changes that will make a real difference to your research. Often the hardest part is working out what would be important to you, since you will not be familiar with the inner workings of the department in advance.

2. Negotiate for the research, not for yourself. If you talk about changes in terms of things you would like, the faculty will weigh this up against how much they want you. Instead phrase the changes in terms of how they can add to your research. Why will this change make your research output substantially better? The faculty will be much more willing to make changes if they can see the value to your research output - after all they want you to succeed.

3. Don't grandstand. These are your colleagues and your requests will typically come at a cost to them, either in terms of faculty subsidies or extra workload. Do not make a little issue into a big issue. Also, don't bluff. In my negotiations with one faculty I did have one "make or break" issue. There were a few things that would have been nice but I could live without - these I let go when they were turned down. But when I discussed one particular clause I explained exactly why this would make my particular research program untenable, and when they couldn't change that one clause I walked away. Don't make an issue "make or break" unless it is literally a deal-breaker.

4. Get it in writing. Okay, this is not exactly in line with #3 about being considerate in negotiations, but a contract should be in writing. If a faculty is happy to agree to a condition there is absolutely no reason why it shouldn't be written down in your contract. Things change over five years. Departmental heads leave and get replaced by new heads. Memories on exactly what was agreed become hazy over time.

Wednesday
Feb172010

Applying for faculty positions

I've had some occasion recently to contemplate the strategies for applying for faculty positions. In 2008 I interviewed at eight different universities for a faculty position, and two of those experiences in particular were very illuminating - the IRIC (Institute for Research in Immunology and Cancer) and VIB (Flemish Institute of Biotechnology) held open applications where all the applicants were interviewed together. This gave me a fascinating insight into the faux pas made and the important criteria for being offered a faculty position.

These are the three criteria I recommend post-docs to consider:

3. Publications. Yes, telling people they need Nature papers is useless advice - everyone knows the importance of publications. Actually, I have put publications at #3 because I think it is much less important than the other two criteria. I interviewed back-to-back against post-docs with outstanding publication records that I couldn't match, multiple major Cell papers that redefined a field and opened up new technologies. Yet I've seen these same people fail at criteria #1 and #2 and miss out to people with less outstanding publication lists. I see publications almost as a threshold effect. For a post-doc to be competitive at a high-level institute they will need to have multiple papers at JEM or higher journals. But in a way it is more important to have a diverse portfolio of publications. Primary papers in multiple laboratories demonstrate an ability to research in different environments. Middle author publications demonstrate willingness to collaborate. Review papers show a grasp over the field. The risk for an applicant with a few good first author Nature papers is that the credit will go to the last author. Having a broader repertoire with senior authorships and multiple laboratories tells the selection panel that you have carved out your own research niche and that you were more than a PhD student put on a lucky project.

2. Experience outside benchwork. The enormous importance that is placed on publications tends to drive post-docs to make a fundamental mistake - you cannot learn to be a PI from the bench. Once you have a faculty position the amount of research time you have available will drop precipitously. Skills are needed in setting up a lab, writing grants, working on a budget, mentoring students, teaching undergrads, faculty business, etc etc. The selection panel is well aware of this, they are not looking for a post-doc to work in their lab, but someone who can run a successful operation, someone who can translate their previous first-author success into future last-author success. One applicant I interviewed with had an outstanding publication record but didn't get a job offer because it was clear that they were an outstanding post-doc but would be a terrible PI. When asked about supervision experience this candidate said "Oh, my PI gave me a technician, and I've trained her to sit behind me and pass me solutions and pipettes reset to the right volumes. It is great, I can now do research twice as fast as before". Perhaps - but how would he fair when he was tied to a computer writing grants and relying on his technician to produce data? It is important for post-docs to show that they have the skill set to run a lab - a different skill set to being a post-doc. Mentor students, write fellowships or grants, train technicians, teach classes - show the selection panel that you have already been running a sub-lab within a larger lab, and you are now ready to expand your operation.

1. Emotional intelligence. We work in science, the bar is pretty low - but still I have seen the stunned look on faces as applicants show zero emotional intelligence. I'm going to put this one at number 1, because an applicant who is above average but not genius on publications and management experience can shoot to the top of a list if they have emotional intelligence. This stuff should be simple but it obviously isn't. I remember standing around at a coffee break during the interview day and listening to a selection panel member ask an applicant how they were finding the experience. The reply? "Actually, to be honest it is terrifying, there are so many good people here that I feel like a fraud". Okay, this is not uncommon, a study by the American Astronomical Association found that more than 50% of graduate students admit to being afraid their peers will find out how little they know. Only 5% strongly disagreed. But don't confide in the selection panel. Every interaction with the selection panel or any faculty member, regardless of how informal, is part of your assessment. An applicant needs to work out what each person is after, and show them that you can deliver - both in body language and your response. Be calm, authoritative and deliberative without being aggressive, flighty or nervous. Consider that every panel member is looking for something different. A good selection panel wants the best person for the department and also the best person for their laboratory in particular. They are picking a long-term colleague, show them that you have skills they can use, knowledge they can draw on, that you are willing to collaborate, that you have an ability to "value-add" to the department. The right applicant in the right place will not only bring in their own research value, but will also increase the research value of other laboratories in the faculty. An applicant should research the faculty and the faculty members, think about collaborative potential and engage each individual they interview with on their own terms.

Now the corollary to this advice - don't fake it too much. If writing grants and mentoring students feels like an annoying distraction from benchwork, think again about whether you want to be a PI. If you are not genuinely excited about the collaborative prospects in a department, don't send in an application there. The interview is not just about the selection panel interviewing you, it is about you subtly working out whether the department will be good for you, so if you have to make promises you don't want to keep you are looking in the wrong place.

Saturday
Jan162010

Sex determination

If yeast sex is simple, how complicated is sex in multicellular organisms? Actually, the act of sexual reproduction in plants and animals (including humans) is essentially identical to that of yeast, fungi, plants and animals, including humans. Multicellular organisms have two different copies of the genome in every cell. Like yeast, to undergo sex we need meiosis to occur. Specialised sexual cells duplicate the two genomes, cut and paste them into four unique genomes, and then divide into four daughter cells. These cells are either large cells containing a single genome and lots of energy (the female egg) or small cells containing nothing more than just a single genome (the male sperm). When they combine the new cell has two genomes, one from the female parent and one from the male parent. Importantly, just like yeast, the offspring that results has a unique genetic composition. The two genomes that the offspring possesses are each novel, created by the combination of the two unique genomes in each parent.

While sex for multicellular organisms is identical to yeast at the cellular level, at the sex determination level things get very complicated. There are many different ways for determining whether an individual is male or female. As a measure of the broad diversity of ways in which species have solved the sex determination problem we can look at a few different examples: clownfish, crocodiles, humans, whiptail lizards and komodo dragons. And this is not including some of the really complicated systems that exist, such as in earthworms, bees and platypi.

Clownfish and crocodiles

Clownfish and crocodiles both have non-genetic sex-determination systems. Males and females have the same genetic make-up and every genome has potential to encode either a male or female individual. The physical manifestations of sex occur due to environmental influences on which set of genetic controls to activate. In clownfish the important environmental influence is the social interaction of other clownfish. All clownfish start out as males. When the sole female in the group dies, the largest male undergoes a rapid sex change and becomes a female. Interestingly, this sex change is reversible – a female moved into a new group where she is no longer the largest will revert back to a male. This plasticity ensures that there is always a breeding female in every group, and that the female comes from the most successful individual in the group.

Like clownfish, crocodiles have no genetic difference between males and females. Unlike clownfish, however, there is no sex plasticity. A male hatches as a male and stays a male for life, a female hatches as a female and stays a female for life. The important environmental influence in this case is the temperature of the egg. If the temperature of the egg is between 31.7°C and 34.5°C the embryo is set as a male, if the temperature of the egg is outside this range the embryo is set as a female. There are several important restraints that this sex determination system has had on crocodile evolution. Firstly, the crocodilian mother has become very active in nest maintenance, as a temperate far from the threshold will result in hatchlings of a single sex. Secondly, this sex determination system has forced crocodiles to maintain a link to land. Other aquatic species, such as dolphins and sea snakes, have been able to become entirely aquatic by giving live birth in the water. These species all have genetic sex determination systems (below). By contrast, crocodiles and turtles need to return to land to lay eggs because a temperature-dependent sex determination system is incompatible with live birth – internal body temperatures are too stable to give the diversity in temperatures required.

Humans and whiptail lizards

Humans and whiptail lizards both use the XX/XY sex determination system. In this system, sex is determined by the combination of sex chromosomes inherited from the parents. XX results in females and XY results in males. As females can only pass on an X chromosome while males can pass on either an X (50% chance) or Y (50% chance) chromosome, this system results in roughly equal numbers of females and males being born. It is very important to note that differences between the sexes are largely not due to genetic differences. Both males and females have the X chromosome, and while females have two copies one of these copies is “inactivated”, making them equivalent to males. The only substantial genetic difference between males and females is the presence of the Y chromosome in males. This Y chromosome is tiny (only 2% of the human genome) and is mostly made of up junk. The only essential gene on the Y chromosome is the SRY gene.

All human embryos, whether XX or XY, spend the first six weeks as females. At this point embryos with the XY genome express the SRY gene in the genital tissue, starting the development of testes. The testes then express testosterone and the embryo detects this testosterone production through the androgen receptor. The effect of this production is a complete remodelling of the genitalia from female into male between 7 and 12 weeks gestation. Many different genes are used to initiate the “male” program instead of the “female” program, but only the SRY gene is on the Y chromosome. In other words, females have all the genes required to develop the physical attributes of a male, and males have all the genes required to develop the physical attributes of a female, and only a single gene decides which program is used. When thinking about physical differences between males and females it is not helpful to think about genetic variation, such as exists between different populations of humans. Instead the best comparison is to think about your heart and your liver. Both cells have the same genome, the same genetic code, but the two cells have initiated different programs from the same code so that the cells can perform different functions.

Most whiptail lizards use the same XX/XY system as humans, with XX lizards being female and XY lizards being male. However 15 species of whiptail lizards have reverted to an asexual system of reproduction. These species consist only of XX females. The females still undergo sexual meiosis to create an egg with a single X chromosome, however in the absence of sperm these eggs spontaneously duplicate their genome to become XX females, in a sexual system called parthenogenesis. It is unclear as to why these whiptail lizards have evolved to abandon the advantages to sexual reproduction, however a clue may be the environment they live in – the dry deserts of North America. It is likely that with the low population densities of lizards living in a desert finding a mate becomes very difficult. By breeding through parthenogenesis females can still reproduce even if they fail to find another lizard, and furthermore every individual offspring is capable of bearing young, allowing more efficient use of resources during dry times, and faster population growth during wet ones.

Komodo dragons

The ZW/ZZ sex determination system used by Komodo dragons is essentially the opposite of the XX/XY system. Here, ZW results in a female while ZZ results in a male. As with the Y chromosome, the W chromosome is a minor chromosome with few functions beyond sex determination. When breeding, a female Komodo dragon can pass on either a Z or W chromosome while a male Komodo dragon can only pass on a Z chromosome. This results in 50%:50% females to males. Interestingly, the Komodo dragon has also developed parthenogenesis, like the whiptail lizard. A female Komodo dragon kept alone will have spontaneous genome duplication of an egg. The outcome, however, is the opposite of that occurring in the whiptail lizard. The whiptail lizard female, using the XY sex determination system, can only pass on an X chromosome, so duplication results in an XX female. The female Komodo dragon, however, uses the ZW sex determination system, so the egg could either have a Z chromosome and duplication to be a ZZ male, or it could have a W chromosome and duplication to become an unviable WW embryo. In practise, therefore, this means that female Komodo dragons that revert to parthenogenesis will always generate ZZ males. This means that a lone female washed up on a new island will generate male offspring by parthenogenesis, allowing later sexual reproduction. What is the advantage of this system of parthenogenesis? There are two likely possibilities. The first is that it avoids the spiral into an inbred population that occurs in whiptail lizards, with only a single necessary parthenogenic generation interrupting sexual reproduction. This may be a more appropriate adaption to the “rich uninhabited island” scenario, with the XX parthenogenic strategy more suitable for the “low population desert” context. Alternatively, and equally plausible, the ZW parthogenesis strategy is less efficient than XX parthenogenesis in both contexts (or vice versa). Since evolution always works from the current genetic situation in incremental steps, non-ideal compromises are common.