Debunking myths on genetics and DNA

Thursday, May 31, 2012

Genomic superspreaders


Endogenous retroviruses (ERVs) are "a unique combination of pathogen and selfish genetic element [1]." I discuss ERVs often on the blog because they truly intrigue me. A couple of weeks ago I talked about retrotransposons, which are transposable sequences derived from ancient viral infections through the integration of the viral genome into the germline. A recent PNAS paper [1] states that ERVs that lose the env gene behave like retrotransposons.
"ERVs can replicate both as transposable elements (TEs) and viruses. Some lineages replicate by an entirely intracellular mechanism and are functionally indistinguishable from the class of TEs called LTR-retrotransposons, whereas others do so within the host germline using cell reinfection in the same manner as the copying within somatic cells of exogenous retroviruses (XRVs)."
I now realize that I had missed one important point about ERVs. I thought that, once inserted into one of the germline cells, it had to be that same infected cell that got fertilized in order for the viral sequence to became part of the host's genome. My new understanding, in light of this last paper I read, is that once integrated into the germline, the virus can replicate within the cell line, making its presence in any future fertilized egg a much more likely event.

The protein responsible for viral cell entry is the envelope, coded by the env gene. Whether the ERV will replicate as a retrotransposons or by reinfection is determined by the integrity of its env protein.
"We can assume that an ERV lineage with a functional env is reinfecting, whereas an ERV lineage with a disintegrated env is retrotransposing (whether reinfection can include germline cells in other host individuals of the same or other species is not known). Some retroviruses with a defective env are able to reinfect by “hitchhiking” the functional env of a coinfecting retrovirus, a mechanism known as “complementation”. However, complementation does not appear to be common in ERVs."
In [1], Magiorkinis et al. look at different ERV lineages within 38 mammalian genomes and ask what controls the relative abundance of ERV lineages. They look at a relatively young group of ERVs, called IAPs. Though these sequences were originally discovered with a degraded env, and hence behaved as retrotransposons, later similar loci with an intact env within the same group were identified, and one in particular was shown to be able to reinfect cells.
"We find repeated transformations from reinfecting into retrotransposing ERVs and show that this transformation results in a rapid proliferation within the genome. Considering our results together with those from studies of transmission diversity in infectious disease epidemics, we propose that retrotransposition is the trait that leads ERVs to become genomic superspreaders."
The results presented in [1] suggest that, once integrated, ERVs initially expand in the host through reinfection, though eventually the env loses its functionality and they become intracellular retrotransposons. A legitimate question is whether the loss of env is a cause or a consequence of the shift to retrotransposition: at least in mouse IAPs, loss of env has been shown to be a consequence. Env degradation enhances intracellular mobility but lowers the chances of interhost transmission. Even though in the literature there have been cases where ERVs with a degraded env were able to capture the functional env of a coinfecting retrovirus, none were found among the IAPs in this study.

The most interesting result in this study is the positive correlation between loss of env and proliferation. This could be due to the fact that coinfection is more likely to confer a disadvantage to the host, thus favoring the shift from coinfection to retrotransposition. It could also be that, compared to coinfection, retrotransposition favors integration into the germline.

[1] Magiorkinis, G., Gifford, R., Katzourakis, A., De Ranter, J., & Belshaw, R. (2012). From the Cover: Env-less endogenous retroviruses are genomic superspreaders Proceedings of the National Academy of Sciences, 109 (19), 7385-7390 DOI: 10.1073/pnas.1200913109

ResearchBlogging.org

Monday, May 28, 2012

Bacteria, biodiversity, and allergies.


You may not have heard of gammaproteobacteria, but I'm sure the names salmonella, escherichia coli, pests and cholera do ring a bell. They are all caused by bacteria that belong to the gammaproteobacteria family. Hanski et al. took small skin samples from 118 Finnish adolescents and found a variety of bacteria, the most represented being Actinobacteria, Bacilli, Clostridia, Betaproteobacteria, Alphaproteobacteria, and Gammaproteobacteria.

"Ew," you're probably thinking. Well. . . think again.

On an average human there are an estimated 10^12 bacteria that make their home in the outer layers of our epidermis and in our hair follicles. And yes, you've guessed it: these guys are very much needed. In their study [1], Hanski et al. correlated the lack of biodiversity in skin microbiota with allergic disposition. The study subjects were from different size towns and villages, offering a diverse range of exposure to bacteria. To analyze the skin microbiota, they took DNA samples from the epidermis on the inside of the arm. To test allergy predisposition they measured IgE antibody levels after exposure to a mixture of common inhalant allergens, and used a cutoff point to define atopic individuals (the ones that showed a predisposition toward allergic hypersensitivity). A side note: IgE antibodies are responsible for the over-stimulation of mast cells and basophils that trigger allergic reactions. Atopic individuals can have up to ten times the normal IgE levels, though that doesn't exclude individuals with normal IgE levels from having an allergic reaction.

In order to test their hypothesis, Hanski et al. did a principal component analysis in which they compared the number of bacteria genera found in the skin samples with land use in the immediate surrounding (whether agricultural, , forest, built area, etc. within 3 km of the subject's home).
"The PC1_env of the land use types was significantly (P = 0.0033) related to PC2_bac, indicating that the generic diversity of proteo-bacteria was higher on the skin of individuals living in an environment with more forest and agricultural land compared with those living in built areas and near water bodies."
PC1 and PC2 in the above are the first and second principal components. Next, the researchers repeated a similar principal component analysis to attest the correlation between diversity in skin microbiota and atopy. One thing to ask when carrying this kind of analyses is whether the atopic subjects in the study are evenly distributed across agricultural and urban areas. If the distribution is skewed (for example, if most atopic subjects live in the city and only a few in agricultural areas), this could clearly skew the results. The researchers checked this and found no correlation between atopy and spatial distribution. they also checked for other possible confounders (other factors that might skew the analysis) such as passive smoking and pets, but none were significantly correlated with atopy.
"Atopic individuals had highly significantly (P = 0.0003) lower generic diversity of gammaproteobacteria on the skin compared with healthy individuals."
Furthermore, the researchers found "one significant correlation, between the relative abundance of gammaproteobacteria and IL-10 expression in healthy individuals (P = 0.015)." IL-10 are anti-inflammatory cytokines (protein molecules).

Overall, an interesting paper, as it reinforces the hypothesis that by limiting the exposures to our immune system we are somehow altering our ability to build appropriate responses to the environment. We are indeed seeing a decline in biodiversity of the environment we live in and at the same time witnessing an increasing prevalence of allergies. I do wonder about the number of subjects (118) versus the high number of tests the researchers conducted. And I also wonder whether the researchers tried a logistic regression fit as an alternative to the principal component analysis.

Hanski, I., von Hertzen, L., Fyhrquist, N., Koskinen, K., Torppa, K., Laatikainen, T., Karisola, P., Auvinen, P., Paulin, L., Makela, M., Vartiainen, E., Kosunen, T., Alenius, H., & Haahtela, T. (2012). Environmental biodiversity, human microbiota, and allergy are interrelated Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.1205624109

This post was chosen as an Editor's Selection for ResearchBlogging.org

Friday, May 25, 2012

DNA vaccines: a work in progress


You are all familiar with the idea behind vaccines: an attenuated form of the pathogen stimulates the immune system to produce T-cells and antibodies specific to that particular antigen. These immune responses then become part of our T- and B-memory cells, cells that have previously encountered a certain antigen and have already specialized to recognize it. The challenge behind a vaccine is to use a form of antigen that's weak enough so not to cause the actual disease, but strong enough so to prompt the appropriate immune response. An efficient immune response has to be broad (it has to recognize all possible strains of the antigen) and strong (enough T-cells and antibodies have to be produced in order to clear the infection).

First generation vaccines use the whole organism as an antigen. Unfortunately, weakened forms may still induce full infection in immunocompromised people. Second generation vaccines use portions of the organism. For example, in HIV, one protein that's been used a lot in vaccine trials is env, the envelope protein: this is the outer shell of the virus, and the part most visible to the immune system. The so called "DNA vaccines" are the third generation vaccines. The idea is to inject a circular molecule of DNA (plasmid) that encodes for the specific antigen proteins. DNA is rapidly absorbed by cells and, once inside, it can use the cell machinery to assemble the proteins it encodes for. Just like in a viral infection, these proteins are then displayed on the cell's surface and presented for recognition by the immune system. The advantage of a DNA vaccine is obvious: there is no risk that the DNA itself will trigger the actual disease. Furthermore, studies have so far shown that no anti-DNA antibodies are produced.

Some DNA vaccines are already in use in veterinary medicine. In humans, though safe and well tolerated, they seem to have lower immunogenicity than other vaccines, and hence their potential hasn't been fully exploited yet. While the reason for this is still unknown, several studies have attempted to use other genes and proteins in combination with the vaccine to improve immunogenicity, in particular, genes and proteins that are involved in immune recognition pathways and cell-signaling pathways.
"Advancements in antigen design, improved formulations, inclusion of molecular adjuvants, and physical methods of delivery have greatly enhanced the immunogenicity of DNA vaccines [1]."
In [1] Ferraro et al. review the current studies in this field specifically for vaccines targeting influenza, human papilloma virus (HPV), and HIV. In the case of influenza, the appeal of a DNA vaccine is that it would considerably shorten the preparation time. In terms of immune responses, DNA vaccines have not been able to trigger good antibody responses, but, on the other hand, tend to perform well in triggering cellular responses (recruiting natural killer cells, T-cells, and phagocytes). In HIV in particular, both antibodies and T-cell responses are needed, both broad enough to cover the variability of the virus. Therefore, it is feasible and promising to combine a DNA vaccine with a protein one.
"Combining a DNA prime and viral boost creates a synergistic enhancement in the magnitude of antigen-specific CD81 T-cell responses. A phase I trial that combined a multi-clade DNA vaccine prime with an Ad5 boost demonstrated that this strategy was capable of eliciting humoral responses in addition to cellular responses."

[1] Ferraro, B., Morrow, M., Hutnick, N., Shin, T., Lucke, C., & Weiner, D. (2011). Clinical Applications of DNA Vaccines: Current Progress Clinical Infectious Diseases, 53 (3), 296-302 DOI: 10.1093/cid/cir334

ResearchBlogging.org

Wednesday, May 23, 2012

Hail to the Bonobos!


Is human nature prone to violence or is cooperation the dominant trait? The latest issue of Science magazine is dedicated to "Human Conflict" and touches a variety of topics, from racism to terrorism, addressing the question: are we good or evil? Is our true nature aggressive and violent, but tamed by social constraints, or is it the other way around, and we are in fact neutrally inclined towards empathy and cooperation, while violence and aggression are the exceptions?

Concepts like "survival of the fittest" and the "selfishness of genes" (a concept I truly dislike, I'm just quoting it here because it seems to be a widespread view) have reinforced the general idea that our true nature is geared towards conflict. Life is conquered through competition.

In [1], de Waal argues that while after World War II the dominant view was that human nature was dominantly aggressive, there isn't much evidence to favor this view.
"During most of our prehistory, we were nomadic hunter-gatherers, whose cultures are nowadays not particularly known for warfare. They do occasionally raid, ambush, and kill their neighbors, but more often trade with them, intermarry, and permit travel through their territories. Hunter-gatherers illustrate a robust potential for peace and cooperation."
Some of our ancestors, like the chimpanzee Pan troglodytes, can be very aggressive and their territorial encounters are often lethal. However, among all chimpanzees, guess who's our closest relative? The Bonobo! if you are not familiar with this wonderful animal, check out what the Wikipedia page says:
"The bonobo is popularly known for its high levels of sexual behavior. Sex functions in conflict appeasement, affection, social status, excitement, and stress reduction. It occurs in virtually all partner combinations and in a variety of positions. This is a factor in the lower levels of aggression seen in the bonobo when compared to the common chimpanzee and other apes. Bonobos are perceived to be matriarchal; females tend to collectively dominate males by forming alliances and use sexuality to control males. A male's rank in the social hierarchy is often determined by his mother's rank."
Such behaviors force us to review the original theory that conflict is the driving force of survival. If that were truly the case, why do some chimpanzee show reconciliatory behaviors such as kissing and hugging after a fight? Furthermore, de Waal argues, all functions that are necessary for survival, such as sex, eating, nursing, and socializing, are associated with a sense of fulfillment. This is not the case with killing and aggressiveness.

Not only do bonobos show signs of empathy in their behavior (such as comforting one another after a disturbance), but also in the anatomy of their brains:
"This species has more gray matter in brain regions involved in the perception of distress, including the right dorsal amygdala and right anterior insula, and a better developed circuitry for inhibiting aggression."
Though highly speculative, evidence of empathy comes also from the fact that monkeys, like humans, have "mirror neurons," neurons that fire when a stimulus is experienced as well as when when, instead, it's observed. Since empathy involves embracing another individual's feelings, mirror neurons are often considered a sign of empathy. A wide range of empathy-based behaviors have been observed in primates, mice, and elephants, from mice discarding food in order to help a trapped companion, to incentive-free assistance in apes. And finally, contrary to aggression and violence, altruism is often followed by a sense of fulfillment.

EDIT: I really appreciate the discussion that this particular post sparked. I just want to add one more thought. I think the general conception has always been that aggression, lethal confrontations, and competition for resources are intrinsic to primitive animal behaviors. On the other hand, we think of sentiments like love, compassion, and empathy in particular as "higher" sentiments, something that pertains to civilization and hence to higher intelligence. This particular paper intrigues me because, without proving anything, it provides evidence to the opposite: it seems to indicate that empathy pertains to all animals, from mice to apes, not just to humans, and that evolution favors cooperation, while disfavoring disruptive confrontations. It is true that with limited resources aggressive behaviors increase, but if you take a society like the bonobos and observe it when in equilibrium with its own environment, then one can't help but wonder: if the bonobos manage to resolve their issue peacefully, why can't we do the same? It seems to me that while empathy is shared across the animal kingdom, envy, jealousy and greed are uniquely human. That certainly puts the phrase "higher intelligence" in perspective.

[1] de Waal, F. (2012). The Antiquity of Empathy Science, 336 (6083), 874-876 DOI: 10.1126/science.1220999

ResearchBlogging.org


Monday, May 21, 2012

Did you see the eclipse last night?

I did, but I also learned that capturing an eclipse is no easy task.
These are my best shots. Notice that they look very dark because I used a shutter speed of 1/5000 and faster and I also had a polarizer filter, but at least where I was I didn't notice any waning in luminosity. Even that thin ring of sun could fully brighten the day!


And a couple post-eclipse shots :-)


Sunday, May 20, 2012

Juggling languages and sounds


Growing up my dominant language was Italian. However, thanks to my dad's sabbaticals, on more than one occasions we lived in English-speaking countries for long periods of time. My brain would reset to Italian as soon as we returned, but I never forgot English, and it was definitely easier to transition again when, as an adult, I moved to the US. However, to this day, I have yet to get rid of that feeling of inadequateness that sticks to me in many situations, whether here or back in Italy. You have no idea how many times I've said the wrong thing at the wrong time and, even worse, I had no idea what I'd said but could only guess from the snickers and smirks around me. Well, it turns out, being bilingual is no easy task, but it has its advantages, too.
"With improved juggling ability, novice jugglers demonstrate structural enhancements in a cortical region associated with processing and storage of complex visual motion. Similarly, the bilingual, a mental juggler of two languages, shows structural and functional enhancements in cortical regions involved in language use and executive control, likely resulting from a lifetime of communicating in two languages. [. . .] The need to constantly control two languages confers advantages in the executive system, the system that directs cognitive processing [1]."
"Mental juggler of two languages" . . . I love it, I can totally relate to this metaphor. BTW, I never felt truly bilingual until I started mixing up it's and its, your and you're... !

Krizman et al. [1] reasoned that since musical training enhances cognitive and sensory processing, a similar neural enhancement would be noticed in bilinguals.
"To test this prediction, performance on a task of integrated visual and auditory sustained selective attention was compared between highly-proficient Spanish-English bilinguals and English monolinguals."
The researchers tested differences between bilinguals and monolinguals in the auditory system. I'm not at all surprised that you would find a difference. For example, I've noticed on more than one occasion that because Italian is a phonetic language (words are pronounced as they are written), many Italians who learn English as adults tend to pronounce English words as they'd read them in Italian. To me, this bias introduced by knowing the written word, or better, the inability to pronounce a word unless you see it written, attests to the fact that while growing up a language is a set of sounds one learns to reproduce, once an adult, this ability is somehow lost. Since the sound is per se "unrecognizable," the brain switches to a visual stimulus instead (the written word).

Back to the paper: as specific auditory stimulus, the researchers used the syllable "da" presented in the context of "multitalker babble relative to a quiet acoustic background." The results:
"Bilinguals, relative to monolinguals, showed enhanced sub-cortical representation of the fundamental frequency of the speech sound as well as improved sustained selective attention."
Like music training, language learning impacts subcortical sound processing. The evidence collected by Krizman et al. suggests that because bilinguals constantly juggle sounds between two languages, they possess a neural enhancement in the way sound is encoded in their brains. Like with musicians, the researchers claim, their auditory system is highly efficient, flexible, and better focused in the sound processing.

I just want to add a personal note: we tend to think of language as a way to convey our thoughts, but would we have thoughts at all without a language? I didn't realize how lucky I've been to have learned English in my childhood until, as an adult, I tried to learn German. Languages do shape the way you think, and in order to learn a new language you truly have to change the way you're thinking. I still can't quite grasp the meaning of German dependent clauses until you get to the end of the sentence (a verb!!! please give me a verb!). It feels like one of those super-high stacks where I'm longing for a "pop operation" to happen. . . And once I have a verb I have to stop and reconstruct the sentence. It's not until you learn to think that way that you truly master the language.

I'm fascinated by the intrinsic relationship between thought, consciousness, and language. Feel free to chip in with your thoughts. In English, please. ;-)

Krizman, J., Marian, V., Shook, A., Skoe, E., & Kraus, N. (2012). From the Cover: Subcortical encoding of sound is enhanced in bilinguals and relates to executive function advantages Proceedings of the National Academy of Sciences, 109 (20), 7877-7881 DOI: 10.1073/pnas.1201575109

ResearchBlogging.org

Wednesday, May 16, 2012

Jumping genes and epigenetics


Speciation is the evolutionary process by which new species emerge. This can happen through several mechanisms, for example, a population can become geographically split and no longer interbreed. The two subgroups will then slowly form new species just by random drift alone. In general any change that forces a split in interbreeding will have a similar effect, as for example a strong selective sweep, or chromosomal rearrangements.

Each chromosome is made of a molecule of double-stranded DNA. If the double strand breaks, mechanisms within the cell will attempt to repair the molecule. However, the mechanism is not perfect and it could end up joining together different broken ends, in which case a new chromosomal rearrangement takes place. They can arise through deletions, duplications, inversions, and translocations. If the change is such to prevent a subgroup of the population to interbreed with the rest of the population, speciation may indeed occur.

In [1], Rebollo et al. discuss how transposable elements can induce speciation by causing chromosomal rearrangements. A quick refresher: a transposable element (TE) is a DNA sequence that can move from one place to another within the same genome. It can make copies of itself that are then inserted at different DNA locations. We "inherited" some of them when an ancient virus got "stuck" in our germ line and became part of our genome (see here and here).

Rebollo et al. notice how epigenetic responses to environmental stimuli can trigger a burst of TE transposition, which in turn can induce chromosomal rearrangements. Therefore they suggest that TE bursts of transposition might induce new speciation in response to environmental changes.

TEs are intriguing because on the one hand they are highly mutagenic (they induce mutations), and in many cases have been associated with cancer and disease. However, if they were truly deleterious to organisms, you would expect them to slowly disappear from genomes. Instead, they are not only quite abundant in mammalian genomes, they seem to increase with genomic complexity, indicating that they may impart an evolutionary advantage.
"TE abundance, TE-derived genomic features and chromosomal rearrangements involving TE sequences are frequently lineage specific and, therefore, suggest that TEs have contributed to the process of speciation, either as a cause, or an effect."
"Significant TE activity is observed in several species, often during periods of radiation, suggesting that massive speciation and massive TE activity may be associated. The genetic distance between two organisms is calculated as a function of their genetic divergence, so every episode that creates divergence, such as lineage-specific transposition events, could contribute to the reproductive isolation of those organisms. TE patterns that differ between individuals of the same species, whether as a cause or a consequence of genetic differentiation, may not only provide genetic markers for researchers, but also constitute evidence of a speciation process occurring within the species concerned."
Transposons are strictly regulated by epigenetic mechanisms that follow different pathways, from noncoding RNAs, to chromatin remodeling and DNA methylation. Interestingly, despite being very rigorous, this epigenetic regulation of TEs is also prone to variation. Its flexibility can induce TE transposition in the germline and, as a consequence, chromosomal rearrangements.
"Although the benefit is not immediate, transposition might have a long term advantage. Indeed, transposition bursts have numerous consequences, resulting in a renewal of genetic diversity, which is the major prerequisite for genome evolution and selection to occur."

[1] Rebollo, R., Horard, B., Hubert, B., & Vieira, C. (2010). Jumping genes and epigenetics: Towards new species Gene, 454 (1-2), 1-7 DOI: 10.1016/j.gene.2010.01.003

ResearchBlogging.org

Friday, May 11, 2012

Flat tori in 3D


Note: re-edited thanks to Steven Halter's wonderful input. Please check his comments below. 

It's been a while since I've read a pure math paper, but when I saw the picture I knew I had to pick this one up. For the pure mathematicians out there: I haven't done pure math since my grad years, so feel free to pitch in and correct me if I misunderstood any of the following!

"Torus" is mathematics for donut. Take a very flexible square -- imagine it's made of rubber -- roll it, then glue together the circles at the two ends. Congratulations. You've made a torus.

Now suppose you live on the torus and you need a map that takes you from A to B. Think of an atlas that shows you all the streets and cities on the torus. How do you map the torus onto a flat surface so that you can actually hold the map in your hands? Well, you go back to that square you used to make the torus, right? Cut out the donut vertically, then horizontally, and you've got your square back and now you can map all the streets you want.

Problem: the distances on your map will now be distorted, just like continent sizes are distorted on world maps (have you ever seen this image?). So now let's go back to the rubber square we've used to make the torus. Imagine you can move continuously from one edge to the opposite one both vertically and horizontally. That's what mathematicians call a square flat torus. The problem we want to solve is the following: we want to map this flat torus into the 3-D with the additional constraint that we all distances preserved.

Well, it turns out, there's a famous theorem, the Nash embedding theorem, that states that any Riemannian manifold (replace that with the torus we were talking about) can be isometrically embedded into a Euclidean space. Isometrically here means in such a way that it preserves the distances.

The above theorem tells us that there's a way to map it into a torus that will preserve the distances. Problem: how to visualize it? See, that's always been my issue with math. It's so beautiful at telling you what exists and what doesn't, but then you get to the practical side, as in, "Okay, now give me such map," and the mathematician shrugs and looks at you all weird: "I told you it exists, aren't you happy with that?"

Sorry, I'm joking, let's get serious again.

Here's the news: we now have a visualization of a flat torus (the square) in the three-dimensional space. In the latest issue of PNAS, Vincent Borrellia, Said Jabranea, Francis Lazarusb, and Boris Thibert present an isometric embedding of the flat torus in three-dimensional space. Forget the jargon and just look at the picture: how cool is that? And here is the best part: see all those corrugation in the figure? It's because it's a "smooth" fractal surface, a sort of hybrid between a fractal and smooth surface. The embedding is
"a continuously differentiable map that cannot be enhanced to be twice continuously differentiable. As a consequence, the image surface is smooth enough to have a tangent plane everywhere, but not sufficient to admit extrinsic curvatures."
Yes. I'm still a mathematician at heart, because I read this and got all excited. Of course, the rest of the paper went right past my head, but any of you willing to add a few more insights, you are more than welcome to do so in the comments below. Thanks! A few more details on the paper here.

Borrelli, V., Jabrane, S., Lazarus, F., & Thibert, B. (2012). From the Cover: Flat tori in three-dimensional space and convex integration Proceedings of the National Academy of Sciences, 109 (19), 7218-7223 DOI: 10.1073/pnas.1118478109

ResearchBlogging.org

Thursday, May 10, 2012

Hijacking dendritic cells


Dendritic cells are antigen-presenting cells: their main function is to patrol in search for "foreign objects" (the antigens). When it finds an antigen, the dendritic cell "chops it up" in fragments that are then presented to its surface. In more technical terms, it takes up the antigen by either phagocytosis or receptor-mediated endocytosis and transfers it to the cytosol, where further degradation may occur via proteosome. From here the resulting antigen peptides enter the endoplasmic reticulum and, ultimately, are exposed on the cell surface. At this point the dendritic cell migrates to the lymph nodes, which are rich in T-cells. The antigen fragments it carries on its surface are like red flags: once a T-cell recognizes a specific fragment, it gets "activated" and new T-cells with the same antigen specificity are created in order to mount an immune response against the invader.

Now, as you know, HIV infects preferentially T-cells. However, as a sexually transmitted virus, it first enters the body through the genital mucosa. How does the virus find T-cells from there? Easy. It highjacks dendritic cells and takes a ride to the lymph nodes, where the T-cells are.

But wait... the dendritic cell is supposed to kill the virus, not give it a ride...

Unfortunately, HIV has developed a mechanism that allows it to escape the usual degradation process inside the dendritic cell. It is capable of hitchhiking the dendritic cell without compromising its infectivity by residing "in an invaginated domain within the cells that is both contiguous with the plasma membrane and distinct from classical endocytic vesicles [1]." These are small membrane vesicles that are referred to as exosomes. They may be released in the extracellular milieu, following fusion of the multivesicular bodies with the plasma membrane. Once inside the dendritic cell, the virus can infect a T-cell via a mechanism called trans-infection, where the virus is passed from one cell to the other through the release/fusion mechanism of the exosome.

As you can imagine, these mechanisms are very interesting to study because if we could block the "highjacking" of the dendritic cells at the mucosa level, we could possibly stop the virus from spreading to the T-cells and initiate the infection. A team of researchers from Spain and Germany have studied this mechanism extensively. In a 2010 paper [1], Izquierdo-Useros et al. suggested that mature dendritic cell trans-infection could play an important role in augmenting "viral dissemination in the lymphoid tissue and significantly contribute to HIV disease progression." Mature dendritic cell encounter many T-cells every hour, with contacts that last several minutes, and as a consequence they have the potential to infect a broad number of T-cells. This could explain why HIV productive infection is more likely in subjects with a pre-existing sexual infection: the pre-existing mucosal inflammation could be responsible for the mobilization of a higher number of dendritic cells, which, in turn, could favor the spread of the HIV virus.

In a paper published last month [2], the research team showed that a particular class of lipids on the HIV surface favors the uptake of the virus into the dendritic cells. These lipids, called gangliosides, are a group of glycosphingolipids that are comprised of a ceramide linked to several oligosaccharide chains. They are basic components of the host cell’s plasma membrane, and they get incorporated into the viral envelope (the outer shell of the virus) when a new viral particles buds out of the cell. Izquierdo-Useros et al. [2] used artificial virus-like particle to show that only viruses with these lipids present on their surface were able to get into the dendritic cells.

Together, these findings pave new ways for novel strategies to block the spread of the HIV virus in the body, as well as a possible dendritic cell based vaccine.

Edit: As I was browsing the latest PNAS issue, I noticed an independent paper that reports the same finding that glycosphingolipid GM3 on the HIV-1 envelope allows for viral capture by mature dendritic cells. I've included the citation below [3].

[1] Izquierdo-Useros, N., Naranjo-Gómez, M., Erkizia, I., Puertas, M., Borràs, F., Blanco, J., & Martinez-Picado, J. (2010). HIV and Mature Dendritic Cells: Trojan Exosomes Riding the Trojan Horse? PLoS Pathogens, 6 (3) DOI: 10.1371/journal.ppat.1000740

[2] Izquierdo-Useros, N., Lorizate, M., Contreras, F., Rodriguez-Plata, M., Glass, B., Erkizia, I., Prado, J., Casas, J., Fabriàs, G., Kräusslich, H., & Martinez-Picado, J. (2012). Sialyllactose in Viral Membrane Gangliosides Is a Novel Molecular Recognition Pattern for Mature Dendritic Cell Capture of HIV-1 PLoS Biology, 10 (4) DOI: 10.1371/journal.pbio.1001315.

[3] Puryear, W., Yu, X., Ramirez, N., Reinhard, B., & Gummuluru, S. (2012). HIV-1 incorporation of host-cell-derived glycosphingolipid GM3 allows for capture by mature dendritic cells Proceedings of the National Academy of Sciences, 109 (19), 7475-7480 DOI: 10.1073/pnas.1201104109
 
This post was chosen as an Editor's Selection for ResearchBlogging.org

Monday, May 7, 2012

More on the viruses inside us: retrotransposons


I've talked about jumping genes, and I've talked about endogenous retroviruses, so now it's time to combine the two and talk about retrotransposons.

Retrotransposons are a subclass of transposons, genetic elements that can make copies of themselves and, once copied, can "jump" to different places in the DNA (hence the name "jumping genes"). Retrotransposons in particular are first transcribed to RNA, and then the RNA is copied back to DNA through reverse transcription, just like retroviruses do.
"Close to half of the human genome is derived from retrotransposons replicating by the copy-and-paste mechanism used by exogenous retroviruses such as HIV. These genetic invaders are both essential motors of evolution and threats whose uncontrolled spread would be fatal to their host [1]."
There are so many intriguing facts to be mentioned about retrotransposons: we seem to have accumulated them through our evolutionary history, as they tend to increase with evolutionary complexity, covering from 3% of the genome in yeast to 44% of the genome in humans. The active ones are responsible for a significant chunk of spontaneous mutations. They are quite abundant in plants, and in fact they were first discovered in maize. And finally, while many retrotransposons derived from endogenous retroviruses that got inserted into the germ line (as I explained last week), there's evidence that also points at the opposite mechanism taking place as well: ancient retrotransposons in the genome that acquired an envelope gene from a viral source and found an escape back to the viral world [2].

Pretty cool, right?

And here's the icing on the cake, which ties it all with yet another earlier post of mine:
"Retroelements are permanently inactivated during embryonic development so as to exhibit a transcriptionally silent state in adult tissues and in the germ line. Their tight regulation is important to prevent insertional mutagenesis and needs to withstand zygotic genome activation, which takes place at the two-cell stage, shortly after fertilization, as well as the ensuing DNA demethylation that is required for reprogramming [1]."
I know, it gets all very complicated, but if you go back to this post, and to the figure in that post in particular, you'll remember how cells "reprogram" themselves during embryogenesis. All epigenetic markers are reset so that cells can start fresh and differentiate into the various tissues needed to make a new being. Well, it turns out, retrotransposons are involved in this process too, and retroelements are silenced through cytosine methylation during embryonic development. The pathways involved are extremely complex and the various elements interact with one another in multiple ways, as "development itself is in fact controlled by temporal and spatial expression and repression of retroelements, and some cellular genes crucial during this period are co-regulated with ERVs and related entities [1]."


[1] Rowe, H., & Trono, D. (2011). Dynamic control of endogenous retroviruses during development Virology, 411 (2), 273-287 DOI: 10.1016/j.virol.2010.12.007

[2] Malik HS, Henikoff S, & Eickbush TH (2000). Poised for contagion: evolutionary origins of the infectious abilities of invertebrate retroviruses. Genome research, 10 (9), 1307-18 PMID: 10984449

ResearchBlogging.org

Thursday, May 3, 2012

The viruses inside us


One of my first and still most popular posts was on endogenous retroviruses, or ERV: these are viral sequences that got integrated in the host DNA and became part of the noncoding genome. With time, Mother Nature found a way to reuse these viral proteins, for example in the placenta, as I was explaining in the earlier post, and some of those proteins became expressed.

I was at a conference last week, and one of the talks discussed the evolution of these endogenous viral elements (EVEs) and how they have become part of a co-evolutionary process. The speaker compared the phylogenetic trees of many EVEs across different species with the phylogenetic trees of the species themselves, and these trees were topologically similar, meaning that the viruses and their hosts have developed mechanisms of coevolution. What this means is that whenever there has been a divergent event in the evolutionary history of a certain species, that event is also reflected in the evolutionary history of the virus hosted by the species. This is not surprising if you think about it: as the host evolves, the virus has to evolve too in order to survive (the Red Queen effect I talked about here).

How did these endogenous viral sequences end up in our genome? In order to replicate, retroviruses undergo reverse transcription, which turns their RNA into DNA, and then the DNA gets integrated into the host genome. When this happens in a germline cell, the germ cell doesn't undergo replication like other cells, as the integrated viral DNA may eventually be distributed throughout the genome through meiotic replication, and as a result the viral genome is stuck there and gets passed on -- as a non-coding sequence -- to the offsprings.

This explains the presence of endogenous retroviruses in our genome. More intriguing is how RNA viruses got there, given that those viruses replicate without getting integrated into the host genome. In fact, they never get trasncribed into DNA. We still don't know how such viruses could have been integrated into the host genome, though one hypothesis is that reverse transcription (as a rare event) could have been triggered by the reverse transcriptase enzyme naturally present in the cellular retroelements.
"The endogenous viral elements (EVEs) we know today must only be a small subset of those that have existed in the past; many others will have been lost by the chance process of genetic drift, which is the fate of most mutations at low frequency, even those that are selectively advantageous. [. . .] Other EVEs may have been removed by purifying selection because they reduce organismal fitness. In particular, human endogenous retroviruses are usually located in genomic regions away from genes, whereas the integration sites of (presumably recent) exogenous retroviruses are often close to genes, suggesting that there is a selective cost in having EVEs located too close to genic regions [1]."
Most endogenous viral elements are defective and hence are found in non-coding regions of the genome. However, like I discussed in my earlier post, it's not unusual for the sequences to find a new function and become expressed again. When this happens, the sequences could become advantageous to the host and hence get fixed in the population. For example, some endogenous viruses trigger protection in the host against similar exogenous viruses by interacting with the infecting virions and causing them to be defective.

These findings have greatly informed our understanding of viral evolution. Indeed, endogenous viral sequences represent a "fossil record" of past infections.
"The key point here is that once integrated into host genomes, EVEs cease to evolve with the very high substitution rates that characterize exogenous RNA and small DNA viruses and instead replicate using high-fidelity host DNA polymerases and probably experience fewer replications per unit time. This will result in a dramatic reduction in evolutionary rate, from the virus scale (usually around 10e-03 nucleotide substitutions per site, per year) to the host scale (~10e-09 subs/site/year)."
Basically, even though viruses evolve at a much faster rate than their hosts, once those sequences are integrated in the germline, from there on, they evolve at the same rate as the host genome, which is much slower. That's how they become "fossils" compared to their exogenous counterparts. For example, studies looking at primate lentiviruses (for example SIV and HIV) have estimated the age of these viruses to be in the thousands at maximum. However, endogenous lentivirus elements in lemurs indicate that they have been circulating for over a million years. Additionally, there is evidence of selection pressure derived from the fitness cost induced by viral infections that also points at the antiquity of some viral families.

[1] Holmes, E. (2011). The Evolution of Endogenous Viral Elements Cell Host & Microbe, 10 (4), 368-377 DOI: 10.1016/j.chom.2011.09.002

ResearchBlogging.org