Impact of HLA-Driven HIV Adaptation on Virulence in Populations of High HIV Seroprevalence

  • Rebecca Payne ,
  • Maximilian Muenchhoff ,
  • Jaclyn Mann ,
  • Hannah E. Roberts ,
  • Philippa Matthews ,
  • Emily Adland ,
  • Allison Hempenstall ,
  • Kuan-Hsiang Huang ,
  • Mark Brockman ,
  • Zabrina Brumme ,
  • Marc Sinclair ,
  • Toshiyuki Miura ,
  • John Frater ,
  • Myron Essex ,
  • Roger Shapiro ,
  • Bruce D. Walker ,
  • Thumbi Ndung'u ,
  • Angela R. McLean ,
  • ,
  • Philip J. R. Goulder

Proceedings of the National Academy of Sciences of the United States of America |

Publication

Is Drug Rollout Reshaping Pathogenesis?

Here’s a provocative thought: as we roll out drugs to the sickest people first, are we selecting for weaker viruses–ie, those that don’t make people sick, and thus are less likely to be subjected to drug therapy? We don’t have direct evidence for this, but when we compare Botswana to South Africa, we see high CD4 (healthier immune systems) per level of viral load (viral concentration) or viral replicative capacity (how well it grows in a lab). Perhaps related (perhaps not), we also see an increased burden of circulating HLA escape mutations. At the very least, this increased burden appears to have wiped out B*57’s ability to modulate relative viral control. Might it also have weakened the virus?

Citation and access

Impact of HLA-driven HIV adaptation on virulence in populations of high HIV seroprevalence (opens in new tab)

Rebecca Payne, Maximilian Muenchhoff, Jaclyn Mann, Hannah E. Roberts, Philippa Matthews, Emily Adland, Allison Hempenstall, Kuan-Hsiang Huang, Mark Brockman, Zabrina Brumme, Marc Sinclair, Toshiyuki Miura, John Frater, Myron Essex, Roger Shapiro, Bruce D. Walker, Thumbi Ndung’u, Angela R. McLean, Jonathan M. Carlson and Philip J. R. Goulder

Proceedings of the National Academy of Sciences (opens in new tab), doi:10.1073/pnas.1413339111, December 2014.

Abstract

It is widely believed that epidemics in new hosts diminish in virulence over time, with natural selection favoring pathogens that cause minimal disease. However, a tradeoff frequently exists between high virulence shortening host survival on the one hand but allowing faster transmission on the other. This is the case in HIV infection, where high viral loads increase transmission risk per coital act but reduce host longevity. We here investigate the impact on HIV virulence of HIV adaptation to HLA molecules that protect against disease progression, such as HLA-B*57 and HLA-B*58:01. We analyzed cohorts in Botswana and South Africa, two countries severely affected by the HIV epidemic. In Botswana, where the epidemic started earlier and adult seroprevalence has been higher, HIV adaptation to HLA including HLA-B*57/58:01 is greater compared with South Africa (P = 7 × 10−82), the protective effect of HLA-B*57/58:01 is absent (P = 0.0002), and population viral replicative capacity is lower (P = 0.03). These data suggest that viral evolution is occurring relatively rapidly, and that adaptation of HIV to the most protective HLA alleles may contribute to a lowering of viral replication capacity at the population level, and a consequent reduction in HIV virulence over time. The potential role in this process played by increasing antiretroviral therapy (ART) access is also explored. Models developed here suggest distinct benefits of ART, in addition to reducing HIV disease and transmission, in driving declines in HIV virulence over the course of the epidemic, thereby accelerating the effects of HLA-mediated viral adaptation.

Overview

This paper has generated a lot of press, so I’ll explain it a bit here. This was a project with Philip Goulder. As with all papers (but especially this one), it’s important to separate out the data from the interpretation. Generally, the former is more reliable than the latter; though the latter is way more interesting! So let me caveat this paper by saying the data are sound, and while I believe the interpretations, they are just that, so take them with a grain of salt.

OK, so here’s what we saw. Philip has lots of data from two key countries: Botswana, where the epidemic has been raging for a long time, but where drug therapy has been available for some time as well; and South Africa, which has a very high prevalence, but is about 10 years “younger” in terms of the epidemic, and which was very slow (opens in new tab) in rolling out drugs. So what differences do we see?

First, we see that, for a given CD4 count, the viral load (VL) and viral replicative capacity (“VRC”) is much lower in Botswana. For these purposes, you can think of CD4 count as a measure of immune health (it’s the consentration of CD4+ T-cells in the blood), while VL and VRC are measures of virus “fitness” (VL is the concentration of virus in the blood, and thus provides a measure of how well the virus can grow in the local environment of the host, while VRC is a measure of how well the virus can grow in a controlled lab environment). That’s how we present it in the paper, but you can also flip it on its head: two viruses of similar fitness will yield healthier immune systems in Botswana than in South Africa. Overall, the correlation between VL and CD4 is reduced in Botswana compared to South Africa. Why would that be? Second, we see that circulating viruses in Botswana carry a much higher burden of HLA escape to the local population than do viruses in South Africa. A notable example is the high prevalence of circulating B*57 escape, which is notable because B*57 is not protective in Botswana, even though it is in just about every other cohort anyone has looked at.

So how to interpret this? Are the VL/CD4 numbers and HLA escape related? We propose two explanations: (1) The high burden of HLA escape leads to lower fitness viruses. This is weakly supported by our data here (high escape in Botswana; lower fitness), but that’s quite circumstantial. Many other studies though have shown a link between HLA escape and lower fitness. So this is plausible, though I am skeptical of this interpretation. Basically, I am a big believer in HIV’s ability to adapt: while single escapes reduce fitness, all escapes we’ve ever studied have compensatory mutations that rescue the virus. I suspect that, at a population level, and over time, those compensations will be “fixed” and any fitness burden will be lost. Furthermore, as we noted in the paper, there was very little correlation between burden of escape and VL or VRC WITHIN either cohort.

So what other possibilities could there be? My guess is that it’s the rollout of drug therapy. National and WHO drug therapy guidelines generally target CD4 counts: that is, preferentially give drugs to patients with the lowest CD4 counts, as they’re the sickest. Now, once we give someone drugs, they generally stop transmitting the virus (opens in new tab). Which is the worst possible outcome, as far as HIV is concerned. Thus, we are by definition placing evolutionary pressure on the virus to NOT reduce CD4 counts. Put another way, we are preferentially wiping out the “strongest” viruses (VRC is negatively correlated with CD4 counts). We can take this one step further by noting that high VL increases the odds of transmission. Although VL is related to CD4 counts, it’s a noisy relationship. Indeed, there are examples (opens in new tab) of individuals with high VL and high CD4 counts, and this is in fact what is observed in natural SIV infection. Thus, from first principles, I would argue that we should expect CD4-driven ART rollout to select for viruses that yield high VL and high CD4 counts. Which is to say, a decoupling of the linkage between VL and CD4; or, if you prefer, healthier immune systems for any given viral load. This is exactly what we’re seeing in Botswana relative to South Africa.

So, two possible explanations. With two cohorts, we can’t prove which (if either) it is. Of the two, I favor the ART hypothesis, as it makes a very coherent evolutionary explanation, and fits nicely with modeling studies arguing that the virus needs to maximize odds-per-exposure of transmission (roughly, high VL) while simultaneously maximizing the number of exposures (roughly, avoiding death or drug therapy). We included some such modeling in our paper, curtesy of Angela McLean’s group. Here’s (opens in new tab) a nice review of such modeling approaches. Of course, this interpretation assumes there are viral genetic configurations that lead to high VL and high CD4 count. As I noted, there appear to be some examples of this in humans, but it’s possible those are driven by human genetics/environment, not viral genetics. If the virus doesn’t have “control” of this, then the effect may simply be to lower viral fitness at the population level (again, consistent with what we report here, though difficult to be sure with cross-sectional samples, as it could be that the cohorts were enrolled at different disease stages). If the virus is making a tradeoff (so to speak) of accepting a loss of fitness to avoid ART, then HLA-driven escape may provide a convenient substrate: the virus escapes from HLA anyway, and the added ART-derived selection pressure basically reduces the advantage of compensation or reversion within the host or following transmission.

In any case, these data raise some interesting hypotheses that will need to be further tested. I can’t wait to see what new data show!

Principal collaborators

Oxford University (opens in new tab)

U of Kwazulu-Natal (opens in new tab)

In the news

This has blown up in the news. Here’s a non-exhaustive list.

It’s interesting how some have focused on the ART story, others on the immune escape story. Most have focused on the idea of HIV getting weaker over time. As I said above, I favor the ART story. Also, I’m not sure if weaker over time is the best way to look at it, or if it’s more useful to think of it as a decoupling of VL from CD4 (ability of the virus to grow, from the ability of the virus to cause disease). My thinking is evolving on this. Will be interesting to see where new data take us.