Stem Cell Special Report: Germlines and gene-editing

After engineering SuperRat, is Superman far behind?

Register for free to listen to this article
Listen with Speechify
0:00
5:00
Early this year, Junjiu Huang and colleagues in China triggered a cascade of controversy when they decided to apply gene-editing technologies to human embryos, methods that had been extensively applied to embryos of other species. As they stated in the abstract of their paper in Protein & Cell, they wanted to better understand DNA repair mechanisms in early embryos as well as the efficiencies and fidelities of these methods—in this case, CRISPR—in preimplantation embryos.
 
“Because ethical concerns preclude studies of gene editing in normal embryos,” the authors wrote, acknowledging the slippery landscape onto which they were stepping, “we decided to use tripronuclear zygotes, which have one oocyte nucleus and two sperm nuclei.”
 
As discussed in the recent Out of Order commentary Asilomar remembered (May 2015 issue of DDNews), response to the publication was swift and loud, with several research organizations raising concerns over the ethical considerations and calling for a moratorium until the implications of such research are understood more fully.
 
In June, the Alliance for Regenerative Medicine added its voice, suggesting that this type of research should not be pursued at this time.
 
“Going forward, we encourage rigorous and transparent legal and policy discussions, as well as continued public debate about the science, safety and ethics of modifying human embryos or germline cells,” they wrote in a position statement that effectively echoed the ISSCR and comments made in op-ed pieces in Science and Nature.
 
Even the White House got into the act, issuing a comment through the Office of Science and Technology Policy.
 
“Research along these lines raises serious and urgent questions about the potential implication for clinical applications that could lead to genetically altered humans,” wrote John Holdren, director of the office and assistant to the U.S. president with regard to science and technology issues. “The full implications of such a step could not be known until a number of generations had inherited the genetic changes made—and choices made in one country could affect all of us.”
 
A week earlier, the National Academy of Sciences and the newly named National Academy of Medicine announced a major initiative to guide decision making about human gene editing. Central to this launch was the call for an international summit this autumn—essentially a new Asilomar Conference—of leading scientists, ethicists and policy specialists to discuss the implications of this research.
 
Within the month, NAS/NAM announced the membership of its advisory group whose mission was to “identify and gather information and advice from the scientific and medical communities that will enable the academies to guide and inform researchers, clinicians, policymakers and the public.”
 
Perhaps not surprisingly, three members of that group were authors of the original summary statement arising from Asilomar in 1975: Paul Berg, David Baltimore and Maxine Singer.
 
There is no denying the ethical and moral implications of altering the germline, but realistically, how close are researchers to achieving that safely and effectively?
 
Making the cut
 
At their most basic, the three predominant gene-editing platforms—ZFNs, TALENs and CRISPR—share two components, explains Jochen Hartner, director of scientific project management at Taconic Biosciences.
 
The first is a nuclease, which introduces double-stranded breaks into DNA sequences with which it interacts. In the case of TALENs and ZFNs, the nuclease is a derivative of the Fok1 enzyme, while in CRISPR it is a microbial Cas9 protein. But these nucleases are non-specific, cutting anywhere in the genome.
 
“Of course, we want to control that to insert targeted mutations at specific sites,” says Hartner. “So you need a specificity component.”
 
For TALENs and ZFNs, this specificity component is a protein sequence that is subdivided into sequence-specific DNA-binding domains. For zinc fingers, the individual domains bind triplets of DNA whereas in TALENs, a separate domain targets each nucleotide.
 
Thus, with these two systems, every time you want to target a new gene sequence, you need to create a new targeting peptide. And despite the best efforts by system designers to turn these domains into so many genetic Lego bricks, the methods can be tricky for those less familiar with protein design.
 
And this is part of the reason why CRISPR made such a splash when it first became available. Rather than require a protein to identify the gene sequence, CRISPR uses a small guide RNA sequence homologous to the target DNA sequence to align the nuclease.
 
“It became as simple as designing PCR primers,” enthuses Namritha Ravinder, senior scientist at Life Technologies.
 
But even relying on homology doesn’t mean you can tackle any DNA sequence with CRISPR, cautions Eric Rhodes, chief technology officer of Horizon Discovery. CRISPR also requires a short triplet sequence called a PAM site—an NGG sequence, for example—to occur before the region of homology.
 
“So, in terms of design density—the term that’s used for how often you can design per base—TALENs can at least theoretically be designed to every base in a genome,” Rhodes continues. “Zinc fingers are designed about one every 50 bp or so. And CRISPR is theoretically about one every 12 to 20 bp, depending on what PAM site you’re using and things like that.”
 
But even that reliance on a specific PAM site is changing for CRISPR, according to Ravinder, as several labs look to engineer or discover Cas9 proteins that will recognize alternative PAM sites.
 
“If you can remove that constraint, engineer those proteins to detect different PAMs, then that challenge of finding the right target site is eliminated,” she continues.
 
Once the selected DNA sequence has been cleaved, the cell’s natural repair mechanisms attempt to fix the gap, using either non-homologous end joining (NHEJ) or homologous replacement (HR).
 
NHEJ is the molecular equivalent of duct-taping any two ends together, which makes it particularly useful for gene knockout experiments as repaired genes are likely to include either insertions or deletions.
 
To make specific changes like point mutations, however, a second fragment of DNA can be added that contains the change of interest as well as long regions of homology to the cut DNA. The cell then uses this fragment as a template to repair or swap out the cleaved sequence.
 
The efficiencies at which these cleavages and repair take place can vary for a variety of reasons, including what platform you use, delivery efficiency, nuclease levels and even cell type (see also Factors impacting genome editing success and failure in the October 2014 issue of  DDNews). Rhodes suggests that the average efficiency for the three methods is around 5 to 20 percent with CRISPR being the most active, but the higher you push efficiency, he warns, the more likely you are to experience off-target issues.
 
For this reason, screening of successful clones is increasingly important.
 
“An ideal scenario would be if I as a researcher made a specific change and then I could sequence the whole genome,” says Ravinder. “Not just the gene sequences but the proteomics, gene expression levels, mRNA, metabolomics.
 
“Then you have a perfect delivery unit that you can administer into these for therapeutic purposes.”
 
Last year, researchers at the University of California, San Francisco took the opposite tack, reducing the levels of nuclease in their experiments to minimize off-target effects and looking for ways to identify rare mutational events in a high background of wild-type cells (see sidebar How’d they do that? below). As they described in Nature Methods, they combined a cell amplification technique most often used in yeast cultures called sib-selection with the TaqMan PCR assay that used fluorescent tags for mutant and wild-type and droplet digital PCR.
 
Combining the methods with either TALENs or CRISPR, they achieved an almost 300-fold enrichment of mutant iPS cells that maintained their pluripotent potential. And initial sequencing efforts suggested a complete absence of off-target changes.
 
“The easy aspect of just cutting the gene is one thing,” says Ravinder, but the downstream characterization requires much more investment.
 
Humanizing models
 
Gene editing, whether with ZFNs, TALENs or CRISPR, has made great strides in most areas of stem cell research; for example, helping scientists convert cell lines from wild type to mutant while otherwise maintaining an isogenic background (see also Model citizens in the August 2013 issue of DDNews). These cells can then be used in screening assays against panels of potential drugs or to better understand the pathophysiology of disease.
 
The platforms are also seeing increased use in transforming model organism genomes via the germline tissues to create better models of human health and disease, in some cases by outright replacing the animal gene with its human equivalent.
 
“Traditionally, people have been able to engineer mice using mouse ES cell technology,” explains Rhodes. In this case, modified stem cells are then injected into early-stage embryos where they become part of the whole organism, creating a mosaic animal. Animals are then cross-bred to generate true strains carrying the modification.
 
Increasingly, however, researchers are skipping the ES cell step and instead are modifying the embryos or the germ cells directly. This cuts model development time considerably, says Rhodes.
 
In March, for example, Nam-Hyung Kim and colleagues at Chungbuk National University used CRISPR technology to examine the impact of knock-in and knockout modifications of several genes in preimplantation pig embryos. As they described in PLoS One, they could even insert genes for green fluorescent protein that would allow them to monitor gene expression during embryogenesis with live-cell imaging.
 
Similarly, Texas A&M University’s Mingyao Liu and colleagues at East China Normal University used direct injection of RNA into single-celled rat embryos to create rat strains carrying mutations in multiple genes via CRISPR. As they reported in Nature Protocols, they could generate knock-out and knock-in strains of rats within six weeks, from target design to identification of mutation-carrying founders.
 
By the same token, Hartner expresses concern about the hype surrounding these methodologies.
 
“People thought for a while that you could do everything by CRISPR in half the time at half the cost,” he says. “And that turned out to be not true.”
 
“You’ll probably hear that you can create knockout mice and rats in eight weeks, but that’s not really reflecting reality,” he continues. “You can get the first transgenic animals after a couple of weeks, but those animals are not ready to study.”
 
As he explains, these animals rarely carry the gene mutation on both alleles, and they may also be mosaics, carrying different alleles of the same genes in different cells of the body. Thus, several rounds of breeding will likely still be required to achieve homogeneous homozygotes.
 
“If you really want to get to a mouse that is equivalent to one created by conventional gene targeting, you’re looking at about half a year until you have a mouse you can really do something with,” Hartner concludes.
 
Timelines aside, the goal is to build models of human disease in these animals, continues Rhodes.
 
“So you might introduce, for example, a particular cancer mutation that you would find in a human population into a mouse gene and then see if that gives those mice a propensity to develop cancer,” he adds. “Then you can try to use experimental treatments to treat that cancer on the animals to see if you can have a different effect.”
 
Again, Hartner agrees, but is cautious.
 
“We do see an increasing demand for more complex mouse models with more complex modifications,” he says. “It’s not just deleting a gene or inserting a point mutation. For instance, we replace an entire mouse gene with a human gene, so human-specific compounds can be tested on the human protein in the mouse.”
 
But, he continues, this is yet tricky with gene-editing technology.
 
Horizon’s SuperRat, however, shows what can be accomplished.
 
“What we’re doing is systematically going in and knocking out the rat liver enzymes and replacing those with human enzymes,” Rhodes explains. “The rats are perfectly fine, but now they start to process drugs the way that a human would, even though it’s being done in the rat.”
 
With an eye to human clinical trials, Horizon hopes to leverage the SuperRat as a more human ADME-Tox screen than typical animal testing in that final step before first-in-human trials.
 
According to Rhodes, the company either uses the homologous replacement approach described earlier to modify rat genes to more closely resemble human homologs, or they use essentially the same technology to insert the human genes elsewhere in the rat genome. Once the human gene is in this safe harbor, as he describes it, they can then go and knock-out or remove the rat gene.
 
“Once you’ve created one rat, you can do this sequentially or you can do it at the same time at a bunch of different genes, and then you can cross-breed all those rats,” he continues, explaining how you can mix and match the sequences within a model. “That’s another advantage of using the animal models.”
 
As these models continue to be developed, however, others turn their gazes to therapeutic applications.
 
Into humans
 
As gene-editing technologies move into humans, the logical first target is to modify cells ex vivo, removing them from a patient, correcting the genetic error or adding an enhancement gene and reintroducing the modified cells back into the patient, suggests Rhodes.
 
This has been a particularly hot topic of late in the area of immuno-oncology, where researchers are looking to add chimeric antigen receptors (CARs) to patient T cells in the hope of triggering an immune response to a tumor (see also Body, heal thyself in the June 2015 issue of DDNews).
 
In January, Novartis signed a five-year collaborative agreement with Intellia Therapeutics for just this purpose, leveraging the latter’s CRISPR experience to engineer CAR T cells (CARTs).
 
“CARTs and HSCs represent two of the most immediate opportunities for CRISPR therapeutic development,” said Intellia CEO and co-founder Dr. Nessan Bermingham in announcing the agreement.
 
French biotech Cellectis—the center of a gene-editing IP dispute (see sidebar Shifting IP landscape below)—is similarly following suit by leveraging its expertise with TALENs to develop CART products, but with a twist. Rather than use autologous cells (patient receives his or her own cells back), Cellectis is angling toward an allogeneic or off-the-shelf model where patients receive modified cells originally provided by healthy donors.
 
Sangamo, meanwhile, continues its work with Biogen to apply ZFNs ex vivo to CD34+ stem cells to better regulate hemoglobin production in patients with beta-thalassemia and sickle cell disease. The goal is to essentially shut down expression of the disease-causing adult hemoglobin gene while upregulating the fetal gene. In February, the FDA accepted Sangamo’s IND application to initiate clinical trials of this platform.
 
“We believe that a single treatment with SB-BCLmR-HSPC has the potential to provide a lasting therapeutic solution for transfusion-dependent beta-thalassemia with significant safety advantages over existing transplant therapies that involve hematopoietic stem progenitor cells from a matched related donor,” said Sangamo’s executive vice president of research and development, Geoff Nichol, in the announcement. “We know that elevated production of fetal globin can ameliorate disease symptoms of hemoglobinopathies such as beta-thalassemia. We are also developing this strategy for sickle cell disease.”
 
Still a bit down the road, suggests Rhodes, are the true gene therapies where you inject a gene directly into the body with the genome-editing machinery.
 
In a recent review published in Nature Medicine, Feng Zhang and colleagues at MIT discussed the relative merits of ex-vivo and in-vivo genome editing, suggesting the latter offered two potential advantages.
 
“First, in-vivo editing can be applied to diseases in which the affected cell population is not amenable to ex-vivo manipulation,” the authors wrote. “Second, in-vivo delivery has the potential to target multiple tissue types, potentially allowing for the treatment of diseases that affect multiple organ systems.”
 
Thus, they suggested, in-vivo treatment could likely be applied to a wider array of diseases than ex-vivo.
 
Rhodes is quick to point out, though, that with in-vivo editing, the clinical team quickly loses control over the system and cannot select appropriately modified cells from those with off-target modifications as one can with ex-vivo manipulation.
 
Zhang and colleagues acknowledged this as well, citing amongst other things the mutagenic potential of the nucleases.
 
“While delivery of nucleic acids and proteins are both capable of achieving transient expression in target cell types, protein delivery is likely to provide the best control over nuclease dosage, since there is no signal amplification,” they suggested.
 
Because of safety concerns like this, where you are unable to retrieve modified tissues that might prove faulty or dangerous to patients, it is likely that true gene therapy in this vein will require a significantly deeper understanding of and improvements in the efficiency and fidelity of the gene-editing technologies.
 
And these are just the changes to somatic tissue, cells that can go no further than the patient in whom they reside and cannot be passed onto future generations. What, then, is the realistic likelihood of applying the same technology to germline cells and particularly embryos, both of which admittedly can be worked on ex vivo?
 
Much ado?
 
Although everyone agrees with the idea of holding discussions on the future of germline modification, not everyone agrees on how big a priority such conversations are. Ironically, one of the co-signatories of the Science opinion piece thinks that there are more pressing concerns with the research being done today.
 
Within weeks of the Science article’s publication, Stanford University Director of the Center of Law and Biosciences Hank Greely published a follow-up opinion in the Center’s blog. In his discussion of the thinking behind the Science opinion, he outlined his priority of the issues around genome editing, placing human germline as least important.
 
“Frankly, although the fuss has been about human germline genomic modifications, I think that attention has been misplaced,” he offered.
 
For Greely, two simple practical issues preclude germline editing from becoming a serious issue: safety and medical demand.
 
“You’d have to be criminally reckless, or insane, to try to make a baby this way unless and until we’ve had a decade or more of preliminary research, with human tissues and with nonhuman animals (including certainly primates and maybe even some of the nonhuman apes), showing that it is safe,” he wrote. “If the moral risk isn’t enough of a deterrent, the potential legal liability should be.”
 
Likewise, given the wealth of other simpler methods to facilitate the birth of healthy children to parents carrying alleles of human disease, Greely just doesn’t see a very large demand for germline modification. And as to concerns about genetically engineered superhumans, he suggested that “it turns out that, after hundreds of billions of dollars spent, we know surprisingly little about the genetics of disease. We know almost nothing about the genetics of ‘enhancement’.”
 
Human somatic modification requires much more attention, according to Greely, if only because it is a practical and present reality. No one questions the opportunities available from such research, and perhaps because the biological impacts of any given therapy reach no further than the clinical subject, enthusiasm for this avenue of exploration is largely unfettered with controversy.
 
Greely offers his greatest reservations, however, for the application of genome editing in nonhuman species where many of the natural clinical cautions are muted or eliminated.
 
“As it gets cheaper and easier to modify genomes, nonhuman genomes offer freedom from a lot of regulation, liability and political controversy, while offering plenty of opportunities to improve the world, become famous or make money—with combinations of all of the above,” he suggested.
 
But as he points out, the very real implication that such efforts could “reshape the biosphere” gives pause.
 
“As the ability to make carefully engineered genomic changes becomes more widely accessible, the possibility of insufficiently controlled or considered experiments increases dramatically,” he wrote. “And so, of course, does the chance of more controlled interventions. I would like to see much more focus on this issue, of great practical importance, instead of so much attention on the sexier issue of germline genome modification in humans.”
 
Regardless of his personal hierarchy of concern, however, Greely is still very much in agreement with the call for a germline-modification moratorium, at least until a meeting has been convened to discuss its implications.
 
Like him, we will just have to wait to see what arises from the summit later this year.
 

 
How’d they do that?
 
A challenge of nuclease-driven gene editing is the need for antibiotic selection, which can potentially interfere with cellular phenotype. Without this selection, however, it can be difficult to identify the rare few cells that have been altered.
 
One way around this problem would be to increase nuclease activity to increase DNA cleavage, but this comes with the potential cost of increased off-target activity.
 
To address this challenge, Bruce Conklin and colleagues at the University of California, San Francisco combined droplet digital  PCR (ddPCR) with sib-selection cell plating to identify and enrich rare gene-editing modifications promoted by TALENs and CRISPR, describing their efforts last year in Nature Methods.
 
Combining ddPCR with the TaqMan PCR assay, using fluorescent probes for both wild-type and mutant alleles, the researchers noted they could detect as little as 0.1 percent mutant gene in a wild-type background.
 
With sib-selection, a population of cells is sub-divided into duplicate plates to be tested for a mutation. The mutation-containing subpopulation is further subdivided and tested repeatedly until the rare event cell is purified.
 
After just three rounds of sib-selection, the scientists found they could increase mutant allele frequency from 0.023 percent of the cell population to 6.8 percent, an almost 300-fold enrichment. Furthermore, characterization of two isolated mutant cell lines showed the cells maintained their pluripotency.
 
Selective sequencing suggested that there were no off-target changes, but the researchers acknowledged complete analysis would require whole-genome sequencing. Nonetheless, the results suggested that rather than isolating more than 2,000 clones to identify a single mutant, the new method would only require the isolation of 11 clones.
 
“As our approach is based on a 96-well plate format, we can easily multiplex mutagenesis so that a single technician could generate six to 10 lines at once,” the authors concluded. “Moreover, subsequent cloning is only attempted when successful [homologous recombination (HR)] has occurred, avoiding the potentially wasted effort of directly isolating iPS cell clones even when HR has not occurred.”
 

Shifting IP landscape
 
Even as companies providing gene-editing services or tools—or directly applying gene editing to discovery of therapeutics—explode out of the woodwork, the intellectual property (IP) grounds on which they took their stands has been shaken this past year.
 
In December, the U.S. Patent and Trademark Office issued a patent to Boston Children’s Hospital and Institut Pasteur that their licensing partner Cellectis suggests covers all extant gene-editing tools. The position immediately sent tremors through the industry, and most recently started rumors of a €1.6-billion takeover of Cellectis by Pfizer, who already owns a stake in the French biotech.
 
As the initial dust settled, however, a number of companies have taken a more wait-and-see position on the legal questions, moving ahead with existing projects or, in some cases, doubling down on gene editing.
 
“I don’t believe the Cellectis IP reads on CRISPR in any way, shape or form,” offers Horizon Discovery Chief Technology Officer Eric Rhodes, suggesting his company’s IP strength comes in its decision to do business broadly. “Horizon has tried to be very proactive in this space, so we’ve already licensed from the Broad, we’ve licensed from ERS Genomics, Caribou Biosciences and from Harvard University.”
 
For Taconic Biosciences’ Jochen Hartner, director of scientific project management, security also comes in not relying on a single technology platform.
 
“Gene editing has not really changed the demand for the type of models we generate because all of the models we generate today, we generate by conventional gene targeting in ES cells,” he says. “And in fact, many of those models cannot yet be generated by any of the gene-editing technologies.”
 
“In worse case,” he muses, “if we did not have the legal capability to use the technology, we could still go back to conventional gene-targeting methodologies.”
 
The biopharmas have been similarly defiant.
 
Within a week of the Cellectis patent, Novartis announced its collaboration with Intellia Therapeutics to apply the latter’s CRISPR platform to develop therapeutics based on its chimeric antigen receptor T cells and hematopoetic stem cells. The five-year agreement will see the companies jointly advance multiple programs, including Intellia’s in-house HSC pipeline.
 
Later that same month, AstraZeneca formed agreements with the Wellcome Trust Sanger Institute, the Innovative Genomics Initiative, Thermo Fisher Scientific and the Broad Institute to apply CRISPR technology to various preclinical projects, including humanized disease models, target identification and screening.
 
“For the therapeutic people, they’ve got a long timeline here of probably 10 years before they are commercializing something,” offers Rhodes to explain biopharma’s apparent lack of concern. “I think they can afford to let some of this play out and once all of the chips have fallen, then we can figure out who actually needs to be dealt with.”
 
But even as the IP discussions occur, the landscape itself will continue to change. Thus, Hartner is somewhat sanguine about the tumult.
 
“By the time the CRISPR patent disputes have been settled, there might already be a new method on the horizon,” he suggests. “We want to remind ourselves that who thought there would be TALENs when the zinc fingers first appeared, and who thought there would be CRISPR at the time TALENs entered the stage?”


Subscribe to Newsletter
Subscribe to our eNewsletters

Stay connected with all of the latest from Drug Discovery News.

March 2024 Issue Front Cover

Latest Issue  

• Volume 20 • Issue 2 • March 2024

March 2024

March 2024 Issue