30 March 2015

Politburo Approved

Viral, tropical diseases are really cool because they have great names. e.g. Dengue or Breakbone Fever or Chikungunya ("that which breaks up").  The great thing about many viral diseases is that they are dependent upon proteases for many things. (And yes, I know how that sounds.)  Proteases have nice, well defined active sites that you can fill quite well and shut them down. In this paper, the authors use fragment-peptide merging to inhibit Dengue protease.  

This is really an extension of previous work.  The original work used capped peptides with a warhead with very good potency (down to 43 nM).  They then investigated retro, retro-inverse, semiretro-inverse, and nonretro di- and tri-peptides.  This lead them to use a tri-peptide (Arg-Lys-Nle) in two generations: first an arylcyanoacrylamide and then to N-substituted 5-arylidenethiazolidinone (thiazolidinediones and rhodanines).  These second generation hybrids had increased membrane permeability, in vitro binding, in cellulo antiviral activity.  Based on docking, they decided to investigate Nle sitting in P1', in contrast to previous site preferences and then merge it with fragments from an optimized capping moiety. 
1.  Starting Point Hybrid Peptide
The investigation of Nle replacements led to the phenylglycine molecule, with 4x greater affinity:
9.  Phenyl-glycine hybrid
They, then chose three hybrids (including 9) and put two different caps on them:
Rhodanine Cap
Acrylamide cap

Compared to the benzoyl cap, the acrylamide was 2x better while the rhodanine was 5x better.  But, wait, doesn't the Politburo condemn all uses of rhodanines?  Of course not.  In this case, the rhodanine was selected through rigorous analysis: and they have selectivity (this assay is fluorogenic).  They are perfectly aware of the general distaste people have for rhodanines and address the concerns. All of this together, leads to the final compound (below).
This is a really nice piece of starting with a tool (covalent peptides) and working to generate drug like molecules with favorable properties. 

25 March 2015

We read these papers so you don't have to

Glycogen Phosphorylase is one of those systems that you hear about all the time; it was the first allosteric enzyme discovered.  It's been discussed here and here previously on this blog.  It is one of those ubiquitous enzymes and has been the subjet of a lot of research looking for allosteric modulators.  The majority of allosteric inhibitors are heterocyclic compounds with a well known history.  This paper wants to add to that history. 
The authors start with what appears to be a dreadful understanding of what fragment-based hit generation is.
"Lead-like discovery refers to the screening of low molecular weight libraries with detection of weak affinities in the high micromolar to millimolar range".
Maybe its just me, but we've been over this before.  Lead-like molecules, as Kubinyi showed, are large and decorated; fragments are not.  So, they got the low molecular weight thing right, but the name of the method wrong. Maybe an error in the proofing...
Starting on previous work, the chose a 21 member heterocycle library (Figure 1.) to investigate a morpholine-based peptide mimetic.
Figure 1.  Fragment Library
Activity was determined by an enzymatic assay with a maximal compound concentration of 222mM.  They also used 22mM, 56 mM, 111mM leading to Table 1 and some crazy SAR (N-Boc-ing 8 yielded 9 with >200x potency).  
Table 1. 
The key compound is 7, with 25 microM IC50;  while 6 (minus the methyl ester) is 1000x less potent.  Strange things are afoot at the Circle K.  They then docked 7 (and a few other "second tier" compounds).  They see "moderate" binding for all compounds; yet, one of these compounds is more than 50x more potent than the others.  We've been down this road before...

23 March 2015

Rad fragments revisited

Two years ago we highlighted a paper in which Cambridge University researchers identified fragments that bind to the protein RAD51, which in turn binds to the protein BRCA2 to protect tumor cells from radiation and chemotherapeutics. In a new paper in ChemMedChem, Marko Hyvönen and colleagues describe how they have grown these fragments into low micromolar binders.

One of the best fragments identified in the previous work was L-tryptophan methyl ester (compound 1), so the researchers naturally tried substituting the methyl group. A phenethyl ester (compound 5c) gave a satisfying 10-fold boost in potency, but this turned out to be the best they could get: shorter or longer linkers were both less active, and modifications around the phenyl ring gave marginal improvements at best. Also, changing the ester to an amide decreased affinity. They were, however, able to improve potency another order of magnitude by acylating the nitrogen (compound 6a).


At the same time, the researchers made a more radical change to the initial fragment by keeping the indole and replacing the rest with a sulfonamide (compound 7a). This also boosted potency. Further optimization of the sulfonamide substituent improved the affinity to low micromolar (compound 7m) and increased ligand efficiency as well.

The original fragments had been characterized crystallographically bound to the protein, but the researchers were unable to obtain structures of the more potent molecules, though they did sometimes see tantalizing hints of electron density. Competition studies with known peptide inhibitors also suggested that the molecules do bind in the same site as the initial fragments.

The thermodynamics of binding were characterized using isothermal titration calorimetry (ITC). Although the initial fragments owed their affinity largely to enthalpic interactions, the more potent molecules were more entropically driven. This, the researchers suggest, could partially account for the failure of crystallography despite extensive efforts: the lipophlic molecules can bind in a variety of conformations.

Some have argued that enthalpic binders should be prioritized, but this study illustrates one of several problems: even if you start with an enthalpic binder, there’s no guarantee it will stay that way during optimization.

This is a nice paper, but I do wonder how much affinity there is to be had at this site on RAD51. Given the micromolar affinity of the natural peptides, nanomolar small-molecule inhibitors may not be possible. Then again, like other difficult PPIs such as MCL-1, perhaps the right molecule just hasn’t been made. How long – and how hard – should you try?

18 March 2015

Mass Spec Screening in Solution

Mass spectrometry is a technique that most people are familiar with, as a QC tool.  It also has been demonstrated as a screening/validation tool.  Native mass spectrometry (nMS) has been discussed here, Weak Affinity Chromatography (WAC) here, and Hydrogen-deuterium exchange (HDX) here.  All of these methods have advantages and disadvantages.  A "new" method is the ligand-observed MS screening (LO-MS).  [I put new in quotes because I know of at least one company that has been using this method for screening for years via a CRO.]

The concept of LO-MS is straight forward (Figure 1) and very similar to WAC.  A mixture of fragments, in this case 384, are mixed with target (NS5B), incubated, and the ultrafiltrated (50kDa cutoff).  This step eliminates the need for the immobilization step in WAC, ensuring the native conformation.  The fragments were at 25 uM, while the target was at 50 uM. 
Figure 1.  Fragments MW 165 and 130 are binders.  MW162 and 150 are not. 
Retained fragments are then dissociated with 90% methanol and those showing intensity higher than the protein-minus control are considered binders (S/N  greater than 10).  In their library, 5% of the compounds were not amenable to mass spec detection, but they included them to increase the complexity of the mixture.  In the end, they ended up with 20 binders in 20 minutes!  They repeated the screen with smaller mixtures (50 and 84 fragments) where they found 12 binders (a subset of the original 20).  As a follow up, they ran the binders by SPR, validating 10 of the binders (50%).  5 out of these 10 gave useable crystals (observable electron density for the fragment) (50%).  They also show how the data can be used to generate Kds (like WAC).

This method raises some issues with me, but first let me say, it sure seems to work, and fast to boot.  From people I know who have used this to screen, they have been very happy.  Here is what bothers me: self-competition in the tube a discussed here and here, this is a non-equilibrium method (variable protein concentration during the ultrafiltration), and it is an indirect method.  For me, I prefer methods that directly detect ligand-target interactions, like NMR, SPR, and nMS.

16 March 2015

Fragments vs p97

The protein p97 is important in regulating protein homeostasis, and thus a potential anti-cancer target. But this is no low-hanging fruit: the protein has three domains and assembles into a hexamer. Two domains, D1 and D2, are ATPases. The third (N) domain binds to other proteins in the cell. All the domains are dynamic and interdependent. Oh, and crystallography is tough. Previous efforts have identified inhibitors of the D2 domain, but not the others. Not to be put off by difficult challenges, a group of researchers at the University of California San Francisco (UCSF) led by Michelle Arkin and Mark Kelly have performed fragment screening against the D1 and N domains, and report their adventures in J. Biomol. Screen.

Within UCSF, the Small Molecule Discovery Center (SMDC) has assembled a fragment library of 2485 commercial compounds from Life, Maybridge, and Asinex. These have an average molecular weight of 207 Da and 15 heavy atoms, with ClogP ~1.5. The researchers used both biophysical and virtual screening.

For the physical screening, the researchers started with surface plasmon resonance (SPR), with each fragment at 0.25 mM. This resulted in 228 primary hits – a fairly high hit rate. Full dose response studies revealed that 160 of theses fragments showed pathological behavior such as concentration-dependent aggregation or superstoichiometric binding. A further 30 showed weak or no binding, 13 were irreversible, and 5 bound nonspecifically to the reference surface, leaving only 20 validated hits which were then repurchased.

The 228 primary hits were also assessed by STD NMR, each at 0.5 mM when possible (some fragments were not sufficiently soluble). Of these, 84 gave a strong STD signal, and 14 of these were also among the 20 SPR-validated hits.

The 20 repurchased fragments were further tested by both SPR and STD NMR, and 13 of them reconfirmed by both methods. The paper includes a table listing all 20 compounds, and one observation that struck me was the fact that all but one of the hits – which had dissociation constants ranging from 0.14 to 1.7 mM – are larger than the library average. Such results could argue for including larger fragments in libraries, though this goes against both molecular complexity theory as well as extensive experience at groups such as Astex.

Next, the researchers sought to discover information on the binding sites. Three fragments could be competed by ADP, suggesting that they bind in the nucleotide-binding site of D1. To narrow things down further, the researchers turned to 13C-1H-methyl-TROSY NMR, in which specific side chain methyl groups of Ile, Leu, Met, Val, and Ala were labeled, and chemical shifts were examined in the presence and absence of fragments. Two of the proposed nucleotide-binding site fragments showed similar shifts as AMP or ADP, further supporting a common binding mode (the third was too weak to test). This was not an easy experiment: the hexamer has a mass of 324 kDa, well above where most people do protein-detected NMR.

Independent of all the biophysical screens, virtual screens were conducted using Glide XP, which suggested that the nucleotide binding site would be the hottest hot spot. Happily, all three fragments that appear to bind to this site scored highly in the in silico work, with two of these within the top 100 fragments. However, the binding sites for the other ten confirmed fragments remain obscure.

This paper serves as a useful guide for how fragment screening is performed on a tough target in a top-tier research group. Although difficult, it is not impossible to advance fragments in the absence of structure. While it remains to be seen whether that will be the case for any of these fragments, the researchers have provided a wealth of data for those who wish to try.

11 March 2015

The Sequel is Never as Good as the Original

We are living in a target-driven environment in Pharma, for both good and bad.  The low-hanging fruit have been plucked and the high-hangers are tough.  But, fragments have proven to be highly utile in liganding these targets.  One drawback with target-based screening is the problem with cellular activity, while it may be easy to generate good activity against the isolated target, in the end you need activity in the cell/animal.  Back in the good ole days, people just skipped the target and went straight into cells: compounds are put on bacterial plates and the microbes die if the compound is anti-microbial.  This is the simplest example of phenotypic screening, the phenotype here being "dead cells". [For a discussion of the history of phenotypic screening, go here.]  Fragments could be the worst case scenario for phenotypic screening as fragment-target interactions are very weak, and very commonly do not exert a biological effect. 

In this paper from Rob Leurs and colleagues, including Iota, the describe a fragment-based phenotypic screen process.  This work is a follow on to previous work from this group discussed here, which I quite liked  So, they have a target (PDEB1) but immediately follow their screening with the phenotypic part.  For the phenotypic screen, they used several different parasitic PDE and MRC5 cell-line as a counter-screen. I won't bore you with any of the experimental details. The compounds are recapitulating known molecules, like benadryl.  Now, I really wanted to like this paper, at least from a process approach.  It appears to my eyes, that all the compounds are pretty much equipotent and cytotoxic.  This is a really disappointing paper in that it doesn't really do anything.  They had shown previously that you could get non-cytotoxic compounds with good inhibition of PDEB1.  They didn't repeat that here.  There is no X-ray, they did before.  The compounds are wholly uninteresting and stretch the imagination to be seen as compounds "with a lot of potential to grow into antiparasitic compounds".

09 March 2015

Are PrATs privileged or pathological?

Pan assay interference compounds – PAINS – have received quite a bit of attention at Practical Fragments. In addition to being a fun topic, the hope is that publicizing them will allow researchers to recognize them before wasting precious resources.

But not all PAINS are created equal. Some, like toxoflavin, simply do not belong in screening libraries due to their tendency to generate reactive oxygen species. I would put alkylidene rhodanines in the same category due to their ability to act as Michael acceptors, their tendency to undergo photochemistry, and their hydrolytic instability. The nice thing about these sorts of molecules is that their clear mechanistic liabilities justify excluding them.

But things are not always so simple, and in a recent paper in J. Med. Chem. Martin Scanlon and co-workers at Monash University, along with J. Willem Nissink at AstraZeneca, describe their experiences with a more ambiguous member of the PAINS tribe: 2-aminothiazoles. (See here for In the Pipeline’s discussion of this paper.)

That 2-aminothiazoles (2-ATs) should be PAINS is not obvious: at least 18 approved drugs contain the substructure. Thus, it was not unreasonable to include 2-ATs in the 1137-fragment library assembled at Monash. But after screening 14 targets by STD-NMR and finding a 2-AT hit in every campaign, the researchers started to become suspicious. They gathered a set of 28 different 2-ATs and screened these against six structurally diverse proteins using surface plasmon resonance (SPR). Many of the 2-ATs bound to 5 of the proteins, and a couple bound to all six. The researchers used 2D-NMR (HSQC-NMR) to further characterize binding and found that the 2-ATs bind to multiple sites on the proteins rather than the desired one-to-one binding mode.

A common source of artifacts is the presence of reactive impurities, so the researchers resynthesized some of the 2-ATs and showed they behave the same, ruling out this mechanism. Solubility was also not a problem. Finally, the ligand-based NMR experiments revealed that the 2-ATs really did appear to be binding to the proteins, ruling out interference from unreacted starting materials or decomposition products.

One structure-activity relationship did emerge: acylation of the amino group dramatically reduced promiscuity of the 2-ATs. However, in the case of 2-ATs with a free amino group, there was little meaningful SAR. Thus, the researchers propose calling these molecules PrATs, or promiscuous 2-aminothiaozles.

Further analysis of high-throughput screening data from the Walter and Eliza Hall Institute and AstraZeneca revealed that 2-ATs were also over-represented among hits. What’s spooky about this result is that most of the screens were done at 10 micromolar – far lower than typical fragment screens.

The researchers freely admit that they have no mechanism for why PrATs bind to so many proteins. I suspect there is something fundamental to be learned about intermolecular interactions here, though how to extract these lessons is beyond me. One gets the impression that the authors themselves have been burned by pursuing PrATs, as they conclude:
On the basis of our findings reported here and our unsuccessful attempts to optimize these fragments against different targets, we have removed 2-ATs from the fragment library.
This paper serves as a thorough, cautionary analysis. As evidenced by multiple approved drugs, PrATs can be advanceable, and we certainly won’t be PAINS-shaming papers that report them as screening hits. If you can advance one to a potent lead, then bless your heart. But be warned that this is likely to be even more difficult than normal.

04 March 2015

Way Down in the X-Ray Weeds

So, what I know about the details of crystallography can fit on the head of a pin...a small pin.  You put pure protein in multiwell plates and then do a huge matrix of crystallization conditions until tiny little crystals form.  Big crystals are best, but you can use tiny crystals or seeds, or with recent advances in technology, to actually collect data.  Then, through some wizardry (some sort of inverse transform) you make spots go to electron density, then with will power and what used to be SGI machines, you thread your protein sequence in, et voila a model of the structure.  I typically don't go for methods papers in fields I have almost no clue in, but this one intrigued me.

This paper aims to increase the efficiency of soaking fragments into crystals to take advantage of 3rd generation synchrotrons. These machines/labs/setups/doohickeys use acoustic droplet injection (ADE), which many people may already be aware of.  In this approach, each fragment soaks into a protein crystal either directly on data collection media or on a moving conveyor belt which then delivers the crystals to the X-ray beam.  The source of inefficiency comes from the time required to soak the fragment in to the crystals (for those where the apparatus is inside the X-ray station. I have no idea what that means, but here is a google image search that might give you an idea.)  A second source is the limit of evaporative dehydration during the fragment soak.  

Using the model system lysozyme and thermolysin the identified factors which can increase efficiency: namely smaller crystals can be used to decrease the soak time.  By small crystals, they are talking things that are 100 microns or less.  The authors go on to state that:
These techniques efficiently use fragment chemicals (~2.5 nL per screened condition), protein (~25 nL per screened condition), space (1120 screened conditions per standard shipping Dewar; no limits using a conveyor belt), and synchrotron beam time (less than 1 second/screened condition).  Evaporative dehydration of the protein crystal limits these fragment screening applications to systems where the fragment soak time is not prohibitive. Slow-binding compounds can be screened (without time constraint) in trays using ADE, but will consume significantly more resources such as purified protein and chemical compounds (~1 µl per screened condition). Hence, it is desirable to identify promising cases where the cost-efficient on-micromesh or on-conveyor soaking methods are adequate.
So, what did I really get out of this paper.  I am amazed by the miniaturization and automation that exists in synchrotrons.  It is really amazing. Its good to get and read the literature for another field.  It can be enlightening.  If I am looking at this correctly, they can screen a 1000 fragments with ~1mg of protein.  With that said, how many people screen for fragments in this way?  It seems not to be resource intensive, if you are sitting at a synchrotron.  But, how much does it cost to sit at that synchrotron?  Are the problems called out here something people encounter every day, or is this a "First World problem" for those sitting at synchrotrons.

What really got me was that the majority of the authors are high school students and undergraduates. This emphasizes to me the commoditization of X-ray, and really all services.  There is a very high level of training that goes in to solving the structure; I get that.  But it seems that many of the steps are commmodities, if you will.  When I was at Merck, they had a directive called (in some form) I-C-E: Innovation-Commoditization-Experimentation.  The concept was the highly trained (and highly paid) scientists needed to focus on innovation.  Once innovation was achieved it led to experimentation (figuring out how to run it routinely).  After that it was a commodity and should be outsourced to enable those scientists to go back to innovating.  It makes sense from a business standpoint, but scary from the scientists standpoint.  I am all for full employment for scientists in industry (trust me on this), but outsourcing can co-exist in industry. Look at the growth in providers from 2011 to 2014. Not sure where I am going with this, but food for thought.

02 March 2015

Fragments vs Factor XIa

Blood clotting is something we’re all familiar with, but the details are devilishly complex; lots of different proteins play a role. Physiologically this makes sense: the many components make for a finely tuned system, and you want clotting to happen when it needs to and then stop. Start too late and you might bleed to death. Start too early (or don’t stop) and you could develop a fatal clot. Not surprisingly, lots of things can go wrong, and many of the enzymes involved are drug targets. In a paper recently published in PLoS One, Ola Fjellström and colleagues at AstraZeneca describe their efforts on one of these.

Factor XI is involved in the “amplification phase” of coagulation, and the activated form (FXIa) is a potential antithrombotic and profibrinolytic target. A high-throughput screen had failed to find anything useful, so the researchers turned to fragments.

The team started with a computational screen of 65,000 in-house compounds with molecular weights < 250 Da. They used Schrödinger’s Glide software and previously determined crystal structures of the protein. The top 1800 fragments were then tested using ligand-detected NMR in pools of 6, with each fragment present at 0.1 mM. The researchers were trying to avoid strongly basic compounds, and they found 13 hits with calculated pKa< 9. Next, 600 structurally related analogs of the hits were screened, resulting in 50 hits total. These were then triaged using inhibition in solution (an SPR technique described here) and taken into crystallography trials. Two fragments gave high-resolution structures and were prioritized. Satisfyingly, the two fragments bound as had been predicted by the initial virtual screen.

Fragment 5 was particularly interesting because it had never been observed as a hit in the S1 pocket of a serine protease. Many enzymes in the coagulation cascade share a conserved S1 pocket. This has a predilection for highly positively charged species, so the neutrality of this fragment was attractive.
Separately, the team found a Bristol-Myers Squibb patent application describing compound 9, which they made and characterized crystallographically. The structure suggested merging a portion of compound 9 with fragment 5, and the resulting compound 13 turned out to be one of the most potent FXIa inhibitors reported.

To better understand the system, the researchers took a deconstruction approach to compound 9, testing the portion (compound 15) that had been used in the merging. This bit has low affinity by itself. Yet, when linked to fragment 5, the resulting compound 13 binds roughly 200-fold more tightly than simple additivity would predict. Similarly dramatic fragment deconstruction results have been reported previously for the related enzymes factor Xa and thrombin.

Unfortunately compound 13 has fairly low membrane permeability, high efflux, and high clearance in rats, though preliminary SAR suggests this is the fault of the Bristol-Myers Squibb piece rather than the new fragment. At any rate, this is another nice example of using fragment screening to replace one portion of a known molecule with a new fragment.