CHI’s Drug Discovery Chemistry
meeting took place over four days last week in San Diego. This was easily the
largest one yet, with eight tracks, two one-day symposia, and nearly 700
attendees; the fragment track alone had around 140 registrants. On the plus
side, there was always at least one talk of interest at any time. On the minus
side, there were often two or more going simultaneously, necessitating tough
choices. As in previous years I won’t attempt to be comprehensive but will
instead cover some broad themes in the order they might be encountered in a
drug discovery program.
You need good chemical matter to
start a fragment screen, and there were several nice talks on library design. Jonathan Baell (Monash
University) gave a plenary keynote on the always entertaining topic of PAINS.
Although there are some 480 PAINS subtypes, 16 of these accounted for 58% of
the hits in the original paper, suggesting that these are the ones to
particularly avoid. But it is always important to be evidenced-based: some of
the rarer PAINS filters may tag innocent compounds, while other bad actors
won’t be picked up. As Jonathan wrote at the top of several slides, “don’t turn
your brain off.”
Ashley Adams described the
reconstruction of AbbVie's fragment libraries. AbbVie was early to the field,
and Ashley described how they incorporated lessons learned over the past two decades. This included adding more compounds with mid-range
Fsp3 values, which, perhaps surprisingly, seemed to give more potent
compounds. A 1000-member library of very small (MW < 200) compounds was also
constructed for more sensitive but lower throughput biophysical screens. One
interesting design factor was to consider whether fragments had potential sites
for selective C-H activation to facilitate fragment-to-lead chemistry.
Tim Schuhmann (Novartis)
described an even more “three-dimensional” library based on natural products and
fragments. Thus far the library is just 330 compounds and has produced a very low
hit rate – just 12 hits across 9 targets – but even a single good hit can be
enough to start a program.
Many talks focused on fragment-finding methods, old and new.
We’ve written previously about the increasingly popular technique of microscale
thermophoresis (MST), and Tom Mander (Domainex) described a success story on the
lysine methyltransferase G9a. When pressed, however, he said it did not work as
well on other targets, and several attendees said they had success in only a
quarter to a third of targets. MST appears to be very sensitive to protein
quality and post-translational modifications, but it can rapidly weed out
aggregators. (On the subject of aggregators, Jon Blevitt (Janssen) described a
molecule that formed aggregates even in the presence of 0.01% Triton X-100.)
Another controversial
fragment-finding technique is the thermal shift assay, but Mary Harner gave a
robust defense of the method and said that it is routinely used at BMS. She has
seen a good correlation between thermal shift and biochemical assays, and
indeed sometimes outliers were traced to problems with the biochemical assay.
The method was even used in a mechanistic study to characterize a compound that
could bind to a protein in the presence of substrate but not in the presence of
a substrate analog found in a disease state. Compounds that stabilized a
protein could often be crystallized, while destabilizers usually could not, and in one project several
strongly destabilizing compounds turned out to be contaminated with zinc.
Crystallography continues to
advance, due in part to improvements in automation described by Anthony Bradley
(Diamond Light Source and the University of Oxford): their high-throughput crystallography
platform has generated about 1000 fragment hits on more than 30 targets. Very
high concentrations of fragments are useful; Diamond routinely uses 500 mM with
up to 50% DMSO, though this obviously requires robust crystals.
Among newer methods, Chris Parker
(Scripps) discussed fragment screening in cells, while Joshua Wand (U. Penn)
described nanoscale encapsulated proteins, in which single protein molecules
could be captured in reverse micelles, thereby increasing the sensitivity in
NMR assays and allowing normally aggregation-prone proteins to be studied. And
Jaime Arenas (Nanotech Biomachines) described a graphene-based electronic sensor
to detect ligand interactions with unlabeled GPCRs in native cell membranes. Unlike
SPR the technique is mass-independent, and although current throughput is low,
it will be fun to watch this develop.
We recently discussed the
impracticality of using enthalpy
measurements in drug discovery, and this was driven home by Ying Wang (AbbVie).
Isothermal titration calorimetry (ITC) measurements suggested low micromolar binding affinity
for a mixture of four diastereomers that, when tested in a displacement
(TR-FRET) assay, showed low nanomolar activity. Once the mixture was resolved
into pure compounds the values agreed, highlighting how sensitive ITC is to
sample purity.
If thermodynamics is proving to
be less useful for lead optimization, kinetics
appears to be more so. Pelin Ayaz (D.E. Shaw) described two Bayer CDK kinase
inhibitors having either a bromine or trifluoromethyl substitution. They had
similar biochemical affinities and the bromine-containing molecule had better
pharmacokinetics, yet the trifluoromethyl-containing molecule performed better
in xenograft studies. This was ultimately traced to a slower off-rate for the
triflouromethyl-substituted compound.
The conference was not lacking
for success stories, including MetAP2 and MKK3 (both described by Derek Cole,
Takeda), LigA (Dominic Tisi, Astex), RNA-dependent RNA polymerase from
influenza (Seth Cohen, UCSD), and KDM4C (Magdalena Korczynska, UCSF). Several
new disclosures will be covered at Practical
Fragments once they are published.
But these successes should not
breed complacency: at a round table chaired by Rod Hubbard (Vernalis and
University of York) the topic turned to remaining challenges (or opportunities).
Chief among these was advancing fragments in the absence of structure.
Multiprotein complexes came up, as did costs in terms of time and resources
that can be required even for conventional targets. Results from different
screening methods often conflict, and choosing the best fragments both in a
library and among hits is not always obvious. Finally, chemically modifying
fragments can be surprisingly difficult, despite their small size.
I could go on much longer but in
the interest of space I’ll stop here. Please add your thoughts, and mark your
calendars for next year, when DDC returns to San Diego from April 2-6!
Thanks for the summary, Dan. Its good to see that FBDD has become a mainstream approach, incorporated by most Pharma companies into their programmes and that the discussion is now as much about the compounds generated by FBDD as the methodologies used to identify hits.
ReplyDeleteThe effects of using ligand mixtures in ITC titrations is an important lesson. Ronan O'Brien, Thomas Lundback and I described some of the issues in an old Microcal application note ('In The Mix'), but I think it may not be currently available. In summary, when the different species bind with similar enthalpies, the titration curves (ligand-into-protein) can resemble simple 1:1 titrations, but the fitted Kd will be closest to that of the weakest-binding component. Ensuring compound purity (including stereochemical purity) is important for all biophysical assays (i.e. those that detect binding by observing a property of the protein and not simply activity), not just ITC.