I would like to draw to your attention a session on the "Use of
Toxicological Information in Drug Design" at the upcoming ACS in Washington
DC (August 20-24, 2000). The session is being organized by the Chemical
Information (CINF) division. Below are the abstracts for the session,
beginning with the keynote address by Dr. Joseph Contrera, associate
director of the Office of Testing and Research at the FDA.
I hope to see you there.
1. Keynote Address: "Application of toxicology databases in drug
by Joseph F. Contrera and Edwin J. Matthews (FDA's Office of Testing and
In drug development, the application of combinatorial chemistry and high
through put screening of compounds has resulted in an unprecedented increase
in the number of compounds identified with potentially desirable
pharmacological properties. The selection of lead compounds is currently
hampered by limitations in the available methods for assessing toxicity.
CDER files are a unique repository of the results of clinical and
non-clinical toxicology studies. With the major advances in computer and
information technology this unique scientific resource can be more
effectively used to improve the scientific basis of regulatory decisions and
product development. A current challenge is developing means to identify
useful relationships and insights from large datasets.
Under a cooperative research and development agreement (CRADA), the FDA CDER
modified and enhanced the capability of Multicase computational toxicology
software to predict the carcinogenic potential of molecules based on
chemical structure. Such software has potential regulatory and drug
development applications that can ultimately benefit the public health.
2. "Computational toxicology and virtual development in drug design"
by Dale E. Johnson and Grushenka H. I. Wolfgang (ddplatform LLC)
The attrition rate of lead compounds that are optimized to development
candidates and that eventually reach early clinical trials is still
alarmingly high. One key reason for the high failure rate is undesired or
unpredicted toxicity - either in animals or humans. Currently, a wide array
of predictive tools is being developed with the goal of improving lead
compound selection at the earliest stage. These new in silico approaches are
changing toxicology into a knowledge- and information-based science where
structure-toxicity-relationships are beginning to be elucidated. Predicting
potential toxicity requires knowledge of both ADME and pharmacokinetic data
across species as well as an understanding of mechanisms of action and the
chemical species that elicits the initial toxic event. Although the toxicity
"bottleneck" is formidable, virtual approaches using simulation algorithms
are expected to accelerate and increase the success of early drug
development within the next 3-5 years.
3. "The paradigm shift from traditional to virtual"
by Stephen K Durham (Bristol-Myers Squibb)
The intensely competitive global pharmaceutical business environment has
forced the need for the early and successful selection of viable drug
candidates, and the abandonment of the traditional development paradigm. The
enormous drug developmental costs necessitate the identification of
toxicologic liabilities of novel compounds during the initial and least
expensive phase of the discovery process. The incorporation of in silico
programs to predict toxicity, accompanied by aggressive model development,
will be fruitful avenues worth pursuit in the drug candidate evaluation
4. "Application of Computational Toxicology (ComTox) and Multicase (MCASE)
Software to the FDA Mission"
by Edwin J. Matthews and Joseph F. Contrera (FDA's Office of Testing and
Under a CRADA between FDA and Multicase, Inc., new ComTox software programs
have been developed that estimate chemical-toxic responses and dosages in
animals using animal toxicology studies, and in humans using pre-market
clinical-trial data and post-market adverse-event data for pharmaceuticals.
This talk will examine two software modules that predict the potential
carcinogenicity, and the maximum-tolerated-dose (MTD), in rats and mice, and
it will elaborate on the experimental parameters that were accounted for the
program's high predictive performance and excellent coverage for
FDA-regulated substances. The parameters include: (1) large control data
sets (n>1000), (2) separate modules for each study cell (rat & mouse), (3)
the use of biological potency scales, and (4) a human expert system. The
talk will also review the current and future regulatory applications of
ComTox within the Agency.
5. "Data mining of toxic chemicals and database-based toxicity prediction"
by Jiansuo Wang and Luhua Lai (Peking University)
In the early stage of drug discovery, especially for computer-aided drug
design, a large number of molecules will be proposed as potential leads and
the bioactivity risk of these molecules is expected to be evaluated prior to
synthesis. The rule-based expert systems have been used for the aim, while
mining of a large amount of toxicological data can provide us with another
In term of pharmacologists/toxicologists, toxicants are the drugs that cause
vital harm. Therefore, the biochemical basis of toxic chemicals is the same
as that of drugs and there exist toxicant-receptor systems just like
drug-receptor systems. Under such a notion, we introduce some concepts and
technologies, which are developed in drug design, into toxic chemicals, and
conduct the following work.
I. We have studied the structural features of toxic chemicals from the RTECS
database associating with specific toxicity. Potential active frameworks,
groups and structure patterns for specific toxicity are obtained by
computational chemistry approaches. These structural features of toxic
chemicals will be helpful to understand activities of toxic chemicals and
useful to predict toxicity of chemicals, especially in the early stage of
II. We take a two-step strategy to explore noncongeneric toxic chemicals
from the database RTECS: the screening of structure patterns and the
generation of detailed relationship between structure and activity. From the
performance of overall procedure, such a stepwise scheme is demonstrated to
be feasible and effective to mine a database of toxic chemicals.
III. We develop one program as database-based toxicity predictor of
chemicals (dbToxPre). For one activity-query molecule, the program firstly
retrieves its structure-related molecule set by quick shape comparison with
the molecules in toxicological database; then carries out detailed
structure-toxicity-relation analysis to the molecule set and produce the
toxicity prediction of the query molecule. The program mainly includes four
parts: a) a fast and efficient clustering of molecules based on molecular
shape, b) field-based similarity computation of molecular structure based on
shape cluster, c) flexible CoMFA analysis of molecules based on shape
cluster, and d) a database of toxic chemicals suitable for such procedure.
6. "In Silico Toxicology Screening of Estrogenic Compounds as Potential
by William J. Welsh (University of Missouri-St. Louis)
Specific estrogen-like compounds known as Selective Estrogen Receptor
Modulators (SERMs) have attracted tremendous interest in the pharmaceutical
industry and elsewhere as potential therapeutic agents for myriad
applications in medicine. We have employed an integrated array of
ligand-based and receptor-based approaches to design a large number of
estrogen agonists and antagonists spanning several chemical classes. As part
of this drug discovery program, we have developed and applied various in
silico strategies for rapid screening of these compounds in terms of their
toxicological and environmental effects. My talk will discuss growing
concerns about the possible environmental impact of estrogenic compounds. It
will also describe our computer-based models for predicting the biological
activity and toxicological profile for the novel set of compounds under
development in this laboratory.
Robert W. Snyder, Ph.D.
Director, Chemistry Marketing
MDL Information Systems
email: bobs at mdli.com
Never stop searching.