GCG v.10 - without databases?

Dr. Greg Quinn greg at franklin.burnham-inst.org
Fri Feb 12 19:40:24 EST 1999

Stuart, good point.
This idea of a centralized resource has been knocking around for a few years. Given 
the very low cost of hard disk space, I haven't found providing space for the databases
to be a problem. I don't keep BLAST databases since our users can perform a blast
search at the NCBI site in a fraction of the time it would take on our Solaris box. In
fact, I'd be interested to know how many sites do keep local GCG BLAST databases, other 
than specialized ones; not that many I would imagine.

I'm guessing that it won't be too long before there will be a few commercially run centers 
of bioinformatics tools and databases, which it will be more cost effective to buy
access to than each site having their own, perhaps also including consultation in respect
to data analysis results. Hmm, could be an opening for some adventurous folk ...

Stuart Brown (browns02 at mcrcr.med.nyu.edu) wrote:
: I've just upgraded our system from GCG 9.1 to GCG 10 and I'm getting the hang of
: the new programs.  To my mind, the most significant new program is NetFETCH.
: NetFETCH solves one of my most annoying long term problems with GCG -
: inconsistancies
: in accession numbers etc. between results from BLAST/ENTREZ searches vs.
: FETCH from 
: my local GCG database (updated nightly by a script that FTPs from
: GenBank).  Now users can
: use Net FETCH to easily grab a bunch of sequences found by BLAST and load them
: into their local directory for manipulation with other GCG tools. 
: Previously, this was 
: one of my main arguments in favor of using FASTA in GCG vs BLAST on the web. 
: [We got rid of local BLAST databases a long time ago.]
: Now I'm wondering how much longer it is going to be worthwhile for us to
: keep our
: local copies of GenBank, Swissprot, PIR etc.  Keeping these databases is
: one of the most
: significant expense in the running of our computer center -forcing us to
: budget for a
: doubling of disk space each year, ad infinitum.   Now that I think about
: it, if we didn't
: keep local databases, we wouldn't need to upgrade our server either.  
: So, what would GCG have to do in order to create a fully functional
: program that did
: not require local databases?  There would have to be a remote server
: somewhere that
: did the FASTA/TFASTA/FASTX/TFASTX searches on GenBank (and it's subsets) similar
: to the way GCG now handles remote BLAST searches at NCBI.  This server
: would also
: have to handle the infrequent, but processor intensive FINDPATTERNS and
: (and new MOTIFSEARCH) as well.  Rather than the free to the world NCBI
: BLAST server,
: I am thinking of a paid service.  How much would you pay...??? Well
: certainly the price
: of my time for the maintenence of local databases, plus the cost of new
: hard drives,
: plus the cost of processor upgrades for our server....  This would have to
: cost more than
: the current "deluxe" database service.  But still, I think it would be
: worth it.  
: Any thoughts?  
: Cheers
: ‹Stuart Brown
: -- 
: Stuart M. Brown
: Bioinformatics Consultant
: NYU Medical Center, 550 First Ave, NY, NY

Computational Biology Group
The Burnham Institute
(formerly La Jolla Cancer Research Inst.)
10901 North Torrey Pines road
La Jolla
Phone:(619) 646 3103
Email: greg at franklin.ljcrf.edu

More information about the Info-gcg mailing list

Send comments to us at biosci-help [At] net.bio.net