I've just upgraded our system from GCG 9.1 to GCG 10 and I'm getting the hang of
the new programs. To my mind, the most significant new program is NetFETCH.
NetFETCH solves one of my most annoying long term problems with GCG -
in accession numbers etc. between results from BLAST/ENTREZ searches vs.
my local GCG database (updated nightly by a script that FTPs from
GenBank). Now users can
use Net FETCH to easily grab a bunch of sequences found by BLAST and load them
into their local directory for manipulation with other GCG tools.
Previously, this was
one of my main arguments in favor of using FASTA in GCG vs BLAST on the web.
[We got rid of local BLAST databases a long time ago.]
Now I'm wondering how much longer it is going to be worthwhile for us to
local copies of GenBank, Swissprot, PIR etc. Keeping these databases is
one of the most
significant expense in the running of our computer center -forcing us to
budget for a
doubling of disk space each year, ad infinitum. Now that I think about
it, if we didn't
keep local databases, we wouldn't need to upgrade our server either.
So, what would GCG have to do in order to create a fully functional
program that did
not require local databases? There would have to be a remote server
did the FASTA/TFASTA/FASTX/TFASTX searches on GenBank (and it's subsets) similar
to the way GCG now handles remote BLAST searches at NCBI. This server
have to handle the infrequent, but processor intensive FINDPATTERNS and
(and new MOTIFSEARCH) as well. Rather than the free to the world NCBI
I am thinking of a paid service. How much would you pay...??? Well
certainly the price
of my time for the maintenence of local databases, plus the cost of new
plus the cost of processor upgrades for our server.... This would have to
cost more than
the current "deluxe" database service. But still, I think it would be
Stuart M. Brown
NYU Medical Center, 550 First Ave, NY, NY