X, etc...

Sat Feb 16 13:31:00 EST 1991

        Recently some people has posted about the + and - of Unix. I just
wanted to remember that Unix is fine for software developers, but not so
much for users. At least in its crude form. This is the reason why X is such
a good choice.

        On the other hand, X imposes a good overload on net usage. Someone
also told of the difficulty of finding 68030 and 80386 computers as well as
Ethernet networks. All I can say is that, for the first theme, it solves
itself. Just give a person a choice to use a faster computer and he will
damn its formerly werid machine and try to get the new. In our Institute
the jobs was done by putting an SE/30 as a server for the Laser instead of
an SE. Now we have almost a computer by lab, and nearly 80% of them are
SE/30. (This is somewhat more difficult for a 386 since it can't work at
its best under MS-DOS)
        For the second problem, installing an Ethernet is not too much
expensive than installing other nets. This is JUST a question to be decided
by the computer staff. The LAN at the school of Medicine here is a thick
ethernet to wich several departmental AppleTalk subnets are joined. This
was the only easy choice when built several years ago. But today YOU HAVE
NO EXCUSE to select it other than small price differences. Now it is as
easy to connect through Ethernet as formerly through small specialized

        I would advocate for trying to enforce the use of faster networks,
mainly for we must plan for research, not business. And in research, more
in Molecular Biology, the progress is as fast that you must plan with five
years in advance at least, and that means things that no one can even dream
of now. Building a slower network, sadly, is of no use at all, since it
will be unusable in a couple of years (e.g. like buying a PC-Junior instead
of an AT).

        My view is that in the near future there should be lots of fast
LANs, joined through fast links (we are planning here to use 4 Mbaud links
to join several centers at national level), with most researchers having
at least one (now) powerful computer (say SE/30 or MacII, 386 or 486, or
a WorkStation), and surely one or two secondary machines for general use
(say word processing, drawing, controlling experiments or sequencing).

        Working in such an arena, grown up from many different systems will
require a common interface. Using all the machines efficiently will need
a distributed system capable of running processes on remote machines under
different O/S. Storage will become a valuable resource, and will need to
be shared securely.

        I bet that X-windows, maybe Unix, Kerberos, NFS and so on may be
a good solution. But that only depends on us and our support. The more stress
we make on standardization, the sooner we'll arrive there and we'll forget
about interconnection and communication headaches.

        On the other hand, somone recently asked about running his Mol.
Modelling software from a remote station. I don't think it is a good idea
unless you have a direct and fast connection, and a good graphics device.
What I would counsel is creating the model in the host, downloading the
structure, and manipulating it on the workstation, which should work
faster and give you an approximate idea of what you are doing. I know
there are some good modelling packages for the Mac II, though I have not
tried anyone yet. I hope to receive some in brief. Maybe then I could
send an overview. Or you could read Peter Markiewicz's summary.

                J. R. Valverde
        Biomedical Research Institute
                Madrid - SPAIN

More information about the Bio-soft mailing list

Send comments to us at biosci-help [At] net.bio.net