Date: | | Aug. 7, 2000, 19:19 MET |
To: | | TIME magazine |
Subject: | | Michael Lemonick: Adventures in Antigravity (TIME, vol.156 no.6 p.54) |
|
Thanks to Michael Lemonick for calling attention to this important
aspect of astrophysics.
I've always been terribly frustrated by the one-track-mindedness
of those astrophysicists who insist on expecting a slowdown in
the expansion of our region of the universe. They are simply
hypnotized by the Single Big Bang theory although, in blatant
contradiction to the very fundamentals of physics, this requires
a miraculous creation of matter and energy out of nothing.
In an endless universe, in which Big Bangs and Big Crunches happen
all over the place all the time, the outer regions of the blow-up
of one Big Bang would inevitably experience external gravitational
attraction from matter of neighboring Big Bangs. What we percieve
as "negative" gravitation is the same old Newtonian "positive"
gravitation, but exerted by masses outside our own local Big Bang
blow-up.
The only remaining mystery is actually, when and how a Big Crunch
inverts into a Big Bang. If astrophysicists hadn't preoccupied
themselves with that Single Big Bang red herring, they would probably
have solved this one long ago.
Best regards,
Waruno Mahdi
Post script: printed in TIME of
Sept. 4, 2000, (Europe editon).
The inverson of a Big Crunch into a Big Bang has meanwhile been theoretically described by
Martin Bojowald,
see his publication:
Zurück vor den Urknall. Die ganze Geschichte des Universums.
Frankfurt am Main: S. Fischer (2009),
ISBN 978-3-10-003910-1
|
Date: | | Dec 6, 1997, 17:21 MET |
To: | | EvolutionLanguage (mailing list) |
Subject: | | Re: on humanness of language |
|
> Sent by: R.G.
>
> > Sent by: Waruno Mahdi
> >
> > Yes, for some reason, people say "language" when they mean "human
> > language", but then think nothing of saying "machine language"
> > (which is quite "non-human").
>
> What is non-human about machine language? It is invented by humans
> for devices used/designed by humans. Humans input to the machine, the
> machine communicates in its own way (machine language) to other
> components, the machine outputs to varying degrees of user-friendly
> formats. All these are very human stories seems to me.
The stories are human alright (even very much so :-)), and the
machines are human-made and human-operated, so far so good. It is the
"language" of these machines that is non-human, for many reasons.
The trivial ones are: |
(1) |
words in machine language, like in non-human animal signaling,
have strictly defined meanings, in human language they are
characterized by polysemy (as a consequence, machine language
and non-human animal signaling cannot tolerated
homonyms/homographs). In non-human animal and computer signaling
systems there is a finite, concrete number of well-defined meanings
for each of which there is a defined expression. In human language
the meanings are not strictly defined and numerable like that, and
there is no such strict tit-for-tat relationship between symbol and
symbolized (we have synonyms, metanyms, homonyms, figurative speech,
etc.). |
(2) |
machine language, like non-human animal signaling systems, cannot
change without loss of functionality. When a new version of a
machine language is introduced (new dialect in animal signaling),
it cannot be used with older machines, whereas the new machines
cannot handle old programs unless the old language persists as
subclass of the new version. Human language, on the other hand,
not only changes without stop, but each speaker is constantly
code-switching between several social dialects, including the
age-group dialect of the own generation, that of the previous
generation, and optionally a professional slang, a regional
dialect, etc. (S)He can also cope with situations in which
several of these dialects are mixed. Machines would go into a
stupor, from which they can only be extricated with a reboot.
Analogically, we may even understand a foreigner speaking broken
English, or a drunkard. Machines likewise play dead in such
situations. In some European languages you can say "he didn't
go, they went him" (meaning "they made him go"), although it is
just as ungrammatical as it is in English. Machines don't put
up with such disdain for syntax. |
(3) |
machine language does not distinguish styles (archaic, poetic,
bookish, colloquial), and I don't think non-human animal
signaling systems does either, although it is imaginable
for the animals, that signal quality may express emotionality
or some other condition. |
So you see, machine language is much more animal (non-human-wise),
than it is human. After all, you can also talk with your pet. In
fact, I've seen pet dogs and cats communicate with their owners much
more "human"-ly than my computer does with me (or anybody's computer
with anybody :-)). When compared with human languages alone, machine
language comes closest to artificial languages (e.g. Esperanto),
but this will only last so long one doesn't use such an artificial
language as natural language. The moment one does that, it will
transform into any normal flexible, variable, changing,
idiosyncratically irregular, dialect-diversifying natural language
(look what happened to Hebrew since the founding of Israel), because
only then does the "human touch" come in. |
Those were the trivial points. The point that I see as being the
principle (not just principal) one is: |
(4) |
Every utterance in a human language first of all establishes
a social relationship between speaker and listener/reader, i.e.
it is an act of social communication, and only secondly,
optionally, does it convey some informative content that can be
inferred by a formal analysis of the code. Even when you are reading
a lecture, you are establishing yourself as lecturer and your
listeners as students, and, depending upon whether you strike a
more mentoring or a more jovial tone, you also indicate how you
would like to see this lecturer-student relationship. It doesn't
matter, how many percent of your students will understand your
lecture. They'll all understand the social part of the message.
Being impersonal in one's speech is not easy (or, when it comes
naturally, one should perhaps consult a psychiatrist). That is
perhaps one reason why speaking announcements into the intercom
requires appropriate training (try let some unschooled layman
announce something over the intercom). In machine language it is
the other way round. It is the formal content that counts. The
flowers may be inserted after special "comment" signs for the
benefit of the (human) programmer. Such a "comment" sign indicates
to the machine that it should ignore everything that follows in that
line.... |
And connected with this point is another: |
(5) |
withdrawal from human language communication can lead to mental
depression. One form of mobbing is that none of the colleagues
speak with the victim anymore. But if nobody uses a computer,
it doesn't suddenly break down. Non-human animals, in this
regard, seem to be closer to humans than machines are. |
The main reason why people consider machine language to be closer to
human language, seems to be that both have syntax and, as a consequence,
the limited number of available words can be organised into more
or less complicated meaningful sentences, and varieties of these can
in turn be ordered in sequence to build lengthy monologues (programs).
But, although we do not seem to know of any non-human animal signaling
system with elaborate syntax, I don't think we should as principle
exclude the possibility of syntax in animal signaling at sub-human
levels. I don't know, for one, whether one can safely exclude syntax
in whale and dolphin signaling (I'd be grateful for comment from
biologists in the know). I also don't know what came first in human
evolution, syntax or conscious social organization (other than by
biological instinct).
Apart from that, of course, machine language syntax differs from human
language syntax as indicated in (2). When Sapir said that all grammars
leak, he meant grammars in human language. Machines cannot cope with
leaking grammars. Theirs don't .
Regards to all, Waruno
|