[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
string as vector of fixnums ?
Date: Sat, 12 Dec 87 16:26:03+0900
From: Masayuki Ida <a37078%tansei.cc.u-tokyo.junet%utokyo-relay.csnet@RELAY.CS.NET>
A Question on the history;
When try to get a component of a string as a number, we must
get it as a character and then coerce it to number type.
I think it is ridiculous.
This situation attacked me several times when I wrote codes for
data conversion among lots of machines, deciding to conform Common Lisp 100%.
If you're doing code conversion, you probably don't want to be using
characters at all, but rather arrays of unsigned bytes. (I've often
said that the only way to write a portable file is with (UNSIGNED-BYTE 8)).
Consider reading a file of EBCDIC in a lisp which only supports 7-bit ascii
for its characters, for example. You could get errors or lose bits if you try
to read it into a string! (I doubt there are any implementations which only
store 7 bits in their strings, but they would be quite legitimate). Certainly,
many EBCDIC codes may be undefined values for characters in an ASCII lisp,
and the byte 65 is unlikely to really mean #\A in EBCDIC.
( historically, many dialects of Lisp can get a component of a string as
a number easily.
On symbolics, there are art-strings types, with which we can treat a character
of a string as number.)
This hasn't been true for a year. The types you describe were part of
old ZetaLisp, and do not exist in Release 7 at all.
I feel Common Lisp degrades as for the treatment of characters.
Or, there must be a decision on it I did not know.
Or, CLtL ignores this issue as an implementation-dependent thing.
I feel that what Common Lisp does here is exactly the right thing.
However, I believe CL needs to add facilities for specifying the external
coding of character files, as part of OPEN.
Any opinions ? or I will appreciate if someone give me
the explanation for the process or the
history on the decision .