[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

"fonted" characters in CL

I propose we eliminate char-font-limit and the associated concepts from
the language.  Either that or make the specification considerably more
tight and detailed saying how >numeric< values are interpreted.  Either
that, or specify some >symbolic< notion of "fonts" we can live with.

Symbolics found that the CL notion of "fonts" is not very portable, nor
is it very useful.  Therefore, char-font-limit => 1 and (char-font
<char>) =always=> 0.  This is in accordance with CLtL, but it doesn't
help us move code to other Common Lisp implementations, and it doesn't
help others port to our system.  

What we have done instead (note I don't want to push this and I'm not
sure the development staff does either since only beta-test sites have
seen this so far) is to define a character to have the following
	A character set
	A code within the character set
The character set and code within character set is roughly char-code.
Bits are as per CLtL.  Style is a symbolic notion of what the characters
LOOK like, for example, bold, italic, small, very-large, fixed width,
etc, and combinations.  A "font" is a set of glyphs.  A font is mapped
to by the triple character-set, style and output device.  There are
probably some lies in this description; our documentation is clearer and
more verbose.

My point is that the current numbering scheme is a holdover from 1970's
text formatters (TJ6, R, etc) and the simplistic mapping of those to the
MIT Lisp Machines editor buffers.  The numbers in those systems are
relative to something; the numbers in CLtL aren't relative to anything.
Those ideas don't hold in real production systems.