[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
"fonted" characters in CL
Date: Wed, 20 Aug 86 08:40 EDT
From: David C. Plummer <DCP@QUABBIN.SCRC.Symbolics.COM>
I propose we eliminate char-font-limit and the associated concepts from
the language.
I agree, vehemently.
Either that or make the specification considerably more
tight and detailed saying how >numeric< values are interpreted. Either
that, or specify some >symbolic< notion of "fonts" we can live with.
I don't think this is a good idea for CL, see more below.
Symbolics found that the CL notion of "fonts" is not very portable, nor
is it very useful.
......
What we have done instead (note I don't want to push this and I'm not
sure the development staff does either since only beta-test sites have
seen this so far) is to define a character to have the following
attributes:
A character set
A code within the character set
Bits
Style
The character set and code within character set is roughly char-code.
Bits are as per CLtL. Style is a symbolic notion of what the characters
LOOK like, for example, bold, italic, small, very-large, fixed width,
etc, and combinations. A "font" is a set of glyphs. A font is mapped
to by the triple character-set, style and output device.
Let me dramatize for a second to make a point. I don't see "size",
"rotation", "shading", "projection", "mask", "color",... why don't we
just put TROFF in format macros and throw that in too. This
would give us a full composition language....
Now don't get me wrong. I agree completely with the MOTIVATION for
the new symbolics way of representing characters. It acknowledges
that characters are complex objects having lots of relative
attributes, etc., and that quick little hacks with font "bits" and
numbers up to some limit won't work for real high-quality output.
However, I see no reason for this to become part of a language
standard. The users of common lisp are NOT primarily typographers.
High quality screen output should be discussed w.r.t. a window system
standard. (Note: there seems to be no de-facto standard here.) High
quality printer output should be discussed w.r.t. an output device
standard. (Note: there are two emerging standards here... Postscript
and Interpress. Trademarks of somebody...) Without these notions,
having "font" information is just clutter in the language.
Concretely. I think char-font-limit = 1, (char-font <char>) = 0
should always be the case, and that ultimately we should drop
the font concept from the language. (like I said, I agree vehemently...)
...mike beckerle
Gold Hill Computers