[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Keyword extensions to Compile
Date: Sat, 17 May 1986 20:25 EDT
From: "Scott E. Fahlman" <Fahlman@C.CS.CMU.EDU>
... I think it is legal for you to extend your implementation in
this way as the language is currently defined ...
Let's stop talking "legal" for a while and talk "rational".
There are lots of gaping holes in the language and it's one thing to agree
that a Common Lisp which is "suspect" is technically correct because the
manual doesn't contradict them. It's another thing altogether to suggest
that implementations not yet written should be encouraged to be as loose
with the wording.
We want to be tolerant of those who took the manual at face value as a spec
they could implement faithfully and be proud of, but we also want to keep
in mind that there's much which we've learned since creating that spec and
it would be silly not to encourage people to use this new knowledge when
designing new CL systems.
It continues to amaze me that people have made "upward-compatible"
extensions to the core language. This just makes it hard to debug
portable programs. CLtL has primitives built in which allow extensions
to be made in a non-intrusive way and I think everyone should use
those in their implementation and leave the standard language alone.
Put another way, I think that every implementation should pick some other
package and put all their extensions on that package and leave the LISP
package pure. We've hashed this one out at length. If someone doesn't
think that there was overwhelming concensus that this was the right
thing, I'd like to hear from them.
VAXLISP, to pick an example at random, does not currently do this.
They throw lots of symbols in with the standard LISP package. I hope
they have plans to change that. Symbolics almost does it right by
having an SCL package for its extensions, but still doesn't go all the
way by shadowing the symbols in LISP which it intends to extend
incompatibly. I'm trying to get that changed.
My recommendation to anyone who wants to extend COMPILE (or any other
function)'s argument specs would be to simply shadow COMPILE on another
package, FOO, and do
(DEFUN LISP:COMPILE (NAME &OPTIONAL DEFINITION)
(FOO:COMPILE NAME DEFINITION))
This way, callers of LISP:COMPILE cannot possibly get non-standard features
in code that's intended to be portable. People who know about the extensions
can also know what package they need to use in order to make the extensions
available. Life becomes much simpler that way.
The problem with making upward compatible extensions is that someone might
accidentally use them and not remember the fact. They'd typically get no
diagnostics from their compiler -- in the case of functions (especially
in light of APPLY or the ability to have a variable evaluate to a keyword),
it's not possible to detect all such uses at compile time. The result is that
the first time they find out they're losing is when they port to another
dialect and it rejects the extended argument syntax. ... if they're lucky.
If they're not lucky, it doesn't reject the syntax. Maybe it thinks it
understands the extended syntax -- and if it thinks it does, maybe it's
right -- then again, maybe it's not. Maybe you eventually get an error
signalled -- or maybe you just get the wrong answer.
I'm not just making this problem up. It has already really happened to me
in real-live attempts to port things. It is a royal pain to debug. If we
don't take steps to alleviate it, we're doing users a disservice.
I've spent some time in-house talking to our software support staff about
problems they're going to come up against when dealing with CL customers
and the worst one is going to be the fact that customers using CL are going
to expect that porting CL code should be a breeze and are going to be quite
surprised at the difficulties they can encounter. In my opinion, that's
a fault of CL, not a fault of Symbolics or DEC or Lucid or whomever. In
the long run, every vendor is going to have some set of users who are upset
at the vendor because their program "which ran fine in some other implementation"
doesn't work in the new implementation. More often than not, it's going to
really be due to the fact that the "other implementation" was too forgiving
and let them run non-portable code. My experience says that the user will
blame the vendor who gave them the new, more restrictive implementation
for their headaches, not the one that let them run the bogus code in the
first place. If it's any vendor's fault that lossage arises, it's probably
the less restrictive implementation, not the more restrictive one. But
I don't think it's any vendor's fault at all. It isn't fair for the
individual vendors to take this heat because they've had no advice about
how to avoid this problem, or even that it's a problem to be avoided.
The fault belongs to the designers. It's time we started asking serious
questions about what portability is really about, and giving good advice
about how implementors should proceed in the future before.
Again, I'm not talking about declaring existing implementations to be "wrong".
I'm just talking about laying down guidelines for how people can and should
build better implementations in the future. For now it's just friendly advice.
Nothing legalistic involved... yet. That advice -should- be part of a later
spec, I think, and implementations which claimed to adhere to that revised
spec and still didn't heed that advice would at that time be declared to be