[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

free variable references in interpreter.

In NIL we had at least two years of experience with automatic special
declarations of free variables.  What used to happen was a free
reference would cause, on the first occurrence, a warning message to
be printed, and a special proclamation to be performed.  This was a
big pain and caused a fair amount of grief.

Now, what happens is that a free variable reference generates a
free-variable-reference error.  The result of this is that the user
gets to choose (via the debugger) to globally declare the variable
special, to just use the special value "just this once" (i.e., the
error will occur again), or any other general debugger action.  It
seems fair to me that you should be warned about a free variable which
is not declared within the current lexical scope.  Remember that the
compiler cannot tell what dynamic scope the function will be called
in, so (in the interpreter) blindly and silently using whatever
dynamic value is around doesn't seem to me to be such a hot idea.


And, as MLY said, we also have special dispensation for the toplevel
null lexical environment to utilize free variables without this
interference, so that one can do things like
	(setq x ...)
without globally corrupting the usage of X.  (If you want to, you use
defparameter instead.)

I would appreciate it if other people would refrain from describing
what NIL does when they are at all unfamiliar with the development
version.  It is unfortunate that the latest public NIL release came
out 1.5 years ago and the published benchmarks for some reason were
those for the version before that, so i don't need any additional help
in the spread of misinformation.