To build HOL on polyml 5.9.1 on MacOS M1, when the standard invocation fails with
poly < tools/smart-configure.sml ... Making mllex ld: LINKEDIT overlap of start of LINKEDIT and symbol table in '/private/var/folders/k5/_29n92td6yqcvzm2pf69n1n40000gn/T/polyobj.8335.o' in '/private/var/folders/k5/_29n92td6yqcvzm2pf69n1n40000gn/T/polyobj.8335.o' clang: error: linker command failed with exit code 1 (use -v to see invocation) Failed to make mllex (Fail "Command failed")The problem is in
polyc
. What to do:
Build poly, except in the ./configure
invocation, do
$ LDFLAGS=-ld_classic ./configureAfter that, invoke
make
in polyml/
. This creates file polyc
Edit EXTRALDFLAGS
in polyc
LINK="g++" # line 5 LINKOPT="-std=gnu++11" # new line 6 EXTRALDFLAGS=-ld_classic # line 16
Edit line 44 and 46 so those lines begin with
"${LINK}" ${LINKOPT}instead of
"${LINK}"
Invoke
make installas per usual.
In the HOL directory, add the line
val EXTRA_POLY_LDFLAGS = ["-ld_classic"];
to file
tools-poly/poly-includes.ML
In a completely fresh installation, this file doesn’t exist already.
Now you can build HOL via the usual invocation
$ poly < tools/smart-configure.sml $ bin/build
/usr/bin/
, and
the Moscow ML library files in /usr/lib/
.
If configure
is not guessing the correct directories,
you will need to write a config-override
file in the
root directory of the HOL installation, and put in it the line
val mosmldir = "/usr/bin/"
You should then be able to run
$ mosml < tools/smart-configure.sml
and
$ bin/build
as normal.
Chr
exception when
trying to build my (large) theories!This is a known problem in Moscow ML 2.01. There is a patch, and some instructions on how to apply it.
The problem is almost certainly that your copy of MoscowML hasn’t been built with dynamic linking correctly enabled. This in turn means that the Muddy BDD package that HolBdd uses can’t load. You can test that this is the problem by running hol, and then trying
- load "HolBdd";
This should fail with an error message about not being able to
load muddy.so
.
If you decide you do want HolBdd (there is nothing else in HOL that depends on dynamic linking) you will probably need to build Moscow ML yourself. Binaries from the central site in Denmark don’t seem to work. However the install.txt file in the distribution does talk about things you need to do to binaries in order to get dynamic linking to work for them, and it would be interesting to hear that someone had got this to work. In any case, to build from sources, you must make sure that you alter the src/Makefile.inc in the MoscowML distribution in line with the instructions in the comments there.
This advice seems to solve this problem for 90% of the people reporting it. If it doesn’t in your case, please get back to us.
Neither
> File "Term.sml", line 1328, characters 10-42: > ! let val {const=Const(r1,_),theory,place} = const_decl name > ! ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ > ! Warning: pattern matching is not exhaustive
nor the following sort of output, when configuring the system before building:
> /local/scratch/kxs/144/bin/mosmlyac: 4 shift/reduce conflicts.
represent problems.
Holmake
from proving false
theorems?! The default behaviour of Holmake
is to cause
failing proofs (from calls to prove
and store_thm
) to be asserted as true
with mk_thm
(such results get oracle tags recording this
fact). This behaviour allows for more of a script file to be checked
because the first failing proof doesn’t abort the script.
If you don’want this behaviour, use the --qof
(“quit on failure”) flag to Holmake
, or use
an OPTIONS = QUIT_ON_FAILURE
line in
a Holmakefile
.
These versions of the system no longer build successfully
because of the Overflow
bug, caused by the fact that
more than 230 seconds have passed since 1 January
1970.
In the file src/portableML/Portable.sml, replace the
local-in-end
block that defines mk_time
,
dest_time
and some others with
local open Time val two_30 = Math.pow(2.0, 30.0) in val timestamp : unit -> time = now val time_to_string : time -> string = toString fun dest_time t = let val adjusted = Real.-(toReal t, two_30) val sec = Real.trunc adjusted in {sec=sec, usec=0} end fun mk_time {sec,usec} = fromReal (Real.+(real sec, two_30)) fun time_eq (t1:time) t2 = (t1 = t2) fun time_lt (t1:time) t2 = Time.<(t1,t2) end
(In Kananaskis-1 this is at line 89; in Taupo-6, it is at line 85.)
If you are using Windows, you will also have to upgrade to Moscow ML 2.01.
This is a Poly/ML bug and seems to preponderantly affect Linux installations ; recent repository versions (later than 5.7.1) make the problem go away. Check the github page for Poly/ML for recent commits.
You are probably seeing mis-interpreted UTF8. You can either turn Unicode off, or fix the environment in which you are running HOL.
If you are using Emacs, you should add the following lines to
your .emacs
:
(set-selection-coding-system 'utf-8) (set-next-selection-coding-system 'utf-8) (setq process-coding-system-alist '((".*" . utf-8)))
If you are using a Windows console, you might try issuing the command
chcp 65001
before starting HOL. According to Albert Lai, one should be aware
that even then, the console font lacks some pretty common symbols
.
If you are using a Unix shell, you should make sure that
your LANG
environment variable includes the
string UTF-8
as a suffix.
Use
val _ = set_trace "Unicode" 0;
You can put this line into a .hol-config.sml
file in
your home directory in Kananaskis-6 and later.
term_grammar()
?The term_grammar()
function returns the grammar
controlling the parsing and printing of HOL terms. Its printed form
doesn’t capture everything inside the grammar, but does provide a
pretty good picture of what happens to your string when it’s
parsed, and what happens to your term when it’s printed.
The first part of the grammar output is something that looks like:
(0) TM ::= "LEAST" <..binders..> "." TM | "?!" <..binders..> "." TM | "?" <..binders..> "." TM | "!" <..binders..> "." TM | "@" <..binders..> "." TM | "\" <..binders..> "." TM (2) TM ::= "let" TM "in" TM [ _ let] (4) TM ::= TM "::" TM (restricted quantification operator) (5) TM ::= TM TM (binder argument concatenation) (7) TM ::= "case" TM "of" TM [case__magic] ... (50) TM ::= TM "," TM (R-associative) (70) TM ::= "if" TM "then" TM "else" TM [COND] (80) TM ::= TM ":-" TM (non-associative) (100) TM ::= TM "<=/=>" TM | TM "<=>" TM | TM "=" TM (non-associative) (200) TM ::= TM "==>" TM (R-associative) ... (450) TM ::= TM "<<=" TM | TM "PSUBSET" TM | TM "SUBSET" TM | TM ">=" TM | TM "<=" TM | TM ">" TM | TM "<" TM | TM "RSUBSET" TM | TM "<>" TM (non-associative) (460) TM ::= TM "with" TM [record update] | ... (R-associative) (480) TM ::= TM "++" TM (L-associative) (490) TM ::= TM "::" TM [CONS] | TM "INSERT" TM | TM "LEX" TM | TM "##" TM (R-associative) (500) TM ::= TM "DELETE" TM | TM "DIFF" TM | TM "UNION" TM | TM "-" TM | TM "+" TM | TM "RUNION" TM (L-associative) ...
This is a description of the way in which terms can be constructed from tokens and recursive invocations. For example, the rule at precedence level 70 is
TM ::= "if" TM "then" TM "else" TM [COND]
This rule is saying that if you input ``if P then Q else
R``
, the concrete syntax phase treatment of this phrase will turn
it into COND P Q R
. If there is no name
in square brackets after a rule, then the head operator name is the
same as the (one) token in the rule. Thus the rule for +
is
just TM ::= TM "+" TM
.
The numbers associated with rules indicate their precedence. The
higher the number, the more tightly the operators “pull
at” TM
s to their left or right. Because +
is at level 500, and *
is at 600, arithmetic expressions
such as ``2 * x + 1``
parse as you’d expect. Rules that
can pull in both directions at once (“infixes”, such as +
and <=
) are also given associativities.
Because /\
is
right-associative, ``p /\ q /\ r``
parses
to (/\ p (/\ q r))
.
The next section of the term_grammar()
output is the
list of known constants. For example:
Known constants: ! ## & () * ** *, + ++ , - /\ 0 :- :> < <<= <= <=/=> <=> <> = =+ ==> > >= ? ?! @ ABS_num ABS_prod ABS_sum AC ALL_DISTINCT APPEND ARB ASSOC Abbrev BIGINTER BIGUNION BIJ BIT1 BIT2 BOUNDED CARD CHOICE COMM COMPL COND CONS CR CROSS CURRY Cong DATATYPE DELETE DIFF DISJOINT DIV DIV2 DIVMOD DIV_2EXP DROP EL EMPTY EMPTY_REL EQC EVEN EVERY EXISTS EXP ... stmarker sum_case sum_size symmetric the_fun the_value total transitive tri trichotomous unint wellfounded ~
This is the list of names that cannot be used to name free
variables. Instead, if this name occurs (after concrete syntax has
been processed away), it will turn into a constant (or a pattern form,
see below). Thus, you cannot write ``EQC + 6``
(when relationTheory
is loaded) because
the EQC
is the name of a constant, and it does not have a
numeric type. If a name appears in a binding position, it can be
anything. So you can write ``λEQC. EQC + 6``
.
I wouldn’t recommend it, but it’s allowed.
The final section is a list of mappings from constant “names” (in the sense of the previous paragraph) to actual term values. Not every such binding is listed. In particular, if a string maps to one constant that has the same name, then this listing will not appear. Each line is of the form
<name> -> <constant1> ... <constantn>
For example, there is a line
** -> EXP
meaning that the name **
will map to the
constant EXP
(from arithmeticTheory
). Note
that there is also a mapping from the string EXP
to the
same constant. This mapping isn’t listed because there is
less cause for confusion there.
Another sort of situation is illustrated with the line
++ -> APPEND ++
which means that the name ++
can map to either the
list constant APPEND
or the constant ++
.
(How do you find out which constant that last one is?
Try map dest_thy_const (decls "++")
.)
Finally, an overloading mapping may also take a string to a
pattern, which is a λ-expression. This is illustrated by the line
for <>
:
<> -> (\(x :'a) (y :'a). bool$~ (min$= x y))
This means that the name <>
maps to a pattern
taking two arguments and turning them into a negated equality.
`...`
and ``...``
— it seems like some
functions want one and others the other, but there’s no obvious
difference between the uses to me ...The ``...``
form is really the application of the
function Parse.Term
to the quotation `...`
.
A quotation is really just a string (except it also supports
anti-quotations, see below). The reason it
isn’t just a string is that SML strings require you to backslash quote
backslashes, and to backslash quote newlines. Both of these would
interfere with HOL syntax (conjunction, disjunction, and lambda in our
ASCII notation all involve backslashes; we often want to input goals
over multiple lines).
The Term
function turns a quotation into a term. It’s
the system’s parser.
If you write something like
tactic_taking_argument ``my term`` THEN some_other_tactic
the `my term`
quotation is parsed entirely independently
of the goal. This can be a pain. Say, you’re proving something about
numbers, and your goal includes variables x
and y
. If your quotation is `x ≠ y`
,
you’d like those variables to have the type :num
. But they won’t
because the parser’s default behaviour is to assign variables
polymorphic types.
But, the tactics that take quotations can arrange to call the parser in a special way, passing in information about the goal’s free variables, so that when you write
Q.tactic_taking_argument `x ≠ y` THEN some_other_tactic
you get x
and y
parsing to have type :num
instead of :'a
.
Most things that take quotations are tactics, but there are some
other situations where it makes sense to be parsing the terms in
augmented contexts. Also, some functions
(e.g., proofManagerLib.g
, Q.store_thm
and Q.prove
) don’t exploit a context, but do enforce a
type on what is parsed (goals must be of type :bool
).
They also reduce screen-clutter.
$∉
isn’t recognised as the “not-an-element” term,
but $NOTIN
is.
The following session seems to exhibit a strange behaviour:
- type_of ``$NOTIN``; <<HOL message: inventing new type variable names: 'a>> > val it = ``:α -> (α -> bool) -> bool`` : hol_type - type_of ``$∉``; > val it = ``:α`` : hol_type - dest_term ``$∉``; <<HOL message: inventing new type variable names: 'a>> > val it = VAR ("∉", ``:'a``) : lambda
In fact, this is a known consequence of the design.
The $
-quoting of symbols turns them into things that
aren’t treated specially by the surface syntax parsing (which handles
infix operators etc). Thus, when dollar-quoted,
the ∉
gets passed through to the name resolution phase
unchanged. The name resolution phase then decides that ∉
is not a name it knows anything about, so it passes through that phase
and becomes a variable.
Name resolution doesn’t know anything about ∉
because
the system is set up so that the surface syntax phase turns it into
the name NOTIN
. You can see this by examining the
output of term_grammar()
:
- term_grammar(); > val it = ... (425) TM ::= ... | TM "∉" TM [NOTIN] | ... ...
The presence of NOTIN
in the square brackets
indicates that this rule maps the form with ∉
into the
name NOTIN
.
A similar mapping from surface syntax to name happens to turn
if-then-else
into COND
. If you
write $if
, you get a variable called if
, not
the constant COND
.
<->
?
The problem looks like this:
- ``<->``; Don't expect to find a -> in this position after a < on line 1318, characters 3-4 and at line 1318, character 2. ! Uncaught exception: ! HOL_ERR
The treatment of -
changed to support unary minus, and so
you get this odd splitting behaviour if you give the
parser <->
when it’s “unprepared”. You can
“prepare” it by making calls to set_fixity
and/or overload_on
.
Thus:
val _ = set_fixity "<->" (Infixl 500); val _ = overload_on ("<->", ``myconst``);
Either of these will do the trick; doing both (a common use-case) is also fine.
let-and
declaration syntax? I don’t
use let-and
much, so is there a way to remove
the let-and
treatment of “and” from the term
grammar?
Yes, if the let-and
syntax is enabled (as it is by
default), there will be a competing “and” infix at fixity level 9 in
the grammar. (You can see this if you look at the output of a call
to term_grammar()
.) The way to get rid of it is with
remove_termtok {term_name = GrammarSpecials.and_special, tok = "and"};
``MAP ~ x``
?
No rule for [~]meaning that the parser wants to do a reduction involving just the special symbol
~
, and it can’t find a rule that
allows this. The rule in the grammar is
TM ::= ~ TM
~
is treated specially, and not just as a function
that you’d apply to arguments normally, for two reasons:
~f
x
, this is parsed as ~(f x)
.
~~b
; you don’t need to write
~(~b)
.
To make ~
lose its special status, you should prefix
it with a $
. ``MAP $~ x``
will work in
the example, assuming that x
has type :bool
list
.
Use grammar manipulation functions such as
set_fixity
to alter its parsing information.
These changes can be made to persist when the current
theory is exported. For example:
- load "listTheory"; - set_fixity "MEM" (Infix(NONASSOC,425)); > val it = () : unit - listTheory.MEM; > val it = |- (!x. x MEM [] = F) /\ !x h t. x MEM h::t = (x = h) \/ x MEM t : thm
To print a term t
try
print_term_by_grammar boolTheory.bool_grammars t
This still retains pretty-printing for all of the constants in
boolTheory
, including COND
. If you really
want to see even /\
, !
etc in their raw
form, try
print_term_by_grammar min_grammars t
For example:
- print_term_by_grammar min_grammars (concl AND_CLAUSES); bool$! (\t. bool$/\ (bool$/\ bool$T t = t) (bool$/\ (bool$/\ t bool$T = t) (bool$/\ (bool$/\ bool$F t = bool$F) (bool$/\ (bool$/\ t bool$F = bool$F) (bool$/\ t t = t))))) > val it = () : unit
and
- print_term_by_grammar min_grammars (concl COND_CLAUSES); bool$! (\t1. bool$! (\t2. bool$/\ (bool$COND bool$T t1 t2 = t1) (bool$COND bool$F t1 t2 = t2))) > val it = () : unit
Note that even in min_grammars
, =
,
@
and ==>
are treated
specially.
``P[e/v]``
into ``(\v. P) e``
.First a warning: this is not likely to do what you want. For example, you wouldn’t get any nice way of stating the Hoare assignment axiom.
{ P[e/v] } v := e { P }In order to state the above, you’d need to actually write
{ (P v)[e/v] } v := e { P v }
to make it clear that v
might exist in the post-condition.
And given this, you might as well write:
{ P e } v := e { P v }(The problem boils down to the fact that the literal HOL term
``(\v. P) x``where P is a variable, reduces under beta-conversion to
P
.)Incidentally, if this is the sort of thing you are doing, you might be interested in looking at Peter Homeier’s Sunrise system which implements a VCG in HOL.
If you really do want to implement your parsing trick above (turning
``P[e/v]``
into ``(\v. P) e``
), you could do
this by implementing your own parser. Moreover, this might not be
impossibly difficult, because the various phases of the existing
parser can be called independently. Here’s a sketch of what you’d
need to do:
add_rule
). Note that
your particular choice of syntax, involving square brackets and
/ will conflict with the use of these symbols in the theories of
lists and integers. The best way around these problems is to
explicitly load these theories, remove their syntax (see
remove_termtok
), and then install your own. This
will ensure that things won’t break if someone loads your
theory/library and then later loads lists and/or integers.
Absyn
, which
will turn P[e/v]
into abstract syntax corresponding
to
You then traverse the term looking for this pattern and turning it into
``foo P e v``
(\v. P) e
. You would also raise an error
at this stage if v
wasn’t a variable.
Parse.absyn_to_term (term_grammar())
fun Term q = absyn_to_term (term_grammar()) (Absyn q)So all you are doing is inserting an extra phase into the process after calling
Absyn
, but before calling
absyn_to_term
.
- val ty = Type`:bool`;and then
- Type`:num -> ^ty`;However, you may wish to also introduce type antiquotations into term parses (as a type constraint on a variable for example). The problem is that you can’t just write
^ty
in this
context, because the parsing function’s type insists that all
antiquotes be of type term. You must use the
ty_antiq
function, which magically makes a type
appear as a term:
- val antity = ty_antiq ty; > val antity = `(ty_antiq(`:bool`))` : termThen:
- val x = Term `x:^antity`;It’s quite easy to figure out when and why
ty_antiq
is required if one understands the type of the parsing functions.
For example, Term
and (fn q => -- q --)
have type
term frag list -> termWhile
Type
and (fn q => == q ==)
have type
hol_type frag list -> hol_typeThe
frag list
refers to the fact that there is a
quotation being consumed. A quotation is something that appears
between back-quotes. A quotation consists of strings and
antiquotations. A term frag list
must have
antiquotations that are of type term
. A
hol_type frag list
must have antiquotations that are
of type hol_type
. Thus, if you use
Term
to parse a term, you can’t directly antiquote
in types, because this would violate the typing rules (you can
only antiquote in terms if you call Term
, because it
requires a term frag list
).
So, in order to antiquote types into terms there is a bit of
magic called ty_antiq
, which if you look at its type
(hol_type -> term
) turns a type into a term. The
term formed is completely bogus in a logical sense (it’s actually
a variable with a special name and the given type), but can be
pushed into the parsing function Term
so as to give
the effect of antiquoting a type.
The function Hol_datatype
takes a hol_type
frag list
, so if you want to antiquote a type into the
quotations it takes, you will not need to use
ty_antiq
.
tac1 THEN tac2
instead
of tac1 THENL ((fn t => [t,t]) tac2)
but when I tried the version with THEN
I got an error
The fact that the infixes THEN
, THEN1
(a version of THENL
) are left-associative occasionally causes problems.
(That there is a problem stems from the fact that the THEN
/THENL
combination is not associative.)
You can usually get around this problem by bracketing your tactics differently.
The typical problem is when you apply tac1
, generating
two goals. You then apply tac2
to one of these
generating a further two goals. You construct tac3
and tac4
to handle these. Then you realise that
tac2 THENL [tac3, tac4]
would correctly handle the second goal that was generated
by tac1
as well. So, you write
tac1 THEN tac2 THENL [tac3, tac4]
and are annoyed when it fails with a THENL
error. The
reason is that at the point when the THENL
gets applied,
there are actually four goals to work with, not just two. The same
sort of thing happens with THEN1
instead
of THENL
. You can solve the problem by writing
tac1 THEN (tac2 THENL [tac3, tac4])
But your problem is superficially slightly different. You say that
tac1 THENL (fn t => [t,t]) tac2
works, but
tac1 THEN tac2
doesn’t. This will be because tac2
involves
a THENL
or a THEN1
. Say, it is
tac21 THEN1 tac22 THEN tac23
When inside a THENL
branch, the two copies of this tactic
do the right thing. When you use THEN
, you have
tac1 THEN tac21 THEN1 tac22 THEN tac23
After tac1
has been applied you have two goals. After
tac21
you have four goals. After tac22
, you
have three goals, the second of which you wanted to be solved
by tac22
. From that point on, the tactics behave
differently. In particular, tac23
may well do the wrong
thing when applied to the goal that tac22
was supposed
to see but didn’t.
The fix: write
tac1 THEN (tac21 THEN1 tac22 THEN tac23)
p ⇔ q
and the two theorems p ⇒ q
and q ⇒ p
?
If the simplifier has been told to rewrite p ⇔ q
, then if it sees occurrences of terms matching (the pattern) p
, it will rewrite to q
. Thus, simplifying with the theorem
⊢ (LENGTH l = 0) ⇔ (l = [])
will find instances of LENGTH l = 0
in the goal and rewrite those instances to l = []
.
If you instead provide the theorem ⊢ (LENGTH l = 0) ⇒ (l = [])
, the simplifier will look for occurrences of l
and attempt to rewrite them to []
, after first attempting to prove LENGTH l = 0
as a side condition.
An instance of l
(a variable) is any term of list type, so this rewrite may be attempted a great many times.
Similarly, if you give
⊢ (l = []) ⇒ (LENGTH l = 0)
the system will attempt to rewrite occurrences of LENGTH l
to 0
by proving the side condition l = []
.
In almost all circumstances, the iff-rewrite will be better, and trying to use the implications will just cause pain.
Use
show_assums := true;
This flag, along with several others that control HOL behaviour can
be found in the Globals
structure.
This has been a much-discussed topic by HOL users over the years. The info-hol archives have the details. The received opinion seemed to be that numbered assumptions were great at proof-creation time, but had potential for being a nightmare when existing proofs had to be changed.
In any case, the assumptions have to be dealt with. My current favourite approach is to use "PAT_ASSUM", as follows:
PAT_ASSUM tm MATCH_MP_TACor
Q.PAT_ASSUM q MATCH_MP_TAC
PAT_ASSUM
is like UNDISCH_THEN
in that it removes an assumption and makes it available as a theorem.
However, it is better than UNDISCH_THEN
since the quotation argument it takes is treated as a pattern to match against.
The free variables of the goal to help constrain the match.
It is often the case that very complex assumptions can be singled out by giving quite simple patterns to PAT_ASSUM
.
If you are unsatisfied with the way hypotheses are handled in proofs, numbered (or named) assumptions can be added in a number of ways.
Perhaps
the simplest is to write a version of UNDISCH_THEN
, call it UNDISCH_NTH
having type
int -> (thm -> tactic) -> tactic
UNDISCH_NTH k ttac
grabs the kth assumption, converts it to a theorem thk, and applies ttac to thk to get the desired tactic.
Here’s one way to define it:
fun UNDISCH_NTH k ttac (g as (asl,_)) = UNDISCH_THEN (Lib.el (k+1)(rev asl)) ttac g;
Use
TypeBase.axiom_of ``:mytypename``
Hopefully you won’t need to do this very much because many of the contexts in which you previously needed the axiom are dealt with more smoothly. For example, cases, induction, one-one and distinctness theorems are now proved automatically, and are available from the TypeBase. For example,
TypeBase.one_one_of ``:mytypename``
computeLib.add_convs
?
Best thing currently is to look at the implementation of
REDUCE_CONV
, which uses add_convs to deal with
DIV
and MOD
: the invocation
add_conv (M,n,c) compset
means (roughly):
when you see constant M applied to n subterms (i.e., M is fully applied to its arguments), apply conversion c
IN
. What theory defined
is it defined in?
IN
is declared in boolTheory
. IN
is not an ML identifier, so we don’t index it. Perhaps we could index
constants to their place of definition though.
You can always figure out the originating theory by
dest_thy_const : term -> {Theory : string, Name : string, Ty : hol_type}
Using decls name
will give you all the declared constants that
have the given name.
In the emacs
mode, you should use M-x customize-face
, and then type in the name of the “face” that you wish to change.
For example, there are faces hol-free-variable
, hol-bound-variable
, hol-type-variable
and hol-type
.
If using the vim
mode, the colour-printing comes from ANSI sequences being sent to the terminal that HOL is running in. You should put something like the following in your .hol-config.sml
file:
val my_colours = let open term_pp_types in { bv = OrangeRed, fv = VividGreen, tyv = PinkPurple, tyop = White, tysyn = White } end val mybackend = PPBackEnd.ansi_terminal "mybackend" my_colours val _ = Parse.current_backend := mybackendThe colour names are given in
src/parse/term_pp_types.sml
(and correspond to those officially supported by ANSI).
If expression M
is causing the grief, then write
(M) handle e => Raise eOften the parentheses around
M
will not be required.
It is sometimes convenient to use the try
function as
well. It takes a function and an argument, applies the function to the
argument, and (on success) returns the result or (on failure) prints the
exception. Note however, that try f x
will not catch errors
arising from the evaluations that lead to f
or
x
.
quit()
)This arises because the quotation-filter, which handles the
``...``
and ``:...``
syntax, doesn’t
understand the quit
command’s significance. Rather than use quit()
, end your interactive
process by sending an EOF
signal. On Unix, type
Control-D
; on Windows, type Control-Z
,
followed by RETURN
.
Both the Holmake
program and the various hol
scripts take -I <dir>
flags to indicate that the
system should look in the specified directories for object files
(whether theories or libraries). You can rebuild theories in later
directories by calling Holmake
with the appropriate
-I
flags. Directories that should be examined for
“include files” can also be specified with an
INCLUDES=
directive in a Holmakefile
.
See the DESCRIPTION manual for more on this.
This is done implicitly in a script
file, usually by implicitly referring to an ancestor theory’s
theorems in the course of proof. If you want a theory to be an
ancestor because you use an exported type (say string
from stringTheory
), but never refer to any of the
theory’s theorems, then you can explicitly refer to it. The
standard idiom we have adopted is to write
local open stringTheory in end;
This makes the dependency on the ancestor theory explicit. If you
omit this, and have no other ML-level dependency in your script file,
then Holmake
will fail to build your script, probably
failing with a parse exception when it encounters a type name such as
string
that it doesn’t think is in scope.
*HOL*
buffer (in emacs), complete with Unicode characters? The following elisp code is (will be) part of the standard hol-mode
.
(defcustom hol-unicode-print-font-filename "/usr/share/fonts/truetype/ttf-dejavu/DejaVuSans.ttf" "File name of font to use when printing HOL output to a PDF file." :group 'hol :type '(file :must-match t)) (defun hol-region-to-unicode-pdf (filename beg end) "Print region to FILE as a PDF, handling Unicode characters." (interactive "FFile to write to: \nr") (if (and transient-mark-mode (not mark-active)) (error "No active region")) (let* ((temp-ps-file (make-temp-file "annot" nil ".ps")) (lpr-switches (list "-font" hol-unicode-print-font-filename "-out" temp-ps-file)) (lpr-add-switches nil) (lpr-command "uniprint")) (lpr-region beg end) (shell-command (concat "ps2pdf " temp-ps-file " " filename)) (delete-file temp-ps-file)))
This relies on having the uniprint
and ps2pdf
tools installed.
On Debian-based systems such as Ubuntu, the former is part of the yudit
package, which can be installed with a command like
sudo apt-get install yudit
The code also relies on a hard-coded path pointing to a font file (the hol-unicode-print-font-filename
).
This can be adjusted using M-x customize
and navigating to the hol
group (see the External
sub-category).
To use this code, select a region of an emacs buffer, and type
M-x hol-region-to-unicode-pdf
The command will prompt for a file to save to, and then write a PDF file to that location.
Thanks to Dan Synek for the problem, and subsequent inspiration for the code above.
Use GSYM
.
To achieve exactly this objective, use stringLib.string_EQ_CONV
, which reduces equalities over string literals to either T
or F
.
For example, to demonstrate that ``(if "foo" = "bar" then 3 else 1) = 1``
, you could write:
- val t = ``(if "foo" = "bar" then 3 else 1)``; > val t = ``(if "foo" = "bar" then 3 else 1)`` : Term.term - (DEPTH_CONV stringLib.string_EQ_CONV THENC REWRITE_CONV []) t; > val it = |- (if "foo" = "bar" then 3 else 1) = 1 : Thm.thm
Note though that the stateful simplification set behind SRW_TAC
and srw_ss()
will automatically handle equalities of this form, so that SIMP_CONV (srw_ss()) []
would serve just as well above.