Token_parser.Make
State
: User state.Token
: Token (pure token without position information). The generated parser receives an actual token together with information about the location in the file.Final
: Final result of a successful parse.Syntax
: Represents what has been syntactically expected and has not been received.Semantic
: Semantic error message (triggered by fail
).module State : Fmlib_std.Interfaces.ANY
module Token : Fmlib_std.Interfaces.ANY
module Final : Fmlib_std.Interfaces.ANY
module Semantic : Fmlib_std.Interfaces.ANY
module Parser :
Interfaces.FULL_PARSER
with type state = State.t
and type token = Position.range * Token.t
and type expect = string * Indent.expectation option
and type final = Final.t
and type semantic = Semantic.t
type state = State.t
type semantic = Semantic.t
p >>= f
Parse first the input according to the combinator p
. In case of success, feed the returned value of p
into the function f
to get the combinator to parse next.
let* x = p in f x
is equivalent to p >>= f
The let*
combinator let us express parsing sequences conveniently. Example:
let* x = p in (* parse [p], result [x] in case of success. *)
let* y = q x in (* parse [q x], result [y] ... *)
let* z = r x y in (* ... *)
...
return f x y z ...
The wildcard let* _ = ...
can be used to ignore results of intermediate parsing steps.
map f p
Try combinator p
. In case of success, map the returned value x
to f
x
. In case of failure, do nothing.
map f p
is equivalent to let* x = p in return (f x)
.
map_and_update f p
Try combinator p
. In case of success, map the returned state state
and value a
to f state a
. In case of failure, do nothing.
val succeed : 'a -> 'a t
succeed a
Succeed immediately without consuming token. Return object a
as result.
val return : 'a -> 'a t
return a
is equivalent to succeed a
.
unexpected expect
triggers a syntax error signalling the expectation expect
.
val clear_last_expectation : 'a -> 'a t
clear_last_expectation p
Clear last failed expectation.
This is useful e.g. after stripping whitespace. Since stripping whitespace means skip_one_or_more ws
or skip_zero_or_more ws
, after skipping whitespace the parser can still expect more whitespace. Therefore there is a failed expectation *whitespace* on the stack. However you rarely want this expectation to be reported.
p </> q
Try first combinator p
. In case of success or failure with consumed token, p </> q
is equivalent to p
.
If p
fails without consuming token, then p </> q
is equivalent to q
.
choices p [q r t ...]
is equivalent to p </> q </> r </> t </> ...
.
p <?> expect
Try combinator p
. In case of success or failure with consumed token, p <?> expect
is equivalent to p
.
If p
fails without consuming token, then the failed expectations are replaced with the failed expectation expect
.
Usually p
is a combinator implementing a choice between various alternatives of a grammar construct. The <?>
combinator allows to replace the set of failed grammar alternatives with a higher abstraction of the failed expectation. E.g. instead of getting the failed expectations identifier
, '('
, -
, ... we can get the failed expectation expression
.
no_expectations p
Parse the combinator p
.
p
fails: no_expectations p
fails with the same error.p
succeeds without consuming tokens: no_expectations p
succeeds without any added expectations.p
succeeds and consumes some token: no_expectations p
succeeds without any expectations.Many combinators can succeed with expectations. E.g. the combinator optional p
expects a p
and succeeds if it does not encounter a construct described by p
. All repetitive combinators like one_or_more
try to consume as many items as possible. At the end they are still expecting an item.
This combinator allows to clear such unneeded expectations. It is particularly useful when removing whitespace. The expectation of whitespace is not a meaningful error message to the user.
get_and_update f
Get the current user state and then update the user state. The returned value is the old state.
state_around before p after
If s0
is the initial state, then execute p
with the start state before s0
and set the update the final state s1
by after s0 a s1
where a
is the returned value in case of success and s1
is the final state after executing p
.
optional p
Try combinator p
.
Some a
where a
is the returned value.None
zero_or_more_fold_left start f p
Try the combinator p
as often as possible. Accumulate the results to the start value start
using the folding function f
.
one_or_more_fold_left first f p
Try the combinator p
at least once and then as often as possible. Put the first value returned by p
into the function first
returning a result and accumulate the subsequent values as often as possible and accumulate the results to the start value returned by first
using the folding function f
.
zero_or_more p
Parse zero or more occurrences of p
and return the collected result in a list.
zero_or_more p
Parse one or more occurrences of p
and return the collected results as a pair of the first value and a list of the remaining values.
skip_zero_or_more p
Parse zero or more occurrences of p
, ignore the result and return the number of occurrences.
skip_one_or_more p
Parse one or more occurrences of p
, ignore the result and return the number of occurrences.
val one_or_more_separated :
('item -> 'r t) ->
('r -> 'sep -> 'item -> 'r t) ->
'item t ->
'sep t ->
'r t
one_or_more_separated first next p sep
Parse one or more occurrences of p
separated by sep
. Use first
to convert the first occurrence of p
into the result and use next
to accumulate the results.
counted min max start next p
Collect between min
and max
numbers if elements recognized by the combinator p
and accumulate them with the folding function next
into the start value start
.
val parenthesized :
('lpar -> 'a -> 'rpar -> 'b t) ->
'lpar t ->
(unit -> 'a t) ->
('lpar -> 'rpar t) ->
'b t
parenthesized make lpar p rpar
Parse an expression recognized by the combinator p
enclosed within parentheses. lpar
recognizes the left parenthesis and rpar
recognizes the right parenthesis. The value returned by lpar
is given to rpar
. With that mechanism it is possible to recognize matching parentheses of different kinds.
After successful parsing the function make
is called with the final value (and the parentheses).
The combinator p
is entered as a thunk in order to be able to call it recursively. In the combinator parenthesized
the combinator p
is called only guardedly. Therefore the combinator p
can contain nested parenthesized expressions.
Precondition: The combinator lpar
has to consume at least one token in case of success.
val operator_expression :
'exp t ->
'op t option ->
'op t ->
('op -> 'op -> bool t) ->
('op -> 'exp -> 'exp t) ->
('exp -> 'op -> 'exp -> 'exp t) ->
'exp t
operator_expression
primary (* Parse a primary expression *)
unary_operator (* Parse a unary operator *)
binary_operator (* Parse a binary operator *)
is_left (* Is the left operator binding stronger? *)
make_unary (* Make a unary expression from the operator and
its operand *)
make_binary (* Make a binary expression from the operator
and its operands *)
Parse an operator expression by using the following combinators:
is_left o1 o2
decides, if the operator o1
on the left has more binding power than the operator o2
. I.e. if the unary operator u
has more binding power than the binary operator o
, then u a o b
is parsed as (u a) o b
. If the binary operator o1
has more binding power than the binary operator o2
, then a o1 b o2 b
is parsed as (a
o1 b) o2 c
.make_unary u a
makes the unary expression (u a)
.make_binary a o b
makes the binary expression (a o b)
.primary
parses a primary expression.unary_operator
parses a unary operator.binary_operator
parses a binary operator.Precondition: primary
, unary_operator
and binary_operator
have to consume at least one token in case of success. Otherwise infinite recursion can happen.
backtrack p expect
Try the combinator p
. In case of failure with consuming token, push the consumed token back to the lookahead and let it fail without consuming token. Use expect
to record the failed expectation.
Backtracking reduces the performance, because the token pushed back to the lookahead have to be parsed again. Try to avoid backtracking whenever possible.
followed_by p expect
Parses p
and backtracks (i.e. all tokens of p
will be pushed back to the lookahead). In case p
succeeds, the followed_by
parser succeeds without consuming token. Otherwise it fails without consuming tokens.
not_followed_by p expect
Parses p
and backtracks (i.e. all tokens of p
will be pushed back to the lookahead). In case p
succeeds, the not_followed_by
parser fails without consuming token. Otherwise it succeeds without consuming tokens.
followed_by
and not_followed_by
can be used to peek into the token stream without consuming token.
val step :
Stdlib.String.t ->
(State.t -> Position.range -> Token.t -> ('a * State.t) option) ->
'a t
step expect f
Elementary parsing step.
expect
describes the expectation in case the parsing step fails.
Failure can happen in 3 cases:
f
indicates an unexpected token by returning None
.The function f
is called with two arguments.
f
has to return an object of type 'a
and a new state, if it accepts the token. Or it returns the expectation, if the current token does not satisfy its expectation.
val expect_end : 'a -> 'a t
expect_end a
Parse the end of input and return an a
in case of success.
If p
is a parser which parses some construct then p >>= expect_end
succeeds if p
succeeds and is immediately followed by the end of input.
CAUTION: This combinator should almost never be used. It might be useful in partial parsers (see Partial Parsing for details). Usually the Token_parser
handles the end of input internally.
Never ever backtrack over this combinator.
val located : 'a t -> (Position.range * 'a) t
located p
Parse p
and return the parse value together with its location in the file.
indent i p
Parse p
indented at least i
columns relative to its parent.
Precondition: 0 <= i
The indentation of p
is defined by the indentation of its leftmost token. The leftmost token has to be indented at least i
columns relative to the parent of p
.
align p
Use the start position of the first token of p
to align it with other constructs.
In an aligned construct the first token is the leftmost token.
Alignment makes sense if there are at least two combinators which are aligned and indented. E.g. suppose there are two combinators p
and q
. Then we can form
indent 1 (
align (
let* a = align p in
let* b = align q in
return (a,b)
))
This combinator parses p
whose first token has to be indented at least one column relative to its parent. And then it parses q
whose first token must be aligned with the first token of p
.
The indentation decouples the alignment of p
and q
with other aligned siblings or parents. indent 0 ...
can be used to make the indentation optional.
Like align
but the leftmost token have to be at the lowest allowed column.
If a whole sequence of aligned constructs have to be left aligned, then at least the first item of the sequence has to be left aligned.
detach p
Parse p
as if there were no indentation and alignment requirements.
make s p
Make the parser from the combinator p
and start it in state s
.
The parser parses a construct described by the combinator p
followed by the end of input. Note that there must not be any combinator generated from expect_end
within the combinator p
. The function make
adds the corresponding expect_end
for you.
make_partial s p
Make a partial parser from the combinator p
and start it in state s
.
The generated partial parser ends whenever it encounters an error or it has recognized a construct described by the combinator p
even if after the construct the input does not end. An end of input after the construct is not consumed by the parser. See section Partial Parsing for details.