Due to the use of border functions, flat automata are more compact than standard automata in the case where intervals of characters are attached to transitions, and the standard algorithms on automata are simpler. We give the standard algorithms for tokenizer construction with automata, namely construction using regular operations, determinization, and minimization, and prove their correctness.

The algorithms work with intervals of characters, but are not more complicated than their counterparts on single characters. It is easy to generate C++ code from the final deterministic automaton. All procedures have been implemented in C++ and are publicly available. The implementation has been used in applications and in teaching. This presentation was given at GandALF 2022 . These are the slides.

Maphoon is a toolbox that consists of two parts: The first
part is a library that contains an automaton-based implementation
of *classifiers*. A classifier cuts the input
into pieces, and classifies the pieces.
The classifier does not create a complete tokenizer, because that would
result in a lack of flexibility, that would make the resulting
tokenizers unusable in practice.
Classifiers are easy to use, flexible, and it is possible to show the
automata, which is important for teaching. If efficiency is needed,
the classifier can be translated into a C++ source, which can be translated
without the library.

The second part is a *parser generator*. It creates a
runnable parser from the description of a grammar with added action
code. Grammar rules are applied from right to left, and
when a grammar rule is applied, the corresponding action
code is executed. The action code compute the meaning of the recognized
input part. The constructed parser is bottom-up. I think that bottom-up parsing
is better than top-down parsing, because it allows more
grammars, and one can think about the grammar purely as grammar,
not as as something procedural where order of trying matters.
There exist other parser generators that work with C,
but Maphoon fully supports RAII and moving, allows run-time definition of
operators, and has tunable error reporting.
For details, slides, and the sources, I refer to
my site on compiler tools.

I argue that that logic reveals a general problem of existing programming languages: We do not yet know how to handle type union and partiality in programming languages, and how to combine it with static type checking. Standard programming languages use dynamic dispatch, enumeration types, pointer casts, variants, or union types. Logicians traditionally prefer matching, but matching also flexibility because cases cannot be regrouped. Instead of these standard approaches, I propose to handle union by case analysis on recursively defined subtypes, combined with static type checking based on these recursively defined subtypes.

In the current talk, I discuss a part of the static type checking procedure, namely a procedure that decides inclusions between recursively defined subtypes. The procedure is tableaux-based and uses blocking, so that termination is guarateed.

The talk was given at Instytut Informatyki in Wrocław. These are the slides. (A talk with very similar content was given at PSSV 2021 in November 2021 in Kazan/Novosibirsk.)

The talk was given at Instytut Informatyki in Wrocław. This is the abstract, and these are the slides.

The following problem is well-known to be NP-hard: "Given a set of propositional clauses C, find an interpretation that makes all clasues in C true". Despite its NP-hardness, modern implementations are able to solve large instances in short time. Other search problems can be solved efficiently by translation them to SAT. I explain the modern approach to SAT-solving, and demonstrate use of MiniSat. Because of its fundamental nature, and the fact that there exist efficient implementations, I think that SAT-solving should be part of the standard curriculum.

Slides as pdf

Download as pdf.

In his introduction of addition and multiplication, there is a strange thing: Both are introdued without reference to the fact that Nat is a free data type. Especially the introduction of multiplication is a mystery. In order to check the proofs, we first give a precise description of Landau's introduction of addition and multiplication. After this, the proof appears correct to us.

In 1977, the complete Grundlagen have been verified in the Automath system. So we want to know: What is the mechanism used in Automath for introducting recursive functions, maybe Van Benthem Jutting used some kind of additional recursion axiom for introducing addition and multiplication?

We look into the sources of Van Benthem Jutting's translation, and see that the translation follows Landau's proof very carefully and that no additional properties were used.

So the question remains: How did Landau/Kalmar manage to get away without using the fact that natural numbers are freely generated? Are there more functions definable in that way?

Download as pdf.