1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
|
#+title: Cognition
#+author: Preston Pan
#+description: Other languages are inflexible and broken. Let's fix that.
#+html_head: <link rel="stylesheet" type="text/css" href="../style.css" />
#+html_head: <link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png">
#+html_head: <link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.png">
#+html_head: <link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.png">
#+html_head: <link rel="manifest" href="/site.webmanifest">
#+html_head: <link rel="mask-icon" href="/safari-pinned-tab.svg" color="#5bbad5">
#+html_head: <meta name="msapplication-TileColor" content="#da532c">
#+html_head: <meta name="theme-color" content="#ffffff">
#+html_head: <meta name="viewport" content="width=1000; user-scalable=0;" />
#+language: en
#+OPTIONS: broken-links:t
* Introduction
Cognition is an active research project that Matthew Hinton and I have been working on for the past
couple of months. Although my commit history for [[https://github.com/metacrank/cognition][this project]] has not been impressive, we came up with
a lot of the theory together, working alongside each other in order to achieve one of the most generalized
systems of syntax we know of. Let's take a look at the conceptual reason why cognition needs to exist, as
well as some /baremetal cognition/ code (you'll see what I mean by this later). There's a paper about this language
available about the language in the repository, for those interested. Understanding cognition might require a
lot of background in parsing, tokenization, and syntax, but I've done my best to write this in a very understandable way.
The repository is available at https://github.com/metacrank/cognition, for your information.
* The problem
Lisp programmers claim that their system of s-expression code in addition to its featureful macro system makes it a
metaprogrammable and generalized system. This is of course true, but there's something very broken with lisp: metaprogramming
and programming /aren't the same thing/, meaning there will always be rigid syntax within lisp
(its parentheses or the fact that it needs to have characters that tell lisp to /read ahead/). The left parenthesis tells
lisp that it needs to keep on reading until the right parenthesis in order to finish some process that allows it to stop
and evaluate the whole expression. This makes the left and right parenthesis unchangable from within the language (not
conceptually, but under some implementations it is not possible), and, more importantly, it makes the process of retroactively
changing the sequence in which these tokens are delimited /impossible/, without a heavy amount of string processing. Other
langauges have other ways in which they need to read ahead when they see a certain token in order to decide what to do.
This process of having a program read ahead based on current input is called /syntax/.
And as long as you read ahead, or assume a default way of reading ahead, you fall into the trap of having some form of syntax.
Cognition is different in that it uses an antisyntax that is fully /postfix/. This has similarities with concatenative
programming languages, but concatenative programming langauges also suffer from two main problems: first, the introduction
of the left and right bracket character (which is in fact prefix notation, as it needs to read ahead of the input stream),
and the quote character for strings. This is unsuitable for such a general language. You can even see the same problem
in lisp's C syntax implementation: escape characters everywhere, awkward must-have spaces delimit the start and end
of certain tokens (and if not, it requires post-processing). The racket programming language has its macro system,
but it is not /runtime dynamic/. It still utilizes preprocessing.
So, what's the percise solution to this connundrum? Well, it's beautiful; but it requires some /cognition/.
* Baremetal Cognition
Baremetal cognition has a couple of perculiar attributes, and it is remarkably like the /Brainfuck/ programming language.
But unlike its look-alike, it has the ability to do some /serious metaprogramming/. Let's take a look at what the
bootstrapping code for a /very minimal/ syntax looks like:
#+begin_example
ldfgldftgldfdtgl
df
dfiff1 crank f
#+end_example
And *do* note the whitespace (line 2 has a whitespace after df, and the newlines matter). Erm, okay. What?
So, our goal in this post is to get from a syntax that looks like /that/ to a syntax that looks like [[file:stem.org][Stem]].
But how on earth does this piece of code even work? Well, we have to introduce two new ideas: delimiters, and ignores.
** Tokenization
Delimiters allow the tokenizer to figure out when one token ends and another begins. The list of single character tokenizers
is public, allowing that list to be modified and read from within cognition itself. Ignored characters are characters
that are completely ignored by the tokenizer in the first stage of every read-eval-print loop; that is, at the start of
collecting the token, it fist skips a set of ignored characters. By default, every single character is a delimiter, and
no characters are ignored characters. The delimiter and ignored characters list allows you to toggle a flag to tell it
to blacklist or whitelist the given characters, adding brevity (and practicality) to the language.
Let's take the first line of code as an example:
#+begin_example
ldfgldftgldfdtgl
#+end_example
because of the delimiter and ignored rules set by default, every single character is read as a token, and no character
is skipped. We therefore read the first character, ~l~. By default, Cognition works off a stack-based programming language
design. If you're not familiar, see the [[file:stem.org][Stem blogpost]] for more detail (in fact if you're not familiar this /won't work/
as an explanation for you, so you should see it, or read up on the /Forth/ programming language).
Though, we call them /containers/, as they are more general than stacks. Additionally, in this default environment, /no/
word is executed except for special /faliases/, as we will cover later.
Therefore, the character ~l~ gets read in and is put on the stack. Then, the character ~d~ is read in and put on the stack.
But ~f~ is different. In order to execute words in Cognition, we must take a look at the falias system.
** Faliases
Faliases are a list of words that get executed when they are put on the stack, or container as we will call it in the future.
All of them in fact execute the equivalent of ~eval~ in stem but as soon as they are put on their container. Meaning, when
~f~, the default falias, is run, it doesn't go on the container, but rather executes the top of the container which is ~d~.
~d~ changes the delimiter list to the string value of a word, meaning that it changes the delimiters to /blacklist/ only
the character ~l~ as a delimiter. Everything else by default is a delimiter because everything by default is parsed
into single character words.
** Delimiter Caveats
Delimiters have an interesting rule, and that is that the delimiter character is excluded from the tokenized word
unless we have not ignored a character in the tokenization loop, in which case we collect the character as a part of
the current token and keep going. This is in contrast to a third kind of tokenization category called the singlet, which
/includes/ itself into a token before skipping itself and ending the tokenization collection.
In addition, remember what I said about the /blacklist/? Well, you can toggle between /blacklisting/ and /whitelisting/
your list of delimiters, singlets, and ignored characters. By default, there are no /blacklisted/ delimiters, no
/whitelisted/ singlets, and no /whitelisted/ ignored characters.
We then also observe that all other characters will simply skip themselves while being collected as a part of the current
token, without ending this loop, therefore collecting new characters until the loop halts via delimiter or singlet rules.
** Continuing the Bootstrap Code
So far, we looked at this part of the code:
#+begin_example
ldf
#+end_example
which simply creates ~l~ as a non-delimiter. Now, for the rest of the code:
#+begin_example
gldftgldfdtgl
df
dfiff1 crank f
#+end_example
~gldf~ puts ~gl~ on the stack due to ~d~ being a delimiter, and ~f~ is called on it, meaning that now ~g~ and ~l~ are
the only non-delimiters. Then, ~tgl~ gets put on the stack and they become non-delimiters with ~df~. ~dtgl~ gets
put on the stack, and the newline becomes the only non-delimiter with ~\ndf~ (yes, the newline is actually a part of
the code here, and spaces need to be as well in order for this to work). Then, the space character, due to how delimiter
rules work (if you don't ignore, the first character is parsed normally even if it is a delimiter)
and ~\n~ gets put on the stack. Then, another ~\ \n~ word is tokenized (you might not see it, but there's another
space on line 3). The current stack looks like this (bottom to top):
#+begin_example
3. dtgl
2. [space char]\n
1. [space char]\n
#+end_example
~df~ sets the non-delimiters to ~\ \n~. ~if~ sets the ignores to ~\ \n~, which ignores these characters at the start
of tokenization. ~f~ executes ~dtgl~, which is a word that toggles the /dflag/, the flag that stores the whitelist/blacklist
distinction for delimiters. Now, all non-delimiters are delimiters and all delimiters are non-delimiters.
Finally, we're put in an environment where spaces and newlines are the delimiters for tokens, and they are ignored at the
start of tokenizing a token. Next, ~1~ is tokenized and put on the stack, and then the ~crank~ word, which is then executed
by ~f~ (the ~1~ token is treated as a number in this case, but everything textual in cognition is a word).
We are done our bootstrapping sequence! Now, you might wonder what ~crank~ does. That we will explain in a later section.
* Bootstrapping Takeaways
From this, we see a couple principles: first, cognition is able to change how it tokenizes on the fly and it can do it
programmatically, allowing you to program a program in cognition that would theoretically automate the process of changing
these delimiters, singlets, and ignores. This is something impossible in other languages, being able to
/program your own tokenizer for some foreign language from within cognition/, and have
/future code be tokenized exactly like how you want it to be/. This is solely possible because the language is postfix
and doesn't read ahead, so it doesn't require more than one token to be parsed before an expression is evaluated. Second,
faliases allow us to execute words without having to have prefix words or any default execution of words.
* Crank
The /metacrank/ system allows us to set a default way in which tokens are executed on the stack. The ~crank~ word takes
a number as its argument and by effect executes the top of the stack for every ~n~ words you put on the stack. To make
this concept concrete, let's look at some code (running from what we call /crank 1/ as we set our environment to
crank one at the end of the bootstrapping sequence):
#+begin_example
5 crank 2crank 2 crank
1 crank unglue swap quote prepose def
#+end_example
the crank 1 environment allows us to stop using ~f~ in order to evaluate tokens. Instead, every /1/ token that is
tokenized is evaluated. Since we programmed in a newline and space-delimited syntax, we can safely interpret this code
intuitively.
The code begins by trying to evaluate ~5~, which evaluates to itself as it is not a builtin. ~crank~ evaluates and puts
us in 5 crank, meaning every /5th/ token evaluates from here on. ~2crank~, ~2~, ~crank~, ~1~ are all put on the stack,
leaving us with a stack that looks like so (notice that ~crank~ doesn't get executed even though it is a bulitin because
we set ourselves to using crank 5):
#+begin_example
4. 2crank
3. 2
2. crank
1. 1
#+end_example
~crank~ is the 5th word, so it executes. Note that this puts us back in crank 1, meaning every word is evaluated.
~unglue~ is a builtin that gets the value of the word at the top of the stack (as ~1~ is used up by the ~crank~ we
evaluated), and so it gets the value of ~crank~, which is a builtin. What that in effect does is it gets the function
pointer associated with the crank builtin. Our new stack looks like this:
#+begin_example
3. 2crank
2. 2
1. [CLIB]
#+end_example
Where CLIB is our function pointer that points to the ~crank~ builtin. We then ~swap~:
#+begin_example
3. 2crank
2. [CLIB]
1. 2
#+end_example
then ~quote~, a builtin that quotes the top thing on the stack:
#+begin_example
3. 2crank
2. [CLIB]
1. [2]
#+end_example
then prepose, a builtin like ~compose~ in stem, except that it preposes and that it puts things in what we call a VMACRO:
#+begin_example
2. 2crank
1. ( [2] [CLIB] )
#+end_example
then we call ~def~. This defines a word ~2crank~ that puts ~2~ on the stack and then calls a function pointer pointing
us to the crank builtin. Now, we still have to define what VMACROs are, and in order to do that we might have to explain
some differences between the cognition stack and the stem stack.
** Differeneces
In the stem stack, putting words on the stack directly is allowed. In cognition, words are put in containers when
they are put on the stack and not evaluated. This means words like ~compose~ in stem work on words (or more accurately
containers with a single word in them) as well as other containers, making the API for this language more consistent.
Additionally, words like ~cd~ as we will make use of this concept.
*** Macros
Macros are another difference between stem quotes and cognition containers. When macros are evaluated, everything in
the macro is evaluated, ignoring the crank. If bound to a word, evaluating that word evaluates the macro which will ignore
the crank completely and will only increment the cranker by one, while evaluating each statement in the macro. They
are useful for making crank-agnostic code, and expanding macros is very useful for the purpose of optimization, although
we will actually have to write the word ~expand~ from more primitive words later on (hint: it uses recursive ~unglue~).
** More Code
Here is te rest of the code in ~bootstrap.cog~ in ~coglib/~:
#+begin_example
getd dup _ concat _ swap d i
_quote_swap_quote_compose_swap_dup_d_i eval
2crank ing 0 crank spc
2crank ing 1 crank swap quote def
2crank ing 0 crank endl
2crank ing 1 crank swap quote def
2crank ing 1 crank
2crank ing 3 crank load ../coglib/ quote
2crank ing 2 crank swap unglue concat unglue fread unglue evalstr unglue
2crank ing 1 crank compose compose compose compose VMACRO cast def
2crank ing 1 crank
2crank ing 1 crank getargs 1 split swap drop 1 split drop
2crank ing 1 crank
2crank ing 1 crank epop drop
2crank ing 1 crank INDEX spc OUT spc OF spc RANGE
2crank ing 1 crank concat concat concat concat concat concat =
2crank ing 1 crank
2crank ing 1 crank missing spc filename concat concat dup endl concat
2crank ing 1 crank swap quote swap quote compose
2crank ing 2 crank print compose exit compose
2crank ing 1 crank
2crank ing 0 crank fread evalstr
2crank ing 1 crank compose
2crank ing 1 crank
2crank ing 1 crank if
#+end_example
Okay, well, the syntax still doesn't look so good, and it's still pretty hard to get what this is doing. But the
basic idea is that ~2crank~ is a macro and is therefore crank agnostic, and we guarantee its execution with ~ing~, another
falias (because it's funny). Then, we execute an ~n crank~, which standardizes what crank each line is in (you might
wonder what ~ing~ and ~f~'s interaction is with the cranker. It actually just guarantees the evaluation of the previous
thing, so if the previous thing already evaluated ~f~ and ~ing~ both do nothing). In any case, this defines words that
are useful, such as ~load~, which loads something from the coglib. It does this by ~compose~-ing things into quotes and
then ~def~-ing those quotes.
The crank, and by extension, the metacrank system is needed in order to discriminate between /evaluating/ some tokens
and /storing/ others for metaprogramming without having to use ~f~, while also keeping the system postfix. Crank
is just one word that allows for this type of behavior; the more general word, ~metacrank~, allows for much more
interesting kinds of syntax manipulation. We have examples of ~metacrank~ down the line, but for now I should explain
the /metacrank word/.
** Metacrank
~n m metacrank~ sets a periodic evaluation ~m~ for an element ~n~ items down the stack. The ~crank~ word is therefore
equivalent to ~0 m metacrank~. Only one token can be evaluated per tokenized token, although /every/ metacrank is incremented
per token, where lower metacranks get priority. This means that if you set two different metacranks, only /one/ of them
can execute per token tokenized, and the lower metacrank gets priority. Note that metacrank and, by extension, crank,
don't /just/ depend on tokenized words; they also work while evaluating word definitions recursively, meaning if a word
is evaluated in ~2 crank~, one out of two words will execute in each level of the evaluation tree. You can play around
with this in the repl to get a sense of how it works: run ~../crank bootstrap.cog repl.cog devel.cog load~, and use stem
like syntax in order to define a function. Then, run that function in ~2 crank~. You will see how the evaluation tree
respects cranking in the same way that the program file itself does.
Metacrank allows for not only metaprogramming in the form of code building, but also
direct syntax manipulation (i.e. /I want to execute this token once I have read n other token(s)/). The advantages to
this system compared to other programming languages' systems are clear: you can program a prefix word and ~undef~ it
when you want to rip out that part of syntax. You can write a prefix character that doesn't stop at an ending character
but /always/ stops when you read a certain number of tokens. You can feed user input into a math program and feed the
output into a syntax system like metacrank. The possibilities are endless! And with that, we will slowly build up the
~stem~ programming language, v2, now with macros and from within our own /cognition/.
* The Stem Dialect, Improved
In this piece of code, we define the /comment/:
#+begin_example
2crank ing 0 crank ff 1
2crank ing 1 crank cut unaliasf
2crank ing 0 crank 0
2crank ing 1 crank cut swap quote def
2crank ing 0 crank
2crank ing 0 crank #
2crank ing 0 crank geti getd gets crankbase f d f i endl s
2crank ing 1 crank compose compose compose compose compose compose compose compose compose
2crank ing 0 crank drop halt crank s d i
2crank ing 1 crank compose compose compose compose compose VMACRO cast quote compose
2crank ing 0 crank halt 1 quote ing 1 quote ing metacrank
2crank ing 1 crank compose compose compose compose VMACRO cast
2crank ing 1 crank def
2crank ing 2 crank # singlet # delim
2crank ing 1 crank #comment: geti getd gets crankbase '' d '' i '\n' s ( drop halt crank s d i ) halt 1 1 metacrank
#+end_example
and it is our first piece of code that builds something /truly/ prefix. The comment character is a prefix that drops
all the text before the newline character, which is a type of word that tells the parser to /read ahead/. This is our
first indication that everything that we thought was possible within cognition truly /is/.
But before that, we can look at the first couple of lines:
#+begin_example
2crank ing 0 crank ff 1
2crank ing 1 crank cut unaliasf
2crank ing 0 crank 0
2crank ing 1 crank cut swap quote def
2crank ing 0 crank
#+end_example
which simply unaliases ~f~ from the falias list, with ~ing~ being the only remaining falias. In cognition, even these
faliases are changeable.
Since we can't put ~f~ directly on the stack (if we try by just using ~f~, it would execute), we instead utilize some
very minimal string processing to do it, putting ~ff~ on the stack and then cutting the string in half to get two copies
of ~f~. We then want ~f~ to mean false, which in cognition is just an empty word. Therefore, we make an empty word by
calling ~0 cut~ on this string, and then ~def~-ing f to the empty string. The following code is where the comment is
defined:
#+begin_example
2crank ing 0 crank #
2crank ing 0 crank geti getd gets crankbase f d f i endl s
2crank ing 1 crank compose compose compose compose compose compose compose compose compose
2crank ing 0 crank drop halt crank s d i
2crank ing 1 crank compose compose compose compose compose VMACRO cast quote compose
2crank ing 0 crank halt 1 quote ing 1 quote ing metacrank
2crank ing 1 crank compose compose compose compose VMACRO cast
2crank ing 1 crank def
2crank ing 2 crank # singlet # delim
2crank ing 1 crank #comment: geti getd gets crankbase '' d '' i '\n' s ( drop halt crank s d i ) halt 1 1 metacrank
#+end_example
Relevant: ~halt~ just puts you in 0 for all metacranks, and ~VMACRO cast~ just turns the top thing on the stack from a
container to a macro. ~geti~, ~getd~, ~gets~ gets the ignores, delims, and singlets respectively as a string; ~drop~ is
~dsc~ in stem. ~singlet~ and ~delim~ sets the singlets and delimiters. ~endl~ is defined withint ~bootstrap.cog~ and just
puts the newline character as a word on the stack. ~crankbase~ gets the current crank.
we call a lot of ~compose~ words in order to build this definition, and we make the ~#~ character a singlet delimiter in
order to allow for spaces after the comment. We put ourselves in ~1 1 metacrank~ in the ~#~ definition while altering
the tokenization rules beforehand in order to tokenize everything until a newline as a token while calling ~#~ on said word
in order to effectively drop that comment and get ourselves back in the original crank and metacrank. Thus, the brilliant
~#~ character is written, operating on a token that is tokenized /in the future/, with complete default postfix syntax.
With the information above, one can work out the specifics of how it works; the point is that it /does/, and one can test
that it does by going into the ~coglib~ folder and running ~../crank bootstrap.cog repl.cog devel.cog load~, which will load
the REPL and load ~devel.cog~, which will in turn load ~comment.cog~.
** The Great Escape
Here we define a preliminary prefix escape character:
#+begin_example
2crank ing 2 crank comment.cog load
2crank ing 0 crank
2crank ing 1 crank # preliminary escape character \
2crank ing 1 crank \
2crank ing 0 crank halt 1 quote ing crank
2crank ing 1 crank compose compose
2crank ing 2 crank VMACRO cast quote eval
2crank ing 0 crank halt 1 quote ing dup ing metacrank
2crank ing 1 crank compose compose compose compose
2crank ing 2 crank VMACRO cast
2crank ing 1 crank def
2crank ing 0 crank
2crank ing 0 crank
#+end_example
This allows for escaping so that we can put something on the stack even if it is to be evaluated,
but we want to redefine this character eventually to be compatible with stem-like quotes. We're
even using our comment character in order to annotate this code by now! Here is the full quote definition (once we have
this definition, we can use it to improve itself):
#+begin_example
2crank ing 0 crank [
2crank ing 0 crank
2crank ing 1 crank # init
2crank ing 0 crank crankbase 1 quote ing metacrankbase dup 1 quote ing =
2crank ing 1 crank compose compose compose compose compose
2crank ing 0 crank
2crank ing 1 crank # meta-crank-stuff0
2crank ing 3 crank dup ] quote =
2crank ing 1 crank compose compose
2crank ing 16 crank drop swap drop swap 1 quote swap metacrank swap crank quote
2crank ing 3 crank compose dup quote dip swap
2crank ing 1 crank compose compose compose compose compose compose compose compose
2crank ing 1 crank compose compose compose compose compose \ VMACRO cast quote compose
2crank ing 3 crank compose dup quote dip swap
2crank ing 1 crank compose compose compose \ VMACRO cast quote compose \ if compose
2crank ing 1 crank \ VMACRO cast quote quote compose
2crank ing 0 crank
2crank ing 1 crank # meta-crank-stuff1
2crank ing 3 crank dup ] quote =
2crank ing 1 crank compose compose
2crank ing 16 crank drop swap drop swap 1 quote swap metacrank swap crank
2crank ing 1 crank compose compose compose compose compose compose compose compose \ VMACRO cast quote compose
2crank ing 3 crank compose dup quote dip swap
2crank ing 1 crank compose compose compose \ VMACRO cast quote compose \ if compose
2crank ing 1 crank \ VMACRO cast quote quote compose
2crank ing 0 crank
2crank ing 1 crank # rest of the definition
2crank ing 16 crank if dup stack swap 0 quote crank
2crank ing 2 crank 1 quote 1 quote metacrank
2crank ing 1 crank compose compose compose compose compose compose compose compose
2crank ing 1 crank compose \ VMACRO cast
2crank ing 0 crank
2crank ing 1 crank def
#+end_example
Um, it's quite the spectacle how Matthew Hinton ever came up with this thing, but alas, it exists. Then, we use it in
order to redefine itself, but better as the old quote definition can't do recursive quotes
(we can do this because the definition is /used/ before you redefine the word due to postfix ~def~, a
development pattern seen often in low level cognition):
#+begin_example
\ [
[ crankbase ] [ 1 ] quote compose [ metacrankbase dup ] compose [ 1 ] quote compose [ = ] compose
[ dup ] \ ] quote compose [ = ] compose
[ drop swap drop swap ] [ 1 ] quote compose [ swap metacrank swap crank quote compose ] compose
[ dup ] quote compose [ dip swap ] compose \ VMACRO cast quote compose
[ dup dup dup ] \ [ quote compose [ = swap ] compose \ ( quote compose [ = or swap ] compose \ \ quote compose [ = or ] compose
[ eval ] quote compose
[ compose ] [ dup ] quote compose [ dip swap ] compose \ VMACRO cast quote compose [ if ] compose \ VMACRO cast
quote compose [ if ] compose \ VMACRO cast quote quote
[ dup ] \ ] quote compose [ = ] compose
[ drop swap drop swap ] [ 1 ] quote compose [ swap metacrank swap crank ] compose \ VMACRO cast quote compose
[ dup dup dup ] \ [ quote compose [ = swap ] compose \ ( quote compose [ = or swap ] compose \ \ quote compose [ = or ] compose
[ eval ] quote compose
[ compose ] [ dup ] quote compose [ dip swap ] compose \ VMACRO cast quote compose [ if ] compose \ VMACRO cast
quote compose [ if ] compose \ VMACRO cast quote quote
compose compose [ if dup stack swap ] compose [ 0 ] quote compose [ crank ] compose
[ 1 ] quote dup compose compose [ metacrank ] compose \ VMACRO cast
def
#+end_example
Okay, so now we can use recursive quoting, just like in stem. But there are still a couple things missing that we probably
want: a good string quote implementation, and probably escape characters that work in the brackets. Also, since Cognition
utilizes macros, we probably want a way to notate those as well, and we probably want a way to expand macros. We can do
all of that! First, we will have to redefine ~\~ once more:
#+begin_example
\ \
[ [ 1 ] metacrankbase [ 1 ] = ]
[ halt [ 1 ] [ 1 ] metacrank quote compose [ dup ] dip swap ]
\ VMACRO cast quote quote compose
[ halt [ 1 ] crank ] VMACRO cast quote quote compose
[ if halt [ 1 ] [ 1 ] metacrank ] compose \ VMACRO cast
def
#+end_example
This piece of code defines the bracket but for macros (split just splits a list into two):
#+begin_example
\ (
\ [ unglue
[ 11 ] split swap [ 10 ] split drop [ macro ] compose
[ 18 ] split quote [ prepose ] compose dip
[ 17 ] split eval eval
[ 1 ] del [ \ ) ] [ 1 ] put
quote quote quote [ prepose ] compose dip
[ 16 ] split eval eval
[ 1 ] del [ \ ) ] [ 1 ] put
quote quote quote [ prepose ] compose dip
prepose
def
#+end_example
We want these macros to automatically expand because it's more efficient to bind already expanded macros to words,
and they functionally evaluate identically (~isdef~ just returns a boolean where true is a non-empty string, false
is an empty string, if a word is defined):
#+begin_example
\ (
( crankbase [ 1 ] metacrankbase dup [ 1 ] =
[ ( dup \ ) =
( drop swap drop swap [ 1 ] swap metacrank swap crank quote compose ( dup ) dip swap )
( dup dup dup \ [ = swap \ ( = or swap \ \ = or
( eval )
( dup isdef ( unglue ) [ ] if compose ( dup ) dip swap )
if )
if ) ]
[ ( dup \ ) =
( drop swap drop swap [ 1 ] swap metacrank swap crank )
( dup dup dup \ [ = swap \ ( = or swap \ \ = or
( eval )
( dup isdef ( unglue ) [ ] if compose ( dup ) dip swap )
if )
if ) ]
if dup macro swap
[ 0 ] crank [ 1 ] [ 1 ] metacrank ) def
#+end_example
and you can see that as we define more things, our language is beginning to look more or less like it has syntax!
In this ~quote.cog~ file which we have been looking at, there are more things, but the bulk of it is pretty much done.
From here on, I will just explain the syntax programmed by quote.cog instead of
|