Tacit Talk
Tacit Talk is a podcast about programming languages, combinators, algorithms and more!
Tacit Talk
Episode 33: From List Calculus to Array Calculi (Bird's Laws, AoP, MoA & SaC) 🟦
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
In this episode, the "From List Calculus to Array Calculi" (generated by GPT 5.4) is read by the Speechify text-to-speech app.
Socials
- Tacit Talk YouTube Playlist
- Conor Hoekstra: LinkTree / Bio
Show Notes
Date Released: 2026-03-18
- From List Calculus to Array Calculi
- An Introduction to the Theory of Lists (1986)
- Mathematics of Arrays (MoA)
- The Algebra of Programming (1996)
- Algegraic Identities for Program Calculation (1989)
- Algorithm Design with Haskell (2020)
- Bird-Meertens Formalism (BMF)
- The School of Squiggol: A History of BMF (2019)
- Single Assignment C (SaC)
- Lectures on Constructive Functional Programming
Welcome to episode 33 of Tacit Talk, released on March 18th, 2026. In today's episode, we're going to do something similar to what we did in episode 32. We had GPT 5.4 compile a deep survey on the history or the path from the list calculus that was presented in an introduction to the theory of lists by Richard Byrd in 1986, all the way to or via the through line to the mathematics of arrays and single assignment C, and along the way, the algebra of programming. Hope you enjoy.
SPEAKER_01From list calculus to array calculli, Byrd's algebraic laws, the theory of lists, the algebra of programming, and the MOA, AC lineage, compiled from Byrd, Myrtons, Gibbons, Mullen, and Schultz. March 2026. Introduction. This document is a deep dive survey of a connected family of ideas. Richard Byrd's Algebraic Laws for Program Calculation. The List Calculus of an Introduction to the Theory of Lists, the Notational and Methodological Shift from Early Byrd Mirton's squiggle to the relational, style of the algebra of programming. Byrd's lesser-known extension of the same program calculation story, from lists to arrays in the 1988 lectures, the parallel but distinct array calculus line running from APL through Lenore Mullen's Math, Mematics of Arrays, MOA, and Psi Calculus to Sven Bodo Schultz's SSC. The central claim of this survey is that these are not identical traditions, but they are close neighbors. Bird's list work and Mullen's array work share an algebraic ambition, begin with a clear specification, identify a small set of structural operators, and derive efficient implementations by equational reasoning. They diverge in what they take to be the primary structure of data. Bird starts from inductive structure and recursion patterns, such as map, fold, scan, and homomorphism. Mullen starts from shape, indexing, and layout neutral array transformations. Accordingly, this paper separates four layers that are often conflated. 1. Bird's list calculus, a calculus of list combinators and algebraic laws. 2. Classic squiggle, BMF notation, the historical notation used in the mid-1980s list monographs. 3. The Algebra of Programming, AOP, a later relational and categorical reformulation that is in the same lineage but not written in the old list notation. 4. Array calculi, a broader family including APL, Bird's own array lectures, and the MOA, Psi Calculus, the C line. The practical payoff of disentangling these layers is conceptual clarity. It explains why a page from the theory of lists looks ample-like, why a page from the algebra of programming does not, why BERD can still be said to have an array calculus in PRG69, and why MOA and SAC should be understood as related but separate descendants of the same algebraic impulse. 1. Genealogy in one page. The short historical arc. The shortest accurate genealogy runs as follows. Optimization problems. 6. Mullins MOA and Psi Calculus developed a different array-centered algebra based on shape and indexing. 7. SEC pursued a practical functional array language with compiler optimizations and explicitly cited MOA and Psi reduction in early work. What is shared? Across these lines, the shared themes are whole program transformation by algebraic law, compact notation, as a tool of thought, elimination of intermediate structures, concern with parallelism and structural decomposition, preference for data type generic or dimension generic operators over ad hoc recursion. What differs? The crucial differences are structural. List calculus reasons over an inductive data type with constructors, concatenation, and folds. AOP reasons relationally and categorically over far more general data type structure. MOA reasons by shape vectors, index maps, and array layout semantics. SSC is a programming language and compiler framework, not merely a pen and paper calculus. Once this genealogy is kept in view, many apparent contradictions disappear. AOP is both in the Bird Mirtan's tradition and not written in classic squiggle. MOA is both an array calculus and not birds list calculus generalized. SSC is both indebted to MOA and a separate language design project, with its own implementation agenda. 2. Birds list calculus. Primary source, Richard Byrd. An introduction to the theory of lists. PRG 56, 1986, Springer version 1987. What list calculus means here? In Bird's setting, a calculus is not merely a notation. It is a package of primitive list operators, equational definitions, algebraic laws that permit rewriting, standard derivation patterns for turning slow specifications into fast programs. So list calculus here means an algebra of list operators in which proofs and program derivations proceed by substitution of equals for equals. The central operators are not arbitrary. They are chosen because they capture recurring recursion schemes, concatenation, map, reduce fold, filter, directed folds, accumulations, scans, prefixes, suffixes, and segments. Primitive combinators. Byrd's early list calculus revolves around a small set of combinators. F, map, f over a list. Reduce a list using associative operator. Filter by predicate P. Directed reductions, corresponding to what later became foldal and folder. Directed accumulations, corresponding to what later became scale and scare. Inits, tails, and segs. This is one of the reasons the notation looks close to APL. The slash is not a cosmetic flourish, it advertises the idea that aggregation is a structural operator in its own right. Why it counts as a calculus? Byrd explicitly wanted a system in which common derivation patterns could be used the way one uses standard rules in integral calculus. The point was not simply to define map and fold, but to establish reusable theorems such as push a map through a concatenation, replace repeated reductions on overlapping prefixes with a single scan, identify when a function is a homomorphism. Exploit distributivity to collapse a quadratic specification into a linear fold. This is what separates the list calculus from mere functional programming notation. The Bird Mirton's connection, Jeremy Gibbons, the school of squiggle, is especially helpful here. He describes BMF as a concise functional notation for equational derivation and emphasizes that the notation was wide-spectrum, some of it was close to executable code, but some of it was designed as a higher-level notation for calculation. That is exactly the role the list calculus plays in PRG 56. Significance. Byrd's list calculus is the clearest locus of the early BMF ideal. It is the place to look if one wants the most direct answer to questions like What are the core laws? Why are slash and star used the way they are? What is meant by scan lemma or promotion? Why does the maximum segment sum derivation work? 3. Byrd's core algebraic laws. Byrd's later papers vary their notation, but the core laws remain remarkably stable. This section gives the canonical list level laws in modernized form, together with the role each law plays. Map and composition, map, fusion. Map map equal to map. This is the law that lets two full traversals collapse into one when the only effect is element-wise transformation. Map Promotion. Map Concat equal to concat map map. This is Byrd's promotion idea in its most elementary form. A map over a compound structure can be promoted into the components. Reduction and homomorphism reduce promotion. Folder, concat equal to folder, map, folder. This is the reduction counterpart of map promotion. It is the law that makes divide and conquer and chunked evaluation algebraically visible. Homomorphism, lemma, if H satisfies H plus plus equal to H, then H can be factored as H equal to for a suitable F, and conversely, every such factorization is a list homomorphism. This law is the bridge between structural decomposition and parallel computation. Once a function is a homomorphism, chunking and recombination are algebraically justified. Directed folds and duality, duality lemma, fold, equal to folder, reverse, where, equal to. This law states that left and right reduction are not alien operators but duals, scans and accumulations, accumulation lemma, scale, equal to map, foldal, inits. This is one of Byrd's most practically important insights. The left side is a linear time accumulation. The right side is a clear but naive specification in terms of all prefixes and a fold per prefix. The law is therefore both a specification theorem and an optimization theorem. Last fold identity. Last scannel, equal to foldal. The scan contains the fold result as its final element. Distributivity in Horner's rule. Horner's rule, if distributes backwards over, then nested reduce over tail patterns can be collapsed into a single directed fold. In Bird's list calculus idiom, this is the move that turns a reduce each tail, then reduce again specification into a linear time computation. It is called Horner's rule because the structure mirrors polynomial evaluation, but Byrd deploys it as a general derivation pattern, rather than a one-off numerical trick. Fusion Fold, Fusion Bird's 1989 paper and later books, make fold fusion explicit in ordinary functional notation. Folder equal to folder, provided the usual side conditions hold. This law removes intermediate results between a producer and a consumer. In practical optimization folklore, it is one of the most reusable laws in the entire tradition. Fold scan fusion. Byrd's 1989 paper also gives a representative law fusing a fold after a scan into a single fold over a pair-valued state. This is the law that makes the maximum segment sum derivation end not merely with a scan, but with an intermediate structure-free linear pass. Significance. These laws are the stable nucleus of Byrd's program calculation enterprise. They are the reason later notation changes matter less than they first appear to. Whether the symbols are APO-like, squiggly, Miranda-like, Haskell-like, or categorical, the same small body of structural facts keeps reappearing. 4. Why the laws matter, maximum segment, sum. The most famous bird derivation is the maximum segment, sum, MSS problem. In its specification first form, MSS, equal to max, sum segs, read literally, this says, 1. Generate every segment of a list. 2. Compute the sum of each segment. 3. Take the maximum. This is clear, but cubic, if implemented naively. The standard derivation pattern. Bird's derivation proceeds by chaining laws. 1. Expand SEGS into concat. Map tails. Inits. 2. Use map promotion and reduce promotion. 3. Horner's rule on the intertail computation. 4. Use the scan lemma to replace repeated work over prefixes. 5. Optionally use fold scan fusion to remove the intermediate scan result. The magic is not one clever trick but a small, disciplined vocabulary of reusable laws. What the example demonstrates. The MSS example demonstrates three things at once. Laws can turn clarity into efficiency without ad hoc invention. Scans are not merely prefixed sums, but optimization devices. The calculus is useful because several laws compose. This is why MSS became the BMF canon example. It is short enough to fit on a few pages, but rich enough to exhibit promotion, accumulation, horner, and fusion in one derivation. Significance. If one wants the emotional center of the bird tradition, MSS is it. Many communities have a signature derivation that shows why the style is worth learning. For BMF, MSS plays that role. 5. Notation, APL, Squiggle, Miranda, Haskell. One of the most common sources of confusion in this area is the assumption that notation and method move in lockstep. They do not. The method stays relatively stable while the notation changes several times. APL Origins. APL contributed the most visible symbolic inheritance, plus, for reduction by addition, times, for product, plus for scan, array-wide operators as first-class notation. This is the direct ancestor of Byrd's use of slash for reduce, the squiggle period. In the mid-1980s, Byrd and Myrtons used a more explicitly symbolic notation. F for map, for symmetric reduce, directional arrows over slash for left and right folds, doubled slash with arrows for scans. This is what people usually mean by squiggle notation. It is list calculus friendly, concise, and historically tied to the BMF era, the retreat to functional syntax. By 1988 and 1989, Byrd had largely retreated to a more accessible notation. Map, fold, folder, scanner, scanner. Jeremy Gibbons' history explains this shift. As part of a broader ebb and flow, the squiggles were elegant for insiders, but increasingly off-putting for most readers. Why AOP looks different? By the time of the algebra of programming, the notation had shifted again, this time not just from squiggles to ordinary functional syntax, but from a list-focused algebra of functions to a relational categorical algebra over much broader data type structure. So the important distinction is early bird list papers, same method and often squiggly notation. 1989 bird paper, same method, ordinary functional notation. AOP, same broad lineage, but a different formal surface and a broader mathematical setting. Significance. The lesson is that bird's laws, squiggle, and AOP notation are not synonyms. They overlap historically, but they are not coextensive. 6. The algebra of programming is in the lineage, but it is not early squiggle. Primary source, Bird and D'Amour, The Algebra of Programming, 1996, dated 1997. The apparent puzzle. Readers often meet the older list papers first, then open AOP and feel they have landed in a different tradition altogether. That impression is understandable. AOP pages are filled with relational composition. Pair, zip, lister, outer, rep, nil, categorical operators, and data type functors. Point-free relational calculations, rather than list operator equations. The appearance is different because the mathematical framework is different. What changed? AOP makes three large moves. 1. From functions to relations as the base mathematical object. 2. From list-specific recursion operators to data type generic constructions such as catamorphisms and anamorphisms. 3. From derivations centered on scans and folds to a larger theory of greedy algorithms, dynamic, programming, and optimization. What did not change, AOP, is still recognizably in the Byrd-Mirton school in three senses. It is still about program derivation by equational law. It still values compact, point-free expression. It still generalizes the same basic insights about structural operators and fusion. Jeremy Gibbons' history captures this exactly. AOP is presented as the culmination of the BMF effort, but also as a point at which the community began to lose enthusiasm for both the squiggly notation and the relational complexity. Why the user-level impression is correct. So the right answer to the common observation AOP isn't using squiggle is, yes, that is correct. AOP is not written in the old list calculus notation. It is best understood as a later generalization in the same research line, not as a mere reprint of the 1986 list theory in New Typography. Significance. This distinction matters historically. It explains why there was no real AOP Haskell edition and AOP BMF edition. There was one AOP book, but it sat between earlier list calculus papers and later Haskell-oriented presentations of similar algorithmic ideas. 7. Byrd's Own Array Generalization. This matters because it shows that the Birdline does contain its own array story. It is just not the same thing as MOA, what Byrd generalizes. PRG69 develops array accumulations along different axes, orthogonal accumulation laws, an array analog of Horner's rule. Segment problems for arrays, later extensions to trees. The array move, therefore, mirrors the list story. Identify structural operators, prove their laws, use them to derive efficient programs. What stays list-like, even in the array lectures, Byrd's mental model remains recognizably tied to the program calculation tradition of folds, scans, and segmentation. The generalization proceeds from known structural recursion ideas rather than from an indexing algebra. This is the sharp difference from MOA. Bird's arrays are a generalization of the functional derivation story. Mullen's arrays are an index and shape calculus from the ground up. Why this section matters? People sometimes say there is no bird-style array calculus because the famous list papers are list-centric. That is too strong. PRG 69 shows that Byrd did indeed pursue arrays, but he pursued them in his own idiom, accumulations, orthogonality, segmentation, and generalized Horner reasoning. Significance. Birds array lectures are the missing bridge between the list calculus and the broader question. Is there an array calculus? The answer is yes, but there are multiple such calculi, and birds is only one member of the family. 8. Array calculi beyond bird. If one asks for an array calculus in the way one asks for bird's list calculus, the answer is plural rather than singular. There are several. The APL tradition. APL is the oldest and most visible ancestor. Arrays are fundamental values rather than lists in disguise. Rank and axis matter directly. Operations such as reduce, scan, transpose, reshape, and catenate are primitive. Notation is rank polymorphic and whole array oriented. APL therefore functions as a practical array algebra long before most PL research communities frame the issue in those terms. What an array calculus has to explain. Compared with list calculus, array calculus must address extra structure, shape vectors, rank, and axis, indexing, permutation, and transpose, reshape and layout, reduction along a selected axis, broadcasting or shape agreement, cell structure rather than merely headtail decomposition. This is why list calculus laws do not simply scale up without change. The broad PL research family. Modern array language research often presents these ideas in typed or compiler-oriented forms. Data parallel calculi. Second-order array combinators, rank polymorphic calculi, shape aware intermediate languages. Historically, however, the two most relevant names for the present discussion are still APL and MOA. Significance. The right comparison is therefore not birds list calculus versus no array calculus, but birds list calculus versus a family of array calculi, whose primitive notions are shape, rank, and indexing rather than list induction. 9. Mathematics of Arrays and the Psi Calculus. Primary source, Lenore M. Restifo Mullen, a Mathematics of Arrays, PhD dissertation, Syracuse, 1988. The core idea Mullen's thesis treats arrays in terms of shape, indexing, symbolic verification of n-dimensional array expressions. Its headline technical move is to use the indexing function PSI as a basis for defining array operations. In later MOA exposition, the Psi calculus is described as a calculus of indexing with shapes. The structural contrast with Byrd Bird's list calculus begins with constructors and reductions over inductive structure. Mullen's MOA begins with the fact that arrays are indexed objects of fixed shape. Put bluntly, Byrd asks how to calculate with recursion patterns. Mullen asks how to How to calculate with shape and indexing. This difference is not superficial. It changes the most natural laws. In BERD, scale F E equal to map, Foldal F E, in its is archetypal. In MOA. The central simplifications involve rewriting array expressions through shape index identities and reducing them toward lower-level indexing forms. DNF and ONF, later, MO. It descriptions often distinguish a denotational form where arrays are viewed abstractly via shapes and index-to-value func, Schen's, an operational form where layout and machine realization are made explicit. This is one of the reasons MOA appealed to high-performance and parallel computing researchers. It offered a bridge from algebraic expression to concrete layout-sensitive implementation. Why psi matters, the importance of PSI is not merely symbolic. It expresses the thesis that array operations can be reduced to disciplined reasoning about indexed access and shape transformation. In that sense, is for MOA what slash and star are for Bird's early list calculus, a compact emblem of the core structural move. Significance. MOA is one of the strongest candidates for a genuine array calculus, in the same broad intellectual sense as Bird's list calculus, a compact structural theory, explicit transformation laws, and a path from clear array specifications to efficient implementation. To set SAC in shape invariant array programming. Primary sources, Sven Bodo Schultz's early SAC Papers from the 1990s, and the later JFP article Single Assignment C, 2003. What SAC is? SAC is a strict functional language, with syntax close to C and arrays as its dominant data structure. It aims to combine high-level array programming, efficient compilation, dimension-invariant or shape-invariant operators, fusion-like elimination of intermediate arrays. SSC is thus not just an array calculus, but a vehicle for implementing one. The important connection to MOA. An early Schultz paper from 1994 makes the connection quite plainly. When discussing the elimination of intermediate arrays, Schultz writes that this kind of optimization has a mathematical basis. In the Mathematics of Arrays, MUL 88, 94 Mosambican Medicals, where it is called Psi reduction. That sentence is important because it establishes in a primary-ish SSC source that MOA predates SSC. Psi reduction was recognized as relevant to array optimization. The SSC line explicitly cites Mullen's work rather than silently duplicating it. What SAC adds? SAC contributes a different emphasis. Language design with familiar C-like syntax, with loops for shape invariant, array construction, and transformation. Compiler technology for loop fusion, scalarization, and intermediate array elimination. Practical performance claims against Fortran and CISL. So if MOA is a mathematically explicit array calculus, SSC is closer to a programmable and compilable array language informed by such theory, where SSC sits relative to BERD. SSC is much closer in surface spirit to APL-descended array languages than to BERDS list calculus. Yet it shares with BERD the transformational ambition, high-level structure should remain visible long enough for algebraic or compiler transformations to remove unnecessary overhead. Significance. SSC belongs in this survey because it demonstrates that the algebraic array programming dream did not stop at notation or theorem proving. It became part of a language and compiler agenda. JITNECT. Comparative synthesis. The deepest difference is not syntax, but the unit of reasoning. Byrd reasons by structural decomposition of recursive data. Mullen reasons by indexed access to fixed-shape objects. AOP abstracts upward into relational and categorical structure. SAC descends downward into a language and compiler implementation. The family resemblance. Despite the differences, they share a family resemblance. Concise structural operators, derivation over ad hoc coding, laws before low-level implementation, concern with intermediate structure elimination, compatibility with parallel and high-performance thinking. Significance. This is why these traditions are best studied together. They illuminate different answers to the same large question. How can data structure sensitive algebra guide program construction? Denect. Recommended reading order. For a reader coming to the area fresh, the following order is much more illuminating than publication order alone. First pass. 1. Byrd. Algebraic Identities for Program Calculation, 1989. 2. Byrd, An Introduction to the Theory of Lists, 1986. 3. Gibbons, The School of Squiggle, 2020. This gives the laws, the full calculus, and the historical frame. Second Pass. 4. Byrd. Lectures on Constructive Functional Programming, 1988, especially the Array Lecture. 5. Byrd and Damore. The Algebra of Programming. 1996, 1997. This shows both the expansion to arrays trees and the later relational turn. 3rd Pass. 6. Mullen. A Mathematics of Arrays, 1988. 7. Later, MOA PSI Expositions. 8. Early, SAC Papers, plus the 2003 JFP article. This gives the separate array calculus line and its implementation-oriented descendant significance. Read in this order, the technical story becomes much easier to parse. First, the laws, then the list calculus, then the notational history, then the array generalizations, then the separate MOAs, a C array lineage. Conclusion. The phrase Byrd's Algebraic Laws names a stable body of derivational laws that first achieved their clearest list level statement in the mid-1980s. Those laws constitute a genuine list calculus, a compact algebra of list combinators used to calculate programs by equational rewriting. The phrase squiggle names, more narrowly, the historical style in which much of that work was first notated. It is therefore possible for a work to belong to the Bird Mirtons tradition without looking squiggly on the page. That is exactly what happens in the algebra of programming. AOP is a late and powerful development of the same broad derivational tradition, but it is not a return to the 1986 list syntax. It is relational, point-free, and categorical. On the array side, there is no single universally accepted array calculus, but there are several serious candidates. Byrd's own 1988 lectures show one path, generalize the fold scan segment story to arrays. Mullen's MOA and Psi calculus show another. Build the theory around shape and indexing from the outset. SSC then carries part of this array-oriented ambition into a practical functional language and compiler design. Taken together, these works form not one straight line but a braided history. The braid has recurring themes concise structural notation, equational transformation, elimination of intermediate structures, and performance derived from laws, but its strands do not collapse into a single notation or a single formal system. Understanding that distinction is the key to reading the literature without confusion.